WorldWideScience

Sample records for exceptional error minimization

  1. Approximate error conjugation gradient minimization methods

    Science.gov (United States)

    Kallman, Jeffrey S

    2013-05-21

    In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.

  2. Error minimizing algorithms for nearest eighbor classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Reid B [Los Alamos National Laboratory; Hush, Don [Los Alamos National Laboratory; Zimmer, G. Beate [TEXAS A& M

    2011-01-03

    Stack Filters define a large class of discrete nonlinear filter first introd uced in image and signal processing for noise removal. In recent years we have suggested their application to classification problems, and investigated their relationship to other types of discrete classifiers such as Decision Trees. In this paper we focus on a continuous domain version of Stack Filter Classifiers which we call Ordered Hypothesis Machines (OHM), and investigate their relationship to Nearest Neighbor classifiers. We show that OHM classifiers provide a novel framework in which to train Nearest Neighbor type classifiers by minimizing empirical error based loss functions. We use the framework to investigate a new cost sensitive loss function that allows us to train a Nearest Neighbor type classifier for low false alarm rate applications. We report results on both synthetic data and real-world image data.

  3. Minimizing Experimental Error in Thinning Research

    Science.gov (United States)

    C. B. Briscoe

    1964-01-01

    Many diverse approaches have been made prescribing and evaluating thinnings on an objective basis. None of the techniques proposed hasbeen widely accepted. Indeed. none has been proven superior to the others nor even widely applicable. There are at least two possible reasons for this: none of the techniques suggested is of any general utility and/or experimental error...

  4. Minimizing Symbol Error Rate for Cognitive Relaying with Opportunistic Access

    KAUST Repository

    Zafar, Ammar

    2012-12-29

    In this paper, we present an optimal resource allocation scheme (ORA) for an all-participate(AP) cognitive relay network that minimizes the symbol error rate (SER). The SER is derived and different constraints are considered on the system. We consider the cases of both individual and global power constraints, individual constraints only and global constraints only. Numerical results show that the ORA scheme outperforms the schemes with direct link only and uniform power allocation (UPA) in terms of minimizing the SER for all three cases of different constraints. Numerical results also show that the individual constraints only case provides the best performance at large signal-to-noise-ratio (SNR).

  5. The contour method cutting assumption: error minimization and correction

    Energy Technology Data Exchange (ETDEWEB)

    Prime, Michael B [Los Alamos National Laboratory; Kastengren, Alan L [ANL

    2010-01-01

    The recently developed contour method can measure 2-D, cross-sectional residual-stress map. A part is cut in two using a precise and low-stress cutting technique such as electric discharge machining. The contours of the new surfaces created by the cut, which will not be flat if residual stresses are relaxed by the cutting, are then measured and used to calculate the original residual stresses. The precise nature of the assumption about the cut is presented theoretically and is evaluated experimentally. Simply assuming a flat cut is overly restrictive and misleading. The critical assumption is that the width of the cut, when measured in the original, undeformed configuration of the body is constant. Stresses at the cut tip during cutting cause the material to deform, which causes errors. The effect of such cutting errors on the measured stresses is presented. The important parameters are quantified. Experimental procedures for minimizing these errors are presented. An iterative finite element procedure to correct for the errors is also presented. The correction procedure is demonstrated on experimental data from a steel beam that was plastically bent to put in a known profile of residual stresses.

  6. Analytical minimization of synchronicity errors in stochastic identification

    Science.gov (United States)

    Bernal, D.

    2018-01-01

    An approach to minimize error due to synchronicity faults in stochastic system identification is presented. The scheme is based on shifting the time domain signals so the phases of the fundamental eigenvector estimated from the spectral density are zero. A threshold on the mean of the amplitude-weighted absolute value of these phases, above which signal shifting is deemed justified, is derived and found to be proportional to the first mode damping ratio. It is shown that synchronicity faults do not map precisely to phasor multiplications in subspace identification and that the accuracy of spectral density estimated eigenvectors, for inputs with arbitrary spectral density, decrease with increasing mode number. Selection of a corrective strategy based on signal alignment, instead of eigenvector adjustment using phasors, is shown to be the product of the foregoing observations. Simulations that include noise and non-classical damping suggest that the scheme can provide sufficient accuracy to be of practical value.

  7. Minimizing the IOL power error induced by keratometric power.

    Science.gov (United States)

    Camps, Vicente J; Piñero, David P; de Fez, Dolores; Mateo, Verónica

    2013-07-01

    To evaluate theoretically in normal eyes the influence on IOL power (PIOL) calculation of the use of a keratometric index (nk) and to analyze and validate preliminarily the use of an adjusted keratometric index (nkadj) in the IOL power calculation (PIOLadj). A model of variable keratometric index (nkadj) for corneal power calculation (Pc) was used for IOL power calculation (named PIOLadj). Theoretical differences (ΔPIOL) between the new proposed formula (PIOLadj) and which is obtained through Gaussian optics ((Equation is included in full-text article.)) were determined using Gullstrand and Le Grand eye models. The proposed new formula for IOL power calculation (PIOLadj) was prevalidated clinically in 81 eyes of 81 candidates for corneal refractive surgery and compared with Haigis, HofferQ, Holladay, and SRK/T formulas. A theoretical PIOL underestimation greater than 0.5 diopters was present in most of the cases when nk = 1.3375 was used. If nkadj was used for Pc calculation, a maximal calculated error in ΔPIOL of ±0.5 diopters at corneal vertex in most cases was observed independently from the eye model, r1c, and the desired postoperative refraction. The use of nkadj in IOL power calculation (PIOLadj) could be valid with effective lens position optimization nondependent of the corneal power. The use of a single value of nk for Pc calculation can lead to significant errors in PIOL calculation that may explain some IOL power overestimations with conventional formulas. These inaccuracies can be minimized by using the new PIOLadj based on the algorithm of nkadj.

  8. Some mathematical refinements concerning error minimization in the genetic code

    NARCIS (Netherlands)

    Buhrman, H.; van der Gulik, P.; Kelk, S.M.; Koolen, W.M.; Stougie, L.

    2011-01-01

    The genetic code is known to have a high level of error robustness and has been shown to be very error robust compared to randomly selected codes, but to be significantly less error robust than a certain code found by a heuristic algorithm. We formulate this optimization problem as a Quadratic

  9. Perceptual learning of degraded speech by minimizing prediction error.

    Science.gov (United States)

    Sohoglu, Ediz; Davis, Matthew H

    2016-03-22

    Human perception is shaped by past experience on multiple timescales. Sudden and dramatic changes in perception occur when prior knowledge or expectations match stimulus content. These immediate effects contrast with the longer-term, more gradual improvements that are characteristic of perceptual learning. Despite extensive investigation of these two experience-dependent phenomena, there is considerable debate about whether they result from common or dissociable neural mechanisms. Here we test single- and dual-mechanism accounts of experience-dependent changes in perception using concurrent magnetoencephalographic and EEG recordings of neural responses evoked by degraded speech. When speech clarity was enhanced by prior knowledge obtained from matching text, we observed reduced neural activity in a peri-auditory region of the superior temporal gyrus (STG). Critically, longer-term improvements in the accuracy of speech recognition following perceptual learning resulted in reduced activity in a nearly identical STG region. Moreover, short-term neural changes caused by prior knowledge and longer-term neural changes arising from perceptual learning were correlated across subjects with the magnitude of learning-induced changes in recognition accuracy. These experience-dependent effects on neural processing could be dissociated from the neural effect of hearing physically clearer speech, which similarly enhanced perception but increased rather than decreased STG responses. Hence, the observed neural effects of prior knowledge and perceptual learning cannot be attributed to epiphenomenal changes in listening effort that accompany enhanced perception. Instead, our results support a predictive coding account of speech perception; computational simulations show how a single mechanism, minimization of prediction error, can drive immediate perceptual effects of prior knowledge and longer-term perceptual learning of degraded speech.

  10. From Errors Treatment to Exceptions Treatment Regarding the Execution Control over Visual Basic Programs

    Directory of Open Access Journals (Sweden)

    Cristina Raluca POPESCU

    2008-01-01

    Full Text Available In order to comply with the quality standards and with the best practices, the execution of the professional programs must be rigorously controlled so that to avoid occurrence of unpredictable situations that might generate anomalies and could lead to computer blockage, forced termination of the execution and data loss. In traditional programming languages, including Visual Basic 6, the concept of error is extremely evolved. It is considered as error any situation in which the program fails to execute correctly, regardless if such anomaly is generated by a software or hardware cause. Nowadays the modern platforms, including VB.NET have introduced a new concept: exception. Unfortunately, perhaps by mistake, exception is assimilated by many IT specialists as an exceptional (extraordinary situation or a rare situation.We agree with the opinion of those IT specialists asserting that error is strictly dependant on the programmer, when he/she fails in correctly generating the application’s structures, whilst exception is a situation not fitting in the usual natural execution or as desired by the programmer or user, without meaning that it occurs more often or more rarely.Designing robust programs implies for such not to terminate abnormally or block, not even upon receiving improper parameters. Two aspects are referred to: the behavior regarding low level errors (caused by the operation system, memory allocation, reading/writing in files or hardware malfunctions and the reaction to the user’s errors, such as providing incorrect input data or incorrect use of operations in respect with their sequences. Notwithstanding what platform is used in designing the programs and regardless the controversy between the specialists, in order for the execution to be terminated under the program’s control, the commands that might generate anomalies and interruptions should be strictly monitored. Implicitly, the execution control

  11. Linearly convergent stochastic heavy ball method for minimizing generalization error

    KAUST Repository

    Loizou, Nicolas

    2017-10-30

    In this work we establish the first linear convergence result for the stochastic heavy ball method. The method performs SGD steps with a fixed stepsize, amended by a heavy ball momentum term. In the analysis, we focus on minimizing the expected loss and not on finite-sum minimization, which is typically a much harder problem. While in the analysis we constrain ourselves to quadratic loss, the overall objective is not necessarily strongly convex.

  12. Stable 1-Norm Error Minimization Based Linear Predictors for Speech Modeling

    DEFF Research Database (Denmark)

    Giacobello, Daniele; Christensen, Mads Græsbøll; Jensen, Tobias Lindstrøm

    2014-01-01

    In linear prediction of speech, the 1-norm error minimization criterion has been shown to provide a valid alternative to the 2-norm minimization criterion. However, unlike 2-norm minimization, 1-norm minimization does not guarantee the stability of the corresponding all-pole filter and can generate...... of the shift operator associated with the particular prediction problem considered. The second method uses the alternative Cauchy bound to impose a convex constraint on the predictor in the 1-norm error minimization. These methods are compared with two existing methods: the Burg method, based on the 1-norm...... minimization of the forward and backward prediction error, and the iteratively reweighted 2-norm minimization known to converge to the 1-norm minimization with an appropriate selection of weights. The evaluation gives proof of the effectiveness of the new methods, performing as well as unconstrained 1-norm...

  13. A strategy for minimizing common mode human error in executing critical functions and tasks

    Energy Technology Data Exchange (ETDEWEB)

    Beltracchi, L. (Nuclear Regulatory Commission, Washington, DC (United States)); Lindsay, R.W. (Argonne National Lab., IL (United States))

    1992-01-01

    Human error in execution of critical functions and tasks can be costly. The Three Mile Island and the Chernobyl Accidents are examples of results from human error in the nuclear industry. There are similar errors that could no doubt be cited from other industries. This paper discusses a strategy to minimize common mode human error in the execution of critical functions and tasks. The strategy consists of the use of human redundancy, and also diversity in human cognitive behavior: skill-, rule-, and knowledge-based behavior. The authors contend that the use of diversity in human cognitive behavior is possible, and it minimizes common mode error.

  14. A strategy for minimizing common mode human error in executing critical functions and tasks

    Energy Technology Data Exchange (ETDEWEB)

    Beltracchi, L. [Nuclear Regulatory Commission, Washington, DC (United States); Lindsay, R.W. [Argonne National Lab., IL (United States)

    1992-05-01

    Human error in execution of critical functions and tasks can be costly. The Three Mile Island and the Chernobyl Accidents are examples of results from human error in the nuclear industry. There are similar errors that could no doubt be cited from other industries. This paper discusses a strategy to minimize common mode human error in the execution of critical functions and tasks. The strategy consists of the use of human redundancy, and also diversity in human cognitive behavior: skill-, rule-, and knowledge-based behavior. The authors contend that the use of diversity in human cognitive behavior is possible, and it minimizes common mode error.

  15. Nonlinear control of ships minimizing the position tracking errors

    Directory of Open Access Journals (Sweden)

    Svein P. Berge

    1999-07-01

    Full Text Available In this paper, a nonlinear tracking controller with integral action for ships is presented. The controller is based on state feedback linearization. Exponential convergence of the vessel-fixed position and velocity errors are proven by using Lyapunov stability theory. Since we only have two control devices, a rudder and a propeller, we choose to control the longship and the sideship position errors to zero while the heading is stabilized indirectly. A Virtual Reference Point (VRP is defined at the bow or ahead of the ship. The VRP is used for tracking control. It is shown that the distance from the center of rotation to the VRP will influence on the stability of the zero dynamics. By selecting the VRP at the bow or even ahead of the bow, the damping in yaw can be increased and the zero dynamics is stabilized. Hence, the heading angle will be less sensitive to wind, currents and waves. The control law is simulated by using a nonlinear model of the Japanese training ship Shiojimaru with excellent results. Wind forces are added to demonstrate the robustness and performance of the integral controller.

  16. A novel approach to minimize error in the medical domain: cognitive neuroscientific insights into training.

    Science.gov (United States)

    Dror, Itiel

    2011-01-01

    Medical errors are an inevitable outcome of the human cognitive system working within the environment and demands of practicing medicine. Training can play a pivotal role in minimizing error, but the prevailing training is not as effective because it directly focuses on error reduction. Based on an understanding of cognitive architecture and how the brain processes information, a new approach is suggested: focusing training on error recovery. This entails specific training in error detection and error mitigation. Such training will not only enable better responses when errors occur, but it is also a more effective way to achieve error reduction. The suggested design for error recovery training is to begin with detecting errors in others. Starting off with highly visible and even exaggerated errors, and advancing to more challenging detections and finally requiring to detect errors within oneself rather than in others. The error mitigation training starts with providing the learners with the correct remedial actions (after they have detected the error). With training, the learners are required to select the appropriate actions within multiple choice alternatives, and eventually are required to generate the appropriate remedial responses themselves. These can be used for instruction as well as for assessment purposes. Time pressure, distractions, competitions and other elements are included so as to make the training more challenging and interactive.

  17. Homodyne laser interferometer involving minimal quadrature phase error to obtain subnanometer nonlinearity.

    Science.gov (United States)

    Cui, Junning; He, Zhangqiang; Jiu, Yuanwei; Tan, Jiubin; Sun, Tao

    2016-09-01

    The demand for minimal cyclic nonlinearity error in laser interferometry is increasing as a result of advanced scientific research projects. Research shows that the quadrature phase error is the main effect that introduces cyclic nonlinearity error, and polarization-mixing cross talk during beam splitting is the main error source that causes the quadrature phase error. In this paper, a new homodyne quadrature laser interferometer configuration based on nonpolarization beam splitting and balanced interference between two circularly polarized laser beams is proposed. Theoretical modeling indicates that the polarization-mixing cross talk is elaborately avoided through nonpolarizing and Wollaston beam splitting, with a minimum number of quadrature phase error sources involved. Experimental results show that the cyclic nonlinearity error of the interferometer is up to 0.6 nm (peak-to-valley value) without any correction and can be further suppressed to 0.2 nm with a simple gain and offset correction method.

  18. Does the sensorimotor system minimize prediction error or select the most likely prediction during object lifting?

    Science.gov (United States)

    Cashaback, Joshua G A; McGregor, Heather R; Pun, Henry C H; Buckingham, Gavin; Gribble, Paul L

    2017-01-01

    The human sensorimotor system is routinely capable of making accurate predictions about an object's weight, which allows for energetically efficient lifts and prevents objects from being dropped. Often, however, poor predictions arise when the weight of an object can vary and sensory cues about object weight are sparse (e.g., picking up an opaque water bottle). The question arises, what strategies does the sensorimotor system use to make weight predictions when one is dealing with an object whose weight may vary? For example, does the sensorimotor system use a strategy that minimizes prediction error (minimal squared error) or one that selects the weight that is most likely to be correct (maximum a posteriori)? In this study we dissociated the predictions of these two strategies by having participants lift an object whose weight varied according to a skewed probability distribution. We found, using a small range of weight uncertainty, that four indexes of sensorimotor prediction (grip force rate, grip force, load force rate, and load force) were consistent with a feedforward strategy that minimizes the square of prediction errors. These findings match research in the visuomotor system, suggesting parallels in underlying processes. We interpret our findings within a Bayesian framework and discuss the potential benefits of using a minimal squared error strategy. Using a novel experimental model of object lifting, we tested whether the sensorimotor system models the weight of objects by minimizing lifting errors or by selecting the statistically most likely weight. We found that the sensorimotor system minimizes the square of prediction errors for object lifting. This parallels the results of studies that investigated visually guided reaching, suggesting an overlap in the underlying mechanisms between tasks that involve different sensory systems. Copyright © 2017 the American Physiological Society.

  19. Does the sensorimotor system minimize prediction error or select the most likely prediction during object lifting?

    Science.gov (United States)

    McGregor, Heather R.; Pun, Henry C. H.; Buckingham, Gavin; Gribble, Paul L.

    2016-01-01

    The human sensorimotor system is routinely capable of making accurate predictions about an object's weight, which allows for energetically efficient lifts and prevents objects from being dropped. Often, however, poor predictions arise when the weight of an object can vary and sensory cues about object weight are sparse (e.g., picking up an opaque water bottle). The question arises, what strategies does the sensorimotor system use to make weight predictions when one is dealing with an object whose weight may vary? For example, does the sensorimotor system use a strategy that minimizes prediction error (minimal squared error) or one that selects the weight that is most likely to be correct (maximum a posteriori)? In this study we dissociated the predictions of these two strategies by having participants lift an object whose weight varied according to a skewed probability distribution. We found, using a small range of weight uncertainty, that four indexes of sensorimotor prediction (grip force rate, grip force, load force rate, and load force) were consistent with a feedforward strategy that minimizes the square of prediction errors. These findings match research in the visuomotor system, suggesting parallels in underlying processes. We interpret our findings within a Bayesian framework and discuss the potential benefits of using a minimal squared error strategy. NEW & NOTEWORTHY Using a novel experimental model of object lifting, we tested whether the sensorimotor system models the weight of objects by minimizing lifting errors or by selecting the statistically most likely weight. We found that the sensorimotor system minimizes the square of prediction errors for object lifting. This parallels the results of studies that investigated visually guided reaching, suggesting an overlap in the underlying mechanisms between tasks that involve different sensory systems. PMID:27760821

  20. Robust Least-Squares Support Vector Machine With Minimization of Mean and Variance of Modeling Error.

    Science.gov (United States)

    Lu, Xinjiang; Liu, Wenbo; Zhou, Chuang; Huang, Minghui

    2017-06-13

    The least-squares support vector machine (LS-SVM) is a popular data-driven modeling method and has been successfully applied to a wide range of applications. However, it has some disadvantages, including being ineffective at handling non-Gaussian noise as well as being sensitive to outliers. In this paper, a robust LS-SVM method is proposed and is shown to have more reliable performance when modeling a nonlinear system under conditions where Gaussian or non-Gaussian noise is present. The construction of a new objective function allows for a reduction of the mean of the modeling error as well as the minimization of its variance, and it does not constrain the mean of the modeling error to zero. This differs from the traditional LS-SVM, which uses a worst-case scenario approach in order to minimize the modeling error and constrains the mean of the modeling error to zero. In doing so, the proposed method takes the modeling error distribution information into consideration and is thus less conservative and more robust in regards to random noise. A solving method is then developed in order to determine the optimal parameters for the proposed robust LS-SVM. An additional analysis indicates that the proposed LS-SVM gives a smaller weight to a large-error training sample and a larger weight to a small-error training sample, and is thus more robust than the traditional LS-SVM. The effectiveness of the proposed robust LS-SVM is demonstrated using both artificial and real life cases.

  1. TEACHING GRAMMAR-IN-CONTEXT AND ITS IMPACT IN MINIMIZING STUDENTS’ GRAMMATICAL ERRORS

    Directory of Open Access Journals (Sweden)

    Yadhi Nur Amin

    2015-11-01

    Full Text Available This study is conducted to determine the effectiveness of teaching grammar-in-context to minimize students‘ grammatical errors in writing. The design of the study was a quasi-experimental with a non-randomized pretest-posttest control group. The samples of the study were taken from the population of the tenth-grade students. The control group was taught by conventional grammar which was separately given with writing skills. Likewise, the experimental one was treated by teaching grammar-in-context. The results of the study showed that the mean score in the post-test was higher than that in the pretest; and the mean score of experimental group increased 16.20 point after the treatment. This result indicated that teaching grammarin-context is considered to be effective in minimizing students‘ grammatical errors in writing.

  2. Teaching Grammar-in-context and Its Impact in Minimizing Students' Grammatical Errors

    OpenAIRE

    Amin, Yadhi Nur

    2015-01-01

    This study is conducted to determine the effectiveness of teaching grammar-in-context to minimize students‘ grammatical errors in writing. The design of the study was a quasi-experimental with a non-randomized pretest-posttest control group. The samples of the study were taken from the population of the tenth-grade students. The control group was taught by conventional grammar which was separately given with writing skills. Likewise, the experimental one was treated...

  3. Artificial neural networks as alternative tool for minimizing error predictions in manufacturing ultradeformable nanoliposome formulations.

    Science.gov (United States)

    León Blanco, José M; González-R, Pedro L; Arroyo García, Carmen Martina; Cózar-Bernal, María José; Calle Suárez, Marcos; Canca Ortiz, David; Rabasco Álvarez, Antonio María; González Rodríguez, María Luisa

    2018-01-01

    This work was aimed at determining the feasibility of artificial neural networks (ANN) by implementing backpropagation algorithms with default settings to generate better predictive models than multiple linear regression (MLR) analysis. The study was hypothesized on timolol-loaded liposomes. As tutorial data for ANN, causal factors were used, which were fed into the computer program. The number of training cycles has been identified in order to optimize the performance of the ANN. The optimization was performed by minimizing the error between the predicted and real response values in the training step. The results showed that training was stopped at 10 000 training cycles with 80% of the pattern values, because at this point the ANN generalizes better. Minimum validation error was achieved at 12 hidden neurons in a single layer. MLR has great prediction ability, with errors between predicted and real values lower than 1% in some of the parameters evaluated. Thus, the performance of this model was compared to that of the MLR using a factorial design. Optimal formulations were identified by minimizing the distance among measured and theoretical parameters, by estimating the prediction errors. Results indicate that the ANN shows much better predictive ability than the MLR model. These findings demonstrate the increased efficiency of the combination of ANN and design of experiments, compared to the conventional MLR modeling techniques.

  4. Target Registration Error minimization involving deformable organs using elastic body splines and Particle Swarm Optimization approach.

    Science.gov (United States)

    Spinczyk, Dominik; Fabian, Sylwester

    2017-12-01

    In minimally invasive surgery one of the main challenges is the precise location of the target during the intervention. The aim of the study is to present usability of elastic body splines (EBS) to minimize TRE error. The method to find the desired EBS parameters values is presented with usage of Particle Swarm optimization approach. This ability of TRE minimization has been achieved for the respiratory phases corresponding to minimum FRE for abdominal (especially liver) surgery. The proposed methodology was verified during experiments conducted on 21 patients diagnosed with liver tumors. This method has been developed to perform operations in real-time on a standard workstation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Minimizing pulling geometry errors in atomic force microscope single molecule force spectroscopy.

    Science.gov (United States)

    Rivera, Monica; Lee, Whasil; Ke, Changhong; Marszalek, Piotr E; Cole, Daniel G; Clark, Robert L

    2008-10-01

    In atomic force microscopy-based single molecule force spectroscopy (AFM-SMFS), it is assumed that the pulling angle is negligible and that the force applied to the molecule is equivalent to the force measured by the instrument. Recent studies, however, have indicated that the pulling geometry errors can drastically alter the measured force-extension relationship of molecules. Here we describe a software-based alignment method that repositions the cantilever such that it is located directly above the molecule's substrate attachment site. By aligning the applied force with the measurement axis, the molecule is no longer undergoing combined loading, and the full force can be measured by the cantilever. Simulations and experimental results verify the ability of the alignment program to minimize pulling geometry errors in AFM-SMFS studies.

  6. An Implementation of Error Minimization Data Transmission in OFDM using Modified Convolutional Code

    Directory of Open Access Journals (Sweden)

    Hendy Briantoro

    2016-04-01

    Full Text Available This paper presents about error minimization in OFDM system. In conventional system, usually using channel coding such as BCH Code or Convolutional Code. But, performance BCH Code or Convolutional Code is not good in implementation of OFDM System. Error bits of OFDM system without channel coding is 5.77%. Then, we used convolutional code with code rate 1/2, it can reduce error bitsonly up to 3.85%. So, we proposed OFDM system with Modified Convolutional Code. In this implementation, we used Software Define Radio (SDR, namely Universal Software Radio Peripheral (USRP NI 2920 as the transmitter and receiver. The result of OFDM system using Modified Convolutional Code with code rate is able recover all character received so can decrease until 0% error bit. Increasing performance of Modified Convolutional Code is about 1 dB in BER of 10-4 from BCH Code and Convolutional Code. So, performance of Modified Convolutional better than BCH Code or Convolutional Code. Keywords: OFDM, BCH Code, Convolutional Code, Modified Convolutional Code, SDR, USRP

  7. The sensorimotor system minimizes prediction error for object lifting when the object's weight is uncertain.

    Science.gov (United States)

    Brooks, Jack; Thaler, Anne

    2017-08-01

    A reliable mechanism to predict the heaviness of an object is important for manipulating an object under environmental uncertainty. Recently, Cashaback et al. (Cashaback JGA, McGregor HR, Pun HCH, Buckingham G, Gribble PL. J Neurophysiol 117: 260-274, 2017) showed that for object lifting the sensorimotor system uses a strategy that minimizes prediction error when the object's weight is uncertain. Previous research demonstrates that visually guided reaching is similarly optimized. Although this suggests a unified strategy of the sensorimotor system for object manipulation, the selected strategy appears to be task dependent and subject to change in response to the degree of environmental uncertainty. Copyright © 2017 the American Physiological Society.

  8. Minimizing the symbol-error-rate for amplify-and-forward relaying systems using evolutionary algorithms

    KAUST Repository

    Ahmed, Qasim Zeeshan

    2015-02-01

    In this paper, a new detector is proposed for an amplify-and-forward (AF) relaying system. The detector is designed to minimize the symbol-error-rate (SER) of the system. The SER surface is non-linear and may have multiple minimas, therefore, designing an SER detector for cooperative communications becomes an optimization problem. Evolutionary based algorithms have the capability to find the global minima, therefore, evolutionary algorithms such as particle swarm optimization (PSO) and differential evolution (DE) are exploited to solve this optimization problem. The performance of proposed detectors is compared with the conventional detectors such as maximum likelihood (ML) and minimum mean square error (MMSE) detector. In the simulation results, it can be observed that the SER performance of the proposed detectors is less than 2 dB away from the ML detector. Significant improvement in SER performance is also observed when comparing with the MMSE detector. The computational complexity of the proposed detector is much less than the ML and MMSE algorithms. Moreover, in contrast to ML and MMSE detectors, the computational complexity of the proposed detectors increases linearly with respect to the number of relays.

  9. Minimalism

    CERN Document Server

    Obendorf, Hartmut

    2009-01-01

    The notion of Minimalism is proposed as a theoretical tool supporting a more differentiated understanding of reduction and thus forms a standpoint that allows definition of aspects of simplicity. This book traces the development of minimalism, defines the four types of minimalism in interaction design, and looks at how to apply it.

  10. Solving the vibroacoustic equations of plates by minimization of error on a sample of observation points.

    Science.gov (United States)

    Collery, Olivier; Guyader, Jean-Louis

    2010-03-01

    In the context of better understanding and predicting sound transmission through heterogeneous fluid-loaded aircraft structures, this paper presents a method of solving the vibroacoustic problem of plates. The present work considers fluid-structure coupling and is applied to simply supported rectangular plates excited mechanically. The proposed method is based on the minimization of the error of verification of the plate vibroacoustic equation of motion on a sample of points. From sampling comes an aliasing effect; this phenomenon is described and solved using a wavelet-based filter. The proposed approach is validated in presenting very accurate results of sound radiation immersed in heavy and light fluids. The fluid-structure interaction appears to be very well described avoiding time-consuming classical calculations of the modal radiation impedances. The focus is also put on different samplings to observe the aliasing effect. As perspectives sound radiation from a non-homogeneous plate is solved and compared with reference results proving all the power of this method.

  11. Patient Safety Technology Gap: Minimizing Errors in Healthcare through Technology Innovation

    Directory of Open Access Journals (Sweden)

    Deborah Carstens

    2005-04-01

    Full Text Available In a world of ever increasing technological advances, users of technology are at risk for exceeding human memory limitations. A gap analysis was conducted through reviewing literature in the field of human error or specifically transition errors in emergency room (ER operations to identify the current state of technology available. The gap analysis revealed the technological needs of ER healthcare workers. The findings indicate the need for technology such as knowledge management or decision support systems in ERs to reduce the potential for error, enhance patient safety, and improve the overall quality of care for the patient.

  12. Optimization of intelligent infusion pump technology to minimize vasopressor pump programming errors.

    Science.gov (United States)

    Vadiei, Nina; Shuman, Carrie A; Murthy, Manasa S; Daley, Mitchell J

    2017-08-01

    There is a lack of data evaluating the impact of hard limit implementation into intelligent infusion pump technology (IIPT). The purpose of this study was to determine if incorporation of vasopressor upper hard limits (UHL) into IIPT increases efficacy of alerts by preventing pump programming errors. Retrospective review from five hospitals within a single healthcare network between April 1, 2013 and May 31, 2014. A total of 65,680 vasopressor data entries were evaluated; 19,377 prior to hard limit implementation and 46,303 after hard limit implementation. The primary outcome was the percent of effective alerts. The secondary outcome was the proportional dose increase from the soft limit provided. A reduction in alert rate occurred after incorporation of hard limits to the IIPT drug library (pre-UHL 4.7% vs. post-UHL 4.0%) with a subsequent increase in the number of errors prevented as represented by a higher effective alert rate (pre-UHL 23.0% vs. post-UHL 37.3%; p < 0.001). The proportional dose increase was significantly reduced (pre-UHL 188% ± 380%] vs. post-UHL 95% ± 128%; p < 0.001). Incorporation of UHLs into IIPT in a multi-site health system with varying intensive care unit and emergency department acuity increases alert effectiveness, reduces dosing errors, and reduces the magnitude of dosing errors that reach the patient.

  13. Bias correction for selecting the minimal-error classifier from many machine learning models.

    Science.gov (United States)

    Ding, Ying; Tang, Shaowu; Liao, Serena G; Jia, Jia; Oesterreich, Steffi; Lin, Yan; Tseng, George C

    2014-11-15

    Supervised machine learning is commonly applied in genomic research to construct a classifier from the training data that is generalizable to predict independent testing data. When test datasets are not available, cross-validation is commonly used to estimate the error rate. Many machine learning methods are available, and it is well known that no universally best method exists in general. It has been a common practice to apply many machine learning methods and report the method that produces the smallest cross-validation error rate. Theoretically, such a procedure produces a selection bias. Consequently, many clinical studies with moderate sample sizes (e.g. n = 30-60) risk reporting a falsely small cross-validation error rate that could not be validated later in independent cohorts. In this article, we illustrated the probabilistic framework of the problem and explored the statistical and asymptotic properties. We proposed a new bias correction method based on learning curve fitting by inverse power law (IPL) and compared it with three existing methods: nested cross-validation, weighted mean correction and Tibshirani-Tibshirani procedure. All methods were compared in simulation datasets, five moderate size real datasets and two large breast cancer datasets. The result showed that IPL outperforms the other methods in bias correction with smaller variance, and it has an additional advantage to extrapolate error estimates for larger sample sizes, a practical feature to recommend whether more samples should be recruited to improve the classifier and accuracy. An R package 'MLbias' and all source files are publicly available. tsenglab.biostat.pitt.edu/software.htm. ctseng@pitt.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Numerical analysis of magnetic field diffusion in ferromagnetic laminations by minimization of constitutive error

    Energy Technology Data Exchange (ETDEWEB)

    Fresa, R. [Consorzio CREATE, DIIIE, University of Salerno, I-84084 Fisciano (Saudi Arabia), (Italy); Serpico, C. [Department of Electrical and Computer Engineering, University of Maryland, College Park, Maryland 20742 (United States); Department of Electrical Engineering, University of Naples ' ' Federico II' ' , I-80152 Napoli, (Italy); Visone, C. [Department of Electrical Engineering, University of Naples ' ' Federico II' ' , I-80152 Napoli, (Italy)

    2000-05-01

    In this article, the diffusion of electromagnetic fields into a ferromagnetic lamination is numerically studied by means of an error-based numerical method. This technique has been developed so far only for the case of nonhysteretic constitutive relations. The generalization to the hysteretic case requires a modification of the technique in order to take into account the evolution of the ''magnetization state'' of the media. Numerical computations obtained by using this approach are reported and discussed. (c) 2000 American Institute of Physics.

  15. Minimal-Entanglement Entanglement-Assisted Quantum Error Correction Codes from Modified Circulant Matrices

    Directory of Open Access Journals (Sweden)

    Duc Manh Nguyen

    2017-07-01

    Full Text Available In this paper, new construction methods of entanglement-assisted quantum error correction code (EAQECC from circulant matrices are proposed. We first construct the matrices from two vectors of constraint size, and determine the isotropic subgroup. Then, we also propose a method for calculation of the entanglement subgroup based on standard forms of binary matrices to satisfy the constraint conditions of EAQECC. With isotropic and entanglement subgroups, we determine all the parameters and the minimum distance of the EAQECC. The proposed EAQECC with small lengths are presented to explain the practicality of this construction of EAQECC. Comparison with some earlier constructions of EAQECC shows that the proposed EAQECC is better.

  16. Minimization of Functional Majorant in a Posteriori Error Analysis Based on H(div Multigrid-Preconditioned CG Method

    Directory of Open Access Journals (Sweden)

    Jan Valdman

    2009-01-01

    Full Text Available We consider a Poisson boundary value problem and its functional a posteriori error estimate derived by S. Repin in 1999. The estimate majorizes the H1 seminorm of the error of the discrete solution computed by FEM method and contains a free ux variable from the H(div space. In order to keep the estimate sharp, a procedure for the minimization of the majorant term with respect to the ux variable is introduced, computing the free ux variable from a global linear system of equations. Since the linear system is symmetric and positive definite, few iterations of a conjugate gradient method with a geometrical multigrid preconditioner are applied. Numerical techniques are demonstated on one benchmark example with a smooth solution on a unit square domain including the computation of the approximate value of the constant in Friedrichs' inequality.

  17. Estimation of the Coefficient of Variation with Minimum Risk: A Sequential Method for Minimizing Sampling Error and Study Cost.

    Science.gov (United States)

    Chattopadhyay, Bhargab; Kelley, Ken

    2016-01-01

    The coefficient of variation is an effect size measure with many potential uses in psychology and related disciplines. We propose a general theory for a sequential estimation of the population coefficient of variation that considers both the sampling error and the study cost, importantly without specific distributional assumptions. Fixed sample size planning methods, commonly used in psychology and related fields, cannot simultaneously minimize both the sampling error and the study cost. The sequential procedure we develop is the first sequential sampling procedure developed for estimating the coefficient of variation. We first present a method of planning a pilot sample size after the research goals are specified by the researcher. Then, after collecting a sample size as large as the estimated pilot sample size, a check is performed to assess whether the conditions necessary to stop the data collection have been satisfied. If not an additional observation is collected and the check is performed again. This process continues, sequentially, until a stopping rule involving a risk function is satisfied. Our method ensures that the sampling error and the study costs are considered simultaneously so that the cost is not higher than necessary for the tolerable sampling error. We also demonstrate a variety of properties of the distribution of the final sample size for five different distributions under a variety of conditions with a Monte Carlo simulation study. In addition, we provide freely available functions via the MBESS package in R to implement the methods discussed.

  18. Anticipating, measuring, and minimizing MEMS mirror scan error to improve laser scanning microscopy's speed and accuracy.

    Science.gov (United States)

    Giannini, John P; York, Andrew G; Shroff, Hari

    2017-01-01

    We describe a method to speed up microelectromechanical system (MEMS) mirror scanning by > 20x, while also improving scan accuracy. We use Landweber deconvolution to determine an input voltage which would produce a desired output, based on the measured MEMS impulse response. Since the MEMS is weakly nonlinear, the observed behavior deviates from expectations, and we iteratively improve our input to minimize this deviation. This allows customizable MEMS angle vs. time with <1% deviation from the desired scan pattern. We demonstrate our technique by optimizing a point scanning microscope's raster patterns to image mammal submandibular gland and pollen at ~10 frames/s.

  19. Minimizing Grating Slope Errors in the IEX Monochromato at the Advanced Photon Source

    Energy Technology Data Exchange (ETDEWEB)

    Fisher, M. V.; Assoufid, L.; McChesney, J.; Qian, J.; Reininger, R.; Rodolakis, F

    2016-01-01

    The IEX beamline at the APS is currently in the commissioning phase. The energy resolution of the beamline was not meeting original specifications by several orders of magnitude. The monochromator, an in-focus VLS-PGM, is currently configured with a high and a medium-line-density grating. Experimental results indicated that both gratings were contributing to the poor energy resolution and this led to venting the monochromator to investigate. The initial suspicion was that a systematic error had occurred in the ruling process on the VLS gratings, but that proved to not be the case. Instead the problem was isolated to mechanical constraints used to mount the gratings into their respective side-cooled holders. Modifications were made to the holders to eliminate problematic constraints without compromising the rest of the design. Metrology performed on the gratings in the original and modified holders demonstrated a 20-fold improvement in the surface profile error which was consistent with finite element analysis performed in support of the modifications. Two gratings were successfully reinstalled and subsequent measurements with beam show a dramatic improvement in energy resolution.

  20. Offline modeling for product quality prediction of mineral processing using modeling error PDF shaping and entropy minimization.

    Science.gov (United States)

    Ding, Jinliang; Chai, Tianyou; Wang, Hong

    2011-03-01

    This paper presents a novel offline modeling for product quality prediction of mineral processing which consists of a number of unit processes in series. The prediction of the product quality of the whole mineral process (i.e., the mixed concentrate grade) plays an important role and the establishment of its predictive model is a key issue for the plantwide optimization. For this purpose, a hybrid modeling approach of the mixed concentrate grade prediction is proposed, which consists of a linear model and a nonlinear model. The least-squares support vector machine is adopted to establish the nonlinear model. The inputs of the predictive model are the performance indices of each unit process, while the output is the mixed concentrate grade. In this paper, the model parameter selection is transformed into the shape control of the probability density function (PDF) of the modeling error. In this context, both the PDF-control-based and minimum-entropy-based model parameter selection approaches are proposed. Indeed, this is the first time that the PDF shape control idea is used to deal with system modeling, where the key idea is to turn model parameters so that either the modeling error PDF is controlled to follow a target PDF or the modeling error entropy is minimized. The experimental results using the real plant data and the comparison of the two approaches are discussed. The results show the effectiveness of the proposed approaches.

  1. A global conformance quality model. A new strategic tool for minimizing defects caused by variation, error, and complexity

    Energy Technology Data Exchange (ETDEWEB)

    Hinckley, C. Martin [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    1994-01-01

    The performance of Japanese products in the marketplace points to the dominant role of quality in product competition. Our focus is motivated by the tremendous pressure to improve conformance quality by reducing defects to previously unimaginable limits in the range of 1 to 10 parts per million. Toward this end, we have developed a new model of conformance quality that addresses each of the three principle defect sources: (1) Variation, (2) Human Error, and (3) Complexity. Although the role of variation in conformance quality is well documented, errors occur so infrequently that their significance is not well known. We have shown that statistical methods are not useful in characterizing and controlling errors, the most common source of defects. Excessive complexity is also a root source of defects, since it increases errors and variation defects. A missing link in the defining a global model has been the lack of a sound correlation between complexity and defects. We have used Design for Assembly (DFA) methods to quantify assembly complexity and have shown that assembly times can be described in terms of the Pareto distribution in a clear exception to the Central Limit Theorem. Within individual companies we have found defects to be highly correlated with DFA measures of complexity in broad studies covering tens of millions of assembly operations. Applying the global concepts, we predicted that Motorola`s Six Sigma method would only reduce defects by roughly a factor of two rather than orders of magnitude, a prediction confirmed by Motorola`s data. We have also shown that the potential defects rates of product concepts can be compared in the earliest stages of development. The global Conformance Quality Model has demonstrated that the best strategy for improvement depends upon the quality control strengths and weaknesses.

  2. [Practical aspects for minimizing errors in the cross-cultural adaptation and validation of quality of life questionnaires].

    Science.gov (United States)

    Lauffer, A; Solé, L; Bernstein, S; Lopes, M H; Francisconi, C F

    2013-01-01

    The development and validation of questionnaires for evaluating quality of life (QoL) has become an important area of research. However, there is a proliferation of non-validated measuring instruments in the health setting that do not contribute to advances in scientific knowledge. To present, through the analysis of available validated questionnaires, a checklist of the practical aspects of how to carry out the cross-cultural adaptation of QoL questionnaires (generic, or disease-specific) so that no step is overlooked in the evaluation process, and thus help prevent the elaboration of insufficient or incomplete validations. We have consulted basic textbooks and Pubmed databases using the following keywords quality of life, questionnaires, and gastroenterology, confined to «validation studies» in English, Spanish, and Portuguese, and with no time limit, for the purpose of analyzing the translation and validation of the questionnaires available through the Mapi Institute and PROQOLID websites. A checklist is presented to aid in the planning and carrying out of the cross-cultural adaptation of QoL questionnaires, in conjunction with a glossary of key terms in the area of knowledge. The acronym DSTAC was used, which refers to each of the 5 stages involved in the recommended procedure. In addition, we provide a table of the QoL instruments that have been validated into Spanish. This article provides information on how to adapt QoL questionnaires from a cross-cultural perspective, as well as to minimize common errors. Copyright © 2012 Asociación Mexicana de Gastroenterología. Published by Masson Doyma México S.A. All rights reserved.

  3. Simulations for Full Unit-memory and Partial Unit-memory Convolutional Codes with Real-time Minimal-byte-error Probability Decoding Algorithm

    Science.gov (United States)

    Vo, Q. D.

    1984-01-01

    A program which was written to simulate Real Time Minimal-Byte-Error Probability (RTMBEP) decoding of full unit-memory (FUM) convolutional codes on a 3-bit quantized AWGN channel is described. This program was used to compute the symbol-error probability of FUM codes and to determine the signal to noise (SNR) required to achieve a bit error rate (BER) of 10 to the minus 6th power for corresponding concatenated systems. A (6,6/30) FUM code, 6-bit Reed-Solomon code combination was found to achieve the required BER at a SNR of 1.886 dB. The RTMBEP algorithm was then modified for decoding partial unit-memory (PUM) convolutional codes. A simulation program was also written to simulate the symbol-error probability of these codes.

  4. Optimization of Machining Parameters for Minimization of Roundness Error in Deep Hole Drilling using Minimum Quantity Lubricant

    Directory of Open Access Journals (Sweden)

    Kamaruzaman Anis Farhan

    2016-01-01

    Full Text Available This paper presents an experimental investigation of deep hole drilling using CNC milling machine. This experiment investigates the effect of machining parameters which are spindle speed, feed rate and depth of hole using minimum quantity lubricant on the roundness error. The experiment was designed using two level full factorial with four center point. Finally, the machining parameters were optimized in obtaining the minimum value of roundness error. The minimum value of roundness error for deep hole drilling is 0.0266 at the spindle speed is 800 rpm, feed rate is 60 mm/min, depth of hole is 70 mm and minimum quantity lubricant is 30ml/hr.

  5. Minimizando erros na administração de drogas intravítreas Minimizing errors in intravitreal drug injection

    Directory of Open Access Journals (Sweden)

    Shu I Yeh

    2003-01-01

    Full Text Available OBJETIVO: Avaliar possíveis erros na administração de drogas intravítreas para o tratamento da endoftalmite e propor técnica que seja reprodutível e acessível. MÉTODOS: Avaliação de técnicas utilizadas e aferição dos volumes retidos nas agulhas utilizando balança analítica. RESULTADOS: A média e o desvio padrão dos volumes retidos nas agulhas de 26, 22 (25 x 0,7 mm e 30 x 0,7 mm e 18 G (gauge foram 0,051±0,006, 0,056±0,005, 0,055±0,004 e 0,075±0,004, respectivamente, para a marca Ryncos® e 0,050±0,003, 0,056±0,002, 0,063±0,002 e 0,084±0,004, respectivamente, para a marca Becton-Dickinson®. Houve diferença estatisticamente significante entre os volumes retidos das duas marcas para as agulhas de 26, 22 (30 x 0,7 mm e 18 G com p = 0,01, p PURPOSE: To assess evaluate the accuracy of intravitreal drug administration in the treatment of endophthalmitis and to suggest a reproducible and accessible technique for this procedure. METHODS: To assess the retained volumes in needles used for the intravitreal injection of antibiotics using an analytical scale. RESULTS: Means and standard deviations of retained volumes in the 26, 22 ( 25 x 0.7 mm and 30 x 0.7 mm and 18 G were respectively 0.051±0.006, 0.056±0.005, 0.055±0.004 and 0.075±0.004 for needles Ryncos® and 0.050±0.003, 0.056±0.002, 0.063±0.002 and 0.084±0.004 for Becton-Dickinson®. There were statistically significant differences in the retained volumes between the two needle brands for 26, 22 (30 x 0.7 mm and 18 G needles with p = 0.01, p < 0.01 and p < 0.01 respectively. No difference was found only for 25 x 0.7 mm 22 G needle with p = 0.83. CONCLUSION: Most needles used for intravitreal injection hold a retained volume in the reserve needle compartment that should be considered during the injection technique, therefore minimizing errors during intravitreal injection of antibiotics.

  6. Towards reporting standards for neuropsychological study results: A proposal to minimize communication errors with standardized qualitative descriptors for normalized test scores.

    Science.gov (United States)

    Schoenberg, Mike R; Rum, Ruba S

    2017-11-01

    Rapid, clear and efficient communication of neuropsychological results is essential to benefit patient care. Errors in communication are a lead cause of medical errors; nevertheless, there remains a lack of consistency in how neuropsychological scores are communicated. A major limitation in the communication of neuropsychological results is the inconsistent use of qualitative descriptors for standardized test scores and the use of vague terminology. PubMed search from 1 Jan 2007 to 1 Aug 2016 to identify guidelines or consensus statements for the description and reporting of qualitative terms to communicate neuropsychological test scores was conducted. The review found the use of confusing and overlapping terms to describe various ranges of percentile standardized test scores. In response, we propose a simplified set of qualitative descriptors for normalized test scores (Q-Simple) as a means to reduce errors in communicating test results. The Q-Simple qualitative terms are: 'very superior', 'superior', 'high average', 'average', 'low average', 'borderline' and 'abnormal/impaired'. A case example illustrates the proposed Q-Simple qualitative classification system to communicate neuropsychological results for neurosurgical planning. The Q-Simple qualitative descriptor system is aimed as a means to improve and standardize communication of standardized neuropsychological test scores. Research are needed to further evaluate neuropsychological communication errors. Conveying the clinical implications of neuropsychological results in a manner that minimizes risk for communication errors is a quintessential component of evidence-based practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Bit Error-Rate Minimizing Detector for Amplify-and-Forward Relaying Systems Using Generalized Gaussian Kernel

    KAUST Repository

    Ahmed, Qasim Zeeshan

    2013-01-01

    In this letter, a new detector is proposed for amplifyand- forward (AF) relaying system when communicating with the assistance of relays. The major goal of this detector is to improve the bit error rate (BER) performance of the receiver. The probability density function is estimated with the help of kernel density technique. A generalized Gaussian kernel is proposed. This new kernel provides more flexibility and encompasses Gaussian and uniform kernels as special cases. The optimal window width of the kernel is calculated. Simulations results show that a gain of more than 1 dB can be achieved in terms of BER performance as compared to the minimum mean square error (MMSE) receiver when communicating over Rayleigh fading channels.

  8. On minimizing assignment errors and the trade-off between false positives and negatives in parentage analysis

    KAUST Repository

    Harrison, Hugo B.

    2013-11-04

    Genetic parentage analyses provide a practical means with which to identify parent-offspring relationships in the wild. In Harrison et al.\\'s study (2013a), we compare three methods of parentage analysis and showed that the number and diversity of microsatellite loci were the most important factors defining the accuracy of assignments. Our simulations revealed that an exclusion-Bayes theorem method was more susceptible to false-positive and false-negative assignments than other methods tested. Here, we analyse and discuss the trade-off between type I and type II errors in parentage analyses. We show that controlling for false-positive assignments, without reporting type II errors, can be misleading. Our findings illustrate the need to estimate and report both the rate of false-positive and false-negative assignments in parentage analyses. © 2013 John Wiley & Sons Ltd.

  9. Masked Visual Analysis: Minimizing Type I Error in Visually Guided Single-Case Design for Communication Disorders.

    Science.gov (United States)

    Byun, Tara McAllister; Hitchcock, Elaine R; Ferron, John

    2017-06-10

    Single-case experimental designs are widely used to study interventions for communication disorders. Traditionally, single-case experiments follow a response-guided approach, where design decisions during the study are based on participants' observed patterns of behavior. However, this approach has been criticized for its high rate of Type I error. In masked visual analysis (MVA), response-guided decisions are made by a researcher who is blinded to participants' identities and treatment assignments. MVA also makes it possible to conduct a hypothesis test assessing the significance of treatment effects. This tutorial describes the principles of MVA, including both how experiments can be set up and how results can be used for hypothesis testing. We then report a case study showing how MVA was deployed in a multiple-baseline across-subjects study investigating treatment for residual errors affecting rhotics. Strengths and weaknesses of MVA are discussed. Given their important role in the evidence base that informs clinical decision making, it is critical for single-case experimental studies to be conducted in a way that allows researchers to draw valid inferences. As a method that can increase the rigor of single-case studies while preserving the benefits of a response-guided approach, MVA warrants expanded attention from researchers in communication disorders.

  10. Minimally invasive positioning robot system of femoral neck hollow screw implants based on x-ray error correction

    Science.gov (United States)

    Zou, Yunpeng; Xu, Ying; Hu, Lei; Guo, Na; Wang, Lifeng

    2017-01-01

    Aiming the high failure rate, the high radiation quantity and the poor positioning accuracy of femoral neck traditional surgery, this article develops a set of new positioning robot system of femoral neck hollow screw implants based on X-rays error correction, which bases on the study of x-rays perspective principle and the Motion Principle of 6 DOF(degree of freedom) series robot UR(Universal Robots). Compared with Computer Assisted Navigation System, this system owns better positioning accuracy and more simple operation. In addition, without extra Equipment of Visual Tracking, this system can reduce a lot of cost. During the surgery, Doctor can plan the operation path and the pose of mark needle according to the positive and lateral X-rays images of patients. Then they can calculate the pixel ratio according to the ratio of the actual length of mark line and the length on image. After that, they can calculate the amount of exercise of UR Robot according to the relative position between operation path and guide pin and the fixed relationship between guide pin and UR robot. Then, they can control UR to drive the positioning guide pin to the operation path. At this point, check the positioning guide pin and the planning path is coincident, if not, repeat the previous steps, until the positioning guide pin and the planning path coincide which will eventually complete the positioning operation. Moreover, to verify the positioning accuracy, this paper make an errors analysis aiming to thirty cases of the experimental model of bone. The result shows that the motion accuracy of the UR Robot is 0.15mm and the Integral error precision is within 0.8mm. To verify the clinical feasibility of this system, this article analysis on three cases of the clinical experiment. In the whole process of positioning, the X-rays irradiation time is 2-3s, the number of perspective is 3-5 and the whole positioning time is 7-10min. The result shows that this system can complete accurately

  11. Standard Practice for Minimizing Dosimetry Errors in Radiation Hardness Testing of Silicon Electronic Devices Using Co-60 Sources

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice covers recommended procedures for the use of dosimeters, such as thermoluminescent dosimeters (TLD's), to determine the absorbed dose in a region of interest within an electronic device irradiated using a Co-60 source. Co-60 sources are commonly used for the absorbed dose testing of silicon electronic devices. Note 1—This absorbed-dose testing is sometimes called “total dose testing” to distinguish it from “dose rate testing.” Note 2—The effects of ionizing radiation on some types of electronic devices may depend on both the absorbed dose and the absorbed dose rate; that is, the effects may be different if the device is irradiated to the same absorbed-dose level at different absorbed-dose rates. Absorbed-dose rate effects are not covered in this practice but should be considered in radiation hardness testing. 1.2 The principal potential error for the measurement of absorbed dose in electronic devices arises from non-equilibrium energy deposition effects in the vicinity o...

  12. Exceptional phenomenology

    DEFF Research Database (Denmark)

    Aggerholm, Kenneth; Moltke Martiny, Kristian

    . Through exceptional cases we can gain a deeper understanding of the ordinary. This was already a guiding thread in Merleau-Ponty’s phenomenological investigations, but this paper will take the idea further by grounding the methodology in ‘hands on’ research in elite sport (football) and pathological cases...

  13. Exceptional Reductions

    CERN Document Server

    Marrani, Alessio; Riccioni, Fabio

    2011-01-01

    Starting from basic identities of the group E8, we perform progressive reductions, namely decompositions with respect to the maximal and symmetric embeddings of E7xSU(2) and then of E6xU(1). This procedure provides a systematic approach to the basic identities involving invariant primitive tensor structures of various irreprs. of finite-dimensional exceptional Lie groups. We derive novel identities for E7 and E6, highlighting the E8 origin of some well known ones. In order to elucidate the connections of this formalism to four-dimensional Maxwell-Einstein supergravity theories based on symmetric scalar manifolds (and related to irreducible Euclidean Jordan algebras, the unique exception being the triality-symmetric N = 2 stu model), we then derive a fundamental identity involving the unique rank-4 symmetric invariant tensor of the 0-brane charge symplectic irrepr. of U-duality groups, with potential applications in the quantization of the charge orbits of supergravity theories, as well as in the study of mult...

  14. AMERICAN EXCEPTIONALISM

    Directory of Open Access Journals (Sweden)

    Oana-Andreea Pirnuta

    2017-11-01

    Full Text Available In an interconnected world where foreign relations matter not only for resources or military alliances but also for cultural relationships, it is highly important to have a better understanding of the power relations among nations. The information carries certain meanings that have important outcomes thus defining the power of a given nation. Foreign policy is the channel through which global politics is exercised. International politics is a hierarchy of power being determined by important cultural, economic as well as geographical aspects. The reasons and strategies that are used in order to reach the outcomes in global politics represent the focus of the present paper. The United States has been the leader in international politics since the early 20th century due to its vast resources and wealth as well as its cultural output. America’s interest in preserving a democratic and free world has its foundation in the beliefs and values it stands for the aim of this paper is to question whether or not there is a concrete premise for the idea of American exceptionalism.

  15. Exception handling for sensor fusion

    Science.gov (United States)

    Chavez, G. T.; Murphy, Robin R.

    1993-08-01

    This paper presents a control scheme for handling sensing failures (sensor malfunctions, significant degradations in performance due to changes in the environment, and errant expectations) in sensor fusion for autonomous mobile robots. The advantages of the exception handling mechanism are that it emphasizes a fast response to sensing failures, is able to use only a partial causal model of sensing failure, and leads to a graceful degradation of sensing if the sensing failure cannot be compensated for. The exception handling mechanism consists of two modules: error classification and error recovery. The error classification module in the exception handler attempts to classify the type and source(s) of the error using a modified generate-and-test procedure. If the source of the error is isolated, the error recovery module examines its cache of recovery schemes, which either repair or replace the current sensing configuration. If the failure is due to an error in expectation or cannot be identified, the planner is alerted. Experiments using actual sensor data collected by the CSM Mobile Robotics/Machine Perception Laboratory's Denning mobile robot demonstrate the operation of the exception handling mechanism.

  16. A transactional model for automatic exception handling

    OpenAIRE

    Cabral, Bruno Miguel Brás

    2009-01-01

    Tese de doutoramento em Engenharia Informática apresentada à Fac. de Ciências e Tecnologia da Univ. de Coimbra Exception handling mechanisms have been around for more than 30 years. Although modern exceptions systems are not very different from the early models, the large majority of modern programming languages rely on exception handling constructs for dealing with errors and abnormal situations. Exceptions have several advantages over other error handling mechanisms, such as the return o...

  17. Trends in Modern Exception Handling

    Directory of Open Access Journals (Sweden)

    Marcin Kuta

    2003-01-01

    Full Text Available Exception handling is nowadays a necessary component of error proof information systems. The paper presents overview of techniques and models of exception handling, problems connected with them and potential solutions. The aspects of implementation of propagation mechanisms and exception handling, their effect on semantics and general program efficiency are also taken into account. Presented mechanisms were adopted to modern programming languages. Considering design area, formal methods and formal verification of program properties we can notice exception handling mechanisms are weakly present what makes a field for future research.

  18. A simple method for assessment and minimization of errors in determination of electrophoretic or electroosmotic mobilities and velocities associated with the axial electric field distortion.

    Science.gov (United States)

    Nowak, Paweł Mateusz; Woźniakiewicz, Michał; Kościelniak, Paweł

    2015-12-01

    It is commonly accepted that the modern CE instruments equipped with efficient cooling system enable accurate determination of electrophoretic or electroosmotic mobilities. It is also often assumed that velocity of migration in a given buffer is constant throughout the capillary length. It is simultaneously neglected that the noncooled parts of capillary produce extensive Joule heating leading to an axial electric field distortion, which contributes to a difference between the effective and nominal electric field potentials and between velocities in the cooled and noncooled parts of capillary. This simplification introduces systematic errors, which so far were however not investigated experimentally. There was also no method proposed for their elimination. We show a simple and fast method allowing for estimation and elimination of these errors that is based on combination of a long-end and short-end injections. We use it to study the effects caused by variation of temperature, electric field, capillary length, and pH. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. 20 CFR 220.179 - Exceptions to medical improvement.

    Science.gov (United States)

    2010-04-01

    ... disability decision was in error. The Board will apply the exception to medical improvement based on error if... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Exceptions to medical improvement. 220.179... Medical Improvement § 220.179 Exceptions to medical improvement. (a) First group of exceptions to medical...

  20. Avulsion of the left internal mammary artery graft after minimally invasive coronary surgery: fatal complication or medical error? A case report.

    Science.gov (United States)

    Viel, Guido; Balmaceda, Ute; Sperhake, Jan P

    2009-01-01

    Minimally invasive direct coronary artery bypass (MIDCAB) is performed through a left anterior mini-thoracotomy without the use of a cardiopulmonary bypass and offers greater potential for more rapid recovery, reduced pain and a decreased need for blood transfusion than conventional coronary artery bypass grafting. Few major complications of the MIDCAB procedure have been reported in the literature since the first intervention was performed in 1995, but the most serious one is avulsion of the left internal mammary artery (LIMA) graft near the site of anastomosis with the left anterior descending coronary artery. Forensic issues regarding the role of the surgeon in causing this life-threatening emergency condition have not been discussed. We report here the case of a 48-year-old man who died 18 days after a MIDCAB of massive thoracic bleeding due to the avulsion of the LIMA graft. We discuss the probable etiopathogenesis of this fatal complication from a forensic point of view.

  1. Action errors, error management, and learning in organizations.

    Science.gov (United States)

    Frese, Michael; Keith, Nina

    2015-01-03

    Every organization is confronted with errors. Most errors are corrected easily, but some may lead to negative consequences. Organizations often focus on error prevention as a single strategy for dealing with errors. Our review suggests that error prevention needs to be supplemented by error management--an approach directed at effectively dealing with errors after they have occurred, with the goal of minimizing negative and maximizing positive error consequences (examples of the latter are learning and innovations). After defining errors and related concepts, we review research on error-related processes affected by error management (error detection, damage control). Empirical evidence on positive effects of error management in individuals and organizations is then discussed, along with emotional, motivational, cognitive, and behavioral pathways of these effects. Learning from errors is central, but like other positive consequences, learning occurs under certain circumstances--one being the development of a mind-set of acceptance of human error.

  2. Exceptional composite dark matter

    Energy Technology Data Exchange (ETDEWEB)

    Ballesteros, Guillermo [Universite Paris Saclay, CEA, CNRS, Institut de Physique Theorique, Gif-sur-Yvette (France); Carmona, Adrian [CERN, Theoretical Physics Department, Geneva (Switzerland); Chala, Mikael [Universitat de Valencia y IFIC, Universitat de Valencia-CSIC, Departament de Fisica Teorica, Burjassot, Valencia (Spain)

    2017-07-15

    We study the dark matter phenomenology of non-minimal composite Higgs models with SO(7) broken to the exceptional group G{sub 2}. In addition to the Higgs, three pseudo-Nambu-Goldstone bosons arise, one of which is electrically neutral. A parity symmetry is enough to ensure this resonance is stable. In fact, if the breaking of the Goldstone symmetry is driven by the fermion sector, this Z{sub 2} symmetry is automatically unbroken in the electroweak phase. In this case, the relic density, as well as the expected indirect, direct and collider signals are then uniquely determined by the value of the compositeness scale, f. Current experimental bounds allow one to account for a large fraction of the dark matter of the Universe if the dark matter particle is part of an electroweak triplet. The totality of the relic abundance can be accommodated if instead this particle is a composite singlet. In both cases, the scale f and the dark matter mass are of the order of a few TeV. (orig.)

  3. Error Minimization of Polynomial Approximation of Delta

    Indian Academy of Sciences (India)

    The difference between Universal time (UT) and Dynamical time (TD), known as Delta ( ) is tabulated for the first day of each year in the Astronomical Almanac. During the last four centuries it is found that there are large differences between its values for two consecutive years. Polynomial approximations have been ...

  4. Preliminary evaluation of an algorithm to minimize the power error selection of an aspheric intraocular lens by optimizing the estimation of the corneal power and the effective lens position

    Directory of Open Access Journals (Sweden)

    David P. Piñero

    2016-06-01

    Full Text Available AIM: To evaluate the refractive predictability achieved with an aspheric intraocular lens(IOLand to develop a preliminary optimized algorithm for the calculation of its power(PIOL.METHODS: This study included 65 eyes implanted with the aspheric IOL LENTIS L-313(Oculentis GmbHthat were divided into 2 groups: 12 eyes(8 patientswith PIOL≥23.0 D(group A, and 53 eyes(35 patientswith PIOLIOLadjwas calculated considering a variable refractive index for corneal power estimation, the refractive outcome obtained, and an adjusted effective lens position(ELPadjaccording to age and anatomical factors. RESULTS: Postoperative spherical equivalent ranged from -0.75 to +0.75 D and from -1.38 to +0.75 D in groups A and B, respectively. No statistically significant differences were found in groups A(P=0.64and B(P=0.82between PIOLadj and the IOL power implanted(PIOLReal. The Bland and Altman analysis showed ranges of agreement between PIOLadj and PIOLReal of +1.11 to -0.96 D and +1.14 to -1.18 D in groups A and B, respectively. Clinically and statistically significant differences were found between PIOLadj and PIOL obtained with Hoffer Q and Holladay I formulas(PCONCLUSION: The refractive predictability of cataract surgery with implantation of an aspheric IOL can be optimized using paraxial optics combined with linear algorithms to minimize the error associated to the estimation of corneal power and ELP.

  5. Medication Errors

    Science.gov (United States)

    ... for You Agency for Healthcare Research and Quality: Medical Errors and Patient Safety Centers for Disease Control and ... Quality Chasm Series National Coordinating Council for Medication Error Reporting and Prevention ... Devices Radiation-Emitting Products Vaccines, Blood & Biologics Animal & ...

  6. Error Patterns

    NARCIS (Netherlands)

    Hoede, C.; Li, Z.

    2001-01-01

    In coding theory the problem of decoding focuses on error vectors. In the simplest situation code words are $(0,1)$-vectors, as are the received messages and the error vectors. Comparison of a received word with the code words yields a set of error vectors. In deciding on the original code word,

  7. Exceptional Cable Television.

    Science.gov (United States)

    Hunt, Edmund B.; Reid, John E., Jr.

    1987-01-01

    Ways in which the resources of a university's special education, communication arts, and library services can be combined with those of special education consortiums or parent organizations to provide exceptional children and their parents and teachers with high-quality cable educational television programs that meet their varied needs are…

  8. On exceptional instanton strings

    NARCIS (Netherlands)

    Del Zotto, M.; Lockhart, G.

    According to a recent classification of 6d (1, 0) theories within F-theory there are only six “pure” 6d gauge theories which have a UV superconformal fixed point. The corresponding gauge groups are SU(3), SO(8), F4, E6, E7, and E8. These exceptional models have BPS strings which are also instantons

  9. Exceptionalism and globalism

    Directory of Open Access Journals (Sweden)

    John Cairns Jr

    2001-03-01

    Full Text Available ABSTRACT: Achieving sustainable use of the planet will require ethical judgments in both sciences and environmental politics. The purpose of this editorial is to discuss two paradigms, exceptionalism and globalism, that are important in this regard. Exceptionalism is the insistence that one set of rules or behaviors is acceptable for an individual or country but that a different set should be used for the rest of the world. For example, the disparity in per capita consumption of resources and economic status has increased dramatically in the last century, but the consumers of great amounts of resources do not feel a proportionate responsibility for addressing this issue. Globalism is defined as individual and societal willingness to diminish, postpone or forgo individual natural resource use to protect and enhance the integrity of the global ecological life support system. Increasing affluence and the still increasing human population, coupled with wide dissemination of information and an increasing awareness that humans occupy a finite planet, exacerbate this already difficult situation. Increased interest in sustainable use of the planet makes open discussion of these issues mandatory because individuals cannot function in isolation from the larger society of which they are a part. Similarly, no country can function in isolation from other countries, which collectively form an interactive mosaic. This discussion identifies some of the crucial issues related to exceptionalism and globalism, which must be addressed before sustainable use of the planet can be achieved.

  10. Taxonomic minimalism.

    Science.gov (United States)

    Beattle, A J; Oliver, I

    1994-12-01

    Biological surveys are in increasing demand while taxonomic resources continue to decline. How much formal taxonomy is required to get the job done? The answer depends on the kind of job but it is possible that taxonomic minimalism, especially (1) the use of higher taxonomic ranks, (2) the use of morphospecies rather than species (as identified by Latin binomials), and (3) the involvement of taxonomic specialists only for training and verification, may offer advantages for biodiversity assessment, environmental monitoring and ecological research. As such, formal taxonomy remains central to the process of biological inventory and survey but resources may be allocated more efficiently. For example, if formal Identification is not required, resources may be concentrated on replication and increasing sample sizes. Taxonomic minimalism may also facilitate the inclusion in these activities of important but neglected groups, especially among the invertebrates, and perhaps even microorganisms. Copyright © 1994. Published by Elsevier Ltd.

  11. The Implication of Diagnostic Errors.

    Science.gov (United States)

    Govindarajan, Raghav

    2017-10-01

    Diagnostic errors are mistakes in the diagnostic process that lead to a misdiagnosis, a missed diagnosis, or a delayed diagnosis. While the past decade's impetus to improve patient safety has focused on medication errors, health care-associated infections, and postsurgical complications, diagnostic errors have received comparatively less attention. Diagnostic errors will continue to play a major role in the patient safety and quality improvement movement because of their burden on care and their financial burden. Developing a patient-partnered diagnostic approach with self-reflection and awareness of cognitive biases is the key to minimizing the impact of diagnostic errors.

  12. Exceptionality in vowel harmony

    Science.gov (United States)

    Szeredi, Daniel

    Vowel harmony has been of great interest in phonological research. It has been widely accepted that vowel harmony is a phonetically natural phenomenon, which means that it is a common pattern because it provides advantages to the speaker in articulation and to the listener in perception. Exceptional patterns proved to be a challenge to the phonetically grounded analysis as they, by their nature, introduce phonetically disadvantageous sequences to the surface form, that consist of harmonically different vowels. Such forms are found, for example in the Finnish stem tuoli 'chair' or in the Hungarian suffixed form hi:d-hoz 'to the bridge', both word forms containing a mix of front and back vowels. There has recently been evidence shown that there might be a phonetic level explanation for some exceptional patterns, as the possibility that some vowels participating in irregular stems (like the vowel [i] in the Hungarian stem hi:d 'bridge' above) differ in some small phonetic detail from vowels in regular stems. The main question has not been raised, though: does this phonetic detail matter for speakers? Would they use these minor differences when they have to categorize a new word as regular or irregular? A different recent trend in explaining morphophonological exceptionality by looking at the phonotactic regularities characteristic of classes of stems based on their morphological behavior. Studies have shown that speakers are aware of these regularities, and use them as cues when they have to decide what class a novel stem belongs to. These sublexical phonotactic regularities have already been shown to be present in some exceptional patterns vowel harmony, but many questions remain open: how is learning the static generalization linked to learning the allomorph selection facet of vowel harmony? How much does the effect of consonants on vowel harmony matter, when compared to the effect of vowel-to-vowel correspondences? This dissertation aims to test these two ideas

  13. Skylab water balance error analysis

    Science.gov (United States)

    Leonard, J. I.

    1977-01-01

    Estimates of the precision of the net water balance were obtained for the entire Skylab preflight and inflight phases as well as for the first two weeks of flight. Quantitative estimates of both total sampling errors and instrumentation errors were obtained. It was shown that measurement error is minimal in comparison to biological variability and little can be gained from improvement in analytical accuracy. In addition, a propagation of error analysis demonstrated that total water balance error could be accounted for almost entirely by the errors associated with body mass changes. Errors due to interaction between terms in the water balance equation (covariances) represented less than 10% of the total error. Overall, the analysis provides evidence that daily measurements of body water changes obtained from the indirect balance technique are reasonable, precise, and relaible. The method is not biased toward net retention or loss.

  14. Medical error

    African Journals Online (AJOL)

    QuickSilver

    Studies in the USA have shown that medical error is the 8th most common cause of death.2,3. The most common causes of medical error are:- administration of the wrong medication or wrong dose of the correct medication, using the wrong route of administration, giving a treatment to the wrong patient or at the wrong time.4 ...

  15. Is India the Exception?

    DEFF Research Database (Denmark)

    Nielsen, Klaus; Storm, Rasmus K.

    India is still the extreme under-achiever in international sport competitions. Whereas in China high growth rates have been accompanied by a huge improvement in its ranking in international sport events a similar impact of extraordinary growth rates is seemingly totally absent in the case of India....... Is India an exception? Several econometric studies have shown that income per capita is a significant variable explaining elite sport results such as results in the Olympic Games. From this stylized fact follows the hypothesis that 'above/below average' growth rates lead to relative improvements....../deterioration of elite sport results (with a time lag)’. However, this has not previously been tested, and the contingencies explaining the seemingly widely different developments in countries such as China and India have not been explored. This paper tests the above hypothesis by means of a study of the correlation...

  16. Is India the Exception?

    DEFF Research Database (Denmark)

    Nielsen, Klaus; Storm, Rasmus K.

    2013-01-01

    India is the extreme under-achiever in international sport competitions. This has only marginally changed with the recent promotion of the Indian economy into the league of BRIC nations. Whereas in China high growth rates have been accompanied by a huge improvement of its performance in internati......India is the extreme under-achiever in international sport competitions. This has only marginally changed with the recent promotion of the Indian economy into the league of BRIC nations. Whereas in China high growth rates have been accompanied by a huge improvement of its performance...... in international sport events a similar impact of extraordinary growth rates has been almost totally absent in the case of India. Is India an exception? Several econometric studies have shown that income per capita is a significant variable explaining elite sport results such as results in the Olympic Games. From...

  17. Giftedness: an exceptionality examined.

    Science.gov (United States)

    Robinson, A; Clinkenbeard, P R

    1998-01-01

    The study of giftedness has practical origins. High-level performance intrigues people. Theoretically, the study of giftedness is related to the psychology of individual differences and has focused on the constructs of intelligence, creativity, and motivation. At a practical level, the research is largely related to school and family contexts, which develop gifts and talents in children and youth. Although broadened definitions of giftedness have emerged, the most extensive body of research available for review concentrates on intellectual giftedness. The varying definitions of giftedness and the impact of social context and diversity on the development of talent pose significant challenges for the field. Finally, the study of exceptionally advanced performance provides insight into basic psychological processes and the school contexts that develop talents in children and youth.

  18. Refractive Errors

    Science.gov (United States)

    ... halos around bright lights, squinting, headaches, or eye strain. Glasses or contact lenses can usually correct refractive errors. Laser eye surgery may also be a possibility. NIH: National Eye ...

  19. On exceptional instanton strings

    Science.gov (United States)

    Del Zotto, Michele; Lockhart, Guglielmo

    2017-09-01

    According to a recent classification of 6d (1 , 0) theories within F-theory there are only six "pure" 6d gauge theories which have a UV superconformal fixed point. The corresponding gauge groups are SU(3) , SO(8) , F 4 , E 6 , E 7, and E 8. These exceptional models have BPS strings which are also instantons for the corresponding gauge groups. For G simply-laced, we determine the 2d N=(0,4) worldsheet theories of such BPS instanton strings by a simple geometric engineering argument. These are given by a twisted S 2 compactification of the 4d N=2 theories of type H 2 , D 4 , E 6 , E 7 and E 8 (and their higher rank generalizations), where the 6d instanton number is mapped to the rank of the corresponding 4d SCFT. This determines their anomaly polynomials and, via topological strings, establishes an interesting relation among the corresponding T 2 × S 2 partition functions and the Hilbert series for moduli spaces of G instantons. Such relations allow to bootstrap the corresponding elliptic genera by modularity. As an example of such procedure, the elliptic genera for a single instanton string are determined. The same method also fixes the elliptic genus for case of one F 4 instanton. These results unveil a rather surprising relation with the Schur index of the corresponding 4d N=2 models.

  20. New Nordic Exceptionalism

    DEFF Research Database (Denmark)

    Danbolt, Mathias

    2016-01-01

    . This article takes Kim and Einhorn’s intervention as a starting point for a critical discussion of the history and politics of Nordic image-building. The article suggests that the reason Kim and Einhorn’s speech passed as a serious proposal was due to its meticulous mimicking of two discursive formations...... that have been central to the debates on the branding of Nordicity over the last decades: on the one hand, the discourse of “Nordic exceptionalism,” that since the 1960s has been central to the promotion of a Nordic political, socio-economic, and internationalist “third way” model, and, on the other hand......, the discourse on the “New Nordic,” that emerged out of the New Nordic Food-movement in the early 2000s, and which has given art and culture a privileged role in the international re-fashioning of the Nordic brand. Through an analysis of Kim and Einhorn’s United Nations of Norden (UNN)-performance, the article...

  1. [DIAGNOSTIC ERRORS IN INTERNAL MEDICINE].

    Science.gov (United States)

    Schattner, Ami

    2017-02-01

    Diagnostic errors remain an important target in improving the quality of care and achieving better health outcomes. With a relatively steady rate estimated at 10-15% in many settings, research aiming to elucidate mechanisms of error is highly important. Results indicate that not only cognitive mistakes but a number of factors acting together often culminate in a diagnostic error. Far from being 'unpreventable', several methods and techniques are suggested that may show promise in minimizing diagnostic errors. These measures should be further investigated and incorporated into all phases of medical education.

  2. Refractive errors.

    Science.gov (United States)

    Schiefer, Ulrich; Kraus, Christina; Baumbach, Peter; Ungewiß, Judith; Michels, Ralf

    2016-10-14

    All over the world, refractive errors are among the most frequently occuring treatable distur - bances of visual function. Ametropias have a prevalence of nearly 70% among adults in Germany and are thus of great epidemiologic and socio-economic relevance. In the light of their own clinical experience, the authors review pertinent articles retrieved by a selective literature search employing the terms "ametropia, "anisometropia," "refraction," "visual acuity," and epidemiology." In 2011, only 31% of persons over age 16 in Germany did not use any kind of visual aid; 63.4% wore eyeglasses and 5.3% wore contact lenses. Refractive errors were the most common reason for consulting an ophthalmologist, accounting for 21.1% of all outpatient visits. A pinhole aperture (stenopeic slit) is a suitable instrument for the basic diagnostic evaluation of impaired visual function due to optical factors. Spherical refractive errors (myopia and hyperopia), cylindrical refractive errors (astigmatism), unequal refractive errors in the two eyes (anisometropia), and the typical optical disturbance of old age (presbyopia) cause specific functional limitations and can be detected by a physician who does not need to be an ophthalmologist. Simple functional tests can be used in everyday clinical practice to determine quickly, easily, and safely whether the patient is suffering from a benign and easily correctable type of visual impairment, or whether there are other, more serious underlying causes.

  3. Specialized minimal PDFs for optimized LHC calculations

    National Research Council Canada - National Science Library

    Carrazza, Stefano; Forte, Stefano; Kassabov, Zahari; Rojo, Juan

    2016-01-01

    We present a methodology for the construction of parton distribution functions (PDFs) designed to provide an accurate representation of PDF uncertainties for specific processes or classes of processes with a minimal number of PDF error sets...

  4. 22 CFR 1423.28 - Briefs in support of exceptions; oppositions to exceptions; cross-exceptions.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Briefs in support of exceptions; oppositions to exceptions; cross-exceptions. 1423.28 Section 1423.28 Foreign Relations FOREIGN SERVICE LABOR RELATIONS BOARD... FOREIGN SERVICE IMPASSE DISPUTES PANEL FOREIGN SERVICE LABOR RELATIONS BOARD AND GENERAL COUNSEL OF THE...

  5. Specialized minimal PDFs for optimized LHC calculations

    NARCIS (Netherlands)

    Carrazza, Stefano; Forte, Stefano; Kassabov, Zahari; Rojo, Juan

    2016-01-01

    We present a methodology for the construction of parton distribution functions (PDFs) designed to provide an accurate representation of PDF uncertainties for specific processes or classes of processes with a minimal number of PDF error sets: specialized minimal PDF sets, or SM-PDFs. We construct

  6. Assessment of the ''thermal normalization technique'' for measurement of neutron cross sections vs energy. [Above 20 keV, resonance, fission integrals, error minimization

    Energy Technology Data Exchange (ETDEWEB)

    Peelle, R.W.; de Sassure, G.

    1977-01-01

    Refined knowledge of the thermal neutron cross sections of the fissile nuclides and of the (n,..cap alpha..) reaction standards, together with the reasonably well known energy dependence of the latter, have permitted resonance-region and low-keV fissile nuclide cross sections to be based on these standards together with count-rate ratios observed as a function of energy using a pulsed ''white'' source. As one evaluates cross sections for energies above 20 keV, optimum results require combination of cross section shape measurements with all available absolute measurements. The assumptions of the ''thermal normalization method'' are reviewed, and an opinion is given of the status of some of the standards required for its use. The complications which may limit the accuracy of results using the method are listed and examples are given. For the /sup 235/U(n,f) cross section, the option is discussed of defining resonance-region fission integrals as standards. The area of the approximately 9 eV resonances in this nuclide may be known to one percent accuracy, but at present the fission integral from 0.1 to 1.0 keV is known to no better than about two percent. This uncertainty is based on the scatter among independent results, and has not been reduced by the most recent measurements. This uncertainty now limits the accuracy attainable for the /sup 235/U(n,f) cross section below about 50 keV. Suggestions are given to indicate how future detailed work might overcome past sources of error.

  7. 75 FR 28306 - Excepted Service

    Science.gov (United States)

    2010-05-20

    ... MANAGEMENT Excepted Service AGENCY: U.S. Office of Personnel Management (OPM). ACTION: Notice. SUMMARY: This... excepted service as required by 5 CFR 213.103. FOR FURTHER INFORMATION CONTACT: Roland Edwards, Senior Executive Resource Services, Employee Services, 202-606-2246. SUPPLEMENTARY INFORMATION: Appearing in the...

  8. 75 FR 3947 - Excepted Service

    Science.gov (United States)

    2010-01-25

    ... MANAGEMENT Excepted Service AGENCY: U.S. Office of Personnel Management (OPM). ACTION: Notice. SUMMARY: This... excepted service as required by 5 CFR 213.103. FOR FURTHER INFORMATION CONTACT: Roland Edwards, Senior Executive Resource Services, Employee Services, 202-606-2246. SUPPLEMENTARY INFORMATION: Appearing in the...

  9. Optimizing Processes to Minimize Risk

    Science.gov (United States)

    Loyd, David

    2017-01-01

    NASA, like the other hazardous industries, has suffered very catastrophic losses. Human error will likely never be completely eliminated as a factor in our failures. When you can't eliminate risk, focus on mitigating the worst consequences and recovering operations. Bolstering processes to emphasize the role of integration and problem solving is key to success. Building an effective Safety Culture bolsters skill-based performance that minimizes risk and encourages successful engagement.

  10. Errors in clinical laboratories or errors in laboratory medicine?

    Science.gov (United States)

    Plebani, Mario

    2006-01-01

    Laboratory testing is a highly complex process and, although laboratory services are relatively safe, they are not as safe as they could or should be. Clinical laboratories have long focused their attention on quality control methods and quality assessment programs dealing with analytical aspects of testing. However, a growing body of evidence accumulated in recent decades demonstrates that quality in clinical laboratories cannot be assured by merely focusing on purely analytical aspects. The more recent surveys on errors in laboratory medicine conclude that in the delivery of laboratory testing, mistakes occur more frequently before (pre-analytical) and after (post-analytical) the test has been performed. Most errors are due to pre-analytical factors (46-68.2% of total errors), while a high error rate (18.5-47% of total errors) has also been found in the post-analytical phase. Errors due to analytical problems have been significantly reduced over time, but there is evidence that, particularly for immunoassays, interference may have a serious impact on patients. A description of the most frequent and risky pre-, intra- and post-analytical errors and advice on practical steps for measuring and reducing the risk of errors is therefore given in the present paper. Many mistakes in the Total Testing Process are called "laboratory errors", although these may be due to poor communication, action taken by others involved in the testing process (e.g., physicians, nurses and phlebotomists), or poorly designed processes, all of which are beyond the laboratory's control. Likewise, there is evidence that laboratory information is only partially utilized. A recent document from the International Organization for Standardization (ISO) recommends a new, broader definition of the term "laboratory error" and a classification of errors according to different criteria. In a modern approach to total quality, centered on patients' needs and satisfaction, the risk of errors and mistakes

  11. Minimal Erythema Dose (MED) testing.

    Science.gov (United States)

    Heckman, Carolyn J; Chandler, Rachel; Kloss, Jacqueline D; Benson, Amy; Rooney, Deborah; Munshi, Teja; Darlow, Susan D; Perlis, Clifford; Manne, Sharon L; Oslin, David W

    2013-05-28

    Ultraviolet radiation (UV) therapy is sometimes used as a treatment for various common skin conditions, including psoriasis, acne, and eczema. The dosage of UV light is prescribed according to an individual's skin sensitivity. Thus, to establish the proper dosage of UV light to administer to a patient, the patient is sometimes screened to determine a minimal erythema dose (MED), which is the amount of UV radiation that will produce minimal erythema (sunburn or redness caused by engorgement of capillaries) of an individual's skin within a few hours following exposure. This article describes how to conduct minimal erythema dose (MED) testing. There is currently no easy way to determine an appropriate UV dose for clinical or research purposes without conducting formal MED testing, requiring observation hours after testing, or informal trial and error testing with the risks of under- or over-dosing. However, some alternative methods are discussed.

  12. 77 FR 74196 - Draft Guidance for Industry on Safety Considerations for Product Design To Minimize Medication...

    Science.gov (United States)

    2012-12-13

    ... Product Design To Minimize Medication Errors; Availability AGENCY: Food and Drug Administration, HHS... guidance for industry entitled ``Safety Considerations for Product Design to Minimize Medication Errors... using a systems approach to minimize medication errors relating to product design. The draft guidance...

  13. Error Budgeting

    Energy Technology Data Exchange (ETDEWEB)

    Vinyard, Natalia Sergeevna [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Perry, Theodore Sonne [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Usov, Igor Olegovich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-04

    We calculate opacity from k (hn)=-ln[T(hv)]/pL, where T(hv) is the transmission for photon energy hv, p is sample density, and L is path length through the sample. The density and path length are measured together by Rutherford backscatter. Δk = $\\partial k$\\ $\\partial T$ ΔT + $\\partial k$\\ $\\partial (pL)$. We can re-write this in terms of fractional error as Δk/k = Δ1n(T)/T + Δ(pL)/(pL). Transmission itself is calculated from T=(U-E)/(V-E)=B/B0, where B is transmitted backlighter (BL) signal and B0 is unattenuated backlighter signal. Then ΔT/T=Δln(T)=ΔB/B+ΔB0/B0, and consequently Δk/k = 1/T (ΔB/B + ΔB$_0$/B$_0$ + Δ(pL)/(pL). Transmission is measured in the range of 0.2

  14. On the Error State Selection for Stationary SINS Alignment and Calibration Kalman Filters-Part II: Observability/Estimability Analysis.

    Science.gov (United States)

    Silva, Felipe O; Hemerly, Elder M; Leite Filho, Waldemar C

    2017-02-23

    This paper presents the second part of a study aiming at the error state selection in Kalman filters applied to the stationary self-alignment and calibration (SSAC) problem of strapdown inertial navigation systems (SINS). The observability properties of the system are systematically investigated, and the number of unobservable modes is established. Through the analytical manipulation of the full SINS error model, the unobservable modes of the system are determined, and the SSAC error states (except the velocity errors) are proven to be individually unobservable. The estimability of the system is determined through the examination of the major diagonal terms of the covariance matrix and their eigenvalues/eigenvectors. Filter order reduction based on observability analysis is shown to be inadequate, and several misconceptions regarding SSAC observability and estimability deficiencies are removed. As the main contributions of this paper, we demonstrate that, except for the position errors, all error states can be minimally estimated in the SSAC problem and, hence, should not be removed from the filter. Corroborating the conclusions of the first part of this study, a 12-state Kalman filter is found to be the optimal error state selection for SSAC purposes. Results from simulated and experimental tests support the outlined conclusions.

  15. Numerical optimization with computational errors

    CERN Document Server

    Zaslavski, Alexander J

    2016-01-01

    This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative. This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton’s meth...

  16. The history of AIDS exceptionalism

    Directory of Open Access Journals (Sweden)

    Smith Julia H

    2010-12-01

    Full Text Available Abstract In the history of public health, HIV/AIDS is unique; it has widespread and long-lasting demographic, social, economic and political impacts. The global response has been unprecedented. AIDS exceptionalism - the idea that the disease requires a response above and beyond "normal" health interventions - began as a Western response to the originally terrifying and lethal nature of the virus. More recently, AIDS exceptionalism came to refer to the disease-specific global response and the resources dedicated to addressing the epidemic. There has been a backlash against this exceptionalism, with critics claiming that HIV/AIDS receives a disproportionate amount of international aid and health funding. This paper situations this debate in historical perspective. By reviewing histories of the disease, policy developments and funding patterns, it charts how the meaning of AIDS exceptionalism has shifted over three decades. It argues that while the connotation of the term has changed, the epidemic has maintained its course, and therefore some of the justifications for exceptionalism remain.

  17. Human Error and Organizational Management

    Directory of Open Access Journals (Sweden)

    Alecxandrina DEACONU

    2009-01-01

    Full Text Available The concern for performance is a topic that raises interest in the businessenvironment but also in other areas that – even if they seem distant from thisworld – are aware of, interested in or conditioned by the economy development.As individual performance is very much influenced by the human resource, wechose to analyze in this paper the mechanisms that generate – consciously or not–human error nowadays.Moreover, the extremely tense Romanian context,where failure is rather a rule than an exception, made us investigate thephenomenon of generating a human error and the ways to diminish its effects.

  18. Exceptional cognitive ability: the phenotype.

    Science.gov (United States)

    Lubinski, David

    2009-07-01

    Characterizing the outcomes related to the phenotype of exceptional cognitive abilities has been feasible in recent years due to the availability of large samples of intellectually precocious adolescents identified by modern talent searches that have been followed-up longitudinally over multiple decades. The level and pattern of cognitive abilities, even among participants within the top 1% of general intellectual ability, are related to differential developmental trajectories and important life accomplishments: The likelihood of earning a doctorate, earning exceptional compensation, publishing novels, securing patents, and earning tenure at a top university (and the academic disciplines within which tenure is most likely to occur) all vary as a function of individual differences in cognitive abilities assessed decades earlier. Individual differences that distinguish the able (top 1 in 100) from the exceptionally able (top 1 in 10,000) during early adolescence matter in life, and, given the heritability of general intelligence, they suggest that understanding the genetic and environmental origins of exceptional abilities should be a high priority for behavior genetic research, especially because the results for extreme groups could differ from the rest of the population. In addition to enhancing our understanding of the etiology of general intelligence at the extreme, such inquiry may also reveal fundamental determinants of specific abilities, like mathematical versus verbal reasoning, and the distinctive phenotypes that contrasting ability patterns are most likely to eventuate in at extraordinary levels.

  19. 78 FR 4881 - Excepted Service

    Science.gov (United States)

    2013-01-23

    ... MANAGEMENT Excepted Service AGENCY: U.S. Office of Personnel Management (OPM). ACTION: Notice. SUMMARY: This... Executive Resources Services, Executive Resources and Employee Development, Employee Services, 202- 606-2246.../2012 Headquarters Services. Office of Assistant Speechwriter......... DD130012 11/9/2012 Secretary of...

  20. Exceptional and Spinorial Conformal Windows

    DEFF Research Database (Denmark)

    Mojaza, Matin; Pica, Claudio; Ryttov, Thomas

    2012-01-01

    We study the conformal window of gauge theories containing fermionic matter fields, where the gauge group is any of the exceptional groups with the fermions transforming according to the fundamental and adjoint representations and the orthogonal groups where the fermions transform according...

  1. Learned Helplessness in Exceptional Children.

    Science.gov (United States)

    Brock, Herman B.; Kowitz, Gerald T.

    The research literature on learned helplessness in exceptional children is reviewed and the authors' efforts to identify and retrain learning disabled (LD) children who have characteristics typical of learned helplessness are reported. Twenty-eight elementary aged LD children viewed as "learned helpless" were randomly assigned to one of four…

  2. The Exceptional State in Africa

    DEFF Research Database (Denmark)

    Suzuki, Shogo

    2013-01-01

    China's relations with African states have undergone significant changes in recent years. China has projected its relationship with Africa as one of equality and ‘mutual help’. Such perceptions of foreign policy stem from the Five Principles of Peaceful Coexistence and the shared experience...... of imperialist domination and economic underdevelopment. Moreover, various public statements by China's elites suggest that China is expected to play a much more prominent, even exceptional role in Africa. This purportedly entails moving beyond the hegemonic West's interventionist aid or security policies......, and is also implicitly designed to highlight the West's shortcomings in promoting African economic growth or peace. Yet where does this perception of exceptionalism come from? Why does Beijing feel that it has to play a leading role in Africa's development? How can Beijing distinguish itself from the nations...

  3. Modeling coherent errors in quantum error correction

    Science.gov (United States)

    Greenbaum, Daniel; Dutton, Zachary

    2018-01-01

    Analysis of quantum error correcting codes is typically done using a stochastic, Pauli channel error model for describing the noise on physical qubits. However, it was recently found that coherent errors (systematic rotations) on physical data qubits result in both physical and logical error rates that differ significantly from those predicted by a Pauli model. Here we examine the accuracy of the Pauli approximation for noise containing coherent errors (characterized by a rotation angle ɛ) under the repetition code. We derive an analytic expression for the logical error channel as a function of arbitrary code distance d and concatenation level n, in the small error limit. We find that coherent physical errors result in logical errors that are partially coherent and therefore non-Pauli. However, the coherent part of the logical error is negligible at fewer than {ε }-({dn-1)} error correction cycles when the decoder is optimized for independent Pauli errors, thus providing a regime of validity for the Pauli approximation. Above this number of correction cycles, the persistent coherent logical error will cause logical failure more quickly than the Pauli model would predict, and this may need to be combated with coherent suppression methods at the physical level or larger codes.

  4. Geometric phase around exceptional points

    OpenAIRE

    Mailybaev, Alexei; Kirillov, Oleg; Seyranian, Alexander,

    2005-01-01

    A wave function picks up, in addition to the dynamic phase, the geometric (Berry) phase when traversing adiabatically a closed cycle in parameter space. We develop a general multidimensional theory of the geometric phase for (double) cycles around exceptional degeneracies in non-Hermitian Hamiltonians. We show that the geometric phase is exactly $\\pi$ for symmetric complex Hamiltonians of arbitrary dimension and for nonsymmetric non-Hermitian Hamiltonians of dimension 2. For nonsymmetric non-...

  5. Errors and erasures decoding for interference channels

    Science.gov (United States)

    Stuber, G.; Mark, J.; Blake, I.

    This paper examines the gains to be realized by using an errors-and-erasures decoding strategy on a pulse jammed channel, rather than errors-only, for concatenated Reed-Solomon/orthogonal codes. The work is largely computational and serves to quantify the gains achievable in such a situation. Previous studies have shown that the gains of errors-and-erasures over errors-only on the AWGN channel are minimal. The gains reported here for the pulse jammed channel are more substantial, particularly for the lower duty cycles.

  6. Increasingly minimal bias routing

    Science.gov (United States)

    Bataineh, Abdulla; Court, Thomas; Roweth, Duncan

    2017-02-21

    A system and algorithm configured to generate diversity at the traffic source so that packets are uniformly distributed over all of the available paths, but to increase the likelihood of taking a minimal path with each hop the packet takes. This is achieved by configuring routing biases so as to prefer non-minimal paths at the injection point, but increasingly prefer minimal paths as the packet proceeds, referred to herein as Increasing Minimal Bias (IMB).

  7. CMS : An exceptional load for an exceptional work site

    CERN Multimedia

    2001-01-01

    Components of the CMS vacuum tank have been delivered to the detector assembly site at Cessy. The complete inner shell was delivered to CERN by special convoy while the outer shell is being assembled in situ. The convoy transporting the inner shell of the CMS vacuum tank took a week to cover the distance between Lons-le-Saunier and Point 5 at Cessy. Left: the convoy making its way down from the Col de la Faucille. With lights flashing, flanked by police outriders and with roads temporarily closed, the exceptional load that passed through the Pays de Gex on Monday 20 May was accorded the same VIP treatment as a leading state dignitary. But this time it was not the identity of the passenger but the exceptional size of the object being transported that made such arrangements necessary. A convoy of two lorries was needed to transport the load, an enormous 13-metre long, 6 metre diameter cylinder weighing 120 tonnes. It took a week to cover the 120 kilometres between Lons-le-Saunier and the assembly site for...

  8. Eliminating US hospital medical errors.

    Science.gov (United States)

    Kumar, Sameer; Steinebach, Marc

    2008-01-01

    Healthcare costs in the USA have continued to rise steadily since the 1980s. Medical errors are one of the major causes of deaths and injuries of thousands of patients every year, contributing to soaring healthcare costs. The purpose of this study is to examine what has been done to deal with the medical-error problem in the last two decades and present a closed-loop mistake-proof operation system for surgery processes that would likely eliminate preventable medical errors. The design method used is a combination of creating a service blueprint, implementing the six sigma DMAIC cycle, developing cause-and-effect diagrams as well as devising poka-yokes in order to develop a robust surgery operation process for a typical US hospital. In the improve phase of the six sigma DMAIC cycle, a number of poka-yoke techniques are introduced to prevent typical medical errors (identified through cause-and-effect diagrams) that may occur in surgery operation processes in US hospitals. It is the authors' assertion that implementing the new service blueprint along with the poka-yokes, will likely result in the current medical error rate to significantly improve to the six-sigma level. Additionally, designing as many redundancies as possible in the delivery of care will help reduce medical errors. Primary healthcare providers should strongly consider investing in adequate doctor and nurse staffing, and improving their education related to the quality of service delivery to minimize clinical errors. This will lead to an increase in higher fixed costs, especially in the shorter time frame. This paper focuses additional attention needed to make a sound technical and business case for implementing six sigma tools to eliminate medical errors that will enable hospital managers to increase their hospital's profitability in the long run and also ensure patient safety.

  9. Exceptional groups from open strings

    OpenAIRE

    Gaberdiel, Matthias R.; Zwiebach, Barton

    1997-01-01

    We consider type IIB theory compactified on a two-sphere in the presence of mutually nonlocal 7-branes. The BPS states associated with the gauge vectors of exceptional groups are seen to arise from open strings connecting the 7-branes, and multi-pronged open strings capable of ending on more than two 7-branes. These multi-pronged strings are built from open string junctions that arise naturally when strings cross 7-branes. The different string configurations can be multiplied as traditional o...

  10. Exceptional Family Member Program EFM

    Science.gov (United States)

    1996-01-01

    patient facilities. e6 Points of Contact for the Exceptional Family Member Program ’ American Cleft Palate National Association for Foundation Alzheimer’s 1...area, D 0 Contact the Easter Seal Society regarding the Early Intervention Program for infants with special needs. 3 0 "!i I . . Other’Resources...800-24- CLEFT - (412) 481-1370 1-800-272-3900 -- (312) 335-8700 American Liver Foundation National Cancer Institute 1-800-223-0171) - (201) 256-2550 1

  11. Exceptional geometry and Borcherds superalgebras

    Energy Technology Data Exchange (ETDEWEB)

    Palmkvist, Jakob [Mitchell Institute for Fundamental Physics and Astronomy, Texas A& M University,College Station, TX 77843 (United States)

    2015-11-05

    We study generalized diffeomorphisms in exceptional geometry with U-duality group E{sub n(n)} from an algebraic point of view. By extending the Lie algebra e{sub n} to an infinite-dimensional Borcherds superalgebra, involving also the extension to e{sub n+1}, the generalized Lie derivatives can be expressed in a simple way, and the expressions take the same form for any n≤7. The closure of the transformations then follows from the Jacobi identity and the grading of e{sub n+1} with respect to e{sub n}.

  12. Exceptional geometry and Borcherds superalgebras

    Science.gov (United States)

    Palmkvist, Jakob

    2015-11-01

    We study generalized diffeomorphisms in exceptional geometry with U-duality group E n( n) from an algebraic point of view. By extending the Lie algebra {e}_n to an infinite-dimensional Borcherds superalgebra, involving also the extension to {e}_{n+1} , the generalized Lie derivatives can be expressed in a simple way, and the expressions take the same form for any n ≤ 7. The closure of the transformations then follows from the Jacobi identity and the grading of {e}_{n+1} with respect to {e}_n.

  13. Loops in exceptional field theory

    Energy Technology Data Exchange (ETDEWEB)

    Bossard, Guillaume [Centre de Physique Théorique, Ecole Polytechnique, CNRS, Université Paris-Saclay,91128 Palaiseau cedex (France); Kleinschmidt, Axel [Max-Planck-Institut für Gravitationsphysik (Albert-Einstein-Institut),Am Mühlenberg 1, DE-14476 Potsdam (Germany); International Solvay Institutes,ULB-Campus Plaine CP231, BE-1050 Brussels (Belgium)

    2016-01-27

    We study certain four-graviton amplitudes in exceptional field theory in dimensions D≥4 up to two loops. As the formulation is manifestly invariant under the U-duality group E{sub 11−D}(ℤ), our resulting expressions can be expressed in terms of automorphic forms. In the low energy expansion, we find terms in the M-theory effective action of type R{sup 4}, ∇{sup 4}R{sup 4} and ∇{sup 6}R{sup 4} with automorphic coefficient functions in agreement with independent derivations from string theory. This provides in particular an explicit integral formula for the exact string theory ∇{sup 6}R{sup 4} threshold function. We exhibit moreover that the usual supergravity logarithmic divergences cancel out in the full exceptional field theory amplitude, within an appropriately defined dimensional regularisation scheme. We also comment on terms of higher derivative order and the role of the section constraint for possible counterterms.

  14. Correlates of minimal dating.

    Science.gov (United States)

    Leck, Kira

    2006-10-01

    Researchers have associated minimal dating with numerous factors. The present author tested shyness, introversion, physical attractiveness, performance evaluation, anxiety, social skill, social self-esteem, and loneliness to determine the nature of their relationships with 2 measures of self-reported minimal dating in a sample of 175 college students. For women, shyness, introversion, physical attractiveness, self-rated anxiety, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. For men, physical attractiveness, observer-rated social skill, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. The patterns of relationships were not identical for the 2 indicators of minimal dating, indicating the possibility that minimal dating is not a single construct as researchers previously believed. The present author discussed implications and suggestions for future researchers.

  15. Minimally invasive orthognathic surgery.

    Science.gov (United States)

    Resnick, Cory M; Kaban, Leonard B; Troulis, Maria J

    2009-02-01

    Minimally invasive surgery is defined as the discipline in which operative procedures are performed in novel ways to diminish the sequelae of standard surgical dissections. The goals of minimally invasive surgery are to reduce tissue trauma and to minimize bleeding, edema, and injury, thereby improving the rate and quality of healing. In orthognathic surgery, there are two minimally invasive techniques that can be used separately or in combination: (1) endoscopic exposure and (2) distraction osteogenesis. This article describes the historical developments of the fields of orthognathic surgery and minimally invasive surgery, as well as the integration of the two disciplines. Indications, techniques, and the most current outcome data for specific minimally invasive orthognathic surgical procedures are presented.

  16. Minimally entangled typical thermal states versus matrix product purifications for the simulation of equilibrium states and time evolution

    Science.gov (United States)

    Binder, Moritz; Barthel, Thomas

    We compare matrix product purifications and minimally entangled typical thermal states (METTS) for the simulation of equilibrium states and finite-temperature response functions of strongly correlated quantum many-body systems. For METTS, we highlight the interplay of statistical and DMRG truncation errors, discuss the use of self-averaging effects, and describe schemes for the computation of response functions. We assess the computation costs and accuracies of the two methods for critical and gapped spin chains and the Bose-Hubbard model. For the same computation cost, purifications yield more accurate results than METTS except for temperatures well below the system's energy gap.

  17. Minimal Super Technicolor

    DEFF Research Database (Denmark)

    Antola, M.; Di Chiara, S.; Sannino, F.

    2011-01-01

    We introduce novel extensions of the Standard Model featuring a supersymmetric technicolor sector (supertechnicolor). As the first minimal conformal supertechnicolor model we consider N=4 Super Yang-Mills which breaks to N=1 via the electroweak interactions. This is a well defined, economical......, between unparticle physics and Minimal Walking Technicolor. We consider also other N =1 extensions of the Minimal Walking Technicolor model. The new models allow all the standard model matter fields to acquire a mass....

  18. 42 CFR 423.578 - Exceptions process.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Exceptions process. 423.578 Section 423.578 Public..., Redeterminations, and Reconsiderations § 423.578 Exceptions process. (a) Requests for exceptions to a plan's tiered... sponsor may design its exception process so that very high cost or unique drugs are not eligible for a...

  19. Least Squared Simulated Errors

    Directory of Open Access Journals (Sweden)

    Peter J. Veazie

    2015-03-01

    Full Text Available Estimation by minimizing the sum of squared residuals is a common method for parameters of regression functions; however, regression functions are not always known or of interest. Maximizing the likelihood function is an alternative if a distribution can be properly specified. However, cases can arise in which a regression function is not known, no additional moment conditions are indicated, and we have a distribution for the random quantities, but maximum likelihood estimation is difficult to implement. In this article, we present the least squared simulated errors (LSSE estimator for such cases. The conditions for consistency and asymptotic normality are given. Finite sample properties are investigated via Monte Carlo experiments on two examples. Results suggest LSSE can perform well in finite samples. We discuss the estimator’s limitations and conclude that the estimator is a viable option. We recommend Monte Carlo investigation of any given model to judge bias for a particular finite sample size of interest and discern whether asymptotic approximations or resampling techniques are preferable for the construction of tests or confidence intervals.

  20. Medical errors in neurosurgery

    OpenAIRE

    Rolston, John D.; Zygourakis, Corinna C.; Han, Seunggu J.; Lau, Catherine Y.; Berger, Mitchel S.; Parsa, Andrew T

    2014-01-01

    Background: Medical errors cause nearly 100,000 deaths per year and cost billions of dollars annually. In order to rationally develop and institute programs to mitigate errors, the relative frequency and costs of different errors must be documented. This analysis will permit the judicious allocation of scarce healthcare resources to address the most costly errors as they are identified. Methods: Here, we provide a systematic review of the neurosurgical literature describing medical errors...

  1. Minimizing Mutual Couping

    DEFF Research Database (Denmark)

    2010-01-01

    Disclosed herein are techniques, systems, and methods relating to minimizing mutual coupling between a first antenna and a second antenna.......Disclosed herein are techniques, systems, and methods relating to minimizing mutual coupling between a first antenna and a second antenna....

  2. Minimally invasive lumbar fusion.

    Science.gov (United States)

    Foley, Kevin T; Holly, Langston T; Schwender, James D

    2003-08-01

    Review article. To provide an overview of current techniques for minimally invasive lumbar fusion. Minimally invasive techniques have revolutionized the management of pathologic conditions in various surgical disciplines. Although these same principles have been used in the treatment of lumbar disc disease for many years, minimally invasive lumbar fusion procedures have only recently been developed. The goals of these procedures are to reduce the approach-related morbidity associated with traditional lumbar fusion, yet allow the surgery to be performed in an effective and safe manner. The authors' clinical experience with minimally invasive lumbar fusion was reviewed, and the pertinent literature was surveyed. Minimally invasive approaches have been developed for common lumbar procedures such as anterior and posterior interbody fusion, posterolateral onlay fusion, and internal fixation. As with all new surgical techniques, minimally invasive lumbar fusion has a learning curve. As well, there are benefits and disadvantages associated with each technique. However, because these techniques are new and evolving, evidence to support their potential benefits is largely anecdotal. Additionally, there are few long-term studies to document clinical outcomes. Preliminary clinical results suggest that minimally invasive lumbar fusion will have a beneficial impact on the care of patients with spinal disorders. Outcome studies with long-term follow-up will be necessary to validate its success and allow minimally invasive lumbar fusion to become more widely accepted.

  3. Light Stops at Exceptional Points

    Science.gov (United States)

    Goldzak, Tamar; Mailybaev, Alexei A.; Moiseyev, Nimrod

    2018-01-01

    Almost twenty years ago, light was slowed down to less than 10-7 of its vacuum speed in a cloud of ultracold atoms of sodium. Upon a sudden turn-off of the coupling laser, a slow light pulse can be imprinted on cold atoms such that it can be read out and converted into a photon again. In this process, the light is stopped by absorbing it and storing its shape within the atomic ensemble. Alternatively, the light can be stopped at the band edge in photonic-crystal waveguides, where the group speed vanishes. Here, we extend the phenomenon of stopped light to the new field of parity-time (P T ) symmetric systems. We show that zero group speed in P T symmetric optical waveguides can be achieved if the system is prepared at an exceptional point, where two optical modes coalesce. This effect can be tuned for optical pulses in a wide range of frequencies and bandwidths, as we demonstrate in a system of coupled waveguides with gain and loss.

  4. Medication errors: prescribing faults and prescription errors.

    Science.gov (United States)

    Velo, Giampaolo P; Minuz, Pietro

    2009-06-01

    1. Medication errors are common in general practice and in hospitals. Both errors in the act of writing (prescription errors) and prescribing faults due to erroneous medical decisions can result in harm to patients. 2. Any step in the prescribing process can generate errors. Slips, lapses, or mistakes are sources of errors, as in unintended omissions in the transcription of drugs. Faults in dose selection, omitted transcription, and poor handwriting are common. 3. Inadequate knowledge or competence and incomplete information about clinical characteristics and previous treatment of individual patients can result in prescribing faults, including the use of potentially inappropriate medications. 4. An unsafe working environment, complex or undefined procedures, and inadequate communication among health-care personnel, particularly between doctors and nurses, have been identified as important underlying factors that contribute to prescription errors and prescribing faults. 5. Active interventions aimed at reducing prescription errors and prescribing faults are strongly recommended. These should be focused on the education and training of prescribers and the use of on-line aids. The complexity of the prescribing procedure should be reduced by introducing automated systems or uniform prescribing charts, in order to avoid transcription and omission errors. Feedback control systems and immediate review of prescriptions, which can be performed with the assistance of a hospital pharmacist, are also helpful. Audits should be performed periodically.

  5. Reexamining the "Serbian exceptionalism" thesis

    Directory of Open Access Journals (Sweden)

    Vujačić Veljko

    2003-01-01

    Full Text Available Although former Yugoslavia constituted what was widely held to be the most "promising" communist country in terms of potentials for economic reform and political democratization, Serbia remained the only East European country in which the former communist elite managed to defeat its opponents in a series of elections and preserve important elements of institutional and ideological continuity with the old system. Moreover, its regime played a conspicuous role in Yugoslavia's violent collapse. In the specialist literature, the "Serbian exceptionalism" thesis has been elaborated in a number of forms. These are critically reviewed in the first part of the paper, classifying the paradigms according to whether they emphasize: 1 Serbian traditionalist, authoritarian, and collectivist political culture, 2 the affinity between traditional Serbian national populism, Russophile anti-Westernism, and communism, 3 the exclusivist and assimilating character of Serbian nationalism, or 4 the appeals of the contemporary Serbian political elite led by S. Milošević. In the second part of the paper an alternative explanation is presented that seeks to be both interpretively adequate and causally plausible. It rests on five basic factors: 1 historical legacy (the distinctive character of the Serbian collective historical experience and the relationship between Serbian and Yugoslav identities; 2 institutional analysis (the unintended consequences of communist federalism; 3 ideology (the revival of narratives of "Serbian victimization" by Serbian intellectuals; 4 leadership and social base (the peculiar nature of Milošević's appeals in the period of the terminal crisis of communism; and 5 the role of the Diaspora (the perceived ethnic threat among Serbs in Croatia and Bosnia. .

  6. Medical errors in neurosurgery.

    Science.gov (United States)

    Rolston, John D; Zygourakis, Corinna C; Han, Seunggu J; Lau, Catherine Y; Berger, Mitchel S; Parsa, Andrew T

    2014-01-01

    Medical errors cause nearly 100,000 deaths per year and cost billions of dollars annually. In order to rationally develop and institute programs to mitigate errors, the relative frequency and costs of different errors must be documented. This analysis will permit the judicious allocation of scarce healthcare resources to address the most costly errors as they are identified. Here, we provide a systematic review of the neurosurgical literature describing medical errors at the departmental level. Eligible articles were identified from the PubMed database, and restricted to reports of recognizable errors across neurosurgical practices. We limited this analysis to cross-sectional studies of errors in order to better match systems-level concerns, rather than reviewing the literature for individually selected errors like wrong-sided or wrong-level surgery. Only a small number of articles met these criteria, highlighting the paucity of data on this topic. From these studies, errors were documented in anywhere from 12% to 88.7% of cases. These errors had many sources, of which only 23.7-27.8% were technical, related to the execution of the surgery itself, highlighting the importance of systems-level approaches to protecting patients and reducing errors. Overall, the magnitude of medical errors in neurosurgery and the lack of focused research emphasize the need for prospective categorization of morbidity with judicious attribution. Ultimately, we must raise awareness of the impact of medical errors in neurosurgery, reduce the occurrence of medical errors, and mitigate their detrimental effects.

  7. A novel hybrid total variation minimization algorithm for compressed sensing

    Science.gov (United States)

    Li, Hongyu; Wang, Yong; Liang, Dong; Ying, Leslie

    2017-05-01

    Compressed sensing (CS) is a technology to acquire and reconstruct sparse signals below the Nyquist rate. For images, total variation of the signal is usually minimized to promote sparseness of the image in gradient. However, similar to all L1-minimization algorithms, total variation has the issue of penalizing large gradient, thus causing large errors on image edges. Many non-convex penalties have been proposed to address the issue of L1 minimization. For example, homotopic L0 minimization algorithms have shown success in reconstructing images from magnetic resonance imaging (MRI). Homotopic L0 minimization may suffer from local minimum which may not be sufficiently robust when the signal is not strictly sparse or the measurements are contaminated by noise. In this paper, we propose a hybrid total variation minimization algorithm to integrate the benefits of both L1 and homotopic L0 minimization algorithms for image recovery from reduced measurements. The algorithm minimizes the conventional total variation when the gradient is small, and minimizes the L0 of gradient when the gradient is large. The transition between L1 and L0 of the gradients is determined by an auto-adaptive threshold. The proposed algorithm has the benefits of L1 minimization being robust to noise/approximation errors, and also the benefits of L0 minimization requiring fewer measurements for recovery. Experimental results using MRI data are presented to demonstrate the proposed hybrid total variation minimization algorithm yields improved image quality over other existing methods in terms of the reconstruction accuracy.

  8. Field error lottery

    Science.gov (United States)

    James Elliott, C.; McVey, Brian D.; Quimby, David C.

    1991-07-01

    The level of field errors in a free electron laser (FEL) is an important determinant of its performance. We have computed 3D performance of a large laser subsystem subjected to field errors of various types. These calculations have been guided by simple models such as SWOOP. The technique of choice is use of the FELEX free electron laser code that now possesses extensive engineering capabilities. Modeling includes the ability to establish tolerances of various types: fast and slow scale field bowing, field error level, beam position monitor error level, gap errors, defocusing errors, energy slew, displacement and pointing errors. Many effects of these errors on relative gain and relative power extraction are displayed and are the essential elements of determining an error budget. The random errors also depend on the particular random number seed used in the calculation. The simultaneous display of the performance versus error level of cases with multiple seeds illustrates the variations attributable to stochasticity of this model. All these errors are evaluated numerically for comprehensive engineering of the system. In particular, gap errors are found to place requirements beyond convenient mechanical tolerances of ± 25 μm, and amelioration of these may occur by a procedure using direct measurement of the magnetic fields at assembly time.

  9. Ruled Laguerre minimal surfaces

    KAUST Repository

    Skopenkov, Mikhail

    2011-10-30

    A Laguerre minimal surface is an immersed surface in ℝ 3 being an extremal of the functional ∫ (H 2/K-1)dA. In the present paper, we prove that the only ruled Laguerre minimal surfaces are up to isometry the surfaces ℝ (φλ) = (Aφ, Bφ, Cφ + D cos 2φ) + λ(sin φ, cos φ, 0), where A,B,C,D ε ℝ are fixed. To achieve invariance under Laguerre transformations, we also derive all Laguerre minimal surfaces that are enveloped by a family of cones. The methodology is based on the isotropic model of Laguerre geometry. In this model a Laguerre minimal surface enveloped by a family of cones corresponds to a graph of a biharmonic function carrying a family of isotropic circles. We classify such functions by showing that the top view of the family of circles is a pencil. © 2011 Springer-Verlag.

  10. Minimalism. Clip and Save.

    Science.gov (United States)

    Hubbard, Guy

    2002-01-01

    Provides background information on the art movement called "Minimalism" discussing why it started and its characteristics. Includes learning activities and information on the artist, Donald Judd. Includes a reproduction of one of his art works and discusses its content. (CMK)

  11. Social Factors Contributing to Exceptional Navajo Children

    Science.gov (United States)

    Leslie, Ernest

    1977-01-01

    Factors leading to exceptionality in Navajo children are explored, reactions of Navajo families to exceptionality and mental retardation are considered, and problems in providing special education services to this population are pointed out. (SBH)

  12. Adapting American Policymaking to Overcome American Exceptionalism

    Science.gov (United States)

    2015-04-13

    permission of the author 14. ABSTRACT the thesis begins with the etymology of American exceptionalism and the way in which its connotation has changed...Author. ABSTRACT The thesis begins with the etymology of American exceptionalism and the way in which its connotation has changed throughout American...impact. CONTENTS CHAPTER 1: INTRODUCTION I The Etymology and History of American Exceptionalism 3 CHAPTER 2: AMERICAN EXCEPTIONALISM, THE EARLY YEARS 7

  13. ATC operational error analysis.

    Science.gov (United States)

    1972-01-01

    The primary causes of operational errors are discussed and the effects of these errors on an ATC system's performance are described. No attempt is made to specify possible error models for the spectrum of blunders that can occur although previous res...

  14. Drug Errors in Anaesthesiology

    Directory of Open Access Journals (Sweden)

    Rajnish Kumar Jain

    2009-01-01

    Full Text Available Medication errors are a leading cause of morbidity and mortality in hospitalized patients. The incidence of these drug errors during anaesthesia is not certain. They impose a considerable financial burden to health care systems apart from the patient losses. Common causes of these errors and their prevention is discussed.

  15. Error patterns II

    NARCIS (Netherlands)

    Hoede, C.; Li, Z.

    2002-01-01

    In coding theory the problem of decoding focuses on error vectors. In the simplest situation code words are $(0,1)$-vectors, as are the received messages and the error vectors. Comparison of a received word with the code words yields a set of error vectors. In deciding on the original code word,

  16. Interpretive Error in Radiology.

    Science.gov (United States)

    Waite, Stephen; Scott, Jinel; Gale, Brian; Fuchs, Travis; Kolla, Srinivas; Reede, Deborah

    2017-04-01

    Although imaging technology has advanced significantly since the work of Garland in 1949, interpretive error rates remain unchanged. In addition to patient harm, interpretive errors are a major cause of litigation and distress to radiologists. In this article, we discuss the mechanics involved in searching an image, categorize omission errors, and discuss factors influencing diagnostic accuracy. Potential individual- and system-based solutions to mitigate or eliminate errors are also discussed. Radiologists use visual detection, pattern recognition, memory, and cognitive reasoning to synthesize final interpretations of radiologic studies. This synthesis is performed in an environment in which there are numerous extrinsic distractors, increasing workloads and fatigue. Given the ultimately human task of perception, some degree of error is likely inevitable even with experienced observers. However, an understanding of the causes of interpretive errors can help in the development of tools to mitigate errors and improve patient safety.

  17. Test-Cost-Sensitive Attribute Reduction of Data with Normal Distribution Measurement Errors

    OpenAIRE

    Hong Zhao; Fan Min; William Zhu

    2013-01-01

    The measurement error with normal distribution is universal in applications. Generally, smaller measurement error requires better instrument and higher test cost. In decision making based on attribute values of objects, we shall select an attribute subset with appropriate measurement error to minimize the total test cost. Recently, error-range-based covering rough set with uniform distribution error was proposed to investigate this issue. However, the measurement errors satisfy normal distrib...

  18. Minimalism and Speakers’ Intuitions

    Directory of Open Access Journals (Sweden)

    Matías Gariazzo

    2011-08-01

    Full Text Available Minimalism proposes a semantics that does not account for speakers’ intuitions about the truth conditions of a range of sentences or utterances. Thus, a challenge for this view is to offer an explanation of how its assignment of semantic contents to these sentences is grounded in their use. Such an account was mainly offered by Soames, but also suggested by Cappelen and Lepore. The article criticizes this explanation by presenting four kinds of counterexamples to it, and arrives at the conclusion that minimalism has not successfully answered the above-mentioned challenge.

  19. Minimal Walking Technicolor

    DEFF Research Database (Denmark)

    Frandsen, Mads Toudal

    2007-01-01

    I report on our construction and analysis of the effective low energy Lagrangian for the Minimal Walking Technicolor (MWT) model. The parameters of the effective Lagrangian are constrained by imposing modified Weinberg sum rules and by imposing a value for the S parameter estimated from the under......I report on our construction and analysis of the effective low energy Lagrangian for the Minimal Walking Technicolor (MWT) model. The parameters of the effective Lagrangian are constrained by imposing modified Weinberg sum rules and by imposing a value for the S parameter estimated from...

  20. Minimally invasive management of urological fistulas.

    Science.gov (United States)

    Núñez Bragayrac, Luciano A; Azhar, Raed A; Sotelo, Rene

    2015-03-01

    Urological fistulas are an underestimated problem worldwide and have devastating consequences for patients. Many urological fistulas result from surgical complications and/or inadequate perinatal obstetric healthcare. Surgical correction is the standard treatment. This article reviews minimally invasive surgical approaches to manage urological fistulas with a particular emphasis on the robotic techniques of fistula correction. In recent years, many surgeons have explored a minimally invasive approach for the management of urological fistulas. Several studies have demonstrated the feasibility of laparoscopic surgery and the reproducibility of reconstructive surgery techniques. Introduction of the robotic platform has provided significant advantages given the improved dexterity and exceptional vision that it confers. Fistulas are a concern worldwide. Laparoscopic surgery correction has been developed through the efforts of several authors, and difficulties such as the increased learning curve have been overcome with innovations, including the robotic platform. Although minimally invasive surgery offers numerous advantages, the most successful approach remains the one with the surgeon is most familiar.

  1. Human error in anesthetic mishaps.

    Science.gov (United States)

    Gaba, D M

    1989-01-01

    While adverse outcomes linked to anesthesia are uncommon in healthy patients, they do occasionally happen. There is rarely a single cause. Anesthesia and surgery bring the patient into a complex world in which innumerable small failings can converge to produce an eventual catastrophe. And for all the technology involved, the anesthesiologist remains the cornerstone of safe anesthesia care, protecting the patient from harm regardless of its source. Responding to the demands of the operating room environment requires on-the-spot decision making in a complex, uncertain, and risky setting. Only responsible, professional human beings acting in concert can perform this task; no machine that we devise now or in the foreseeable future will suffice. I have outlined the components of a dynamic decision-making process that successfully protects patients in almost all cases. However, being human, anesthesiologists do make errors along the way--errors we are just beginning to understand. Sometimes these errors are due to faulty vigilance or incompetence, but usually they are made by appropriately trained, competent practitioners. Anesthesiologists can err in many ways, and recognizing these ways makes it easier to analyze the events leading to an anesthetic accident. More importantly, it better equips us to eliminate or minimize them in the future--and this is the real challenge.

  2. Minimal DBM Substraction

    DEFF Research Database (Denmark)

    David, Alexandre; Håkansson, John; G. Larsen, Kim

    In this paper we present an algorithm to compute DBM substractions with a guaranteed minimal number of splits and disjoint DBMs to avoid any redundance. The substraction is one of the few operations that result in a non-convex zone, and thus, requires splitting. It is of prime importance to reduce...

  3. Minimal constrained supergravity

    Energy Technology Data Exchange (ETDEWEB)

    Cribiori, N. [Dipartimento di Fisica e Astronomia “Galileo Galilei”, Università di Padova, Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova, Via Marzolo 8, 35131 Padova (Italy); Dall' Agata, G., E-mail: dallagat@pd.infn.it [Dipartimento di Fisica e Astronomia “Galileo Galilei”, Università di Padova, Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova, Via Marzolo 8, 35131 Padova (Italy); Farakos, F. [Dipartimento di Fisica e Astronomia “Galileo Galilei”, Università di Padova, Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova, Via Marzolo 8, 35131 Padova (Italy); Porrati, M. [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY 10003 (United States)

    2017-01-10

    We describe minimal supergravity models where supersymmetry is non-linearly realized via constrained superfields. We show that the resulting actions differ from the so called “de Sitter” supergravities because we consider constraints eliminating directly the auxiliary fields of the gravity multiplet.

  4. Minimally invasive periodontal therapy.

    Science.gov (United States)

    Dannan, Aous

    2011-10-01

    Minimally invasive dentistry is a concept that preserves dentition and supporting structures. However, minimally invasive procedures in periodontal treatment are supposed to be limited within periodontal surgery, the aim of which is to represent alternative approaches developed to allow less extensive manipulation of surrounding tissues than conventional procedures, while accomplishing the same objectives. In this review, the concept of minimally invasive periodontal surgery (MIPS) is firstly explained. An electronic search for all studies regarding efficacy and effectiveness of MIPS between 2001 and 2009 was conducted. For this purpose, suitable key words from Medical Subject Headings on PubMed were used to extract the required studies. All studies are demonstrated and important results are concluded. Preliminary data from case cohorts and from many studies reveal that the microsurgical access flap, in terms of MIPS, has a high potential to seal the healing wound from the contaminated oral environment by achieving and maintaining primary closure. Soft tissues are mostly preserved and minimal gingival recession is observed, an important feature to meet the demands of the patient and the clinician in the esthetic zone. However, although the potential efficacy of MIPS in the treatment of deep intrabony defects has been proved, larger studies are required to confirm and extend the reported positive preliminary outcomes.

  5. Minimal constrained supergravity

    Directory of Open Access Journals (Sweden)

    N. Cribiori

    2017-01-01

    Full Text Available We describe minimal supergravity models where supersymmetry is non-linearly realized via constrained superfields. We show that the resulting actions differ from the so called “de Sitter” supergravities because we consider constraints eliminating directly the auxiliary fields of the gravity multiplet.

  6. Minimal Walking Technicolor

    DEFF Research Database (Denmark)

    Foadi, Roshan; Frandsen, Mads Toudal; A. Ryttov, T.

    2007-01-01

    , pseudoscalars, vector mesons and other fields predicted by the minimal walking theory. We construct their self-interactions and interactions with standard model fields. Using the Weinberg sum rules, opportunely modified to take into account the walking behavior of the underlying gauge theory, we find...

  7. Intraoral radiographic errors.

    Science.gov (United States)

    Patel, J R

    1979-11-01

    The purpose of this investigation was to investigate intraoral radiography in regards to the frequency of errors, the types of error necessitating retakes, and the relationship of error frequency to the teeth area examined and type x-ray cone used. The present study used 283 complete mouth radiographic surveys made, and 890 radiographs were found to be clinically unacceptable for one or more errors in technique. Thirteen and one-tenth errors per one hundred radiographs were found in this study. The three major radiographic errors occurring in this study were incorrect film placement (49.9 percent), cone-cutting (20.8 percent), and incorrect vertical angulation (12.5 percent).

  8. Error coding simulations

    Science.gov (United States)

    Noble, Viveca K.

    1993-11-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  9. Compendation of SSC lattice optics in the presence of dipole field errors: Report of the Correction Element Working Group

    Energy Technology Data Exchange (ETDEWEB)

    Bintinger, D.; Chao, A.; Forest, E. [and others

    1989-02-01

    The assignment of the Correction Element Working Group (CEWG) is to advance the designs of various candidate correction schemes to a point where they can be compared and distilled down to a single plan. Choosing among, the options often involves consideration of incommensurate factors such as cost, practicality, and theoretical performance. Except for minor issues, the CEWG purpose is to gather and array the facts in a form from which these decisions can be rationally made, but not to make the decisions. The present report analyses various schemes for compensating nonlinear multipole errors in the main arc dipoles of the Superconducting Super Collider. Emphasis is on comparing lumped and distributed compensation, on minimizing the total number of correction elements, and on reducing the sensitivity to closed-orbit errors.

  10. Minimal Composite Inflation

    DEFF Research Database (Denmark)

    Channuie, Phongpichit; Jark Joergensen, Jakob; Sannino, Francesco

    2011-01-01

    We investigate models in which the inflaton emerges as a composite field of a four dimensional, strongly interacting and nonsupersymmetric gauge theory featuring purely fermionic matter. We show that it is possible to obtain successful inflation via non-minimal coupling to gravity, and that the u......We investigate models in which the inflaton emerges as a composite field of a four dimensional, strongly interacting and nonsupersymmetric gauge theory featuring purely fermionic matter. We show that it is possible to obtain successful inflation via non-minimal coupling to gravity......, and that the underlying dynamics is preferred to be near conformal. We discover that the compositeness scale of inflation is of the order of the grand unified energy scale....

  11. Minimally symmetric Higgs boson

    Energy Technology Data Exchange (ETDEWEB)

    Low, Ian

    2015-06-01

    Models addressing the naturalness of a light Higgs boson typically employ symmetries, either bosonic or fermionic, to stabilize the Higgs mass. We consider a setup with the minimal amount of symmetries: four shift symmetries acting on the four components of the Higgs doublet, subject to the constraints of linearly realized SU(2)(L) x U(1)(Y) electroweak symmetry. Up to terms that explicitly violate the shift symmetries, the effective Lagrangian can be derived, irrespective of the spontaneously broken group G in the ultraviolet, and is universal among all models where the Higgs arises as a pseudo-Nambu-Goldstone boson. Very high energy scatterings of vector bosons could provide smoking gun signals of a minimally symmetric Higgs boson.

  12. Audio-Tutorial Programming with Exceptional Children

    Science.gov (United States)

    Hofmeister, Alan

    1973-01-01

    The findings from the application of audio-tutorial programing in three curriculum areas with three groups of exceptional children are reported. The findings suggest that audio-tutorial programing has qualities capable of meeting some of the instructional needs of exceptional children. (Author)

  13. Correction for quadrature errors

    DEFF Research Database (Denmark)

    Netterstrøm, A.; Christensen, Erik Lintz

    1994-01-01

    In high bandwidth radar systems it is necessary to use quadrature devices to convert the signal to/from baseband. Practical problems make it difficult to implement a perfect quadrature system. Channel imbalance and quadrature phase errors in the transmitter and the receiver result in error signals...

  14. Medical error and disclosure.

    Science.gov (United States)

    White, Andrew A; Gallagher, Thomas H

    2013-01-01

    Errors occur commonly in healthcare and can cause significant harm to patients. Most errors arise from a combination of individual, system, and communication failures. Neurologists may be involved in harmful errors in any practice setting and should familiarize themselves with tools to prevent, report, and examine errors. Although physicians, patients, and ethicists endorse candid disclosure of harmful medical errors to patients, many physicians express uncertainty about how to approach these conversations. A growing body of research indicates physicians often fail to meet patient expectations for timely and open disclosure. Patients desire information about the error, an apology, and a plan for preventing recurrence of the error. To meet these expectations, physicians should participate in event investigations and plan thoroughly for each disclosure conversation, preferably with a disclosure coach. Physicians should also anticipate and attend to the ongoing medical and emotional needs of the patient. A cultural change towards greater transparency following medical errors is in motion. Substantial progress is still required, but neurologists can further this movement by promoting policies and environments conducive to open reporting, respectful disclosure to patients, and support for the healthcare workers involved. © 2013 Elsevier B.V. All rights reserved.

  15. Next-to-minimal SOFTSUSY

    Science.gov (United States)

    Allanach, B. C.; Athron, P.; Tunstall, Lewis C.; Voigt, A.; Williams, A. G.

    2014-09-01

    renormalisation group equations must be consistent with boundary conditions on supersymmetry breaking parameters, as well as on the weak-scale boundary condition on gauge couplings, Yukawa couplings and the Higgs potential parameters. Solution method: Nested iterative algorithm and numerical minimisation of the Higgs potential. Reasons for new version: Major extension to include the next-to-minimal supersymmetric standard model. Summary of revisions: Added additional supersymmetric and supersymmetry breaking parameters associated with the additional gauge singlet. Electroweak symmetry breaking conditions are significantly changed in the next-to-minimal mode, and some sparticle mixing changes. An interface to NMSSMTools has also been included. Some of the object structure has also changed, and the command line interface has been made more user friendly. Restrictions: SOFTSUSY will provide a solution only in the perturbative regime and it assumes that all couplings of the model are real (i.e. CP-conserving). If the parameter point under investigation is non-physical for some reason (for example because the electroweak potential does not have an acceptable minimum), SOFTSUSY returns an error message. Running time: A few seconds per parameter point.

  16. Proofreading for word errors.

    Science.gov (United States)

    Pilotti, Maura; Chodorow, Martin; Agpawa, Ian; Krajniak, Marta; Mahamane, Salif

    2012-04-01

    Proofreading (i.e., reading text for the purpose of detecting and correcting typographical errors) is viewed as a component of the activity of revising text and thus is a necessary (albeit not sufficient) procedural step for enhancing the quality of a written product. The purpose of the present research was to test competing accounts of word-error detection which predict factors that may influence reading and proofreading differently. Word errors, which change a word into another word (e.g., from --> form), were selected for examination because they are unlikely to be detected by automatic spell-checking functions. Consequently, their detection still rests mostly in the hands of the human proofreader. Findings highlighted the weaknesses of existing accounts of proofreading and identified factors, such as length and frequency of the error in the English language relative to frequency of the correct word, which might play a key role in detection of word errors.

  17. Uncorrected refractive errors

    Directory of Open Access Journals (Sweden)

    Kovin S Naidoo

    2012-01-01

    Full Text Available Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC, were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR Development, Service Development and Social Entrepreneurship.

  18. Uncorrected refractive errors.

    Science.gov (United States)

    Naidoo, Kovin S; Jaggernath, Jyoti

    2012-01-01

    Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC), were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR) Development, Service Development and Social Entrepreneurship.

  19. Uncorrected refractive errors

    Science.gov (United States)

    Naidoo, Kovin S; Jaggernath, Jyoti

    2012-01-01

    Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC), were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR) Development, Service Development and Social Entrepreneurship. PMID:22944755

  20. Minimally Invasive Parathyroidectomy

    Directory of Open Access Journals (Sweden)

    Lee F. Starker

    2011-01-01

    Full Text Available Minimally invasive parathyroidectomy (MIP is an operative approach for the treatment of primary hyperparathyroidism (pHPT. Currently, routine use of improved preoperative localization studies, cervical block anesthesia in the conscious patient, and intraoperative parathyroid hormone analyses aid in guiding surgical therapy. MIP requires less surgical dissection causing decreased trauma to tissues, can be performed safely in the ambulatory setting, and is at least as effective as standard cervical exploration. This paper reviews advances in preoperative localization, anesthetic techniques, and intraoperative management of patients undergoing MIP for the treatment of pHPT.

  1. Minimally Actuated Serial Robot

    OpenAIRE

    Mann, Moshe P.; Damti, Lior; Zarrouk, David

    2017-01-01

    In this paper, we propose a novel type of serial robot with minimal actuation. The robot is a serial rigid structure consisting of multiple links connected by passive joints and of movable actuators. The novelty of this robot is that the actuators travel over the links to a given joint and adjust the relative angle between the two adjacent links. The joints passively preserve their angles until one of the actuators moves them again. This actuation can be applied to any serial robot with two o...

  2. Exceptional Cosmetic surgeries on $S^3$

    OpenAIRE

    Ravelomanana, Huygens C.

    2015-01-01

    This paper concerns the truly or purely cosmetic surgery conjecture. We give a survey on exceptional surgeries and cosmetic surgeries. We prove that the slope of an exceptional truly cosmetic surgery on a hyperbolic knot in $S^3$ must be $\\pm 1$ and the surgery must be toroidal but not Seifert fibred. As consequence we show that there are no exceptional truly cosmetic surgeries on certain types of hyperbolic knot in $S^3$. We also give some properties of Heegaard Floer correction terms and to...

  3. Disclosure of medical errors.

    Science.gov (United States)

    Matlow, Anne; Stevens, Polly; Harrison, Christine; Laxer, Ronald M

    2006-12-01

    The 1999 release of the Institute of Medicine's document To Err is Human was akin to removing the lid of Pandora's box. Not only were the magnitude and impact of medical errors now apparent to those working in the health care industry, but consumers or health care were alerted to the occurrence of medical events causing harm. One specific solution advocated was the disclosure to patients and their families of adverse events resulting from medical error. Knowledge of the historical perspective, ethical underpinnings, and medico-legal implications gives us a better appreciation of current recommendations for disclosing adverse events resulting from medical error to those affected.

  4. Human error mitigation initiative (HEMI) : summary report.

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, Susan M.; Ramos, M. Victoria; Wenner, Caren A.; Brannon, Nathan Gregory

    2004-11-01

    Despite continuing efforts to apply existing hazard analysis methods and comply with requirements, human errors persist across the nuclear weapons complex. Due to a number of factors, current retroactive and proactive methods to understand and minimize human error are highly subjective, inconsistent in numerous dimensions, and are cumbersome to characterize as thorough. An alternative and proposed method begins with leveraging historical data to understand what the systemic issues are and where resources need to be brought to bear proactively to minimize the risk of future occurrences. An illustrative analysis was performed using existing incident databases specific to Pantex weapons operations indicating systemic issues associated with operating procedures that undergo notably less development rigor relative to other task elements such as tooling and process flow. Future recommended steps to improve the objectivity, consistency, and thoroughness of hazard analysis and mitigation were delineated.

  5. Cognitive function in families with exceptional survival

    DEFF Research Database (Denmark)

    Barral, Sandra; Cosentino, Stephanie; Costa, Rosann

    2012-01-01

    members in the offspring generation demonstrate significantly better performance on multiple tasks requiring attention, working memory, and semantic processing when compared with individuals without a family history of exceptional survival, suggesting that cognitive performance may serve as an important...

  6. 48 CFR 8.605 - Exceptions.

    Science.gov (United States)

    2010-10-01

    ... REQUIRED SOURCES OF SUPPLIES AND SERVICES Acquisition From Federal Prison Industries, Inc. 8.605 Exceptions... determination that the FPI item of supply is not comparable to supplies available from the private sector that...

  7. 7 CFR 1944.75 - Exception authority.

    Science.gov (United States)

    2010-01-01

    ...) PROGRAM REGULATIONS (CONTINUED) HOUSING Housing Application Packaging Grants § 1944.75 Exception authority... supported with documentation to explain the adverse effect on the Government's interest and/or impact on the...

  8. Inborn errors of metabolism

    Science.gov (United States)

    ... metabolism. A few of them are: Fructose intolerance Galactosemia Maple sugar urine disease (MSUD) Phenylketonuria (PKU) Newborn ... disorder. Alternative Names Metabolism - inborn errors of Images Galactosemia Phenylketonuria test References Bodamer OA. Approach to inborn ...

  9. Medical Errors Reduction Initiative

    National Research Council Canada - National Science Library

    Mutter, Michael L

    2005-01-01

    The Valley Hospital of Ridgewood, New Jersey, is proposing to extend a limited but highly successful specimen management and medication administration medical errors reduction initiative on a hospital-wide basis...

  10. Exceptional cosmetic surgeries on homology spheres

    OpenAIRE

    Ravelomanana, Huygens C.

    2016-01-01

    The cosmetic surgery conjecture is a longstanding conjecture in 3-manifold theory. We present a theorem about exceptional cosmetic surgery for homology spheres. Along the way we prove that if the surgery is not a small seifert $\\mathbb{Z}/2\\mathbb{Z}$-homology sphere or a toroidal irreducible non-Seifert surgery then there is at most one pair of exceptional truly cosmetic slope. We also prove that toroidal truly cosmetic surgeries on integer homology spheres must be integer homology spheres.

  11. Design for Error Tolerance

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1983-01-01

    An important aspect of the optimal design of computer-based operator support systems is the sensitivity of such systems to operator errors. The author discusses how a system might allow for human variability with the use of reversibility and observability.......An important aspect of the optimal design of computer-based operator support systems is the sensitivity of such systems to operator errors. The author discusses how a system might allow for human variability with the use of reversibility and observability....

  12. Interruption Practice Reduces Errors

    Science.gov (United States)

    2014-01-01

    dangers of errors at the PCS. Electronic health record systems are used to reduce certain errors related to poor- handwriting and dosage...Arlington VA 22202-4302 Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to...RESPONSIBLE PERSON a REPORT unclassified b ABSTRACT unclassified c THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39

  13. Inpatients’ medical prescription errors

    Directory of Open Access Journals (Sweden)

    Aline Melo Santos Silva

    2009-09-01

    Full Text Available Objective: To identify and quantify the most frequent prescription errors in inpatients’ medical prescriptions. Methods: A survey of prescription errors was performed in the inpatients’ medical prescriptions, from July 2008 to May 2009 for eight hours a day. Rresults: At total of 3,931 prescriptions was analyzed and 362 (9.2% prescription errors were found, which involved the healthcare team as a whole. Among the 16 types of errors detected in prescription, the most frequent occurrences were lack of information, such as dose (66 cases, 18.2% and administration route (26 cases, 7.2%; 45 cases (12.4% of wrong transcriptions to the information system; 30 cases (8.3% of duplicate drugs; doses higher than recommended (24 events, 6.6% and 29 cases (8.0% of prescriptions with indication but not specifying allergy. Cconclusion: Medication errors are a reality at hospitals. All healthcare professionals are responsible for the identification and prevention of these errors, each one in his/her own area. The pharmacist is an essential professional in the drug therapy process. All hospital organizations need a pharmacist team responsible for medical prescription analyses before preparation, dispensation and administration of drugs to inpatients. This study showed that the pharmacist improves the inpatient’s safety and success of prescribed therapy.

  14. Event detection and exception handling strategies in the ASDEX Upgrade discharge control system

    Energy Technology Data Exchange (ETDEWEB)

    Treutterer, W., E-mail: Wolfgang.Treutterer@ipp.mpg.de; Neu, G.; Rapson, C.; Raupp, G.; Zasche, D.; Zehetbauer, T.

    2013-10-15

    Highlights: •Event detection and exception handling is integrated in control system architecture. •Pulse control with local exception handling and pulse supervision with central exception handling are strictly separated. •Local exception handling limits the effect of an exception to a minimal part of the controlled system. •Central Exception Handling solves problems requiring coordinated action of multiple control components. -- Abstract: Thermonuclear plasmas are governed by nonlinear characteristics: plasma operation can be classified into scenarios with pronounced features like L and H-mode, ELMs or MHD activity. Transitions between them may be treated as events. Similarly, technical systems are also subject to events such as failure of measurement sensors, actuator saturation or violation of machine and plant operation limits. Such situations often are handled with a mixture of pulse abortion and iteratively improved pulse schedule reference programming. In case of protection-relevant events, however, the complexity of even a medium-sized device as ASDEX Upgrade requires a sophisticated and coordinated shutdown procedure rather than a simple stop of the pulse. The detection of events and their intelligent handling by the control system has been shown to be valuable also in terms of saving experiment time and cost. This paper outlines how ASDEX Upgrade's discharge control system (DCS) detects events and handles exceptions in two stages: locally and centrally. The goal of local exception handling is to limit the effect of an unexpected or asynchronous event to a minimal part of the controlled system. Thus, local exception handling facilitates robustness to failures but keeps the decision structures lean. A central state machine deals with exceptions requiring coordinated action of multiple control components. DCS implements the state machine by means of pulse schedule segments containing pre-programmed waveforms to define discharge goal and control

  15. Transanal Minimally Invasive Surgery

    Science.gov (United States)

    deBeche-Adams, Teresa; Nassif, George

    2015-01-01

    Transanal minimally invasive surgery (TAMIS) was first described in 2010 as a crossover between single-incision laparoscopic surgery and transanal endoscopic microsurgery (TEM) to allow access to the proximal and mid-rectum for resection of benign and early-stage malignant rectal lesions. The TAMIS technique can also be used for noncurative intent surgery of more advanced lesions in patients who are not candidates for radical surgery. Proper workup and staging should be done before surgical decision-making. In addition to the TAMIS port, instrumentation and set up include readily available equipment found in most operating suites. TAMIS has proven its usefulness in a wide range of applications outside of local excision, including repair of rectourethral fistula, removal of rectal foreign body, control of rectal hemorrhage, and as an adjunct in total mesorectal excision for rectal cancer. TAMIS is an easily accessible, technically feasible, and cost-effective alternative to TEM. PMID:26491410

  16. Minimal asymmetric dark matter

    Directory of Open Access Journals (Sweden)

    Sofiane M. Boucenna

    2015-09-01

    Full Text Available In the early Universe, any particle carrying a conserved quantum number and in chemical equilibrium with the thermal bath will unavoidably inherit a particle–antiparticle asymmetry. A new particle of this type, if stable, would represent a candidate for asymmetric dark matter (DM with an asymmetry directly related to the baryon asymmetry. We study this possibility for a minimal DM sector constituted by just one (generic SU(2L multiplet χ carrying hypercharge, assuming that at temperatures above the electroweak phase transition an effective operator enforces chemical equilibrium between χ and the Higgs boson. We argue that limits from DM direct detection searches severely constrain this scenario, leaving as the only possibilities scalar or fermion multiplets with hypercharge y=1, preferentially quintuplets or larger SU(2 representations, and with a mass in the few TeV range.

  17. Minimally extended SILH

    Energy Technology Data Exchange (ETDEWEB)

    Chala, Mikael [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Valencia Univ. (Spain). Dept. de Fisica Teorica y IFIC; Durieux, Gauthier; Matsedonskyi, Oleksii [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Grojean, Christophe [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Humboldt-Univ. Berlin (Germany). Inst. fuer Physik; Lima, Leonardo de [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Univ. Estadual Paulista, Sao Paulo (Brazil). Inst. de Fisica Teorica

    2017-03-15

    Higgs boson compositeness is a phenomenologically viable scenario addressing the hierarchy problem. In minimal models, the Higgs boson is the only degree of freedom of the strong sector below the strong interaction scale. We present here the simplest extension of such a framework with an additional composite spin-zero singlet. To this end, we adopt an effective field theory approach and develop a set of rules to estimate the size of the various operator coefficients, relating them to the parameters of the strong sector and its structural features. As a result, we obtain the patterns of new interactions affecting both the new singlet and the Higgs boson's physics. We identify the characteristics of the singlet field which cause its effects on Higgs physics to dominate over the ones inherited from the composite nature of the Higgs boson. Our effective field theory construction is supported by comparisons with explicit UV models.

  18. Minimal Reducts with Grasp

    Directory of Open Access Journals (Sweden)

    Iris Iddaly Mendez Gurrola

    2011-03-01

    Full Text Available The proper detection of patient level of dementia is important to offer the suitable treatment. The diagnosis is based on certain criteria, reflected in the clinical examinations. From these examinations emerge the limitations and the degree in which each patient is in. In order to reduce the total of limitations to be evaluated, we used the rough set theory, this theory has been applied in areas of the artificial intelligence such as decision analysis, expert systems, knowledge discovery, classification with multiple attributes. In our case this theory is applied to find the minimal limitations set or reduct that generate the same classification that considering all the limitations, to fulfill this purpose we development an algorithm GRASP (Greedy Randomized Adaptive Search Procedure.

  19. Natural minimal dark matter

    CERN Document Server

    Fabbrichesi, Marco

    2016-01-01

    We show how the Higgs boson mass is protected from the potentially large corrections due to the introduction of minimal dark matter if the new physics sector is made supersymmetric. The fermionic dark matter candidate (a 5-plet of $SU(2)_L$) is accompanied by a scalar state. The weak gauge sector is made supersymmetric and the Higgs boson is embedded in a supersymmetric multiplet. The remaining standard model states are non-supersymmetric. Non vanishing corrections to the Higgs boson mass only appear at three-loop level and the model is natural for dark matter masses up to 15 TeV--a value larger than the one required by the cosmological relic density. The construction presented stands as an example of a general approach to naturalness that solves the little hierarchy problem which arises when new physics is added beyond the standard model at an energy scale around 10 TeV.

  20. Error monitoring in musicians

    Directory of Open Access Journals (Sweden)

    Clemens eMaidhof

    2013-07-01

    Full Text Available To err is human, and hence even professional musicians make errors occasionally during their performances. This paper summarizes recent work investigating error monitoring in musicians, i.e. the processes and their neural correlates associated with the monitoring of ongoing actions and the detection of deviations from intended sounds. EEG Studies reported an early component of the event-related potential (ERP occurring before the onsets of pitch errors. This component, which can be altered in musicians with focal dystonia, likely reflects processes of error detection and/or error compensation, i.e. attempts to cancel the undesired sensory consequence (a wrong tone a musician is about to perceive. Thus, auditory feedback seems not to be a prerequisite for error detection, consistent with previous behavioral results. In contrast, when auditory feedback is externally manipulated and thus unexpected, motor performance can be severely distorted, although not all feedback alterations result in performance impairments. Recent studies investigating the neural correlates of feedback processing showed that unexpected feedback elicits an ERP component after note onsets, which shows larger amplitudes during music performance than during mere perception of the same musical sequences. Hence, these results stress the role of motor actions for the processing of auditory information. Furthermore, recent methodological advances like the combination of 3D motion capture techniques with EEG will be discussed. Such combinations of different measures can potentially help to disentangle the roles of different feedback types such as proprioceptive and auditory feedback, and in general to derive at a better understanding of the complex interactions between the motor and auditory domain during error monitoring. Finally, outstanding questions and future directions in this context will be discussed.

  1. Errors and mistakes in breast ultrasound diagnostics

    Directory of Open Access Journals (Sweden)

    Wiesław Jakubowski

    2012-09-01

    Full Text Available Sonomammography is often the first additional examination performed in the diagnostics of breast diseases. The development of ultrasound imaging techniques, particularly the introduction of high frequency transducers, matrix transducers, harmonic imaging and finally, elastography, influenced the improvement of breast disease diagnostics. Neverthe‑ less, as in each imaging method, there are errors and mistakes resulting from the techni‑ cal limitations of the method, breast anatomy (fibrous remodeling, insufficient sensitivity and, in particular, specificity. Errors in breast ultrasound diagnostics can be divided into impossible to be avoided and potentially possible to be reduced. In this article the most frequently made errors in ultrasound have been presented, including the ones caused by the presence of artifacts resulting from volumetric averaging in the near and far field, artifacts in cysts or in dilated lactiferous ducts (reverberations, comet tail artifacts, lateral beam artifacts, improper setting of general enhancement or time gain curve or range. Errors dependent on the examiner, resulting in the wrong BIRADS‑usg classification, are divided into negative and positive errors. The sources of these errors have been listed. The methods of minimization of the number of errors made have been discussed, includ‑ ing the ones related to the appropriate examination technique, taking into account data from case history and the use of the greatest possible number of additional options such as: harmonic imaging, color and power Doppler and elastography. In the article examples of errors resulting from the technical conditions of the method have been presented, and those dependent on the examiner which are related to the great diversity and variation of ultrasound images of pathological breast lesions.

  2. Lessons from aviation - the role of checklists in minimally invasive cardiac surgery.

    Science.gov (United States)

    Hussain, S; Adams, C; Cleland, A; Jones, P M; Walsh, G; Kiaii, B

    2016-01-01

    We describe an adverse event during minimally invasive cardiac surgery that resulted in a multi-disciplinary review of intra-operative errors and the creation of a procedural checklist. This checklist aims to prevent errors of omission and communication failures that result in increased morbidity and mortality. We discuss the application of the aviation - led "threats and errors model" to medical practice and the role of checklists and other strategies aimed at reducing medical errors. © The Author(s) 2015.

  3. Statistical tests to compare motif count exceptionalities

    Directory of Open Access Journals (Sweden)

    Vandewalle Vincent

    2007-03-01

    Full Text Available Abstract Background Finding over- or under-represented motifs in biological sequences is now a common task in genomics. Thanks to p-value calculation for motif counts, exceptional motifs are identified and represent candidate functional motifs. The present work addresses the related question of comparing the exceptionality of one motif in two different sequences. Just comparing the motif count p-values in each sequence is indeed not sufficient to decide if this motif is significantly more exceptional in one sequence compared to the other one. A statistical test is required. Results We develop and analyze two statistical tests, an exact binomial one and an asymptotic likelihood ratio test, to decide whether the exceptionality of a given motif is equivalent or significantly different in two sequences of interest. For that purpose, motif occurrences are modeled by Poisson processes, with a special care for overlapping motifs. Both tests can take the sequence compositions into account. As an illustration, we compare the octamer exceptionalities in the Escherichia coli K-12 backbone versus variable strain-specific loops. Conclusion The exact binomial test is particularly adapted for small counts. For large counts, we advise to use the likelihood ratio test which is asymptotic but strongly correlated with the exact binomial test and very simple to use.

  4. Learning from Errors

    Directory of Open Access Journals (Sweden)

    MA. Lendita Kryeziu

    2015-06-01

    Full Text Available “Errare humanum est”, a well known and widespread Latin proverb which states that: to err is human, and that people make mistakes all the time. However, what counts is that people must learn from mistakes. On these grounds Steve Jobs stated: “Sometimes when you innovate, you make mistakes. It is best to admit them quickly, and get on with improving your other innovations.” Similarly, in learning new language, learners make mistakes, thus it is important to accept them, learn from them, discover the reason why they make them, improve and move on. The significance of studying errors is described by Corder as: “There have always been two justifications proposed for the study of learners' errors: the pedagogical justification, namely that a good understanding of the nature of error is necessary before a systematic means of eradicating them could be found, and the theoretical justification, which claims that a study of learners' errors is part of the systematic study of the learners' language which is itself necessary to an understanding of the process of second language acquisition” (Corder, 1982; 1. Thus the importance and the aim of this paper is analyzing errors in the process of second language acquisition and the way we teachers can benefit from mistakes to help students improve themselves while giving the proper feedback.

  5. Swarm robotics and minimalism

    Science.gov (United States)

    Sharkey, Amanda J. C.

    2007-09-01

    Swarm Robotics (SR) is closely related to Swarm Intelligence, and both were initially inspired by studies of social insects. Their guiding principles are based on their biological inspiration and take the form of an emphasis on decentralized local control and communication. Earlier studies went a step further in emphasizing the use of simple reactive robots that only communicate indirectly through the environment. More recently SR studies have moved beyond these constraints to explore the use of non-reactive robots that communicate directly, and that can learn and represent their environment. There is no clear agreement in the literature about how far such extensions of the original principles could go. Should there be any limitations on the individual abilities of the robots used in SR studies? Should knowledge of the capabilities of social insects lead to constraints on the capabilities of individual robots in SR studies? There is a lack of explicit discussion of such questions, and researchers have adopted a variety of constraints for a variety of reasons. A simple taxonomy of swarm robotics is presented here with the aim of addressing and clarifying these questions. The taxonomy distinguishes subareas of SR based on the emphases and justifications for minimalism and individual simplicity.

  6. Minimal dilaton model

    Directory of Open Access Journals (Sweden)

    Oda Kin-ya

    2013-05-01

    Full Text Available Both the ATLAS and CMS experiments at the LHC have reported the observation of the particle of mass around 125 GeV which is consistent to the Standard Model (SM Higgs boson, but with an excess of events beyond the SM expectation in the diphoton decay channel at each of them. There still remains room for a logical possibility that we are not seeing the SM Higgs but something else. Here we introduce the minimal dilaton model in which the LHC signals are explained by an extra singlet scalar of the mass around 125 GeV that slightly mixes with the SM Higgs heavier than 600 GeV. When this scalar has a vacuum expectation value well beyond the electroweak scale, it can be identified as a linearly realized version of a dilaton field. Though the current experimental constraints from the Higgs search disfavors such a region, the singlet scalar model itself still provides a viable alternative to the SM Higgs in interpreting its search results.

  7. Minimal Higgs inflation

    Directory of Open Access Journals (Sweden)

    Debaprasad Maity

    2017-06-01

    Full Text Available In this paper we propose minimal Higgs inflation scenarios by non-polynomial modification of the Higgs potential. The modification is done in such a way that it creates a flat plateau for a huge range of field values at the inflationary energy scale μ≃(λ1/4α. Assuming the perturbative Higgs quartic coupling, λ≃O(1, our model prediction for all the cosmologically relevant quantities, (ns,r,dnsk, fit extremely well with observations made by PLANCK. For both the models the inflation energy scale turned out to be μ≃(1014,1015 GeV. Considering observed central value of the scalar spectral index, ns=0.968, models predict efolding number, N=(52,47. Within a wide range of viable parameter space, we found that the prediction of tensor to scalar ratio r(≤10−5 is far below the current experimental limit. The prediction for the running of scalar spectral index, dnsk, remains very small. We also computed the background field dependent unitarity scale Λ(h, which turned out to be much larger than the aforementioned inflationary energy scale.

  8. Errors in Neonatology

    Directory of Open Access Journals (Sweden)

    Antonio Boldrini

    2013-06-01

    Full Text Available Introduction: Danger and errors are inherent in human activities. In medical practice errors can lean to adverse events for patients. Mass media echo the whole scenario. Methods: We reviewed recent published papers in PubMed database to focus on the evidence and management of errors in medical practice in general and in Neonatology in particular. We compared the results of the literature with our specific experience in Nina Simulation Centre (Pisa, Italy. Results: In Neonatology the main error domains are: medication and total parenteral nutrition, resuscitation and respiratory care, invasive procedures, nosocomial infections, patient identification, diagnostics. Risk factors include patients’ size, prematurity, vulnerability and underlying disease conditions but also multidisciplinary teams, working conditions providing fatigue, a large variety of treatment and investigative modalities needed. Discussion and Conclusions: In our opinion, it is hardly possible to change the human beings but it is likely possible to change the conditions under they work. Voluntary errors report systems can help in preventing adverse events. Education and re-training by means of simulation can be an effective strategy too. In Pisa (Italy Nina (ceNtro di FormazIone e SimulazioNe NeonAtale is a simulation center that offers the possibility of a continuous retraining for technical and non-technical skills to optimize neonatological care strategies. Furthermore, we have been working on a novel skill trainer for mechanical ventilation (MEchatronic REspiratory System SImulator for Neonatal Applications, MERESSINA. Finally, in our opinion national health policy indirectly influences risk for errors. Proceedings of the 9th International Workshop on Neonatology · Cagliari (Italy · October 23rd-26th, 2013 · Learned lessons, changing practice and cutting-edge research

  9. Spent fuel bundle counter sequence error manual - BRUCE NGS

    Energy Technology Data Exchange (ETDEWEB)

    Nicholson, L.E

    1992-03-20

    The Spent Fuel Bundle Counter (SFBC) is used to count the number and type of spent fuel transfers that occur into or out of controlled areas at CANDU reactor sites. However if the transfers are executed in a non-standard manner or the SFBC is malfunctioning, the transfers are recorded as sequence errors. Each sequence error message typically contains adequate information to determine the cause of the message. This manual provides a guide to interpret the various sequence error messages that can occur and suggests probable cause or causes of the sequence errors. Each likely sequence error is presented on a 'card' in Appendix A. Note that it would be impractical to generate a sequence error card file with entries for all possible combinations of faults. Therefore the card file contains sequences with only one fault at a time. Some exceptions have been included however where experience has indicated that several faults can occur simultaneously.

  10. Spent fuel bundle counter sequence error manual - DARLINGTON NGS

    Energy Technology Data Exchange (ETDEWEB)

    Nicholson, L.E

    1992-03-25

    The Spent Fuel Bundle Counter (SFBC) is used to count the number and type of spent fuel transfers that occur into or out of controlled areas at CANDU reactor sites. However if the transfers are executed in a non-standard manner or the SFBC is malfunctioning, the transfers are recorded as sequence errors. Each sequence error message typically contains adequate information to determine the cause of the message. This manual provides a guide to interpret the various sequence error messages that can occur and suggests probable cause or causes of the sequence errors. Each likely sequence error is presented on a 'card' in Appendix A. Note that it would be impractical to generate a sequence error card file with entries for all possible combinations of faults. Therefore the card file contains sequences with only one fault at a time. Some exceptions have been included however where experience has indicated that several faults can occur simultaneously.

  11. Post-exceptionalism in public policy

    DEFF Research Database (Denmark)

    Daugbjerg, Carsten; Feindt, Peter H.

    2017-01-01

    Framing the special issue on the transformation of Food and Agricultural Policy, this article introduces the concept of post-exceptionalism in public policies. The analysis of change in agri-food policy serves as a generative example to conceptualize current transformations in sectoral policy......, institutions, interest constellations and policy instruments. It reflects the more complex, open, contested and fluid nature of contemporary policy fields that nevertheless still maintain their policy heritage. Discussing stability, the authors distinguish between complementary and tense post-exceptionalism....

  12. Session-based Choreography with Exceptions

    DEFF Research Database (Denmark)

    Carbone, Marco

    2009-01-01

    Choreography has recently emerged as a pragmatic and concise way of describing communication-based systems such as web services and financial protocols. Recent studies have investigated the transition from the design stage of a system to its implementation providing an automatic way of mapping...... a choreograhy into executable code. In this work, we focus on an extension of choreography with a communication-based (interactional) exception mechanism by giving its formal semantics. In particular, we discuss through some examples how interactional exceptions at choreography level can be implemented into end...

  13. Exceptional memory performance in the Long Life Family Study

    DEFF Research Database (Denmark)

    Barral, Sandra; Cosentino, Stephanie; Costa, Rosann

    2013-01-01

    Research to understand variability at the highest end of the cognitive performance distribution has been scarce. Our aim was to define a cognitive endophenotype based on exceptional episodic memory (EM) performance and to investigate familial aggregation of EM in families from the Long Life Family...... Study (LLFS). Using a sample of 1911 nondemented offspring of long-lived probands, we created a quantitative phenotype, EM (memory z ≥ 1.5), and classified LLFS families as EM and non-EM families based on the number of EM offspring. We then assessed differences in memory performance between LLFS...... relatives in the parental generation of EM families and those in non-EM families using multivariate analysis adjusted for APOE Apolipoprotein E genotype. LLFS relatives in the proband generation from EM families showed better EM performance than those from non-EM families (β = 0.74, standard error = 0.19, p...

  14. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  15. Syllabification, Tone Marking, and Minimality in Eleme Ngulube ...

    African Journals Online (AJOL)

    Mrs Afam

    3. eee. 'what is it?' This researcher has carefully analysed all exceptions and found that they are morphologically complex, such that long vowels either (a) belong to different morphemes or (b) belong to a morpheme which has undergone the vowel-spreading rule induced by minimality. It is also significant that all of these ...

  16. Minimal prosodic stems/words in Malawian Tonga: A Morpheme ...

    African Journals Online (AJOL)

    Theory analysis. Winfred Mkochi. Chancellor College, University of Malawi wmkochi@cc.ac.mw. Abstract. The paper aims to investigate the real size of the minimal .... constraint grammar defining the canonical shapes as unmarked. ..... Mkochi (2009, 2014) makes an error of judgement as my recordings and my own.

  17. An analysis of tracking error in image-guided neurosurgery.

    Science.gov (United States)

    Gerard, Ian J; Collins, D Louis

    2015-10-01

    This study quantifies some of the technical and physical factors that contribute to error in image-guided interventions. Errors associated with tracking, tool calibration and registration between a physical object and its corresponding image were investigated and compared with theoretical descriptions of these errors. A precision milled linear testing apparatus was constructed to perform the measurements. The tracking error was shown to increase in linear fashion with distance normal to the camera, and the tracking error ranged between 0.15 and 0.6 mm. The tool calibration error increased as a function of distance from the camera and the reference tool (0.2-0.8 mm). The fiducial registration error was shown to improve when more points were used up until a plateau value was reached which corresponded to the total fiducial localization error ([Formula: see text]0.8 mm). The target registration error distributions followed a [Formula: see text] distribution with the largest error and variation around fiducial points. To minimize errors, tools should be calibrated as close as possible to the reference tool and camera, and tools should be used as close to the front edge of the camera throughout the intervention, with the camera pointed in the direction where accuracy is least needed during surgery.

  18. Error Control for Network-on-Chip Links

    CERN Document Server

    Fu, Bo

    2012-01-01

    As technology scales into nanoscale regime, it is impossible to guarantee the perfect hardware design. Moreover, if the requirement of 100% correctness in hardware can be relaxed, the cost of manufacturing, verification, and testing will be significantly reduced. Many approaches have been proposed to address the reliability problem of on-chip communications. This book focuses on the use of error control codes (ECCs) to improve on-chip interconnect reliability. Coverage includes detailed description of key issues in NOC error control faced by circuit and system designers, as well as practical error control techniques to minimize the impact of these errors on system performance. Provides a detailed background on the state of error control methods for on-chip interconnects; Describes the use of more complex concatenated codes such as Hamming Product Codes with Type-II HARQ, while emphasizing integration techniques for on-chip interconnect links; Examines energy-efficient techniques for integrating multiple error...

  19. Learning Disabilities - Programs: Exceptional Child Bibliography Series.

    Science.gov (United States)

    Council for Exceptional Children, Reston, VA. Information Center on Exceptional Children.

    One in a series of over 50 similar selected listings relating to handicapped and gifted children, the bibliography contains 96 references selected from Exceptional Child Education Abstracts concerning programing for children with learning disabilities. References include conference papers, journal articles, texts for parents and teachers, and…

  20. Post-exceptionalism in public policy

    NARCIS (Netherlands)

    Daugbjerg, Carsten; Feindt, Peter H.

    2017-01-01

    Framing the special issue on the transformation of Food and Agricultural Policy, this article introduces the concept of post-exceptionalism in public policies. The analysis of change in agri-food policy serves as a generative example to conceptualize current transformations in sectoral policy

  1. 7 CFR 774.24 - Exception.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Exception. 774.24 Section 774.24 Agriculture Regulations of the Department of Agriculture (Continued) FARM SERVICE AGENCY, DEPARTMENT OF AGRICULTURE... interest of the Government and not inconsistent with the authorizing statute or other applicable law. ...

  2. 7 CFR 773.23 - Exception.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Exception. 773.23 Section 773.23 Agriculture Regulations of the Department of Agriculture (Continued) FARM SERVICE AGENCY, DEPARTMENT OF AGRICULTURE... Government and not inconsistent with the authorizing statute or other applicable law. ...

  3. FAPE Model of Exceptional Student Education Leadership

    Science.gov (United States)

    Dubberly, Russell G.

    2012-01-01

    The FAPE Model of Exceptional Education Leadership is defined as facilitative, affiliative, praising and rewarding, and experiential and empirical. The FAPE administrator uses a facilitative approach that guides and coaches to help employees find a pathway to success. This leader works to build emotional capacity between all members of the…

  4. Leiomyosarcoma of the Penis, an Exceptional Entity

    Directory of Open Access Journals (Sweden)

    Edwin Javier Romero Gonzalez

    2015-05-01

    Full Text Available In tumors of the penis, mesenchymal tumors are extremely rare and within them, sarcomas are exceptional. We report a patient with a sarcomatous lesion treated with conservative surgery with good surgical outcome and the review of the literature, to present the latest advances in the treatment of this unusual entity.

  5. Seeing and Supporting Twice-Exceptional Learners

    Science.gov (United States)

    Lee, Chin-Wen; Ritchotte, Jennifer A.

    2018-01-01

    Through a four-part discussion, this essay advocates for seeing the characteristics and special needs of gifted students with disabilities and using best practices to support their learning. Part 1 delineates the evolution of the legislative acts and professional initiatives regarding twice exceptionality. Part 2 discusses the educational rights…

  6. Transition and Students with Twice Exceptionality

    Science.gov (United States)

    Prior, Susan

    2013-01-01

    "Twice exceptional" is one of the terms used to describe students who have giftedness and a disability. This is a small heterogeneous population of individual learners who are underserved in special, gifted, and mainstream education settings. Despite the availability of research on transition for students with disabilities, there is…

  7. 32 CFR 811.1 - Exceptions.

    Science.gov (United States)

    2010-07-01

    ... Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE SALES AND SERVICES RELEASE, DISSEMINATION, AND SALE OF VISUAL INFORMATION MATERIALS § 811.1 Exceptions. The regulations in this part do not apply to: (a) Visual information (VI) materials made for the Air Force Office of Special Investigations for use...

  8. 12 CFR 229.13 - Exceptions.

    Science.gov (United States)

    2010-01-01

    ...) Redeposited checks. Sections 229.10(c) and 229.12 do not apply to a check that has been returned unpaid and redeposited by the customer or the depositary bank. This exception does not apply— (1) To a check that has... doubt collectibility—(1) In general. Sections 229.10(c) and 229.12 do not apply to a check deposited in...

  9. 31 CFR 211.3 - Exceptions.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false Exceptions. 211.3 Section 211.3 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY FINANCIAL MANAGEMENT SERVICE DELIVERY OF CHECKS AND WARRANTS TO ADDRESSES OUTSIDE THE...

  10. 31 CFR 101.7 - Exceptions.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Exceptions. 101.7 Section 101.7 Money and Finance: Treasury Regulations Relating to Money and Finance MITIGATION OF FORFEITURE OF... smelting the gold coins exceeds the value of the gold bullion to be returned. ...

  11. Working with Navajo Parents of Exceptional Children.

    Science.gov (United States)

    Jones, Doris; And Others

    Undergraduate students at Northern Arizona University interviewed and surveyed 20 staff members at Kayenta Unified School District (KUSD) on the Navajo Reservation and 14 parents of exceptional Navajo children enrolled in KUSD. Both groups were asked to identify challenges affecting the working relationship between parents and school on a rural…

  12. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 10. Error Correcting Codes How Numbers Protect Themselves. Priti Shankar. Series Article Volume 1 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  13. Random errors revisited

    DEFF Research Database (Denmark)

    Jacobsen, Finn

    2000-01-01

    the random errors of estimates of the sound intensity in, say, one-third octave bands from the power and cross power spectra of the signals from an intensity probe determined with a dual channel FFT analyser. This is not very practical, though. In this paper it is demonstrated that one can predict the random...

  14. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March 1997 pp 33-47. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/002/03/0033-0047 ...

  15. Errors in uroradiology

    Energy Technology Data Exchange (ETDEWEB)

    Viamonte, M. Jr. (Miami Univ., Miami Beach, FL (United States). Dept. of Radiology)

    1992-01-01

    This book covering errors in urologic radiology, takes into account the imaging modalities presently used for examining the urinary tract: excretory urography, ultrasonography, computerized tomography and angiography. The author gives examples of anatomical variations, developmental anomalies, and benign conditions that simulate neoplasias and lead to mistakes. (orig.) With 148 figs.

  16. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 1. Error Correcting Codes The Hamming Codes. Priti Shankar. Series Article Volume 2 Issue 1 January ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  17. Orwell's Instructive Errors

    Science.gov (United States)

    Julian, Liam

    2009-01-01

    In this article, the author talks about George Orwell, his instructive errors, and the manner in which Orwell pierced worthless theory, faced facts and defended decency (with fluctuating success), and largely ignored the tradition of accumulated wisdom that has rendered him a timeless teacher--one whose inadvertent lessons, while infrequently…

  18. Error management in audit firms: Error climate, type, and originator

    NARCIS (Netherlands)

    Gold, A.H.; Gronewold, U.; Salterio, S.E.

    2014-01-01

    This paper examines how the treatment of audit staff who discover errors in audit files by superiors affects their willingness to report these errors. The way staff are treated by superiors is labelled as the audit office error management climate. In a "blame-oriented" climate errors are not

  19. Goldmann tonometer error correcting prism: clinical evaluation

    Directory of Open Access Journals (Sweden)

    McCafferty S

    2017-05-01

    Full Text Available Sean McCafferty,1–3 Garrett Lim,2 William Duncan,2 Eniko T Enikov,4 Jim Schwiegerling,1 Jason Levine,1,3 Corin Kew3 1Department of Ophthalmology, College of Optical Science, University of Arizona, 2Intuor Technologies, 3Arizona Eye Consultants, 4Department of Aerospace and Mechanical, College of Engineering, University of Arizona, Tucson, AZ, USA Purpose: Clinically evaluate a modified applanating surface Goldmann tonometer prism designed to substantially negate errors due to patient variability in biomechanics.Methods: A modified Goldmann prism with a correcting applanation tonometry surface (CATS was mathematically optimized to minimize the intraocular pressure (IOP measurement error due to patient variability in corneal thickness, stiffness, curvature, and tear film adhesion force. A comparative clinical study of 109 eyes measured IOP with CATS and Goldmann prisms. The IOP measurement differences between the CATS and Goldmann prisms were correlated to corneal thickness, hysteresis, and curvature.Results: The CATS tonometer prism in correcting for Goldmann central corneal thickness (CCT error demonstrated a reduction to <±2 mmHg in 97% of a standard CCT population. This compares to only 54% with CCT error <±2 mmHg using the Goldmann prism. Equal reductions of ~50% in errors due to corneal rigidity and curvature were also demonstrated.Conclusion: The results validate the CATS prism’s improved accuracy and expected reduced sensitivity to Goldmann errors without IOP bias as predicted by mathematical modeling. The CATS replacement for the Goldmann prism does not change Goldmann measurement technique or interpretation. Keywords: glaucoma, tonometry, Goldmann, IOP, intraocular pressure, appalnation tonometer, corneal biomechanics, CATS tonometer, CCT, central corneal thickness, tonometer error 

  20. Optimizing Neural Network Architectures Using Generalization Error Estimators

    DEFF Research Database (Denmark)

    Larsen, Jan

    1994-01-01

    This paper addresses the optimization of neural network architectures. It is suggested to optimize the architecture by selecting the model with minimal estimated averaged generalization error. We consider a least-squares (LS) criterion for estimating neural network models, i.e., the associated...... model weights are estimated by minimizing the LS criterion. The quality of a particular estimated model is measured by the average generalization error. This is defined as the expected squared prediction error on a novel input-output sample averaged over all possible training sets. An essential part...... of the suggested architecture optimization scheme is to calculate an estimate of the average generalization error. We suggest using the GEN-estimator which allows for dealing with nonlinear, incomplete models, i.e., models which are not capable of modeling the underlying nonlinear relationship perfectly. In most...

  1. Anatomic, clinical, and neuropsychological correlates of spelling errors in Primary Progressive Aphasia

    OpenAIRE

    Shim, HyungSub; Hurley, Robert S.; Rogalski, Emily; Mesulam, M.-Marsel

    2012-01-01

    This study evaluates spelling errors in the three subtypes of primary progressive aphasia (PPA): agrammatic (PPA-G), logopenic (PPA-L), and semantic (PPA-S). Forty one PPA-patients and 36 age-matched healthy controls were administered a test of spelling. The total number of errors and types of errors in spelling to dictation of regular words, exception words and nonwords, were recorded. Error types were classified based on phonetic plausibility. In the first analysis, scores were evaluated by...

  2. Automatic Error Analysis Using Intervals

    Science.gov (United States)

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  3. Sources of Error in Mammalian Genetic Screens

    Directory of Open Access Journals (Sweden)

    Laura Magill Sack

    2016-09-01

    Full Text Available Genetic screens are invaluable tools for dissection of biological phenomena. Optimization of such screens to enhance discovery of candidate genes and minimize false positives is thus a critical aim. Here, we report several sources of error common to pooled genetic screening techniques used in mammalian cell culture systems, and demonstrate methods to eliminate these errors. We find that reverse transcriptase-mediated recombination during retroviral replication can lead to uncoupling of molecular tags, such as DNA barcodes (BCs, from their associated library elements, leading to chimeric proviral genomes in which BCs are paired to incorrect ORFs, shRNAs, etc. This effect depends on the length of homologous sequence between unique elements, and can be minimized with careful vector design. Furthermore, we report that residual plasmid DNA from viral packaging procedures can contaminate transduced cells. These plasmids serve as additional copies of the PCR template during library amplification, resulting in substantial inaccuracies in measurement of initial reference populations for screen normalization. The overabundance of template in some samples causes an imbalance between PCR cycles of contaminated and uncontaminated samples, which results in a systematic artifactual depletion of GC-rich library elements. Elimination of contaminating plasmid DNA using the bacterial endonuclease Benzonase can restore faithful measurements of template abundance and minimize GC bias.

  4. Learning from Galileo's errors

    CERN Document Server

    Bernieri, Enrico

    2012-01-01

    Four hundred years after its publication, Galileo's masterpiece Sidereus Nuncius is still a mine of useful information for historians of science and astronomy. In his short book Galileo reports a large amount of data that, despite its age, has not yet been fully explored. In this paper Galileo's first observations of Jupiter's satellites are quantitatively re-analysed by using modern planetarium software. All the angular records reported in the Sidereus Nuncius are, for the first time, compared with satellites' elongations carefully reconstructed taking into account software accuracy and the indeterminacy of observation time. This comparison allows us to derive the experimental errors of Galileo's measurements and gives us direct insight into the effective angular resolution of Galileo's observations. Until now, historians of science have mainly obtained these indirectly and they are often not correctly estimated. Furthermore, a statistical analysis of Galileo's experimental errors shows an asymmetrical distr...

  5. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  6. Correction of refractive errors

    Directory of Open Access Journals (Sweden)

    Vladimir Pfeifer

    2005-10-01

    Full Text Available Background: Spectacles and contact lenses are the most frequently used, the safest and the cheapest way to correct refractive errors. The development of keratorefractive surgery has brought new opportunities for correction of refractive errors in patients who have the need to be less dependent of spectacles or contact lenses. Until recently, RK was the most commonly performed refractive procedure for nearsighted patients.Conclusions: The introduction of excimer laser in refractive surgery has given the new opportunities of remodelling the cornea. The laser energy can be delivered on the stromal surface like in PRK or deeper on the corneal stroma by means of lamellar surgery. In LASIK flap is created with microkeratome in LASEK with ethanol and in epi-LASIK the ultra thin flap is created mechanically.

  7. Is There Muslim Exceptionalism in Democracy Research?

    DEFF Research Database (Denmark)

    Hariri, Jacob Gerner

    , also, to alternative theories of the causes and correlates of democracy. This paper presents evidence against the notion of Muslim exceptionalism in democracy research. Thus, outside the European continent, territories that were governed earlier and more consistently by state organizations up......Casual observation suggests a negative association between Islam and democratic rule, as very few Muslim countries can be considered democracies. Recent research has conrmed this observation by documenting that the negative association is robust to dierent democracy indices, different samples, and...

  8. Discover Aggregates Exceptions over Hidden Web Databases

    OpenAIRE

    Suhaim, Saad Bin; Liu, Weimo; Zhang, Nan

    2016-01-01

    Nowadays, many web databases "hidden" behind their restrictive search interfaces (e.g., Amazon, eBay) contain rich and valuable information that is of significant interests to various third parties. Recent studies have demonstrated the possibility of estimating/tracking certain aggregate queries over dynamic hidden web databases. Nonetheless, tracking all possible aggregate query answers to report interesting findings (i.e., exceptions), while still adhering to the stringent query-count limit...

  9. How to make a minimal genome for synthetic minimal cell.

    Science.gov (United States)

    Zhang, Liu-Yan; Chang, Su-Hua; Wang, Jing

    2010-05-01

    As a key focus of synthetic biology, building a minimal artificial cell has given rise to many discussions. A synthetic minimal cell will provide an appropriate chassis to integrate functional synthetic parts, devices and systems with functions that cannot generally be found in nature. The design and construction of a functional minimal genome is a key step while building such a cell/chassis since all the cell functions can be traced back to the genome. Kinds of approaches, based on bioinformatics and molecular biology, have been developed and proceeded to derive essential genes and minimal gene sets for the synthetic minimal genome. Experiments about streamlining genomes of model bacteria revealed genome reduction led to unanticipated beneficial properties, such as high electroporation efficiency and accurate propagation of recombinant genes and plasmids that were unstable in other strains. Recent achievements in chemical synthesis technology for large DNA segments together with the rapid development of the whole-genome sequencing, have transferred synthesis of genes to assembly of the whole genomes based on oligonucleotides, and thus created strong preconditions for synthesis of artificial minimal genome. Here in this article, we review briefly the history and current state of research in this field and summarize the main methods for making a minimal genome. We also discuss the impacts of minimized genome on metabolism and regulation of artificial cell.

  10. Managing residual refractive error after cataract surgery.

    Science.gov (United States)

    Sáles, Christopher S; Manche, Edward E

    2015-06-01

    We present a review of keratorefractive and intraocular approaches to managing residual astigmatic and spherical refractive error after cataract surgery, including laser in situ keratomileusis (LASIK), photorefractive keratectomy (PRK), arcuate keratotomy, intraocular lens (IOL) exchange, piggyback IOLs, and light-adjustable IOLs. Currently available literature suggests that laser vision correction, whether LASIK or PRK, yields more effective and predictable outcomes than intraocular surgery. Piggyback IOLs with a rounded-edge profile implanted in the sulcus may be superior to IOL exchange, but both options present potential risks that likely outweigh the refractive benefits except in cases with large residual spherical errors. The light-adjustable IOL may provide an ideal treatment to pseudophakic ametropia by obviating the need for secondary invasive procedures after cataract surgery, but it is not widely available nor has it been sufficiently studied. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  11. Reporting Self-Made Errors: The Impact of Organizational Error-Management Climate and Error Type

    NARCIS (Netherlands)

    Gold, A.H.; Gronewold, U.; Salterio, S.E.

    2013-01-01

    We study how an organization's error-management climate affects organizational members' beliefs about other members' willingness to report errors that they discover when chance of error detection by superiors and others is extremely low. An error-management climate, as a component of the

  12. Clinimetrics corner: a closer look at the minimal clinically important difference (MCID)

    OpenAIRE

    Wright, Alexis; Hannon, Joseph; Hegedus, Eric J.; Kavchak, Alicia Emerson

    2012-01-01

    Minimal clinically important difference (MCID) scores are commonly used by clinicians when determining patient response to treatment and to guide clinical decision-making during the course of treatment. For research purposes, the MCID score is often used in sample size calculations for adequate powering of a study to minimize the false-positives (type 1 errors) and the false-negatives (type 2 errors). For clinicians and researchers alike, it is critical that the MCID score is a valid and stab...

  13. Error sensitivity analysis in 10-30-day extended range forecasting by using a nonlinear cross-prediction error model

    Science.gov (United States)

    Xia, Zhiye; Xu, Lisheng; Chen, Hongbin; Wang, Yongqian; Liu, Jinbao; Feng, Wenlan

    2017-06-01

    Extended range forecasting of 10-30 days, which lies between medium-term and climate prediction in terms of timescale, plays a significant role in decision-making processes for the prevention and mitigation of disastrous meteorological events. The sensitivity of initial error, model parameter error, and random error in a nonlinear crossprediction error (NCPE) model, and their stability in the prediction validity period in 10-30-day extended range forecasting, are analyzed quantitatively. The associated sensitivity of precipitable water, temperature, and geopotential height during cases of heavy rain and hurricane is also discussed. The results are summarized as follows. First, the initial error and random error interact. When the ratio of random error to initial error is small (10-6-10-2), minor variation in random error cannot significantly change the dynamic features of a chaotic system, and therefore random error has minimal effect on the prediction. When the ratio is in the range of 10-1-2 (i.e., random error dominates), attention should be paid to the random error instead of only the initial error. When the ratio is around 10-2-10-1, both influences must be considered. Their mutual effects may bring considerable uncertainty to extended range forecasting, and de-noising is therefore necessary. Second, in terms of model parameter error, the embedding dimension m should be determined by the factual nonlinear time series. The dynamic features of a chaotic system cannot be depicted because of the incomplete structure of the attractor when m is small. When m is large, prediction indicators can vanish because of the scarcity of phase points in phase space. A method for overcoming the cut-off effect ( m > 4) is proposed. Third, for heavy rains, precipitable water is more sensitive to the prediction validity period than temperature or geopotential height; however, for hurricanes, geopotential height is most sensitive, followed by precipitable water.

  14. Detecting Soft Errors in Stencil based Computations

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, V. [Univ. of Utah, Salt Lake City, UT (United States); Gopalkrishnan, G. [Univ. of Utah, Salt Lake City, UT (United States); Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    Given the growing emphasis on system resilience, it is important to develop software-level error detectors that help trap hardware-level faults with reasonable accuracy while minimizing false alarms as well as the performance overhead introduced. We present a technique that approaches this idea by taking stencil computations as our target, and synthesizing detectors based on machine learning. In particular, we employ linear regression to generate computationally inexpensive models which form the basis for error detection. Our technique has been incorporated into a new open-source library called SORREL. In addition to reporting encouraging experimental results, we demonstrate techniques that help reduce the size of training data. We also discuss the efficacy of various detectors synthesized, as well as our future plans.

  15. Minimal massive 3D gravity

    NARCIS (Netherlands)

    Bergshoeff, Eric; Hohm, Olaf; Merbis, Wout; Routh, Alasdair J.; Townsend, Paul K.

    2014-01-01

    We present an alternative to topologically massive gravity (TMG) with the same 'minimal' bulk properties; i.e. a single local degree of freedom that is realized as a massive graviton in linearization about an anti-de Sitter (AdS) vacuum. However, in contrast to TMG, the new 'minimal massive gravity'

  16. Energy minimization of ferromagnetic particles

    Science.gov (United States)

    Qu, Heliang; Li, Jiangyu

    2004-07-01

    In this paper, we present an energy-minimization theory of ferromagnetic particles to characterize the magnetization reversal and hysteresis loop of ferromagnetic polycrystals, with the inter-granular magneto-static interactions accounted for through the effective medium approximation. The energy-minimizing magnetization distribution is determined, and the remanence and coercivity are predicted.

  17. A Review On Numerical Error Correction Using Various Techniques

    Directory of Open Access Journals (Sweden)

    Iqra Ahmed

    2015-07-01

    Full Text Available Abstract From decades the work of symbolic computations cannot be ignored in real time calculations. During the discussion of various automated machines for estimated calculations we came to know where there are inputs and the corresponding outputs the term error is obvious. But the error can be minimized by using different suitable algorithms. This study focusses on techniques used for error correction in numeric and symbolic computations. After reviewing on different techniques discussed before we generate analysis by taking some of the parameters. The Experimental results shows that these algorithm has better performance in terms of accuracy performance cost validity safety security reliability and power consumption.

  18. Local media influence on opting out from an exception from informed consent trial.

    Science.gov (United States)

    Nelson, Maria J; DeIorio, Nicole M; Schmidt, Terri; Griffiths, Denise; Daya, Mohamud; Haywood, Liana; Zive, Dana; Newgard, Craig D

    2010-01-01

    News media are used for community education and notification in exception from informed consent clinical trials, yet their effectiveness as an added safeguard in such research remains unknown. We assessed the number of callers requesting opt-out bracelets after each local media report and described the errors and content within each media report. We undertook a descriptive analysis of local media trial coverage (newspaper, television, radio, and Web log) and opt-out requests during a 41-month period at a single site participating in an exception from informed consent out-of-hospital trial. Two nontrial investigators independently assessed 41 content-based media variables (including background, trial information, graphics, errors, publication information, and assessment) with a standardized, semiqualitative data collection tool. Major errors were considered serious misrepresentation of the trial purpose or protocol, whereas minor errors included misinformation unlikely to mislead the lay reader about the trial. We plotted the temporal relationship between opt-out bracelet requests and media reports. Descriptive information about the news sources and the trial coverage are presented. We collected 39 trial-related media reports (33 newspaper, 1 television, 1 radio, and 4 blogs). There were 13 errors in 9 (23%) publications, 7 of which were major and 6 minor. Of 384 requests for 710 bracelets, 310 requests (80%) occurred within 4 days after trial media coverage. Graphic timeline representation of the data suggested a close association between media reports about the trial and requests for opt-out bracelets. According to results from a single site, local media coverage for an exception from informed consent clinical trial had a substantial portion of errors and appeared closely associated with opt-out requests. Copyright 2008. Published by Mosby, Inc.

  19. Minimal Webs in Riemannian Manifolds

    DEFF Research Database (Denmark)

    Markvorsen, Steen

    2008-01-01

    For a given combinatorial graph $G$ a {\\it geometrization} $(G, g)$ of the graph is obtained by considering each edge of the graph as a $1-$dimensional manifold with an associated metric $g$. In this paper we are concerned with {\\it minimal isometric immersions} of geometrized graphs $(G, g......)$ into Riemannian manifolds $(N^{n}, h)$. Such immersions we call {\\em{minimal webs}}. They admit a natural 'geometric' extension of the intrinsic combinatorial discrete Laplacian. The geometric Laplacian on minimal webs enjoys standard properties such as the maximum principle and the divergence theorems, which...... are of instrumental importance for the applications. We apply these properties to show that minimal webs in ambient Riemannian spaces share several analytic and geometric properties with their smooth (minimal submanifold) counterparts in such spaces. In particular we use appropriate versions of the divergence...

  20. Waste minimization handbook, Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility`s life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996.

  1. Minimal Surfaces for Hitchin Representations

    DEFF Research Database (Denmark)

    Li, Qiongling; Dai, Song

    2018-01-01

    . In this paper, we investigate the properties of immersed minimal surfaces inside symmetric space associated to a subloci of Hitchin component: $q_n$ and $q_{n-1}$ case. First, we show that the pullback metric of the minimal surface dominates a constant multiple of the hyperbolic metric in the same conformal...... class and has a strong rigidity property. Secondly, we show that the immersed minimal surface is never tangential to any flat inside the symmetric space. As a direct corollary, the pullback metric of the minimal surface is always strictly negatively curved. In the end, we find a fully decoupled system......Given a reductive representation $\\rho: \\pi_1(S)\\rightarrow G$, there exists a $\\rho$-equivariant harmonic map $f$ from the universal cover of a fixed Riemann surface $\\Sigma$ to the symmetric space $G/K$ associated to $G$. If the Hopf differential of $f$ vanishes, the harmonic map is then minimal...

  2. Perancangan Fasilitas Kerja untuk Mereduksi Human Error

    Directory of Open Access Journals (Sweden)

    Harmein Nasution

    2012-01-01

    Full Text Available Work equipments and environment which are not design ergonomically can cause physical exhaustion to the workers. As a result of that physical exhaustion, many defects in the production lines can happen due to human error and also cause musculoskeletal complaints. To overcome, those effects, we occupied methods for analyzing the workers posture based on the SNQ (Standard Nordic Questionnaire, plibel, QEC (Quick Exposure Check and biomechanism. Moreover, we applied those methods for designing rolling machines and grip egrek ergono-mically, so that the defects on those production lines can be minimized.

  3. Refractive errors and school performance in Brazzaville, Congo ...

    African Journals Online (AJOL)

    Background: Wearing glasses before ten years is becoming more common in developed countries. In black Africa, for cultural or irrational reasons, this attitude remains exceptional. This situation is a source of amblyopia and learning difficulties. Objective: To determine the role of refractive errors in school performance in ...

  4. Exceptional Points in three-dimensional Nanostructures

    CERN Document Server

    Kodigala, Ashok; Kanté, Boubacar

    2016-01-01

    Exceptional points (EPs) are degeneracies in open wave systems where at least two energy levels and their corresponding eigenstates coalesce. We report evidence of the existence of EPs in 3D plasmonic nanostructures. The systems are composed of coupled plasmonic nanoresonators and can be judiciously and systematically driven to EPs by controlling symmetry-compatible modes via their near-field and far-field interactions. The proposed platform opens the way to the investigation of EPs for enhanced light-matter interactions and applications in communication, sensing and imaging.

  5. Exceptional Family Transitional Training Program (EFTTP)

    Science.gov (United States)

    2009-01-01

    Stilwell Drive Fort Monroe, VA 23651 O: 1-757-788-3535 F: 1-757-788-3713 FORT MYER, VA ACS-EFMP 201 Custer Road Fort Myer, VA 22211 O: 1-703-696-8467 F...1895-61-6323 US NSGA RAF MENWITH HILL Commanding Officer US NSGA RAF Menwith Hill PSC 45 Unit 8470 APO AE 09468 O: 011-44-1423-846717 DSN: 314-262-6717...Program 151 Bernard Road Fort Monroe, VA 23651 Telephone: 757-788-3878 Army Community Service Exceptional Family Member Program Building 201, Custer

  6. Exceptional Points and Dynamical Phase Transitions

    Directory of Open Access Journals (Sweden)

    I. Rotter

    2010-01-01

    Full Text Available In the framework of non-Hermitian quantum physics, the relation between exceptional points,dynamical phase transitions and the counter intuitive behavior of quantum systems at high level density is considered. The theoretical results obtained for open quantum systems and proven experimentally some years ago on a microwave cavity, may explain environmentally induce deffects (including dynamical phase transitions, which have been observed in various experimental studies. They also agree(qualitatively with the experimental results reported recently in PT symmetric optical lattices.

  7. Exceptional Antibodies Produced by Successive Immunizations.

    Directory of Open Access Journals (Sweden)

    Patricia J Gearhart

    2015-12-01

    Full Text Available Antibodies stand between us and pathogens. Viruses mutate quickly to avoid detection, and antibodies mutate at similar rates to hunt them down. This death spiral is fueled by specialized proteins and error-prone polymerases that change DNA sequences. Here, we explore how B lymphocytes stay in the race by expressing activation-induced deaminase, which unleashes a tsunami of mutations in the immunoglobulin loci. This produces random DNA substitutions, followed by selection for the highest affinity antibodies. We may be able to manipulate the process to produce better antibodies by expanding the repertoire of specific B cells through successive vaccinations.

  8. Exceptional Antibodies Produced by Successive Immunizations.

    Science.gov (United States)

    Gearhart, Patricia J; Castiblanco, Diana P; Russell Knode, Lisa M

    2015-12-01

    Antibodies stand between us and pathogens. Viruses mutate quickly to avoid detection, and antibodies mutate at similar rates to hunt them down. This death spiral is fueled by specialized proteins and error-prone polymerases that change DNA sequences. Here, we explore how B lymphocytes stay in the race by expressing activation-induced deaminase, which unleashes a tsunami of mutations in the immunoglobulin loci. This produces random DNA substitutions, followed by selection for the highest affinity antibodies. We may be able to manipulate the process to produce better antibodies by expanding the repertoire of specific B cells through successive vaccinations.

  9. Los errores cometidos

    Directory of Open Access Journals (Sweden)

    José Martínez Terrero

    2015-01-01

    Full Text Available Durante el Encuentro sobre Comunicación Alternativa y Popular, José Martínez Terrero, SJ, repasó las principales corrientes de la Comunicación Alternativa y Popular, desde los años 60 hasta el presente, rescatando sus aportes conceptuales y prácticos, así como los errores cometidos. A continuación publicamos un extracto que destaca la reflexión crítica y autocrítica.

  10. Performance, postmodernity and errors

    DEFF Research Database (Denmark)

    Harder, Peter

    2013-01-01

    with the prestige variety, and conflate non-standard variation with parole/performance and class both as erroneous. Nowadays the anti-structural sentiment of present-day linguistics makes it tempting to confuse the rejection of ideal abstract structure with a rejection of any distinction between grammatical...... as deviant from the perspective of function-based structure and discuss to what extent the recognition of a community langue as a source of adaptive pressure may throw light on different types of deviation, including language handicaps and learner errors....

  11. Nonresponse Error in Mail Surveys: Top Ten Problems

    Directory of Open Access Journals (Sweden)

    Jeanette M. Daly

    2011-01-01

    Full Text Available Conducting mail surveys can result in nonresponse error, which occurs when the potential participant is unwilling to participate or impossible to contact. Nonresponse can result in a reduction in precision of the study and may bias results. The purpose of this paper is to describe and make readers aware of a top ten list of mailed survey problems affecting the response rate encountered over time with different research projects, while utilizing the Dillman Total Design Method. Ten nonresponse error problems were identified, such as inserter machine gets sequence out of order, capitalization in databases, and mailing discarded by postal service. These ten mishaps can potentiate nonresponse errors, but there are ways to minimize their frequency. Suggestions offered stem from our own experiences during research projects. Our goal is to increase researchers' knowledge of nonresponse error problems and to offer solutions which can decrease nonresponse error in future projects.

  12. GLOBAL RISKS AND INSTRUMENTS OF ITS MINIMIZATION

    Directory of Open Access Journals (Sweden)

    O. Havryliuk

    2014-06-01

    Full Text Available It is argued that economic globalization leads to the formation of macro-economic, political and other risks that are able to grow into global risks affecting, without exception, all national economies, creating a serious threat to national economic security. The emphasis is on the negative elements of a set of global risks, their development and minimize the possibility of using a number of tools. Ensuring firmness of the state to external risks demands continuous monitoring and forecasting of world processes and usage of economic instruments of rapid response for prevention of negative consequences. The essence of the category of "risk" is revealed and deepened. The global risks that can not affect the economic security of Ukraine is disclosed. It is shown that the emergence of these global risks has negative impact on the economic security of Ukraine.

  13. Minimal Poems Written in 1979 Minimal Poems Written in 1979

    Directory of Open Access Journals (Sweden)

    Sandra Sirangelo Maggio

    2008-04-01

    Full Text Available The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism. The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism.

  14. On the Scalar Manifold of Exceptional Supergravity

    CERN Document Server

    Cacciatori, Sergio L; Marrani, Alessio

    2012-01-01

    We construct two parametrizations of the non compact exceptional Lie group G=E7(-25), based on a fibration which has the maximal compact subgroup K=(E6 x U(1))/Z_3 as a fiber. It is well known that G plays an important role in the N=2 d=4 magic exceptional supergravity, where it describes the U-duality of the theory and where the symmetric space M=G/K gives the vector multiplets' scalar manifold. First, by making use of the exponential map, we compute a realization of G/K, that is based on the E6 invariant d-tensor, and hence exhibits the maximal possible manifest [(E6 x U(1))/Z_3]-covariance. This provides a basis for the corresponding supergravity theory, which is the analogue of the Calabi-Vesentini coordinates. Then we study the Iwasawa decomposition. Its main feature is that it is SO(8)-covariant and therefore it highlights the role of triality. Along the way we analyze the relevant chain of maximal embeddings which leads to SO(8). It is worth noticing that being based on the properties of a "mixed" Freu...

  15. Phase error in Fourier transform spectrometers employing polarization interferometers

    Science.gov (United States)

    Wang, Yifan; Kudenov, Michael W.; Craven-Jones, Julia

    2014-05-01

    Phase error is common in reflective interferometers, such as the Michelson. This yields highly asymmetric interferograms that complicate the post-processing of single-sided interference data. Common methods of compensating for phase errors include the Mertz, Forman, and Cannes phase correction techniques. However, birefringent interferometers often have highly symmetric interferograms; thus, compensating for phase errors may represent an unnecessary and/or detrimental step in post processing. In this paper, an analysis of the phase error generated by the Infrared Hyperspectral Imaging Polarimeter (IHIP) is conducted. First, a model of the IHIP is presented that quantifies the phase error in the system. The error associated with calculating spectra from single-sided interferograms, using Mertz phase correction and simple single­sided to double-sided mirroring, is then investigated and compared to "true" double-sided Cannes phase corrected spectra. These error calculations are set within the context of measurements taken from a Michelson interferometer-based Fourier transform spectrometer. Results demonstrate that the phase error of the IHIP is comparatively small and that Mertz phase correction may not be necessary to minimize error in the spectral calculation.

  16. Minimal flows and their extensions

    CERN Document Server

    Auslander, J

    1988-01-01

    This monograph presents developments in the abstract theory of topological dynamics, concentrating on the internal structure of minimal flows (actions of groups on compact Hausdorff spaces for which every orbit is dense) and their homomorphisms (continuous equivariant maps). Various classes of minimal flows (equicontinuous, distal, point distal) are intensively studied, and a general structure theorem is obtained. Another theme is the ``universal'' approach - entire classes of minimal flows are studied, rather than flows in isolation. This leads to the consideration of disjointness of flows, w

  17. Improving the error backpropagation algorithm with a modified error function.

    Science.gov (United States)

    Oh, S H

    1997-01-01

    This letter proposes a modified error function to improve the error backpropagation (EBP) algorithm of multilayer perceptrons (MLPs) which suffers from slow learning speed. To accelerate the learning speed of the EBP algorithm, the proposed method reduces the probability that output nodes are near the wrong extreme value of sigmoid activation function. This is acquired through a strong error signal for the incorrectly saturated output node and a weak error signal for the correctly saturated output node. The weak error signal for the correctly saturated output node, also, prevents overspecialization of learning for training patterns. The effectiveness of the proposed method is demonstrated in a handwritten digit recognition task.

  18. Transient error approximation in a Lévy queue

    NARCIS (Netherlands)

    B. Mathijsen (Britt); A.P. Zwart (Bert)

    2017-01-01

    textabstractMotivated by a capacity allocation problem within a finite planning period, we conduct a transient analysis of a single-server queue with Lévy input. From a cost minimization perspective, we investigate the error induced by using stationary congestion measures as opposed to

  19. 75 FR 15371 - Time Error Correction Reliability Standard

    Science.gov (United States)

    2010-03-29

    ... undertake an actual modification to their generation dispatch to correct for Time Error, it must be...\\ generally requires a description and analysis of final rules that will have significant economic impact on a... stated objectives of a proposed rule and that minimize any significant economic impact on a substantial...

  20. [The error, source of learning].

    Science.gov (United States)

    Joyeux, Stéphanie; Bohic, Valérie

    2016-05-01

    The error itself is not recognised as a fault. It is the intentionality which differentiates between an error and a fault. An error is unintentional while a fault is a failure to respect known rules. The risk of error is omnipresent in health institutions. Public authorities have therefore set out a series of measures to reduce this risk. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  1. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  2. Specialized minimal PDFs for optimized LHC calculations

    CERN Document Server

    Carrazza, Stefano; Kassabov, Zahari; Rojo, Juan

    2016-04-15

    We present a methodology for the construction of parton distribution functions (PDFs) designed to provide an accurate representation of PDF uncertainties for specific processes or classes of processes with a minimal number of PDF error sets: specialized minimal PDF sets, or SM-PDFs. We construct these SM-PDFs in such a way that sets corresponding to different input processes can be combined without losing information, specifically on their correlations, and that they are robust upon smooth variations of the kinematic cuts. The proposed strategy never discards information, so that the SM-PDF sets can be enlarged by the addition of new processes, until the prior PDF set is eventually recovered for a large enough set of processes. We illustrate the method by producing SM-PDFs tailored to Higgs, top quark pair, and electroweak gauge boson physics, and determine that, when the PDF4LHC15 combined set is used as the prior, around 11, 4 and 11 Hessian eigenvectors respectively are enough to fully describe the corresp...

  3. Specialized minimal PDFs for optimized LHC calculations.

    Science.gov (United States)

    Carrazza, Stefano; Forte, Stefano; Kassabov, Zahari; Rojo, Juan

    2016-01-01

    We present a methodology for the construction of parton distribution functions (PDFs) designed to provide an accurate representation of PDF uncertainties for specific processes or classes of processes with a minimal number of PDF error sets: specialized minimal PDF sets, or SM-PDFs. We construct these SM-PDFs in such a way that sets corresponding to different input processes can be combined without losing information, specifically as regards their correlations, and that they are robust upon smooth variations of the kinematic cuts. The proposed strategy never discards information, so that the SM-PDF sets can be enlarged by the addition of new processes, until the prior PDF set is eventually recovered for a large enough set of processes. We illustrate the method by producing SM-PDFs tailored to Higgs, top-quark pair, and electroweak gauge boson physics, and we determine that, when the PDF4LHC15 combined set is used as the prior, around 11, 4, and 11 Hessian eigenvectors, respectively, are enough to fully describe the corresponding processes.

  4. Heart bypass surgery - minimally invasive

    Science.gov (United States)

    ... and lifestyle Cholesterol - drug treatment Controlling your high blood pressure Dietary fats explained Fast food tips Heart attack - discharge Heart attack - what to ask your doctor Heart bypass surgery - minimally invasive - discharge Heart disease - risk factors Heart pacemaker - discharge ...

  5. Luc Besson : entre exception culturelle et Hollywood

    OpenAIRE

    Warczinski, Anne

    2013-01-01

    « L’exception culturelle franco-française est morte » (cité dans L’Express 19.11.2009). C’est ce que Jean-Marie Messier, président et directeur général de Vivendi (Altman 2002), a déclaré lors de la fusion de son entreprise avec la société de production américaine Universal en 2001 à New York (Vidal/Ministère de la culture et de la communication 2002). Cette déclaration a provoqué l’outrage aussi bien dans le gouvernement français que parmi les professionnels du cinéma, d’autant plus que Mess...

  6. Transporting "exceptional cargo" on the CERN sites

    CERN Multimedia

    EN Department

    2012-01-01

    When the Transport Service is managing "exceptional cargo", the driver and the escort are often in charge of an operation involving equipment worth many hundred thousand francs. Equipment that may well be irreplaceable for a facility or an experiment.   The members of the Transport Service who carry out these tasks are very professional and are – needless to say – highly concentrated on the job. They count on your understanding and support in the traffic on site. Their convoys are – for good reasons – moving slowly. Kindly do not overtake, do not cut in in front of them and do not drive too closely. Respect the escort and do not position yourself between the truck and the escort vehicles. The EN department counts on your courtesy on the road.  

  7. Exceptionally elevated triglyceride in severe lipemia retinalis.

    Science.gov (United States)

    Yin, Han Y; Warman, Roberto; Suh, Edward H; Cheng, Anny Ms

    2016-01-01

    To report a case of successful treatment for severe lipemia retinalis with extreme severe hypertriglyceridemia (sHTG). Observational case report. A 6-week-old infant with severe lipemia retinalis manifested diffuse creamy retinal vessels complicated with vulvar xanthomas. Extreme sHTG with 185-folds of the normal level was reported. Chromosome microarray and lipid gene sequencing confirmed a homozygous lipoprotein lipase gene coding mutation. Under strict adherence to a high medium-chain triglycerides formula and discontinuation of breast milk, the lipemia retinalis and vulval lesions resolved along with a stable plasma lipid level throughout the follow-up period of 6 months. Strict adherence to a low-fat diet without breast milk appears to be effective in treating infants with severe lipemia retinalis associated with exceptionally high triglycerides.

  8. An evaluation of graphical context as a means for ameliorating the effects of registration error.

    Science.gov (United States)

    Robertson, Cindy M; MacIntyre, Blair; Walker, Bruce N

    2009-01-01

    An ongoing research problem in Augmented Reality (AR) is to improve tracking and display technology in order to minimize registration errors. However, perfect registration is not always necessary for users to understand the intent of an augmentation. This paper describes the results of an experiment to evaluate the effects of registration error in a Lego block placement task and the effectiveness of graphical context at ameliorating these effects. Three types of registration error were compared: no error, fixed error and random error. These three errors were evaluated with no context present and some graphical context present. The results of this experiment indicated that adding graphical context to a scene in which some registration error is present can allow a person to effectively operate in such an environment, in this case completing the Lego block placement task with a reduced number of errors made and in a shorter amount of time.

  9. Error sensitivity to refinement: a criterion for optimal grid adaptation

    Science.gov (United States)

    Luchini, Paolo; Giannetti, Flavio; Citro, Vincenzo

    2017-12-01

    Most indicators used for automatic grid refinement are suboptimal, in the sense that they do not really minimize the global solution error. This paper concerns with a new indicator, related to the sensitivity map of global stability problems, suitable for an optimal grid refinement that minimizes the global solution error. The new criterion is derived from the properties of the adjoint operator and provides a map of the sensitivity of the global error (or its estimate) to a local mesh refinement. Examples are presented for both a scalar partial differential equation and for the system of Navier-Stokes equations. In the last case, we also present a grid-adaptation algorithm based on the new estimator and on the FreeFem++ software that improves the accuracy of the solution of almost two order of magnitude by redistributing the nodes of the initial computational mesh.

  10. Rapid mapping of volumetric errors

    Energy Technology Data Exchange (ETDEWEB)

    Krulewich, D.; Hale, L.; Yordy, D.

    1995-09-13

    This paper describes a relatively inexpensive, fast, and easy to execute approach to mapping the volumetric errors of a machine tool, coordinate measuring machine, or robot. An error map is used to characterize a machine or to improve its accuracy by compensating for the systematic errors. The method consists of three steps: (1) modeling the relationship between the volumetric error and the current state of the machine; (2) acquiring error data based on length measurements throughout the work volume; and (3) optimizing the model to the particular machine.

  11. Controlling errors in unidosis carts

    Directory of Open Access Journals (Sweden)

    Inmaculada Díaz Fernández

    2010-01-01

    Full Text Available Objective: To identify errors in the unidosis system carts. Method: For two months, the Pharmacy Service controlled medication either returned or missing from the unidosis carts both in the pharmacy and in the wards. Results: Uncorrected unidosis carts show a 0.9% of medication errors (264 versus 0.6% (154 which appeared in unidosis carts previously revised. In carts not revised, the error is 70.83% and mainly caused when setting up unidosis carts. The rest are due to a lack of stock or unavailability (21.6%, errors in the transcription of medical orders (6.81% or that the boxes had not been emptied previously (0.76%. The errors found in the units correspond to errors in the transcription of the treatment (3.46%, non-receipt of the unidosis copy (23.14%, the patient did not take the medication (14.36%or was discharged without medication (12.77%, was not provided by nurses (14.09%, was withdrawn from the stocks of the unit (14.62%, and errors of the pharmacy service (17.56% . Conclusions: It is concluded the need to redress unidosis carts and a computerized prescription system to avoid errors in transcription.Discussion: A high percentage of medication errors is caused by human error. If unidosis carts are overlooked before sent to hospitalization units, the error diminishes to 0.3%.

  12. Standardizing Medication Error Event Reporting in the U.S. Department of Defense

    National Research Council Canada - National Science Library

    Nosek, Ronald A., Jr; McMeekin, Judy; Rake, Geoffrey W

    2005-01-01

    ...) began an aggressive examination of medical errors and the strategies for minimizing them. A primary goal was the creation of a standardized medication event reporting system, including a central registry for the compilation of reported data...

  13. 24 CFR 401.411 - Guidelines for determining exception rents.

    Science.gov (United States)

    2010-04-01

    ... exception rents. 401.411 Section 401.411 Housing and Urban Development Regulations Relating to Housing and... RESTRUCTURING PROGRAM (MARK-TO-MARKET) Restructuring Plan § 401.411 Guidelines for determining exception rents. (a) When do exception rents apply? (1) The Restructuring Plan may provide for exception rents...

  14. Minimally invasive operative care. I. Minimal intervention and concepts for minimally invasive cavity preparations.

    Science.gov (United States)

    Peters, M C; McLean, M E

    2001-01-01

    From the mainly reparative dentistry of the 20th century, contemporary dentistry shifts towards a minimal intervention (MI) approach encompassing up-to-date caries diagnosis and risk assessment before arriving at a treatment decision. An overview is provided of incorporating MI philosophy into the field of operative dentistry. The ultimate goal of MI is to extend the lifetime of restored teeth with as little intervention as possible. When operative care is indicated, it should be aimed at "prevention of extension." Black's principles for cavity design are considered and put in the perspective of minimally invasive operative care. Guiding principles for contemporary adhesive cavities are reviewed. Contemporary operative care should be based on a minimally invasive approach. Minimal intervention is not just a technique, it is a philosophy!

  15. Minimally invasive cervical spine surgery.

    Science.gov (United States)

    Skovrlj, Branko; Qureshi, Sheeraz A

    2017-06-01

    Degenerative disorders of the cervical spine requiring surgical intervention have become increasingly more common over the past decade. Traditionally, open surgical approaches have been the mainstay of surgical treatment. More commonly, minimally invasive techniques are being developed with the intent to decrease surgical morbidity and iatrogenic spinal instability. This study will review four minimally invasive cervical techniques that have been increasingly utilized in the treatment of degenerative cervical spine disease. A series of PubMed-National Library of Medicine searches were performed. Only articles in English journals or with published with English language translations were included. Level of evidence of the selected articles was assessed. The significant incidence of postoperative dysphagia following ACDF has led to the development and increased use of zero-profile, stand-alone anterior cervical cages. The currently available literature examining the safety and effectiveness of zero-profile interbody devices supports the use of these devices in patients undergoing single-level ACDF. A multitude of studies demonstrating the significant incidence and impact of axial neck pain following open posterior spine surgery have led to a wave of research and development of techniques aimed at minimizing posterior cervical paraspinal disruption while achieving appropriate neurological decompression and/or spinal fixation. The currently available literature supports the use of minimally invasive posterior cervical laminoforaminotomy for the treatment of single-level radiculopathy. The literature suggests that fluoroscopically-assisted percutaneous cervical lateral mass screw fixation appears to be a technically feasible, safe and minimally invasive technique. Based on the currently available literature it appears that the DTRAX® expandable cage system is an effective minimally invasive posterior cervical technique for the treatment of single-level cervical

  16. Triphasic MRI of pelvic organ descent: sources of measurement error

    Energy Technology Data Exchange (ETDEWEB)

    Morren, Geert L. [Bowel and Digestion Centre, The Oxford Clinic, 38 Oxford Terrace, Christchurch (New Zealand)]. E-mail: geert_morren@hotmail.com; Balasingam, Adrian G. [Christchurch Radiology Group, P.O. Box 21107, 4th Floor, Leicester House, 291 Madras Street, Christchurch (New Zealand); Wells, J. Elisabeth [Department of Public Health and General Medicine, Christchurch School of Medicine, St. Elmo Courts, Christchurch (New Zealand); Hunter, Anne M. [Christchurch Radiology Group, P.O. Box 21107, 4th Floor, Leicester House, 291 Madras Street, Christchurch (New Zealand); Coates, Richard H. [Christchurch Radiology Group, P.O. Box 21107, 4th Floor, Leicester House, 291 Madras Street, Christchurch (New Zealand); Perry, Richard E. [Bowel and Digestion Centre, The Oxford Clinic, 38 Oxford Terrace, Christchurch (New Zealand)

    2005-05-01

    Purpose: To identify sources of error when measuring pelvic organ displacement during straining using triphasic dynamic magnetic resonance imaging (MRI). Materials and methods: Ten healthy nulliparous woman underwent triphasic dynamic 1.5 T pelvic MRI twice with 1 week between studies. The bladder was filled with 200 ml of a saline solution, the vagina and rectum were opacified with ultrasound gel. T2 weighted images in the sagittal plane were analysed twice by each of the two observers in a blinded fashion. Horizontal and vertical displacement of the bladder neck, bladder base, introitus vaginae, posterior fornix, cul-de sac, pouch of Douglas, anterior rectal wall, anorectal junction and change of the vaginal axis were measured eight times in each volunteer (two images, each read twice by two observers). Variance components were calculated for subject, observer, week, interactions of these three factors, and pure error. An overall standard error of measurement was calculated for a single observation by one observer on a film from one woman at one visit. Results: For the majority of anatomical reference points, the range of displacements measured was wide and the overall measurement error was large. Intra-observer error and week-to-week variation within a subject were important sources of measurement error. Conclusion: Important sources of measurement error when using triphasic dynamic MRI to measure pelvic organ displacement during straining were identified. Recommendations to minimize those errors are made.

  17. Minimal but non-minimal inflation and electroweak symmetry breaking

    Energy Technology Data Exchange (ETDEWEB)

    Marzola, Luca [National Institute of Chemical Physics and Biophysics,Rävala 10, 10143 Tallinn (Estonia); Institute of Physics, University of Tartu,Ravila 14c, 50411 Tartu (Estonia); Racioppi, Antonio [National Institute of Chemical Physics and Biophysics,Rävala 10, 10143 Tallinn (Estonia)

    2016-10-07

    We consider the most minimal scale invariant extension of the standard model that allows for successful radiative electroweak symmetry breaking and inflation. The framework involves an extra scalar singlet, that plays the rôle of the inflaton, and is compatibile with current experimental bounds owing to the non-minimal coupling of the latter to gravity. This inflationary scenario predicts a very low tensor-to-scalar ratio r≈10{sup −3}, typical of Higgs-inflation models, but in contrast yields a scalar spectral index n{sub s}≃0.97 which departs from the Starobinsky limit. We briefly discuss the collider phenomenology of the framework.

  18. Heavy Stable Isotopes: From Exceptional to Expected

    Science.gov (United States)

    Anbar, A.

    2006-12-01

    Less than a decade ago, the stable isotope geochemistry of transition metals and other "heavy" elements was a highly specialized niche confined to a few seemingly exceptional elements. This situation was transformed by the development and refinement of MC-ICP-MS techniques, particularly in the last five years. Measurable stable isotope variations turn out to be ubiquitous across the periodic table, from Li to Hg. It is now safe to assume that the isotopic composition of any element with two or more stable isotopes is measurably variable. What was once exceptional is now expected. Among the first of these new systems to be explored were Fe and Mo isotopes. A number of lessons emerging from this work can be applied to the development of other isotope systems. Most important is that initial expectations are often wrong. For example, based on their environmental chemistries it was expected that redox reactions should produce some of the largest isotope effects for both elements. In the case of Fe, theoretical and experimental studies converge to convincingly indicate that a fractionation of ~ 1.5 ‰/amu occurs between Fe(III) and Fe(II) aquo complexes at equilibrium (e.g., Welch et al., 2003; Anbar et al., 2005). Consistent with these findings, most natural variations of are < 1.5 ‰/amu (e.g., Johnson et al., 2004). This redox-related fractionation is at the heart of emerging interpretations of variations in the isotopic composition of Fe and their application to understanding ancient ocean redox (e.g., Dauphas et al., 2004; Rouxel et al., 2005). In contrast, Mo isotope variations turn out to be controlled only indirectly by redox conditions. Instead, one of the most important Mo isotope effects in the environment appears to be a fractionation of ~ 1 ‰/amu during adsorption of Mo to Mn-oxides (Barling et al., 2001; Siebert et al., 2003). This fractionation has been reproduced in the laboratory (Barling and Anbar, 2004) and appears to be an equilibrium isotope

  19. An approach to improving the structure of error-handling code in the linux kernel

    DEFF Research Database (Denmark)

    Saha, Suman; Lawall, Julia; Muller, Gilles

    2011-01-01

    The C language does not provide any abstractions for exception handling or other forms of error handling, leaving programmers to devise their own conventions for detecting and handling errors. The Linux coding style guidelines suggest placing error handling code at the end of each function, where...... an automatic program transformation that transforms error-handling code into this style. We have applied our transformation to the Linux 2.6.34 kernel source code, on which it reorganizes the error handling code of over 1800 functions, in about 25 minutes....

  20. Input error versus output error model reference adaptive control

    Science.gov (United States)

    Bodson, Marc; Sastry, Shankar

    1987-01-01

    Algorithms for model reference adaptive control were developed in recent years, and their stability and convergence properties have been investigated. Typical algorithms in continuous time involve strictly positive real conditions on the reference model, while similar discrete time algorithms do not require such conditions. It is shown how algorithms differ by the use of an input error versus an output error, and present a continuous time input error adaptive control algorithm which does not involve SPR conditions. The connections with other schemes are discussed. The input error scheme has general stability and ocnvergence properties that are similar to the output error scheme. However, analysis using averaging methods reveals some preferable convergence properties of the input error scheme. Several other advantages are also discussed.

  1. The Error Reporting in the ATLAS TDAQ System

    Science.gov (United States)

    Kolos, Serguei; Kazarov, Andrei; Papaevgeniou, Lykourgos

    2015-05-01

    The ATLAS Error Reporting provides a service that allows experts and shift crew to track and address errors relating to the data taking components and applications. This service, called the Error Reporting Service (ERS), gives to software applications the opportunity to collect and send comprehensive data about run-time errors, to a place where it can be intercepted in real-time by any other system component. Other ATLAS online control and monitoring tools use the ERS as one of their main inputs to address system problems in a timely manner and to improve the quality of acquired data. The actual destination of the error messages depends solely on the run-time environment, in which the online applications are operating. When an application sends information to ERS, depending on the configuration, it may end up in a local file, a database, distributed middleware which can transport it to an expert system or display it to users. Thanks to the open framework design of ERS, new information destinations can be added at any moment without touching the reporting and receiving applications. The ERS Application Program Interface (API) is provided in three programming languages used in the ATLAS online environment: C++, Java and Python. All APIs use exceptions for error reporting but each of them exploits advanced features of a given language to simplify the end-user program writing. For example, as C++ lacks language support for exceptions, a number of macros have been designed to generate hierarchies of C++ exception classes at compile time. Using this approach a software developer can write a single line of code to generate a boilerplate code for a fully qualified C++ exception class declaration with arbitrary number of parameters and multiple constructors, which encapsulates all relevant static information about the given type of issues. When a corresponding error occurs at run time, the program just need to create an instance of that class passing relevant values to one

  2. [Exceptional etiology of acute renal: Burkitt's lymphoma].

    Science.gov (United States)

    Dial, Cherif; Doh, Kwame; Thiam, Ibou; Faye, Mariam; Woto-Gaye, Gisèle

    2018-02-05

    Burkitt's lymphoma (BL) is an exceptional cause of acute renal failure (ARF). The origin of the tumor clone may be lymphoid follicles secondary to renal Epstein-Barr virus (EBV) infection. With the presentation of this clinical case, the pathogenesis, diagnostic criteria and evolution of this extremely rare affection will be discussed. A 4-year-old patient with a recent history of acute osteomyelitis of the right thigh presented an ARF without indications of post-infectious glomerulonephritis. Ultrasound showed enlarged kidneys without dilation of the excretory cavities. Diffuse interstitial infiltration of atypical lymphoid cells of medium size were noted upon renal biopsy. The tumor cells expressed antibodies against CD20, CD10, Bcl6, and Ki67 but not against Bcl2 or CD3. The search for an EBV infection was positive. A few days after diagnosis, the evolution was spontaneously fatal. BL of the kidney is a rare condition that accounts for less than 1 % of kidney tumors, associated almost invariably with EBV infection. The diagnosis is confirmed histologically by renal biopsy and the criteria of Malbrain affirms the primitive character of the lymphoma. BL of the kidney is a diagnostic and therapeutic emergency and may be fatal. Copyright © 2018 Société francophone de néphrologie, dialyse et transplantation. Published by Elsevier Masson SAS. All rights reserved.

  3. Exceptional groups, symmetric spaces and applications

    Energy Technology Data Exchange (ETDEWEB)

    Cerchiai, Bianca L.; Cacciatori, Sergio L.

    2009-03-31

    In this article we provide a detailed description of a technique to obtain a simple parameterization for different exceptional Lie groups, such as G{sub 2}, F{sub 4} and E{sub 6}, based on their fibration structure. For the compact case, we construct a realization which is a generalization of the Euler angles for SU(2), while for the non compact version of G{sub 2(2)}/SO(4) we compute the Iwasawa decomposition. This allows us to obtain not only an explicit expression for the Haar measure on the group manifold, but also for the cosets G{sub 2}/SO(4), G{sub 2}/SU(3), F{sub 4}/Spin(9), E{sub 6}/F{sub 4} and G{sub 2(2)}/SO(4) that we used to find the concrete realization of the general element of the group. Moreover, as a by-product, in the simplest case of G{sub 2}/SO(4), we have been able to compute an Einstein metric and the vielbein. The relevance of these results in physics is discussed.

  4. Exceptionally elevated triglyceride in severe lipemia retinalis

    Directory of Open Access Journals (Sweden)

    Yin HY

    2016-10-01

    Full Text Available Han Y Yin,1–3 Roberto Warman,2,4 Edward H Suh,2 Anny MS Cheng2,3 1Wayne State University, School of Medicine, Detroit, MI, 2Department of Ophthalmology, Florida International University, Herbert Wertheim College of Medicine, 3Ocular Surface Center, Miami, 4Division of Pediatric Ophthalmology, Nicklaus Children’s Hospital, Miami, FL, USA Purpose: To report a case of successful treatment for severe lipemia retinalis with extreme severe hypertriglyceridemia (sHTG.Design: Observational case report.Observations: A 6-week-old infant with severe lipemia retinalis manifested diffuse creamy retinal vessels complicated with vulvar xanthomas. Extreme sHTG with 185-folds of the normal level was reported. Chromosome microarray and lipid gene sequencing confirmed a homozygous lipoprotein lipase gene coding mutation.Results: Under strict adherence to a high medium-chain triglycerides formula and discontinuation of breast milk, the lipemia retinalis and vulval lesions resolved along with a stable plasma lipid level throughout the follow-up period of 6 months.Conclusion: Strict adherence to a low-fat diet without breast milk appears to be effective in treating infants with severe lipemia retinalis associated with exceptionally high triglycerides. Keywords: hypertriglyceride, infant, lipemia retinalis, lipoprotein lipase gene

  5. Exceptional longevity is associated with decreased reproduction.

    Science.gov (United States)

    Tabatabaie, Vafa; Atzmon, Gil; Rajpathak, Swapnil N; Freeman, Ruth; Barzilai, Nir; Crandall, Jill

    2011-12-01

    A number of leading theories of aging, namely The Antagonistic Pleiotropy Theory (Williams, 1957), The Disposable Soma Theory (Kirkwood, 1977) and most recently The Reproductive-Cell Cycle Theory (Bowen and Atwood, 2004, 2010) suggest a tradeoff between longevity and reproduction. While there has been an abundance of data linking longevity with reduced fertility in lower life forms, human data have been conflicting. We assessed this tradeoff in a cohort of genetically and socially homogenous Ashkenazi Jewish centenarians (average age ~100 years). As compared with an Ashkenazi cohort without exceptional longevity, our centenarians had fewer children (2.01 vs 2.53, p<0.0001), were older at first childbirth (28.0 vs 25.6, p<0.0001), and at last childbirth (32.4 vs 30.3, p<0.0001). The smaller number of children was observed for male and female centenarians alike. The lower number of children in both genders together with the pattern of delayed reproductive maturity is suggestive of constitutional factors that might enhance human life span at the expense of reduced reproductive ability.

  6. Medication error in anaesthesia and critical care: A cause for concern

    Directory of Open Access Journals (Sweden)

    Dilip Kothari

    2010-01-01

    Full Text Available Medication error is a major cause of morbidity and mortality in medical profession, and anaesthesia and critical care are no exception to it. Man, medicine, machine and modus operandi are the main contributory factors to it. In this review, incidence, types, risk factors and preventive measures of the medication errors are discussed in detail.

  7. Contour Error Map Algorithm

    Science.gov (United States)

    Merceret, Francis; Lane, John; Immer, Christopher; Case, Jonathan; Manobianco, John

    2005-01-01

    The contour error map (CEM) algorithm and the software that implements the algorithm are means of quantifying correlations between sets of time-varying data that are binarized and registered on spatial grids. The present version of the software is intended for use in evaluating numerical weather forecasts against observational sea-breeze data. In cases in which observational data come from off-grid stations, it is necessary to preprocess the observational data to transform them into gridded data. First, the wind direction is gridded and binarized so that D(i,j;n) is the input to CEM based on forecast data and d(i,j;n) is the input to CEM based on gridded observational data. Here, i and j are spatial indices representing 1.25-km intervals along the west-to-east and south-to-north directions, respectively; and n is a time index representing 5-minute intervals. A binary value of D or d = 0 corresponds to an offshore wind, whereas a value of D or d = 1 corresponds to an onshore wind. CEM includes two notable subalgorithms: One identifies and verifies sea-breeze boundaries; the other, which can be invoked optionally, performs an image-erosion function for the purpose of attempting to eliminate river-breeze contributions in the wind fields.

  8. Error analysis in laparoscopic surgery

    Science.gov (United States)

    Gantert, Walter A.; Tendick, Frank; Bhoyrul, Sunil; Tyrrell, Dana; Fujino, Yukio; Rangel, Shawn; Patti, Marco G.; Way, Lawrence W.

    1998-06-01

    Iatrogenic complications in laparoscopic surgery, as in any field, stem from human error. In recent years, cognitive psychologists have developed theories for understanding and analyzing human error, and the application of these principles has decreased error rates in the aviation and nuclear power industries. The purpose of this study was to apply error analysis to laparoscopic surgery and evaluate its potential for preventing complications. Our approach is based on James Reason's framework using a classification of errors according to three performance levels: at the skill- based performance level, slips are caused by attention failures, and lapses result form memory failures. Rule-based mistakes constitute the second level. Knowledge-based mistakes occur at the highest performance level and are caused by shortcomings in conscious processing. These errors committed by the performer 'at the sharp end' occur in typical situations which often times are brought about by already built-in latent system failures. We present a series of case studies in laparoscopic surgery in which errors are classified and the influence of intrinsic failures and extrinsic system flaws are evaluated. Most serious technical errors in lap surgery stem from a rule-based or knowledge- based mistake triggered by cognitive underspecification due to incomplete or illusory visual input information. Error analysis in laparoscopic surgery should be able to improve human performance, and it should detect and help eliminate system flaws. Complication rates in laparoscopic surgery due to technical errors can thus be considerably reduced.

  9. Sepsis: Medical errors in Poland.

    Science.gov (United States)

    Rorat, Marta; Jurek, Tomasz

    2016-01-01

    Health, safety and medical errors are currently the subject of worldwide discussion. The authors analysed medico-legal opinions trying to determine types of medical errors and their impact on the course of sepsis. The authors carried out a retrospective analysis of 66 medico-legal opinions issued by the Wroclaw Department of Forensic Medicine between 2004 and 2013 (at the request of the prosecutor or court) in cases examined for medical errors. Medical errors were confirmed in 55 of the 66 medico-legal opinions. The age of victims varied from 2 weeks to 68 years; 49 patients died. The analysis revealed medical errors committed by 113 health-care workers: 98 physicians, 8 nurses and 8 emergency medical dispatchers. In 33 cases, an error was made before hospitalisation. Hospital errors occurred in 35 victims. Diagnostic errors were discovered in 50 patients, including 46 cases of sepsis being incorrectly recognised and insufficient diagnoses in 37 cases. Therapeutic errors occurred in 37 victims, organisational errors in 9 and technical errors in 2. In addition to sepsis, 8 patients also had a severe concomitant disease and 8 had a chronic disease. In 45 cases, the authors observed glaring errors, which could incur criminal liability. There is an urgent need to introduce a system for reporting and analysing medical errors in Poland. The development and popularisation of standards for identifying and treating sepsis across basic medical professions is essential to improve patient safety and survival rates. Procedures should be introduced to prevent health-care workers from administering incorrect treatment in cases. © The Author(s) 2015.

  10. Inelastic scattering with Chebyshev polynomials and preconditioned conjugate gradient minimization.

    Science.gov (United States)

    Temel, Burcin; Mills, Greg; Metiu, Horia

    2008-03-27

    We describe and test an implementation, using a basis set of Chebyshev polynomials, of a variational method for solving scattering problems in quantum mechanics. This minimum error method (MEM) determines the wave function Psi by minimizing the least-squares error in the function (H Psi - E Psi), where E is the desired scattering energy. We compare the MEM to an alternative, the Kohn variational principle (KVP), by solving the Secrest-Johnson model of two-dimensional inelastic scattering, which has been studied previously using the KVP and for which other numerical solutions are available. We use a conjugate gradient (CG) method to minimize the error, and by preconditioning the CG search, we are able to greatly reduce the number of iterations necessary; the method is thus faster and more stable than a matrix inversion, as is required in the KVP. Also, we avoid errors due to scattering off of the boundaries, which presents substantial problems for other methods, by matching the wave function in the interaction region to the correct asymptotic states at the specified energy; the use of Chebyshev polynomials allows this boundary condition to be implemented accurately. The use of Chebyshev polynomials allows for a rapid and accurate evaluation of the kinetic energy. This basis set is as efficient as plane waves but does not impose an artificial periodicity on the system. There are problems in surface science and molecular electronics which cannot be solved if periodicity is imposed, and the Chebyshev basis set is a good alternative in such situations.

  11. Minimally Invasive Video-Assisted versus Minimally Invasive Nonendoscopic Thyroidectomy

    Directory of Open Access Journals (Sweden)

    Zdeněk Fík

    2014-01-01

    Full Text Available Minimally invasive video-assisted thyroidectomy (MIVAT and minimally invasive nonendoscopic thyroidectomy (MINET represent well accepted and reproducible techniques developed with the main goal to improve cosmetic outcome, accelerate healing, and increase patient’s comfort following thyroid surgery. Between 2007 and 2011, a prospective nonrandomized study of patients undergoing minimally invasive thyroid surgery was performed to compare advantages and disadvantages of the two different techniques. There were no significant differences in the length of incision to perform surgical procedures. Mean duration of hemithyroidectomy was comparable in both groups, but it was more time consuming to perform total thyroidectomy by MIVAT. There were more patients undergoing MIVAT procedures without active drainage in the postoperative course and we also could see a trend for less pain in the same group. This was paralleled by statistically significant decreased administration of both opiates and nonopiate analgesics. We encountered two cases of recurrent laryngeal nerve palsies in the MIVAT group only. MIVAT and MINET represent safe and feasible alternative to conventional thyroid surgery in selected cases and this prospective study has shown minimal differences between these two techniques.

  12. Minimizing Costs Can Be Costly

    Directory of Open Access Journals (Sweden)

    Rasmus Rasmussen

    2010-01-01

    Full Text Available A quite common practice, even in academic literature, is to simplify a decision problem and model it as a cost-minimizing problem. In fact, some type of models has been standardized to minimization problems, like Quadratic Assignment Problems (QAPs, where a maximization formulation would be treated as a “generalized” QAP and not solvable by many of the specially designed softwares for QAP. Ignoring revenues when modeling a decision problem works only if costs can be separated from the decisions influencing revenues. More often than we think this is not the case, and minimizing costs will not lead to maximized profit. This will be demonstrated using spreadsheets to solve a small example. The example is also used to demonstrate other pitfalls in network models: the inability to generally balance the problem or allocate costs in advance, and the tendency to anticipate a specific type of solution and thereby make constraints too limiting when formulating the problem.

  13. Minimal Marking: A Success Story

    Directory of Open Access Journals (Sweden)

    Anne McNeilly

    2014-11-01

    Full Text Available The minimal-marking project conducted in Ryerson’s School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The “minimal-marking” concept (Haswell, 1983, which requires dramatically more student engagement, resulted in more successful learning outcomes for surface-level knowledge acquisition than the more traditional approach of “teacher-corrects-all.” Results suggest it would be effective, not just for grammar, punctuation, and word usage, the objective here, but for any material that requires rote-memory learning, such as the Associated Press or Canadian Press style rules used by news publications across North America.

  14. Techniques for correcting transmission errors in video adaptive delta modulation channels

    Science.gov (United States)

    Scheinberg, N.; Schilling, D. L.

    1976-01-01

    We investigated the effects of channel errors on an adaptive delta modulator used to encode video signals. The investigation revealed that channel errors caused a permanent shift in the dc level of the delta modulator's estimate. Errors in the step size, on the other hand, were transitory and had no noticeable effect on the received pictures. We then presented three error correction schemes to minimize the visibility of the channel errors in the received picture. The first scheme required the transmitter to periodically send the correct dc level of the estimate to the receiver. The second method employed a leaky integrator, and the third method used line-to-line interpolation

  15. Accuracy enhancement of dual rotating mueller matrix imaging polarimeter by diattenuation and retardance error calibration approach

    Science.gov (United States)

    Bhattacharyya, Kaustav; Serrano-García, David Ignacio; Otani, Yukitoshi

    2017-06-01

    We present a new calibration method to minimize the errors due to non-ideal retarders of a dual rotating Mueller matrix polarimeter. To increase the accuracy of the dual rotating retarder polarimeter, it is necessary to compensate the errors caused by the inaccuracy of retarders. Although calibration method for retardance already exists, limitations on the accuracy have been obtained by considering only the retardance errors. In the proposed model we added the calibration of diattenuation error of the retarder along with retardance error on the standard model. An enhancement in the accuracy of the system is obtained. The proposed model is described with equations and supporting experimental results are presented.

  16. Does Minimally Invasive Spine Surgery Minimize Surgical Site Infections?

    Science.gov (United States)

    Patel, Ravish Shammi; Dutta, Shumayou

    2016-01-01

    Study Design Retrospective review of prospectively collected data. Purpose To evaluate the incidence of surgical site infections (SSIs) in minimally invasive spine surgery (MISS) in a cohort of patients and compare with available historical data on SSI in open spinal surgery cohorts, and to evaluate additional direct costs incurred due to SSI. Overview of Literature SSI can lead to prolonged antibiotic therapy, extended hospitalization, repeated operations, and implant removal. Small incisions and minimal dissection intrinsic to MISS may minimize the risk of postoperative infections. However, there is a dearth of literature on infections after MISS and their additional direct financial implications. Methods All patients from January 2007 to January 2015 undergoing posterior spinal surgery with tubular retractor system and microscope in our institution were included. The procedures performed included tubular discectomies, tubular decompressions for spinal stenosis and minimal invasive transforaminal lumbar interbody fusion (TLIF). The incidence of postoperative SSI was calculated and compared to the range of cited SSI rates from published studies. Direct costs were calculated from medical billing for index cases and for patients with SSI. Results A total of 1,043 patients underwent 763 noninstrumented surgeries (discectomies, decompressions) and 280 instrumented (TLIF) procedures. The mean age was 52.2 years with male:female ratio of 1.08:1. Three infections were encountered with fusion surgeries (mean detection time, 7 days). All three required wound wash and debridement with one patient requiring unilateral implant removal. Additional direct cost due to infection was $2,678 per 100 MISS-TLIF. SSI increased hospital expenditure per patient 1.5-fold after instrumented MISS. Conclusions Overall infection rate after MISS was 0.29%, with SSI rate of 0% in non-instrumented MISS and 1.07% with instrumented MISS. MISS can markedly reduce the SSI rate and can be an

  17. Anatomic, clinical, and neuropsychological correlates of spelling errors in primary progressive aphasia.

    Science.gov (United States)

    Shim, Hyungsub; Hurley, Robert S; Rogalski, Emily; Mesulam, M-Marsel

    2012-07-01

    This study evaluates spelling errors in the three subtypes of primary progressive aphasia (PPA): agrammatic (PPA-G), logopenic (PPA-L), and semantic (PPA-S). Forty-one PPA patients and 36 age-matched healthy controls were administered a test of spelling. The total number of errors and types of errors in spelling to dictation of regular words, exception words and nonwords, were recorded. Error types were classified based on phonetic plausibility. In the first analysis, scores were evaluated by clinical diagnosis. Errors in spelling exception words and phonetically plausible errors were seen in PPA-S. Conversely, PPA-G was associated with errors in nonword spelling and phonetically implausible errors. In the next analysis, spelling scores were correlated to other neuropsychological language test scores. Significant correlations were found between exception word spelling and measures of naming and single word comprehension. Nonword spelling correlated with tests of grammar and repetition. Global language measures did not correlate significantly with spelling scores, however. Cortical thickness analysis based on MRI showed that atrophy in several language regions of interest were correlated with spelling errors. Atrophy in the left supramarginal gyrus and inferior frontal gyrus (IFG) pars orbitalis correlated with errors in nonword spelling, while thinning in the left temporal pole and fusiform gyrus correlated with errors in exception word spelling. Additionally, phonetically implausible errors in regular word spelling correlated with thinning in the left IFG pars triangularis and pars opercularis. Together, these findings suggest two independent systems for spelling to dictation, one phonetic (phoneme to grapheme conversion), and one lexical (whole word retrieval). Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Minimal Flavor Constraints for Technicolor

    DEFF Research Database (Denmark)

    Sakuma, Hidenori; Sannino, Francesco

    2010-01-01

    We analyze the constraints on the the vacuum polarization of the standard model gauge bosons from a minimal set of flavor observables valid for a general class of models of dynamical electroweak symmetry breaking. We will show that the constraints have a strong impact on the self-coupling and mas......We analyze the constraints on the the vacuum polarization of the standard model gauge bosons from a minimal set of flavor observables valid for a general class of models of dynamical electroweak symmetry breaking. We will show that the constraints have a strong impact on the self...

  19. The Usability-Error Ontology

    DEFF Research Database (Denmark)

    Elkin, Peter L.; Beuscart-zephir, Marie-Catherine; Pelayo, Sylvia

    2013-01-01

    in patients coming to harm. Often the root cause analysis of these adverse events can be traced back to Usability Errors in the Health Information Technology (HIT) or its interaction with users. Interoperability of the documentation of HIT related Usability Errors in a consistent fashion can improve our...... ability to do systematic reviews and meta-analyses. In an effort to support improved and more interoperable data capture regarding Usability Errors, we have created the Usability Error Ontology (UEO) as a classification method for representing knowledge regarding Usability Errors. We expect the UEO...... will grow over time to support an increasing number of HIT system types. In this manuscript, we present this Ontology of Usability Error Types and specifically address Computerized Physician Order Entry (CPOE), Electronic Health Records (EHR) and Revenue Cycle HIT systems....

  20. A Year of Exceptional Achievements FY 2008

    Energy Technology Data Exchange (ETDEWEB)

    devore, L; Chrzanowski, P

    2008-11-06

    2008 highlights: (1) Stockpile Stewardship and Complex Transformation - LLNL achieved scientific breakthroughs that explain some of the key 'unknowns' in nuclear weapons performance and are critical to developing the predictive science needed to ensure the safety, reliability, and security of the U.S. nuclear deterrent without nuclear testing. In addition, the National Ignition Facility (NIF) passed 99 percent completion, an LLNL supercomputer simulation won the 2007 Gordon Bell Prize, and a significant fraction of our inventory of special nuclear material was shipped to other sites in support of complex transformation. (2) National and Global Security - Laboratory researchers delivered insights, technologies, and operational capabilities that are helping to ensure national security and global stability. Of particular note, they developed advanced detection instruments that provide increased speed, accuracy, specificity, and resolution for identifying and characterizing biological, chemical, nuclear, and high-explosive threats. (3) Exceptional Science and Technology - The Laboratory continued its tradition of scientific excellence and technical innovation. LLNL scientists made significant contributions to Nobel Prize-winning work on climate change. LLNL also received three R&D 100 awards and six Nanotech 50 awards, and dozens of Laboratory scientists and engineers were recognized with professional awards. These honors provide valuable confirmation that peers and outside experts recognize the quality of our staff and our work. (4) Enhanced Business and Operations - A major thrust under LLNS is to make the Laboratory more efficient and cost competitive. We achieved roughly $75 million in cost savings for support activities through organizational changes, consolidation of services, improved governance structures and work processes, technology upgrades, and systems shared with Los Alamos National Laboratory. We realized nonlabor cost savings of $23 million

  1. VIOLENCE AGAINST TEACHERS- RULE OR EXCEPTION?

    Directory of Open Access Journals (Sweden)

    Siniša Opić

    2013-12-01

    Full Text Available 800x600 Abstract- The objective of this study is to examine the prevalence of violence against teachers by students. The study included 175 teachers, five primary and five secondary schools. The age of respondents (teachers ranges from 20 to 65, with average age being 44,33 years. The used  instrument has assessed violence against teachers and has consisted of  data about the characteristics of respondents, frequency and type of violence experienced from students.The results suggest that violence against teachers in primary and secondary schools in Zagreb taken into sample is very much present. Since 74,3% teachers has experienced violence from their students during the year, that kind of behavior is more of a rule than an exception. Students in primary and secondary schools show violent behavior against their teachers at an equal level. Male teachers, as opposed to female teachers, are more frequently victims of violent behavior (posting inappropriate content online from their students. Also, there is a statistically significant correlation (negative between age (years of service in school and frequency of experienced violence from students. Normal 0 21 false false false HR X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Obična tablica"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";}

  2. Processor register error correction management

    Energy Technology Data Exchange (ETDEWEB)

    Bose, Pradip; Cher, Chen-Yong; Gupta, Meeta S.

    2016-12-27

    Processor register protection management is disclosed. In embodiments, a method of processor register protection management can include determining a sensitive logical register for executable code generated by a compiler, generating an error-correction table identifying the sensitive logical register, and storing the error-correction table in a memory accessible by a processor. The processor can be configured to generate a duplicate register of the sensitive logical register identified by the error-correction table.

  3. GOMOS data characterisation and error estimation

    Directory of Open Access Journals (Sweden)

    J. Tamminen

    2010-10-01

    Full Text Available The Global Ozone Monitoring by Occultation of Stars (GOMOS instrument uses stellar occultation technique for monitoring ozone, other trace gases and aerosols in the stratosphere and mesosphere. The self-calibrating measurement principle of GOMOS together with a relatively simple data retrieval where only minimal use of a priori data is required provides excellent possibilities for long-term monitoring of atmospheric composition.

    GOMOS uses about 180 of the brightest stars as its light source. Depending on the individual spectral characteristics of the stars, the signal-to-noise ratio of GOMOS varies from star to star, resulting also in varying accuracy of retrieved profiles. We present here an overview of the GOMOS data characterisation and error estimation, including modeling errors, for O3, NO2, NO3, and aerosol profiles. The retrieval error (precision of night-time measurements in the stratosphere is typically 0.5–4% for ozone, about 10–20% for NO2, 20–40% for NO3 and 2–50% for aerosols. Mesospheric O3, up to 100 km, can be measured with 2–10% precision. The main sources of the modeling error are incompletely corrected scintillation, inaccurate aerosol modeling, uncertainties in cross sections of trace gases and in atmospheric temperature. The sampling resolution of GOMOS varies depending on the measurement geometry. In the data inversion a Tikhonov-type regularization with pre-defined target resolution requirement is applied leading to 2–3 km vertical resolution for ozone and 4 km resolution for other trace gases and aerosols.

  4. Exceptionally prolonged tooth formation in elasmosaurid plesiosaurians.

    Science.gov (United States)

    Kear, Benjamin P; Larsson, Dennis; Lindgren, Johan; Kundrát, Martin

    2017-01-01

    Elasmosaurid plesiosaurians were globally prolific marine reptiles that dominated the Mesozoic seas for over 70 million years. Their iconic body-plan incorporated an exceedingly long neck and small skull equipped with prominent intermeshing 'fangs'. How this bizarre dental apparatus was employed in feeding is uncertain, but fossilized gut contents indicate a diverse diet of small pelagic vertebrates, cephalopods and epifaunal benthos. Here we report the first plesiosaurian tooth formation rates as a mechanism for servicing the functional dentition. Multiple dentine thin sections were taken through isolated elasmosaurid teeth from the Upper Cretaceous of Sweden. These specimens revealed an average of 950 daily incremental lines of von Ebner, and infer a remarkably protracted tooth formation cycle of about 2-3 years-other polyphyodont amniotes normally take ~1-2 years to form their teeth. Such delayed odontogenesis might reflect differences in crown length and function within an originally uneven tooth array. Indeed, slower replacement periodicity has been found to distinguish larger caniniform teeth in macrophagous pliosaurid plesiosaurians. However, the archetypal sauropterygian dental replacement system likely also imposed constraints via segregation of the developing tooth germs within discrete bony crypts; these partly resorbed to allow maturation of the replacement teeth within the primary alveoli after displacement of the functional crowns. Prolonged dental formation has otherwise been linked to tooth robustness and adaption for vigorous food processing. Conversely, elasmosaurids possessed narrow crowns with an elongate profile that denotes structural fragility. Their apparent predilection for easily subdued prey could thus have minimized this potential for damage, and was perhaps coupled with selective feeding strategies that ecologically optimized elasmosaurids towards more delicate middle trophic level aquatic predation.

  5. Structure Errors in System Identification

    Science.gov (United States)

    Bekey, G. A.; Hadaegh, F. Y.

    1984-01-01

    An approach to system identification is presented which explicitly takes structure errors into account and hence provides a systematic way for answering questions concerning the magnitude of estimated parameter errors resulting from structural errors. It is indicated that, from this point of view, it is possible to define near equivalence between process and model and to obtain meaningful theoretical results on solution error system identification. It remains to apply these results to large realistic problems such as those involving models of complex man machine systems.

  6. Identifying Error in AUV Communication

    National Research Council Canada - National Science Library

    Coleman, Joseph; Merrill, Kaylani; O'Rourke, Michael; Rajala, Andrew G; Edwards, Dean B

    2006-01-01

    Mine Countermeasures (MCM) involving Autonomous Underwater Vehicles (AUVs) are especially susceptible to error, given the constraints on underwater acoustic communication and the inconstancy of the underwater communication channel...

  7. Heuristic errors in clinical reasoning.

    Science.gov (United States)

    Rylander, Melanie; Guerrasio, Jeannette

    2016-08-01

    Errors in clinical reasoning contribute to patient morbidity and mortality. The purpose of this study was to determine the types of heuristic errors made by third-year medical students and first-year residents. This study surveyed approximately 150 clinical educators inquiring about the types of heuristic errors they observed in third-year medical students and first-year residents. Anchoring and premature closure were the two most common errors observed amongst third-year medical students and first-year residents. There was no difference in the types of errors observed in the two groups. Errors in clinical reasoning contribute to patient morbidity and mortality Clinical educators perceived that both third-year medical students and first-year residents committed similar heuristic errors, implying that additional medical knowledge and clinical experience do not affect the types of heuristic errors made. Further work is needed to help identify methods that can be used to reduce heuristic errors early in a clinician's education. © 2015 John Wiley & Sons Ltd.

  8. Measurement Error and Equating Error in Power Analysis

    Science.gov (United States)

    Phillips, Gary W.; Jiang, Tao

    2016-01-01

    Power analysis is a fundamental prerequisite for conducting scientific research. Without power analysis the researcher has no way of knowing whether the sample size is large enough to detect the effect he or she is looking for. This paper demonstrates how psychometric factors such as measurement error and equating error affect the power of…

  9. A Hybrid Unequal Error Protection / Unequal Error Resilience ...

    African Journals Online (AJOL)

    The quality layers are then assigned an Unequal Error Resilience to synchronization loss by unequally allocating the number of headers available for synchronization to them. Following that Unequal Error Protection against channel noise is provided to the layers by the use of Rate Compatible Punctured Convolutional ...

  10. An ECMA-55 Minimal BASIC Compiler for x86-64 Linux®

    Directory of Open Access Journals (Sweden)

    John Gatewood Ham

    2014-10-01

    Full Text Available This paper describes a new non-optimizing compiler for the ECMA-55 Minimal BASIC language that generates x86-64 assembler code for use on the x86-64 Linux® [1] 3.x platform. The compiler was implemented in C99 and the generated assembly language is in the AT&T style and is for the GNU assembler. The generated code is stand-alone and does not require any shared libraries to run, since it makes system calls to the Linux® kernel directly. The floating point math uses the Single Instruction Multiple Data (SIMD instructions and the compiler fully implements all of the floating point exception handling required by the ECMA-55 standard. This compiler is designed to be small, simple, and easy to understand for people who want to study a compiler that actually implements full error checking on floating point on x86-64 CPUs even if those people have little programming experience. The generated assembly code is also designed to be simple to read.

  11. Chemical basis for minimal cognition

    DEFF Research Database (Denmark)

    Hanczyc, Martin; Ikegami, Takashi

    -movement. Different from the mere physicalchemical process, any life system preserves its own identity and consistency with respect to the environment. This homeostasis, rooted on the sensory motor couplings, will organize minimal cognition (see also, Ikegami , T. et al., 2008, BioSys., 91, p.388 ]...

  12. A Defense of Semantic Minimalism

    Science.gov (United States)

    Kim, Su

    2012-01-01

    Semantic Minimalism is a position about the semantic content of declarative sentences, i.e., the content that is determined entirely by syntax. It is defined by the following two points: "Point 1": The semantic content is a complete/truth-conditional proposition. "Point 2": The semantic content is useful to a theory of…

  13. Minimally invasive aortic valve replacement

    DEFF Research Database (Denmark)

    Foghsgaard, Signe; Schmidt, Thomas Andersen; Kjaergard, Henrik K

    2009-01-01

    In this descriptive prospective study, we evaluate the outcomes of surgery in 98 patients who were scheduled to undergo minimally invasive aortic valve replacement. These patients were compared with a group of 50 patients who underwent scheduled aortic valve replacement through a full sternotomy...

  14. Harm minimization among teenage drinkers

    DEFF Research Database (Denmark)

    Jørgensen, Morten Hulvej; Curtis, Tine; Christensen, Pia Haudrup

    2007-01-01

    . In regulating the social context of drinking they relied on their personal experiences more than on formalized knowledge about alcohol and harm, which they had learned from prevention campaigns and educational programmes. CONCLUSIONS: In this study we found that teenagers may help each other to minimize alcohol...

  15. Une Exception Francaise: Les Grandes Ecoles (A French Exception: The Great Schools).

    Science.gov (United States)

    Kimmel, Alain

    1996-01-01

    Examines the role of exceptional schools in France that have produced famous personages such as Charles de Gaulle and Jean-Paul Sartre. The schools reviewed include L'Ecole Nationale d'Administration, L'Ecole Polytechnique, L'Ecole Normale Superieure, L'Ecole des Hautes Etudes Commerciales, Saint-Cyr, and L'Institut d'Etudes Politiques de Paris.…

  16. Influence of errors in the dimensions of a switched parasitic array on gain and impedance match

    CSIR Research Space (South Africa)

    Mofolo, MRO

    2012-09-01

    Full Text Available on the performance of the antenna system, for the given antenna specifications. Index Terms?Monte Carlo simulation; Impedance; Gain; Antenna arrays; Error analysis; ESPAR. I. INTRODUCTION With an increasing demand for wireless broadband, especially for rural....g. mechanical dimensions, may be as a result of random errors. Although random errors may be minimal, they have influence on the antenna performance attributes such as gain, side lobes, and beam positioning precision [8?11] and require to be quantified...

  17. Sensor Interaction as a Source of the Electromagnetic Field Measurement Error

    Directory of Open Access Journals (Sweden)

    Hartansky R.

    2014-12-01

    Full Text Available The article deals with analytical calculation and numerical simulation of interactive influence of electromagnetic sensors. Sensors are components of field probe, whereby their interactive influence causes the measuring error. Electromagnetic field probe contains three mutually perpendicular spaced sensors in order to measure the vector of electrical field. Error of sensors is enumerated with dependence on interactive position of sensors. Based on that, proposed were recommendations for electromagnetic field probe construction to minimize the sensor interaction and measuring error.

  18. Error coding simulations in C

    Science.gov (United States)

    Noble, Viveca K.

    1994-10-01

    When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.

  19. Predictive error analysis for a water resource management model

    Science.gov (United States)

    Gallagher, Mark; Doherty, John

    2007-02-01

    SummaryIn calibrating a model, a set of parameters is assigned to the model which will be employed for the making of all future predictions. If these parameters are estimated through solution of an inverse problem, formulated to be properly posed through either pre-calibration or mathematical regularisation, then solution of this inverse problem will, of necessity, lead to a simplified parameter set that omits the details of reality, while still fitting historical data acceptably well. Furthermore, estimates of parameters so obtained will be contaminated by measurement noise. Both of these phenomena will lead to errors in predictions made by the model, with the potential for error increasing with the hydraulic property detail on which the prediction depends. Integrity of model usage demands that model predictions be accompanied by some estimate of the possible errors associated with them. The present paper applies theory developed in a previous work to the analysis of predictive error associated with a real world, water resource management model. The analysis offers many challenges, including the fact that the model is a complex one that was partly calibrated by hand. Nevertheless, it is typical of models which are commonly employed as the basis for the making of important decisions, and for which such an analysis must be made. The potential errors associated with point-based and averaged water level and creek inflow predictions are examined, together with the dependence of these errors on the amount of averaging involved. Error variances associated with predictions made by the existing model are compared with "optimized error variances" that could have been obtained had calibration been undertaken in such a way as to minimize predictive error variance. The contributions by different parameter types to the overall error variance of selected predictions are also examined.

  20. Slope Error Measurement Tool for Solar Parabolic Trough Collectors: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Stynes, J. K.; Ihas, B.

    2012-04-01

    The National Renewable Energy Laboratory (NREL) has developed an optical measurement tool for parabolic solar collectors that measures the combined errors due to absorber misalignment and reflector slope error. The combined absorber alignment and reflector slope errors are measured using a digital camera to photograph the reflected image of the absorber in the collector. Previous work using the image of the reflection of the absorber finds the reflector slope errors from the reflection of the absorber and an independent measurement of the absorber location. The accuracy of the reflector slope error measurement is thus dependent on the accuracy of the absorber location measurement. By measuring the combined reflector-absorber errors, the uncertainty in the absorber location measurement is eliminated. The related performance merit, the intercept factor, depends on the combined effects of the absorber alignment and reflector slope errors. Measuring the combined effect provides a simpler measurement and a more accurate input to the intercept factor estimate. The minimal equipment and setup required for this measurement technique make it ideal for field measurements.

  1. Minimal cut sets in biochemical reaction networks

    National Research Council Canada - National Science Library

    Klamt, Steffen; Gilles, Ernst Dieter

    2004-01-01

    .... We introduce the concept of minimal cut sets for biochemical networks. A minimal cut set (MCS) is a minimal (irreducible) set of reactions in the network whose inactivation will definitely lead to a failure in certain network functions...

  2. On the geometry of thin exceptional sets in Manin's conjecture

    DEFF Research Database (Denmark)

    Lehmann, Brian; Tanimoto, Sho

    2017-01-01

    Manin’s Conjecture predicts the rate of growth of rational points of a bounded height after removing those lying on an exceptional set. We study whether the exceptional set in Manin’s Conjecture is a thin set....

  3. Dual Processing and Diagnostic Errors

    Science.gov (United States)

    Norman, Geoff

    2009-01-01

    In this paper, I review evidence from two theories in psychology relevant to diagnosis and diagnostic errors. "Dual Process" theories of thinking, frequently mentioned with respect to diagnostic error, propose that categorization decisions can be made with either a fast, unconscious, contextual process called System 1 or a slow, analytical,…

  4. Barriers to Medical Error Reporting.

    Science.gov (United States)

    Poorolajal, Jalal; Rezaie, Shirin; Aghighi, Negar

    2015-01-01

    This study was conducted to explore the prevalence of medical error underreporting and associated barriers. This cross-sectional study was performed from September to December 2012. Five hospitals, affiliated with Hamadan University of Medical Sciences, in Hamedan, Iran were investigated. A self-administered questionnaire was used for data collection. Participants consisted of physicians, nurses, midwives, residents, interns, and staffs of radiology and laboratory departments. Overall, 50.26% of subjects had committed but not reported medical errors. The main reasons mentioned for underreporting were lack of effective medical error reporting system (60.0%), lack of proper reporting form (51.8%), lack of peer supporting a person who has committed an error (56.0%), and lack of personal attention to the importance of medical errors (62.9%). The rate of committing medical errors was higher in men (71.4%), age of 50-40 years (67.6%), less-experienced personnel (58.7%), educational level of MSc (87.5%), and staff of radiology department (88.9%). This study outlined the main barriers to reporting medical errors and associated factors that may be helpful for healthcare organizations in improving medical error reporting as an essential component for patient safety enhancement.

  5. Barriers to medical error reporting

    Directory of Open Access Journals (Sweden)

    Jalal Poorolajal

    2015-01-01

    Full Text Available Background: This study was conducted to explore the prevalence of medical error underreporting and associated barriers. Methods: This cross-sectional study was performed from September to December 2012. Five hospitals, affiliated with Hamadan University of Medical Sciences, in Hamedan,Iran were investigated. A self-administered questionnaire was used for data collection. Participants consisted of physicians, nurses, midwives, residents, interns, and staffs of radiology and laboratory departments. Results: Overall, 50.26% of subjects had committed but not reported medical errors. The main reasons mentioned for underreporting were lack of effective medical error reporting system (60.0%, lack of proper reporting form (51.8%, lack of peer supporting a person who has committed an error (56.0%, and lack of personal attention to the importance of medical errors (62.9%. The rate of committing medical errors was higher in men (71.4%, age of 50-40 years (67.6%, less-experienced personnel (58.7%, educational level of MSc (87.5%, and staff of radiology department (88.9%. Conclusions: This study outlined the main barriers to reporting medical errors and associated factors that may be helpful for healthcare organizations in improving medical error reporting as an essential component for patient safety enhancement.

  6. Explaining Errors in Children's Questions

    Science.gov (United States)

    Rowland, Caroline F.

    2007-01-01

    The ability to explain the occurrence of errors in children's speech is an essential component of successful theories of language acquisition. The present study tested some generativist and constructivist predictions about error on the questions produced by ten English-learning children between 2 and 5 years of age. The analyses demonstrated that,…

  7. Onorbit IMU alignment error budget

    Science.gov (United States)

    Corson, R. W.

    1980-01-01

    The Star Tracker, Crew Optical Alignment Sight (COAS), and Inertial Measurement Unit (IMU) from a complex navigation system with a multitude of error sources were combined. A complete list of the system errors is presented. The errors were combined in a rational way to yield an estimate of the IMU alignment accuracy for STS-1. The expected standard deviation in the IMU alignment error for STS-1 type alignments was determined to be 72 arc seconds per axis for star tracker alignments and 188 arc seconds per axis for COAS alignments. These estimates are based on current knowledge of the star tracker, COAS, IMU, and navigation base error specifications, and were partially verified by preliminary Monte Carlo analysis.

  8. Prioritising interventions against medication errors

    DEFF Research Database (Denmark)

    Lisby, Marianne; Pape-Larsen, Louise; Sørensen, Ann Lykkegaard

    experts appointed by 13 healthcare-, professional- and scientific organisations in Denmark. Test of definition: The definition was applied to historic data from a somatic hospital (2003; 64 patients) [2] and further, prospectively tested in comparable studies of medication errors in a psychiatric hospital...... errors was compared between the somatic hospital (2003), the nursing homes and the psychiatric hospital whereas comparison of prescribing errors included all four clinical settings. Results: Definition: The expert panel reached consensus of the following definition “An error in the stages...... of the medication process - ordering, transcribing, dispensing, administering and monitoring the effect - causing harm or implying a risk of harming the patient”. In addition, consensus for 60 of 76 error types covering all stages in the medication process was achieved. Test of definition: The definition...

  9. Statistical Primer for Athletic Trainers: The Essentials of Understanding Measures of Reliability and Minimal Important Change.

    Science.gov (United States)

    Riemann, Bryan L; Lininger, Monica R

    2018-01-01

      To describe the concepts of measurement reliability and minimal important change.   All measurements have some magnitude of error. Because clinical practice involves measurement, clinicians need to understand measurement reliability. The reliability of an instrument is integral in determining if a change in patient status is meaningful.   Measurement reliability is the extent to which a test result is consistent and free of error. Three perspectives of reliability-relative reliability, systematic bias, and absolute reliability-are often reported. However, absolute reliability statistics, such as the minimal detectable difference, are most relevant to clinicians because they provide an expected error estimate. The minimal important difference is the smallest change in a treatment outcome that the patient would identify as important.   Clinicians should use absolute reliability characteristics, preferably the minimal detectable difference, to determine the extent of error around a patient's measurement. The minimal detectable difference, coupled with an appropriately estimated minimal important difference, can assist the practitioner in identifying clinically meaningful changes in patients.

  10. "No admittance except on business" 

    Directory of Open Access Journals (Sweden)

    Mathilde Bourrier

    2011-04-01

     sociologie embarquée », soit s’en dégager radicalement.“No admittance except on business”. Issues in negotiating entry into organisationsThis article has a history. The first reason for wanting to write it was linked to the development in 2007 of a master course in sociology on the conditions associated with being able to enter organisations. To my great surprise, and in spite of my efforts during the preparation of the course, I found few texts which discussed real entry conditions. This article reflects then this disappointment. The second reason concerns repeated observations of the hardening conditions in relation with studies undertaken within organisations, whether in France or the United States. Reports I have received confirm my observations that the field of high risk organisations has become particularly more and more difficult for young researchers or doctoral students to freely organize their work. Paradoxically, whilst more efforts are undertaken so that sociologists (amongst others have access to risk industries, researchers are confronted with conditions that are often rigid and hardly generous. As for the issues themselves that are studied, they are very aligned with the industrials’ own managerial questions. Ethnographic studies by immersion are abandoned in favour of action-research, in form de theses obliged, at the end of the day, to make propositions for improvements and recommendations about management tools. I have a feeling that what is happening in high-risk organisations is also the case for the sociology of organisations in general. In the first part of this article, I propose a partial revue of the way that the sociology of organisations has treated the question of entering into organisations. I will reflect on why there are so few works devoted to this question and on what the consequences may well be for the field itself. Secondly, I will examine the possibilities offered today as well as the efforts to make in order to, either, resolutely

  11. Robust super-resolution by minimizing a Gaussian-weighted L2 error norm

    NARCIS (Netherlands)

    Pham, T.Q.; Vliet, L.J. van; Schutte, K.

    2008-01-01

    Super-resolution restoration is the problem of restoring a high-resolution scene from multiple degraded low-resolution images under motion. Due to imaging blur and noise, this problem is ill-posed. Additional constraints such as smoothness of the solution via regularization is often required to

  12. MINIMIZING COMPUTATIONAL ERRORS OF TSUNAMI WAVE-RAY AND TRAVEL TIME

    Directory of Open Access Journals (Sweden)

    Andrei G. Marchuk

    2008-01-01

    Full Text Available There are many methods for computing tsunami kinematics directly and inversely. The direct detection of waves in the deep ocean makes it possible to establish tsunami source characteristics and origin. Thus, accuracy of computational methods is very important in obtaining reliable results. In a non-homogeneous medium where tsunami wave propagation velocity varies, it is not very easy to determine a wave-ray that connects two given points along a path. The present study proposes modification in the methodology of determining tsunami travel-times and of wave-ray paths. An approximate ray trace path can be developed from a source origin point to any other point on a computational grid by solving directly the problem - and thus obtain the tsunami travel- times. The initial ray approximation can be optimized with the use of an algorithm that calculates all potential variations and applies corrections to travel-time values. Such an algorithm was tested in an area with model bathymetry and compared with a non-optimized method. The latter exceeded the optimized method by one minute of travel-time for every hour of tsunami propagation time.

  13. A two-step scheme for the advection equation with minimized dissipation and dispersion errors

    Science.gov (United States)

    Takacs, L. L.

    1985-01-01

    A two-step advection scheme of the Lax-Wendroff type is derived which has accuracy and phase characteristics similar to that of a third-order scheme. The scheme is exactly third-order accurate in time and space for uniform flow. The new scheme is compared with other currently used methods, and is shown to simulate well the advection of localized disturbances with steep gradients. The scheme is derived for constant flow and generalized to two-dimensional nonuniform flow.

  14. An Implementation of Error Minimization Position Estimate in Wireless Inertial Measurement Unit using Modification ZUPT

    Directory of Open Access Journals (Sweden)

    Adytia Darmawan

    2016-12-01

    Full Text Available Position estimation using WIMU (Wireless Inertial Measurement Unit is one of emerging technology in the field of indoor positioning systems. WIMU can detect movement and does not depend on GPS signals. The position is then estimated using a modified ZUPT (Zero Velocity Update method that was using Filter Magnitude Acceleration (FMA, Variance Magnitude Acceleration (VMA and Angular Rate (AR estimation. Performance of this method was justified on a six-legged robot navigation system. Experimental result shows that the combination of VMA-AR gives the best position estimation.

  15. 40 CFR 51.930 - Mitigation of Exceptional Events.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Mitigation of Exceptional Events. 51... Mitigation of Exceptional Events. (a) A State requesting to exclude air quality data due to exceptional events must take appropriate and reasonable actions to protect public health from exceedances or...

  16. 50 CFR 14.62 - Exceptions to import declaration requirements.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 1 2010-10-01 2010-10-01 false Exceptions to import declaration requirements. 14.62 Section 14.62 Wildlife and Fisheries UNITED STATES FISH AND WILDLIFE SERVICE, DEPARTMENT OF... § 14.62 Exceptions to import declaration requirements. (a) Except for wildlife requiring a permit...

  17. 21 CFR 1307.03 - Exceptions to regulations.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Exceptions to regulations. 1307.03 Section 1307.03 Food and Drugs DRUG ENFORCEMENT ADMINISTRATION, DEPARTMENT OF JUSTICE MISCELLANEOUS General Information § 1307.03 Exceptions to regulations. Any person may apply for an exception to the application of any...

  18. 49 CFR 173.4a - Excepted quantities.

    Science.gov (United States)

    2010-10-01

    ..., the document must include the statement “Dangerous Goods in Excepted Quantities” and indicate the... statement “Dangerous Goods in Excepted Quantities” and indicate the number of packages. (i) Training. Each... 49 Transportation 2 2010-10-01 2010-10-01 false Excepted quantities. 173.4a Section 173.4a...

  19. Detecting and explaining business exceptions for risk assessment

    NARCIS (Netherlands)

    Liu, L.; Daniëls, H.A.M.; Hofman, W.; Hammoudi, S.; Maciaszek, L.; Cordeiro, J.; Dietz, J.

    2013-01-01

    Systematic risk analysis can be based on causal analysis of business exceptions. In this paper we describe the concepts of automatic analysis for the exceptional patterns which are hidden in a large set of business data. These exceptions are interesting to be investigated further for their causes

  20. Minimalism and the Pragmatic Frame

    Directory of Open Access Journals (Sweden)

    Ana Falcato

    2016-02-01

    Full Text Available In the debate between literalism and contextualism in semantics, Kent Bach’s project is often taken to stand on the latter side of the divide. In this paper I argue this is a misleading assumption and justify it by contrasting Bach’s assessment of the theoretical eliminability of minimal propositions arguably expressed by well-formed sentences with standard minimalist views, and by further contrasting his account of the division of interpretative processes ascribable to the semantics and pragmatics of a language with a parallel analysis carried out by the most radical opponent to semantic minimalism, i.e., by occasionalism. If my analysis proves right, the sum of its conclusions amounts to a refusal of Bach’s main dichotomies.

  1. Minimal universal quantum heat machine

    Science.gov (United States)

    Gelbwaser-Klimovsky, D.; Alicki, R.; Kurizki, G.

    2013-01-01

    In traditional thermodynamics the Carnot cycle yields the ideal performance bound of heat engines and refrigerators. We propose and analyze a minimal model of a heat machine that can play a similar role in quantum regimes. The minimal model consists of a single two-level system with periodically modulated energy splitting that is permanently, weakly, coupled to two spectrally separated heat baths at different temperatures. The equation of motion allows us to compute the stationary power and heat currents in the machine consistent with the second law of thermodynamics. This dual-purpose machine can act as either an engine or a refrigerator (heat pump) depending on the modulation rate. In both modes of operation, the maximal Carnot efficiency is reached at zero power. We study the conditions for finite-time optimal performance for several variants of the model. Possible realizations of the model are discussed.

  2. Construction schedules slack time minimizing

    Science.gov (United States)

    Krzemiński, Michał

    2017-07-01

    The article presents two copyright models for minimizing downtime working brigades. Models have been developed for construction schedules performed using the method of work uniform. Application of flow shop models is possible and useful for the implementation of large objects, which can be divided into plots. The article also presents a condition describing gives which model should be used, as well as a brief example of optimization schedule. The optimization results confirm the legitimacy of the work on the newly-developed models.

  3. Torsional Rigidity of Minimal Submanifolds

    DEFF Research Database (Denmark)

    Markvorsen, Steen; Palmer, Vicente

    2006-01-01

    We prove explicit upper bounds for the torsional rigidity of extrinsic domains of minimal submanifolds $P^m$ in ambient Riemannian manifolds $N^n$ with a pole $p$. The upper bounds are given in terms of the torsional rigidities of corresponding Schwarz symmetrizations of the domains in warped...... for the torsional rigidity are actually attained and give conditions under which the geometric average of the stochastic mean exit time for Brownian motion at infinity is finite....

  4. The incidence of hiatal hernia after minimally invasive esophagectomy.

    Science.gov (United States)

    Bronson, Nathan W; Luna, Renato A; Hunter, John G; Dolan, James P

    2014-05-01

    Minimally invasive esophagectomy (MIE) has evolved as a means to minimize the morbidity of an operation which is traditionally associated with a significant risk. However, this approach may have its own unique postoperative complications. In this study, we describe the incidence and outcomes of hiatal hernia in a cohort of MIE patients. Clinical follow-up data on 114 patients who had undergone minimally invasive esophagectomy between 2003 and 2011 were retrospectively reviewed. Clinical presentation and computed tomography (CT) scans of the chest and abdomen were used to establish the diagnosis of hiatal herniation after minimally invasive esophagectomy. Age, gender, presenting complaint, comorbid conditions, clinical tumor stage, surgical specimen size, length and cost of hospital admissions, operation performed for hiatal herniation, and mortality were all recorded for analysis. Nine (8%) of the 114 patients who underwent MIE had postoperative hiatal herniation. Five of these patients were asymptomatic. All patients except two who presented emergently were repaired laparoscopically on an elective basis. The average length of stay after hiatal hernia repair was 5.5 days (range 2-12) at an average charge of $40,785 (range $25,264-$83,953). At follow-up, one patient complained of symptoms associated with reflux. Hiatal herniation is not a rare event after MIE. It is also associated with significant health-care cost and may be lethal. Most occurrences appear to be asymptomatic and, if detected, can be repaired with good resolution of symptoms, minimal associated morbidity, and no mortality.

  5. Minimally Invasive Heart Valve Surgery.

    Science.gov (United States)

    Bouhout, Ismail; Morgant, Marie-Catherine; Bouchard, Denis

    2017-09-01

    Minimally invasive valve surgery represents a recent and significant advance in modern heart surgery. Indeed, many less invasive approaches for both the aortic and mitral valves have been developed in the past 2 decades. These procedures were hypothesized to result in less operative trauma, which might translate into better patient outcomes. However, this clinical benefit remains controversial in the literature. The aim of this review is to discuss the evidence surrounding minimally invasive heart valve surgery in the current era. A systematic search of the literature from 2006-2016 was performed looking for articles reporting early or late outcomes after minimally invasive valve surgery. Less invasive valve surgery is safe and provides long-term surgical outcomes similar to those of standard sternotomy. In addition, these approaches result in a reduction in overall hospital length of stay and may mitigate the risk of early morbidity-mainly postoperative bleeding, transfusions, and ventilation duration. Copyright © 2017 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.

  6. Minimally invasive total knee arthroplasty.

    Science.gov (United States)

    Bonutti, Peter M; Mont, Michael A; McMahon, Margo; Ragland, Phillip S; Kester, Mark

    2004-01-01

    Currently, minimally invasive total knee arthroplasty is defined as an incision length of definition are: 1. The amount of soft-tissue dissection (including muscle, ligament, and capsular damage). 2. Patellar retraction or eversion. 3. Tibiofemoral dislocation. Minimally invasive surgery should not be considered to be a cosmetic procedure but rather one that addresses patients' concerns with regard to postoperative pain and slow rehabilitation. Standard total knee arthroplasties provide pain relief, but returning to activities of daily living remains a challenge for some individuals, who may take several weeks to recover. Several studies have demonstrated long-term success (at more than ten years) of standard total knee arthroplasties. However, many patients remain unsatisfied with the results of the surgery. In a study of functional limitations of patients with a Knee Society score of > or = 90 points after total knee arthroplasty, only 35% of patients stated that they had no limitations. This finding was highlighted in a study by Dickstein et al., in which one-third of the elderly patients who underwent knee replacement were unhappy with the outcome at six and twelve months postoperatively. Although many surgeons utilize objective functional scoring systems to evaluate outcome, it is likely that the criteria for a successful result of total knee arthroplasty differ between the patient and the surgeon. This was evident in a report by Bullens et al., who concluded that surgeons are more satisfied with the results of total knee arthroplasty than are their patients. Trousdale et al. showed that, in addition to concerns about long-term functional outcome, patients' major concerns were postoperative pain and the time required for recovery. Patients undergoing total knee arthroplasty have specific functional goals, such as climbing stairs, squatting, kneeling, and returning to some level of low-impact sports after surgery. Our clinical investigations demonstrated that

  7. Automatic Generation of Minimal Cut Sets

    Directory of Open Access Journals (Sweden)

    Sentot Kromodimoeljo

    2015-06-01

    Full Text Available A cut set is a collection of component failure modes that could lead to a system failure. Cut Set Analysis (CSA is applied to critical systems to identify and rank system vulnerabilities at design time. Model checking tools have been used to automate the generation of minimal cut sets but are generally based on checking reachability of system failure states. This paper describes a new approach to CSA using a Linear Temporal Logic (LTL model checker called BT Analyser that supports the generation of multiple counterexamples. The approach enables a broader class of system failures to be analysed, by generalising from failure state formulae to failure behaviours expressed in LTL. The traditional approach to CSA using model checking requires the model or system failure to be modified, usually by hand, to eliminate already-discovered cut sets, and the model checker to be rerun, at each step. By contrast, the new approach works incrementally and fully automatically, thereby removing the tedious and error-prone manual process and resulting in significantly reduced computation time. This in turn enables larger models to be checked. Two different strategies for using BT Analyser for CSA are presented. There is generally no single best strategy for model checking: their relative efficiency depends on the model and property being analysed. Comparative results are given for the A320 hydraulics case study in the Behavior Tree modelling language.

  8. Linear network error correction coding

    CERN Document Server

    Guang, Xuan

    2014-01-01

    There are two main approaches in the theory of network error correction coding. In this SpringerBrief, the authors summarize some of the most important contributions following the classic approach, which represents messages by sequences?similar to algebraic coding,?and also briefly discuss the main results following the?other approach,?that uses the theory of rank metric codes for network error correction of representing messages by subspaces. This book starts by establishing the basic linear network error correction (LNEC) model and then characterizes two equivalent descriptions. Distances an

  9. Urgent listing exceptions and outcomes in pediatric heart transplantation: Comparison to standard criteria patients.

    Science.gov (United States)

    Davies, Ryan R; McCulloch, Michael A; Haldeman, Shylah; Gidding, Samuel S; Pizarro, Christian

    2017-03-01

    United Network for Organ Sharing (UNOS) policy enables listing exceptions to avoid penalizing patients with waitlist mortality not captured by standard criteria. Outcomes among patients listed by exception have not been analyzed. We performed a retrospective analysis of pediatric (≤17 years of age, n = 4,706) listings (2006 to 2015) for primary, isolated heart transplantation within the UNOS data set, assessing Status 1A exception (n = 211, 4.5%) use across regions and patient characteristics and evaluating waitlist outcomes compared with candidates listed using standard criteria. Death or removal for reason other than transplant did not differ between exception and standard criteria patients at 1 month (11.7% vs 16.2%, p = not statistically significant [NS]), 2 months (18.2% vs 29.0%, p = 0.11) or overall (16.1% vs 22.0%, p = NS) on the waitlist. Rates were higher than among Status 1B patients (1 month: 2.8%; 2 months: 5.6%; overall: 14.9%; p < 0.0001). The cumulative incidence of competing risks (transplantation, death/removal for reasons other than transplant and removal) did not differ when comparing Status 1A exception patients with Status 1A standard criteria patients. Use of 1A exceptions varied across UNOS regions (1.9% to 22.3%, p < 0.0001). Risk-adjusted modeling identified patients more (hypertrophic cardiomyopathy: odds ratio [OR] = 2.8, 95% confidence interval [CI] 1.5 to 5.0; restrictive cardiomyopathy: OR = 2.7, 95% CI 1.7 to 4.3) and less (low socioeconomic status: OR = 0.7, 95% CI 0.5 to 1.0) likely to use an exception. Use of exceptions was uncorrelated with regional outcomes. Waitlist mortality among Status 1A exception patients is similar to that among those listed by standard criteria. However, variation in exception use across geography and demography may contribute to inequities in access to transplantation, particularly for those with low socioeconomic status. Standardization of practices may decrease regional variation and minimize

  10. Barriers to adverse event and error reporting in anesthesia.

    Science.gov (United States)

    Heard, Gaylene C; Sanderson, Penelope M; Thomas, Rowan D

    2012-03-01

    reporting, and legislated protection of reports from legal discoverability. The majority of anesthesiologists in our study did not agree that the attitudinal/emotional barriers surveyed would influence reporting of an unspecified adverse event caused by error, with the exception of the barrier of being concerned about blame by colleagues. The probable influence of 6 perceived barriers to reporting a specified adverse event of anaphylaxis differed with the presence or absence of error. Anesthesiologists in our study supported assistive reporting strategies. There seem to be some differences between our results and previously published research for other physician groups.

  11. Errors and complications in laparoscopic surgery

    Directory of Open Access Journals (Sweden)

    Liviu Drăghici

    2017-05-01

    Full Text Available Background. In laparoscopic surgery errors are unavoidable and require proper acknowledgment to reduce the risk of intraoperative and accurately assess the appropriate therapeutic approach. Fortunately, their frequency is low and cannot overshadow the benefits of laparoscopic surgery. Materials and Methods. We made an epidemiological investigation in General Surgery Department of Emergency Clinical Hospital "St. John" Bucharest, analyzing 20 years of experience in laparoscopic surgery, during 1994-2014. We wanted to identify evolution trends in complications of laparoscopic surgery, analyzing the dynamic of errors occurred in all patients with laparoscopic procedures. Results. We recorded 26847 laparoscopic interventions with a total of 427 intra-or postoperative complications that required 160 conversions and 267 reinterventions to resolve inconsistencies. The average frequency of occurrence of complications was 15.9‰ (15.9 of 1,000 cases. In the period under review it was a good momentum of laparoscopic procedures in our department. Number of minimally invasive interventions increased almost 10 times, from 266 cases operated laparoscopically in 1995 to 2638 cases in 2008. Annual growth of the number of laparoscopic procedures has surpassed the number of complications. Conclusions. Laborious work of laparoscopic surgery and a specialized centre with well-trained team of surgeons provide premises for a good performance even in the assimilation of new and difficult procedures.

  12. ERROR CONVERGENCE ANALYSIS FOR LOCAL HYPERTHERMIA APPLICATIONS

    Directory of Open Access Journals (Sweden)

    NEERU MALHOTRA

    2016-01-01

    Full Text Available The accuracy of numerical solution for electromagnetic problem is greatly influenced by the convergence of the solution obtained. In order to quantify the correctness of the numerical solution the errors produced on solving the partial differential equations are required to be analyzed. Mesh quality is another parameter that affects convergence. The various quality metrics are dependent on the type of solver used for numerical simulation. The paper focuses on comparing the performance of iterative solvers used in COMSOL Multiphysics software. The modeling of coaxial coupled waveguide applicator operating at 485MHz has been done for local hyperthermia applications using adaptive finite element method. 3D heat distribution within the muscle phantom depicting spherical leison and localized heating pattern confirms the proper selection of the solver. The convergence plots are obtained during simulation of the problem using GMRES (generalized minimal residual and geometric multigrid linear iterative solvers. The best error convergence is achieved by using nonlinearity multigrid solver and further introducing adaptivity in nonlinear solver.

  13. Detection and Classification of Measurement Errors in Bioimpedance Spectroscopy

    OpenAIRE

    David Ayllón; Roberto Gil-Pita; Fernando Seoane

    2016-01-01

    Bioimpedance spectroscopy (BIS) measurement errors may be caused by parasitic stray capacitance, impedance mismatch, cross-talking or their very likely combination. An accurate detection and identification is of extreme importance for further analysis because in some cases and for some applications, certain measurement artifacts can be corrected, minimized or even avoided. In this paper we present a robust method to detect the presence of measurement artifacts and identify what kind of measur...

  14. QuorUM: An Error Corrector for Illumina Reads.

    Directory of Open Access Journals (Sweden)

    Guillaume Marçais

    Full Text Available Illumina Sequencing data can provide high coverage of a genome by relatively short (most often 100 bp to 150 bp reads at a low cost. Even with low (advertised 1% error rate, 100 × coverage Illumina data on average has an error in some read at every base in the genome. These errors make handling the data more complicated because they result in a large number of low-count erroneous k-mers in the reads. However, there is enough information in the reads to correct most of the sequencing errors, thus making subsequent use of the data (e.g. for mapping or assembly easier. Here we use the term "error correction" to denote the reduction in errors due to both changes in individual bases and trimming of unusable sequence. We developed an error correction software called QuorUM. QuorUM is mainly aimed at error correcting Illumina reads for subsequent assembly. It is designed around the novel idea of minimizing the number of distinct erroneous k-mers in the output reads and preserving the most true k-mers, and we introduce a composite statistic π that measures how successful we are at achieving this dual goal. We evaluate the performance of QuorUM by correcting actual Illumina reads from genomes for which a reference assembly is available.We produce trimmed and error-corrected reads that result in assemblies with longer contigs and fewer errors. We compared QuorUM against several published error correctors and found that it is the best performer in most metrics we use. QuorUM is efficiently implemented making use of current multi-core computing architectures and it is suitable for large data sets (1 billion bases checked and corrected per day per core. We also demonstrate that a third-party assembler (SOAPdenovo benefits significantly from using QuorUM error-corrected reads. QuorUM error corrected reads result in a factor of 1.1 to 4 improvement in N50 contig size compared to using the original reads with SOAPdenovo for the data sets investigated

  15. Comparison of Neural Network Error Measures for Simulation of Slender Marine Structures

    DEFF Research Database (Denmark)

    Christiansen, Niels H.; Voie, Per Erlend Torbergsen; Winther, Ole

    2014-01-01

    platform is designed and tested. The purpose of setting up the network is to reduce calculation time in a fatigue life analysis. Therefore, the networks trained on different error functions are compared with respect to accuracy of rain flow counts of stress cycles over a number of time series simulations......Training of an artificial neural network (ANN) adjusts the internal weights of the network in order to minimize a predefined error measure. This error measure is given by an error function. Several different error functions are suggested in the literature. However, the far most common measure...... for regression is the mean square error. This paper looks into the possibility of improving the performance of neural networks by selecting or defining error functions that are tailor-made for a specific objective. A neural network trained to simulate tension forces in an anchor chain on a floating offshore...

  16. Coping with human errors through system design: Implications for ecological interface design

    DEFF Research Database (Denmark)

    Rasmussen, Jens; Vicente, Kim J.

    1989-01-01

    Research during recent years has revealed that human errors are not stochastic events which can be removed through improved training programs or optimal interface design. Rather, errors tend to reflect either systematic interference between various models, rules, and schemata, or the effects of t...... on both the interferences causing error and on the opportunity for error recovery left to the operator.......Research during recent years has revealed that human errors are not stochastic events which can be removed through improved training programs or optimal interface design. Rather, errors tend to reflect either systematic interference between various models, rules, and schemata, or the effects...... of the adaptive mechanisms involved in learning. In terms of design implications, these findings suggest that reliable human-system interaction will be achieved by designing interfaces which tend to minimize the potential for control interference and support recovery from errors. In other words, the focus should...

  17. Minimal families of curves on surfaces

    KAUST Repository

    Lubbes, Niels

    2014-11-01

    A minimal family of curves on an embedded surface is defined as a 1-dimensional family of rational curves of minimal degree, which cover the surface. We classify such minimal families using constructive methods. This allows us to compute the minimal families of a given surface.The classification of minimal families of curves can be reduced to the classification of minimal families which cover weak Del Pezzo surfaces. We classify the minimal families of weak Del Pezzo surfaces and present a table with the number of minimal families of each weak Del Pezzo surface up to Weyl equivalence.As an application of this classification we generalize some results of Schicho. We classify algebraic surfaces that carry a family of conics. We determine the minimal lexicographic degree for the parametrization of a surface that carries at least 2 minimal families. © 2014 Elsevier B.V.

  18. Asymmetrical Search Errors in Infancy

    Science.gov (United States)

    Butterworth, George

    1976-01-01

    To establish the spatial generality of perseverative errors in infant manual search, a group of infants aged 8-11 months performed Piaget's Stage IV task with an object hidden at successive locations in the vertical plane. (Author/JMB)

  19. Quantile Regression With Measurement Error

    KAUST Repository

    Wei, Ying

    2009-08-27

    Regression quantiles can be substantially biased when the covariates are measured with error. In this paper we propose a new method that produces consistent linear quantile estimation in the presence of covariate measurement error. The method corrects the measurement error induced bias by constructing joint estimating equations that simultaneously hold for all the quantile levels. An iterative EM-type estimation algorithm to obtain the solutions to such joint estimation equations is provided. The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a longitudinal study with an unusual measurement error structure. © 2009 American Statistical Association.

  20. The uncorrected refractive error challenge

    Directory of Open Access Journals (Sweden)

    Kovin Naidoo

    2016-11-01

    Full Text Available Refractive error affects people of all ages, socio-economic status and ethnic groups. The most recent statistics estimate that, worldwide, 32.4 million people are blind and 191 million people have vision impairment. Vision impairment has been defined based on distance visual acuity only, and uncorrected distance refractive error (mainly myopia is the single biggest cause of worldwide vision impairment. However, when we also consider near visual impairment, it is clear that even more people are affected. From research it was estimated that the number of people with vision impairment due to uncorrected distance refractive error was 107.8 million,1 and the number of people affected by uncorrected near refractive error was 517 million, giving a total of 624.8 million people.

  1. Comprehensive Error Rate Testing (CERT)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services (CMS) implemented the Comprehensive Error Rate Testing (CERT) program to measure improper payments in the Medicare...

  2. Errors in laparoscopic surgery: what surgeons should know.

    Science.gov (United States)

    Galleano, R; Franceschi, A; Ciciliot, M; Falchero, F; Cuschieri, A

    2011-04-01

    Some two decades after its introduction, minimal access surgery (MAS) is still evolving. Undoubtedly, its significant uptake world wide is due to its clinical benefits to patient outcome. These benefits include reduced traumatic insult, reduction of pain, earlier return to bowel function, decrease disability, shorter hospitalization and better cosmetic results. Nonetheless complications due to the laparoscopic approach are not rare as documented by several studies on task specific or procedure related MAS morbidity. In all these instances, error analysis research has demonstrated that an understanding of the underlying causes of these complications requires a comprehensive approach addressing the entire system related to the procedure for identification and characterization of the errors ultimately responsible for the morbidity. The present review covers definition, taxonomy and incidence of errors in medicine with special reference to MAS. In addition, possible root causes of adverse events in laparoscopy are explored and existing methods to study errors are reviewed. Finally specific areas requiring further human factors research to enhance safety of patients undergoing laparoscopic operations are identified. The hope is that awareness of causes and mechanisms of errors may reduce incidence of errors in clinical practice for the final benefit of the patients.

  3. About the ZOOM minimization package

    Energy Technology Data Exchange (ETDEWEB)

    Fischler, M.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  4. Error correcting coding for OTN

    DEFF Research Database (Denmark)

    Justesen, Jørn; Larsen, Knud J.; Pedersen, Lars A.

    2010-01-01

    Forward error correction codes for 100 Gb/s optical transmission are currently receiving much attention from transport network operators and technology providers. We discuss the performance of hard decision decoding using product type codes that cover a single OTN frame or a small number...... of such frames. In particular we argue that a three-error correcting BCH is the best choice for the component code in such systems....

  5. Errors in Chemical Sensor Measurements

    Directory of Open Access Journals (Sweden)

    Artur Dybko

    2001-06-01

    Full Text Available Various types of errors during the measurements of ion-selective electrodes, ionsensitive field effect transistors, and fibre optic chemical sensors are described. The errors were divided according to their nature and place of origin into chemical, instrumental and non-chemical. The influence of interfering ions, leakage of the membrane components, liquid junction potential as well as sensor wiring, ambient light and temperature is presented.

  6. Quantum error correction for beginners

    Science.gov (United States)

    Devitt, Simon J.; Munro, William J.; Nemoto, Kae

    2013-07-01

    Quantum error correction (QEC) and fault-tolerant quantum computation represent one of the most vital theoretical aspects of quantum information processing. It was well known from the early developments of this exciting field that the fragility of coherent quantum systems would be a catastrophic obstacle to the development of large-scale quantum computers. The introduction of quantum error correction in 1995 showed that active techniques could be employed to mitigate this fatal problem. However, quantum error correction and fault-tolerant computation is now a much larger field and many new codes, techniques, and methodologies have been developed to implement error correction for large-scale quantum algorithms. In response, we have attempted to summarize the basic aspects of quantum error correction and fault-tolerance, not as a detailed guide, but rather as a basic introduction. The development in this area has been so pronounced that many in the field of quantum information, specifically researchers who are new to quantum information or people focused on the many other important issues in quantum computation, have found it difficult to keep up with the general formalisms and methodologies employed in this area. Rather than introducing these concepts from a rigorous mathematical and computer science framework, we instead examine error correction and fault-tolerance largely through detailed examples, which are more relevant to experimentalists today and in the near future.

  7. Medical Error and Moral Luck.

    Science.gov (United States)

    Hubbeling, Dieneke

    2016-09-01

    This paper addresses the concept of moral luck. Moral luck is discussed in the context of medical error, especially an error of omission that occurs frequently, but only rarely has adverse consequences. As an example, a failure to compare the label on a syringe with the drug chart results in the wrong medication being administered and the patient dies. However, this error may have previously occurred many times with no tragic consequences. Discussions on moral luck can highlight conflicting intuitions. Should perpetrators receive a harsher punishment because of an adverse outcome, or should they be dealt with in the same way as colleagues who have acted similarly, but with no adverse effects? An additional element to the discussion, specifically with medical errors, is that according to the evidence currently available, punishing individual practitioners does not seem to be effective in preventing future errors. The following discussion, using relevant philosophical and empirical evidence, posits a possible solution for the moral luck conundrum in the context of medical error: namely, making a distinction between the duty to make amends and assigning blame. Blame should be assigned on the basis of actual behavior, while the duty to make amends is dependent on the outcome.

  8. Error image aware content restoration

    Science.gov (United States)

    Choi, Sungwoo; Lee, Moonsik; Jung, Byunghee

    2015-12-01

    As the resolution of TV significantly increased, content consumers have become increasingly sensitive to the subtlest defect in TV contents. This rising standard in quality demanded by consumers has posed a new challenge in today's context where the tape-based process has transitioned to the file-based process: the transition necessitated digitalizing old archives, a process which inevitably produces errors such as disordered pixel blocks, scattered white noise, or totally missing pixels. Unsurprisingly, detecting and fixing such errors require a substantial amount of time and human labor to meet the standard demanded by today's consumers. In this paper, we introduce a novel, automated error restoration algorithm which can be applied to different types of classic errors by utilizing adjacent images while preserving the undamaged parts of an error image as much as possible. We tested our method to error images detected from our quality check system in KBS(Korean Broadcasting System) video archive. We are also implementing the algorithm as a plugin of well-known NLE(Non-linear editing system), which is a familiar tool for quality control agent.

  9. The effect of aging on the brain network for exception word reading.

    Science.gov (United States)

    Provost, Jean-Sebastien; Brambati, Simona M; Chapleau, Marianne; Wilson, Maximiliano A

    2016-11-01

    Cognitive and computational models of reading aloud agree on the existence of two procedures for reading. Pseudowords (e.g., atendier) are correctly read through subword processes only while exception words (e.g., pint) are only correctly read via whole-words processes. Regular words can be correctly read by means of either way. Previous behavioral studies showed that older adults relied more on whole-word processing for reading. The aim of the present fMRI study was to verify whether this larger whole-word reliance for reading in older adults was reflected by changes in the pattern of brain activation. Both young and elderly participants read aloud pseudowords, exception and regular words in the scanner. Behavioral results reproduced those of previous studies showing that older adults made significantly less errors when reading exception words. Neuroimaging results showed significant activation of the left anterior temporal lobe (ATL), a key region implicated in whole-word reading for exception word reading in both young and elderly participants. Critically, ATL activation was also found for regular word reading in the elderly. No differences were observed in the pattern of activation between regular and pseudowords in the young. In conclusion, these results extend evidence on the critical role of the left ATL for exception word reading to elderly participants. Additionally, our study shows for the first time from a developmental point of view that the behavioral changes found in reading during normal aging also have a brain counterpart in the reading network changes that sustain exception and regular word reading in the elderly. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Grammatical Error Analysis in Recount Text Made by the Students of Cokroaminoto University of Palopo

    Directory of Open Access Journals (Sweden)

    Hermini Hermini

    2014-02-01

    Full Text Available This study aimed to find out (1 Grammatical errors in recount text made by the English Department students of the second and the sixth semester of Cokroaminoto University of Palopo, (2 the frequent grammatical errors made by the second and the sixth semester students of English department students (3 The difference of grammatical errors made by the second and the sixth semester students. The sample of the study was 723 sentences made by 30 students of the second semester and 30 students of the sixth semester students in academic year 2013/2014 that were taken by cluster random sampling technique. The sentences were 337 (46.61% simple sentences, 83(11.48% compound sentences, 218 (30.15% complex sentences, 85 (11.76% compound complex sentences. The data were collected by using two kinds of instruments namely: writing test to find the students’ grammatical errors and questionnaire to find the solution to prevent or minimize errors. Data on the students’ errors were analyzed by using descriptive statistics. The results of the study showed that the students made 832 errors classified into13 types of errors which consisted of 140 (16.82% errors in production of verb, 110 (13.22% errors in preposition, 106 (12,74% errors in distribution of verb, 98 (11.77% miscellaneous errors, 82 (9.85% errors in missing subject, 67(8.05% errors in part of speech, 61 (7,33% errors in irregular verbs, 58 (6.97% other errors in verb groups, 52(6.25% errors in the use of article, 24 (2.88% errors in gerund, 18 (2.16% errors in infinitive, 11(1.32% errors in pronoun/case, and 5 (0.6% errors in questions. The top six frequent grammatical errors made by the students were production of verb group, preposition, distribution of verb group, miscellaneous error, missing subject, and part of speech. The difference of both groups was the frequency in committing errors such as part of speech, irregular verb, infinitive verbs, and other errors in verb.

  11. A Six Sigma Trial For Reduction of Error Rates in Pathology Laboratory.

    Science.gov (United States)

    Tosuner, Zeynep; Gücin, Zühal; Kiran, Tuğçe; Büyükpinarbaşili, Nur; Turna, Seval; Taşkiran, Olcay; Arici, Dilek Sema

    2016-01-01

    A major target of quality assurance is the minimization of error rates in order to enhance patient safety. Six Sigma is a method targeting zero error (3.4 errors per million events) used in industry. The five main principles of Six Sigma are defining, measuring, analysis, improvement and control. Using this methodology, the causes of errors can be examined and process improvement strategies can be identified. The aim of our study was to evaluate the utility of Six Sigma methodology in error reduction in our pathology laboratory. The errors encountered between April 2014 and April 2015 were recorded by the pathology personnel. Error follow-up forms were examined by the quality control supervisor, administrative supervisor and the head of the department. Using Six Sigma methodology, the rate of errors was measured monthly and the distribution of errors at the preanalytic, analytic and postanalytical phases was analysed. Improvement strategies were reclaimed in the monthly intradepartmental meetings and the control of the units with high error rates was provided. Fifty-six (52.4%) of 107 recorded errors in total were at the pre-analytic phase. Forty-five errors (42%) were recorded as analytical and 6 errors (5.6%) as post-analytical. Two of the 45 errors were major irrevocable errors. The error rate was 6.8 per million in the first half of the year and 1.3 per million in the second half, decreasing by 79.77%. The Six Sigma trial in our pathology laboratory provided the reduction of the error rates mainly in the pre-analytic and analytic phases.

  12. Medication Distribution in Hospital: Errors Observed X Errors Perceived

    Directory of Open Access Journals (Sweden)

    G. N. Silva

    2013-07-01

    Full Text Available Abstract: The aim of the present study was to compare errors committed in the distribution of medicationsat a hospital pharmacy with those perceived by staff members involved in the distributionprocess. Medications distributed to the medical and surgical wards were analyzed. The drugswere dispensed in individualized doses per patient, separated by administration time in boxes orplastic bags for 24 hours of care and using the carbon copy of the prescription. Nineteen staffmembers involved in the drug-dispensing process were also interviewed. In the observationphase, 1963 drugs dispensed in 259 prescriptions were analyzed, with a total of 61 dispensingerrors (3.2% of the medications. The most frequent errors were omission of the prescribedmedication (23% and distribution of non-prescribed medication (14.8%. In the interviews, themain errors perceived by the staff were medications dispensed at a concentration other thanthat prescribed (22% and the distribution of non-prescribed medication or medication differentfrom that prescribed (20%. Differences were found between the most frequent errors observedand those reported by staff members. Nonetheless, the views of the staff proved coherent withthe literature on this issue.Keywords: medication errors, hospital medication system.

  13. Anisotropic mesh adaptation for solution of finite element problems using hierarchical edge-based error estimates

    Energy Technology Data Exchange (ETDEWEB)

    Lipnikov, Konstantin [Los Alamos National Laboratory; Agouzal, Abdellatif [UNIV DE LYON; Vassilevski, Yuri [Los Alamos National Laboratory

    2009-01-01

    We present a new technology for generating meshes minimizing the interpolation and discretization errors or their gradients. The key element of this methodology is construction of a space metric from edge-based error estimates. For a mesh with N{sub h} triangles, the error is proportional to N{sub h}{sup -1} and the gradient of error is proportional to N{sub h}{sup -1/2} which are optimal asymptotics. The methodology is verified with numerical experiments.

  14. Accidental iatrogenic intoxications by cytotoxic drugs: error analysis and practical preventive strategies.

    Science.gov (United States)

    Zernikow, B; Michel, E; Fleischhack, G; Bode, U

    1999-07-01

    Drug errors are quite common. Many of them become harmful only if they remain undetected, ultimately resulting in injury to the patient. Errors with cytotoxic drugs are especially dangerous because of the highly toxic potential of the drugs involved. For medico-legal reasons, only 1 case of accidental iatrogenic intoxication by cytotoxic drugs tends to be investigated at a time, because the focus is placed on individual responsibility rather than on system errors. The aim of our study was to investigate whether accidental iatrogenic intoxications by cytotoxic drugs are faults of either the individual or the system. The statistical analysis of distribution and quality of such errors, and the in-depth analysis of contributing factors delivered a rational basis for the development of practical preventive strategies. A total of 134 cases of accidental iatrogenic intoxication by a cytotoxic drug (from literature reports since 1966 identified by an electronic literature survey, as well as our own unpublished cases) underwent a systematic error analysis based on a 2-dimensional model of error generation. Incidents were classified by error characteristics and point in time of occurrence, and their distribution was statistically evaluated. The theories of error research, informatics, sensory physiology, cognitive psychology, occupational medicine and management have helped to classify and depict potential sources of error as well as reveal clues for error prevention. Monocausal errors were the exception. In the majority of cases, a confluence of unfavourable circumstances either brought about the error, or prevented its timely interception. Most cases with a fatal outcome involved erroneous drug administration. Object-inherent factors were the predominant causes. A lack of expert as well as general knowledge was a contributing element. In error detection and prevention of error sequelae, supervision and back-checking are essential. Improvement of both the individual

  15. Generalized Punctured Convolutional Codes with Unequal Error Protection

    Directory of Open Access Journals (Sweden)

    Marcelo Eduardo Pellenz

    2009-01-01

    Full Text Available We conduct a code search restricted to the recently introduced class of generalized punctured convolutional codes (GPCCs to find good unequal error protection (UEP convolutional codes for a prescribed minimal trellis complexity. The trellis complexity is taken to be the number of symbols per information bit in the “minimal” trellis module for the code. The GPCC class has been shown to possess codes with good distance properties under this decoding complexity measure. New good UEP convolutional codes and their respective effective free distances are tabulated for a variety of code rates and “minimal” trellis complexities. These codes can be used in several applications that require different levels of protection for their bits, such as the hierarchical digital transmission of video or images.

  16. Laparoscopic colonic resection in inflammatory bowel disease: minimal surgery, minimal access and minimal hospital stay.

    LENUS (Irish Health Repository)

    Boyle, E

    2008-11-01

    Laparoscopic surgery for inflammatory bowel disease (IBD) is technically demanding but can offer improved short-term outcomes. The introduction of minimally invasive surgery (MIS) as the default operative approach for IBD, however, may have inherent learning curve-associated disadvantages. We hypothesise that the establishment of MIS as the standard operative approach does not increase patient morbidity as assessed in the initial period of its introduction into a specialised unit, and that it confers earlier postoperative gastrointestinal recovery and reduced hospitalisation compared with conventional open resection.

  17. Predictors of Errors of Novice Java Programmers

    Science.gov (United States)

    Bringula, Rex P.; Manabat, Geecee Maybelline A.; Tolentino, Miguel Angelo A.; Torres, Edmon L.

    2012-01-01

    This descriptive study determined which of the sources of errors would predict the errors committed by novice Java programmers. Descriptive statistics revealed that the respondents perceived that they committed the identified eighteen errors infrequently. Thought error was perceived to be the main source of error during the laboratory programming…

  18. Learning time-dependent noise to reduce logical errors: real time error rate estimation in quantum error correction

    Science.gov (United States)

    Huo, Ming-Xia; Li, Ying

    2017-12-01

    Quantum error correction is important to quantum information processing, which allows us to reliably process information encoded in quantum error correction codes. Efficient quantum error correction benefits from the knowledge of error rates. We propose a protocol for monitoring error rates in real time without interrupting the quantum error correction. Any adaptation of the quantum error correction code or its implementation circuit is not required. The protocol can be directly applied to the most advanced quantum error correction techniques, e.g. surface code. A Gaussian processes algorithm is used to estimate and predict error rates based on error correction data in the past. We find that using these estimated error rates, the probability of error correction failures can be significantly reduced by a factor increasing with the code distance.

  19. Making operations on standard-library containers strongly exception safe

    DEFF Research Database (Denmark)

    Katajainen, Jyrki

    2007-01-01

    An operation on an element container is said to provide a strong guarantee of exception safety if, in case an exception is thrown, the operation leaves the container in the state in which it was before the operation. In this paper, we explore how to adjust operations on C++ standard......-library containers to provide the strong guarantee of exception safety, instead of the default guarantee, without violating the stringent performance requirements specified in the C++ standard. In particular, we show that every strongly exception-safe operation on dynamic arrays and ordered dictionaries is only...

  20. American Exceptionalism: Essential Context for National Security Strategy Development

    National Research Council Canada - National Science Library

    McNevin, David T

    2006-01-01

    ...." However, there are key components of American national character that give rise to the notion of "American exceptionalism" that are evident when one examines American historical political thought...

  1. Modelo de error en imágenes comprimidas con wavelets Error Model in Wavelet-compressed Images

    Directory of Open Access Journals (Sweden)

    Gloria Puetamán G.

    2007-06-01

    Full Text Available En este artículo se presenta la compresión de imágenes a través de la comparación entre el modelo Wavelet y el modelo Fourier, utilizando la minimización de la función de error. El problema que se estudia es específico, consiste en determinar una base {ei} que minimice la función de error entre la imagen original y la recuperada después de la compresión. Es de resaltar que existen muchas aplicaciones, por ejemplo, en medicina o astronomía, en donde no es aceptable ningún deterioro de la imagen porque toda la información contenida, incluso la que se estima como ruido, se considera imprescindible.In this paper we study image compression as a way to compare Wavelet and Fourier models, by minimizing the error function. The particular problem we consider is to determine basis {ei} minimizing the error function between the original image and the recovered one after compression. It is to be noted or remarked that there are many applications in such diverse fields as for example medicine and astronomy, where no image deteriorating is acceptable since even noise is considered essential.

  2. Minimally invasive spine stabilisation with long implants

    Science.gov (United States)

    Logroscino, Carlo Ambrogio; Proietti, Luca

    2009-01-01

    Originally aimed at treating degenerative syndromes of the lumbar spine, percutaneous minimally invasive posterior fixation is nowadays even more frequently used to treat some thoracolumbar fractures. According to the modern principles of saving segment of motion, a short implant (one level above and one level below the injured vertebra) is generally used to stabilise the injured spine. Although the authors generally use a short percutaneous fixation in treating thoracolumbar fractures with good results, they observed some cases in which the high fragmentation of the vertebral body and the presence of other associated diseases (co-morbidities) did not recommend the use of a short construct. The authors identified nine cases, in which a long implant (two levels above and two levels below the injured vertebra) was performed by a percutaneous minimally invasive approach. Seven patients (five males/two females) were affected by thoracolumbar fractures. T12 vertebra was involved in three cases, L1 in two cases, T10 and L2 in one case, respectively. Two fractures were classified as type A 3.1, two as A 3.2, two as A 3.3 and one as B 2.3, according to Magerl. In the present series, there were also two patients affected by a severe osteolysis of the spine (T9 and T12) due to tumoral localisation. All patients operated on with long instrumentation had a good outcome with prompt and uneventful clinical recovery. At the 1-year follow-up, all patients except one, who died 11 months after the operation, did not show any radiologic signs of mobilisation or failure of the implant. Based on the results of the present series, the long percutaneous fixation seems to represent an effective and safe system to treat particular cases of vertebral lesions. In conclusion, the authors believe that a long implant might be an alternative surgical method compared to more aggressive or demanding procedures, which in a few patients could represent an overtreatment. PMID:19399530

  3. The minimal flavour violating axion

    Science.gov (United States)

    Arias-Aragón, F.; Merlo, L.

    2017-10-01

    The solution to the Strong CP problem is analysed within the Minimal Flavour Violation (MFV) context. An Abelian factor of the complete flavour symmetry of the fermionic kinetic terms may play the role of the Peccei-Quinn symmetry in traditional axion models. Its spontaneous breaking, due to the addition of a complex scalar field to the Standard Model scalar spectrum, generates the MFV axion, which may redefine away the QCD theta parameter. It differs from the traditional QCD axion for its couplings that are governed by the fermion charges under the axial Abelian symmetry. It is also distinct from the so-called Axiflavon, as the MFV axion does not describe flavour violation, while it does induce flavour non-universality effects. The MFV axion phenomenology is discussed considering astrophysical, collider and flavour data.

  4. [MINIMALLY INVASIVE AORTIC VALVE REPLACEMENT].

    Science.gov (United States)

    Tabata, Minoru

    2016-03-01

    Minimally invasive aortic valve replacement (MIAVR) is defined as aortic valve replacement avoiding full sternotomy. Common approaches include a partial sternotomy right thoracotomy, and a parasternal approach. MIAVR has been shown to have advantages over conventional AVR such as shorter length of stay and smaller amount of blood transfusion and better cosmesis. However, it is also known to have disadvantages such as longer cardiopulmonary bypass and aortic cross-clamp times and potential complications related to peripheral cannulation. Appropriate patient selection is very important. Since the procedure is more complex than conventional AVR, more intensive teamwork in the operating room is essential. Additionally, a team approach during postoperative management is critical to maximize the benefits of MIAVR.

  5. Strategies to Minimize Antibiotic Resistance

    Directory of Open Access Journals (Sweden)

    Sang Hee Lee

    2013-09-01

    Full Text Available Antibiotic resistance can be reduced by using antibiotics prudently based on guidelines of antimicrobial stewardship programs (ASPs and various data such as pharmacokinetic (PK and pharmacodynamic (PD properties of antibiotics, diagnostic testing, antimicrobial susceptibility testing (AST, clinical response, and effects on the microbiota, as well as by new antibiotic developments. The controlled use of antibiotics in food animals is another cornerstone among efforts to reduce antibiotic resistance. All major resistance-control strategies recommend education for patients, children (e.g., through schools and day care, the public, and relevant healthcare professionals (e.g., primary-care physicians, pharmacists, and medical students regarding unique features of bacterial infections and antibiotics, prudent antibiotic prescribing as a positive construct, and personal hygiene (e.g., handwashing. The problem of antibiotic resistance can be minimized only by concerted efforts of all members of society for ensuring the continued efficiency of antibiotics.

  6. Strategies to minimize antibiotic resistance.

    Science.gov (United States)

    Lee, Chang-Ro; Cho, Ill Hwan; Jeong, Byeong Chul; Lee, Sang Hee

    2013-09-12

    Antibiotic resistance can be reduced by using antibiotics prudently based on guidelines of antimicrobial stewardship programs (ASPs) and various data such as pharmacokinetic (PK) and pharmacodynamic (PD) properties of antibiotics, diagnostic testing, antimicrobial susceptibility testing (AST), clinical response, and effects on the microbiota, as well as by new antibiotic developments. The controlled use of antibiotics in food animals is another cornerstone among efforts to reduce antibiotic resistance. All major resistance-control strategies recommend education for patients, children (e.g., through schools and day care), the public, and relevant healthcare professionals (e.g., primary-care physicians, pharmacists, and medical students) regarding unique features of bacterial infections and antibiotics, prudent antibiotic prescribing as a positive construct, and personal hygiene (e.g., handwashing). The problem of antibiotic resistance can be minimized only by concerted efforts of all members of society for ensuring the continued efficiency of antibiotics.

  7. Orbit IMU alignment: Error analysis

    Science.gov (United States)

    Corson, R. W.

    1980-01-01

    A comprehensive accuracy analysis of orbit inertial measurement unit (IMU) alignments using the shuttle star trackers was completed and the results are presented. Monte Carlo techniques were used in a computer simulation of the IMU alignment hardware and software systems to: (1) determine the expected Space Transportation System 1 Flight (STS-1) manual mode IMU alignment accuracy; (2) investigate the accuracy of alignments in later shuttle flights when the automatic mode of star acquisition may be used; and (3) verify that an analytical model previously used for estimating the alignment error is a valid model. The analysis results do not differ significantly from expectations. The standard deviation in the IMU alignment error for STS-1 alignments was determined to the 68 arc seconds per axis. This corresponds to a 99.7% probability that the magnitude of the total alignment error is less than 258 arc seconds.

  8. Negligence, genuine error, and litigation

    Directory of Open Access Journals (Sweden)

    Sohn DH

    2013-02-01

    Full Text Available David H SohnDepartment of Orthopedic Surgery, University of Toledo Medical Center, Toledo, OH, USAAbstract: Not all medical injuries are the result of negligence. In fact, most medical injuries are the result either of the inherent risk in the practice of medicine, or due to system errors, which cannot be prevented simply through fear of disciplinary action. This paper will discuss the differences between adverse events, negligence, and system errors; the current medical malpractice tort system in the United States; and review current and future solutions, including medical malpractice reform, alternative dispute resolution, health courts, and no-fault compensation systems. The current political environment favors investigation of non-cap tort reform remedies; investment into more rational oversight systems, such as health courts or no-fault systems may reap both quantitative and qualitative benefits for a less costly and safer health system.Keywords: medical malpractice, tort reform, no fault compensation, alternative dispute resolution, system errors

  9. Medication errors in pediatric inpatients

    DEFF Research Database (Denmark)

    Rishoej, Rikke Mie; Almarsdóttir, Anna Birna; Christesen, Henrik Thybo

    2017-01-01

    The aim was to describe medication errors (MEs) in hospitalized children reported to the national mandatory reporting and learning system, the Danish Patient Safety Database (DPSD). MEs were extracted from DPSD from the 5-year period of 2010-2014. We included reports from public hospitals...... on patients aged 0-17 years and categorized by reporters as medication-related. Reports from psychiatric wards and outpatient clinics were excluded. A ME was defined as any medication-related error occurring in the medication process whether harmful or not. MEs were categorized as harmful if they resulted...... in actual harm or interventions to prevent harm. MEs were further categorized according to occurrence in the medication process, type of error, and the medicines involved. A total of 2071 MEs including 487 harmful MEs were identified. Most MEs occurred during prescribing (40.8%), followed by dispensing (38...

  10. Reward positivity: Reward prediction error or salience prediction error?

    Science.gov (United States)

    Heydari, Sepideh; Holroyd, Clay B

    2016-08-01

    The reward positivity is a component of the human ERP elicited by feedback stimuli in trial-and-error learning and guessing tasks. A prominent theory holds that the reward positivity reflects a reward prediction error signal that is sensitive to outcome valence, being larger for unexpected positive events relative to unexpected negative events (Holroyd & Coles, 2002). Although the theory has found substantial empirical support, most of these studies have utilized either monetary or performance feedback to test the hypothesis. However, in apparent contradiction to the theory, a recent study found that unexpected physical punishments also elicit the reward positivity (Talmi, Atkinson, & El-Deredy, 2013). The authors of this report argued that the reward positivity reflects a salience prediction error rather than a reward prediction error. To investigate this finding further, in the present study participants navigated a virtual T maze and received feedback on each trial under two conditions. In a reward condition, the feedback indicated that they would either receive a monetary reward or not and in a punishment condition the feedback indicated that they would receive a small shock or not. We found that the feedback stimuli elicited a typical reward positivity in the reward condition and an apparently delayed reward positivity in the punishment condition. Importantly, this signal was more positive to the stimuli that predicted the omission of a possible punishment relative to stimuli that predicted a forthcoming punishment, which is inconsistent with the salience hypothesis. © 2016 Society for Psychophysiological Research.

  11. A Hybrid Unequal Error Protection / Unequal Error Resilience ...

    African Journals Online (AJOL)

    admpather

    Following that Unequal Error Protection against channel noise is provided to the layers by the use of ... of the Peak to Peak Signal to Noise power Ratio (PSNR) and the Mean Structural Similarity. Index (MSSIM) metric. Keywords: ..... compensation. IEEE Trans. on Circuits and Systems for Video Technology. pp. 438–452.

  12. Student Errors in Fractions and Possible Causes of These Errors

    Science.gov (United States)

    Aksoy, Nuri Can; Yazlik, Derya Ozlem

    2017-01-01

    In this study, it was aimed to determine the errors and misunderstandings of 5th and 6th grade middle school students in fractions and operations with fractions. For this purpose, the case study model, which is a qualitative research design, was used in the research. In the study, maximum diversity sampling, which is a purposeful sampling method,…

  13. Medication errors: definitions and classification

    Science.gov (United States)

    Aronson, Jeffrey K

    2009-01-01

    To understand medication errors and to identify preventive strategies, we need to classify them and define the terms that describe them. The four main approaches to defining technical terms consider etymology, usage, previous definitions, and the Ramsey–Lewis method (based on an understanding of theory and practice). A medication error is ‘a failure in the treatment process that leads to, or has the potential to lead to, harm to the patient’. Prescribing faults, a subset of medication errors, should be distinguished from prescription errors. A prescribing fault is ‘a failure in the prescribing [decision-making] process that leads to, or has the potential to lead to, harm to the patient’. The converse of this, ‘balanced prescribing’ is ‘the use of a medicine that is appropriate to the patient's condition and, within the limits created by the uncertainty that attends therapeutic decisions, in a dosage regimen that optimizes the balance of benefit to harm’. This excludes all forms of prescribing faults, such as irrational, inappropriate, and ineffective prescribing, underprescribing and overprescribing. A prescription error is ‘a failure in the prescription writing process that results in a wrong instruction about one or more of the normal features of a prescription’. The ‘normal features’ include the identity of the recipient, the identity of the drug, the formulation, dose, route, timing, frequency, and duration of administration. Medication errors can be classified, invoking psychological theory, as knowledge-based mistakes, rule-based mistakes, action-based slips, and memory-based lapses. This classification informs preventive strategies. PMID:19594526

  14. Analysis of Medication Error Reports

    Energy Technology Data Exchange (ETDEWEB)

    Whitney, Paul D.; Young, Jonathan; Santell, John; Hicks, Rodney; Posse, Christian; Fecht, Barbara A.

    2004-11-15

    In medicine, as in many areas of research, technological innovation and the shift from paper based information to electronic records has created a climate of ever increasing availability of raw data. There has been, however, a corresponding lag in our abilities to analyze this overwhelming mass of data, and classic forms of statistical analysis may not allow researchers to interact with data in the most productive way. This is true in the emerging area of patient safety improvement. Traditionally, a majority of the analysis of error and incident reports has been carried out based on an approach of data comparison, and starts with a specific question which needs to be answered. Newer data analysis tools have been developed which allow the researcher to not only ask specific questions but also to “mine” data: approach an area of interest without preconceived questions, and explore the information dynamically, allowing questions to be formulated based on patterns brought up by the data itself. Since 1991, United States Pharmacopeia (USP) has been collecting data on medication errors through voluntary reporting programs. USP’s MEDMARXsm reporting program is the largest national medication error database and currently contains well over 600,000 records. Traditionally, USP has conducted an annual quantitative analysis of data derived from “pick-lists” (i.e., items selected from a list of items) without an in-depth analysis of free-text fields. In this paper, the application of text analysis and data analysis tools used by Battelle to analyze the medication error reports already analyzed in the traditional way by USP is described. New insights and findings were revealed including the value of language normalization and the distribution of error incidents by day of the week. The motivation for this effort is to gain additional insight into the nature of medication errors to support improvements in medication safety.

  15. ERROR HANDLING IN INTEGRATION WORKFLOWS

    Directory of Open Access Journals (Sweden)

    Alexey M. Nazarenko

    2017-01-01

    Full Text Available Simulation experiments performed while solving multidisciplinary engineering and scientific problems require joint usage of multiple software tools. Further, when following a preset plan of experiment or searching for optimum solu- tions, the same sequence of calculations is run multiple times with various simulation parameters, input data, or conditions while overall workflow does not change. Automation of simulations like these requires implementing of a workflow where tool execution and data exchange is usually controlled by a special type of software, an integration environment or plat- form. The result is an integration workflow (a platform-dependent implementation of some computing workflow which, in the context of automation, is a composition of weakly coupled (in terms of communication intensity typical subtasks. These compositions can then be decomposed back into a few workflow patterns (types of subtasks interaction. The pat- terns, in their turn, can be interpreted as higher level subtasks.This paper considers execution control and data exchange rules that should be imposed by the integration envi- ronment in the case of an error encountered by some integrated software tool. An error is defined as any abnormal behavior of a tool that invalidates its result data thus disrupting the data flow within the integration workflow. The main requirementto the error handling mechanism implemented by the integration environment is to prevent abnormal termination of theentire workflow in case of missing intermediate results data. Error handling rules are formulated on the basic pattern level and on the level of a composite task that can combine several basic patterns as next level subtasks. The cases where workflow behavior may be different, depending on user's purposes, when an error takes place, and possible error handling op- tions that can be specified by the user are also noted in the work.

  16. Spent fuel bundle counter sequence error manual - KANUPP (125 MW) NGS

    Energy Technology Data Exchange (ETDEWEB)

    Nicholson, L.E

    1992-03-20

    The Spent Fuel Bundle Counter (SFBC) is used to count the number and type of spent fuel transfers that occur into or out of controlled areas at CANDU reactor sites. However if the transfers are executed in a non-standard manner or the SFBC is malfunctioning, the transfers are recorded as sequence errors. Each sequence error message may contain adequate information to determine the cause of the message. This manual provides a guide to interpret the various sequence error messages that can occur and suggests probable cause or causes of the sequence errors. Each likely sequence error is presented on a 'card' in Appendix A. Note that it would be impractical to generate a sequence error card file with entries for all possible combinations of faults. Therefore the card file contains sequences with only one fault at a time. Some exceptions have been included however where experience has indicated that several faults can occur simultaneously.

  17. Spent fuel bundle counter sequence error manual - RAPPS (200 MW) NGS

    Energy Technology Data Exchange (ETDEWEB)

    Nicholson, L.E

    1992-03-24

    The Spent Fuel Bundle Counter (SFBC) is used to count the number and type of spent fuel transfers that occur into or out of controlled areas at CANDU reactor sites. However if the transfers are executed in a non-standard manner or the SFBC is malfunctioning, the transfers are recorded as sequence errors. Each sequence error message typically contains adequate information to determine the cause of the message. This manual provides a guide to interpret the various sequence error messages that can occur and suggests probable cause or causes of the sequence errors. Each likely sequence error is presented on a 'card' in Appendix A. Note that it would be impractical to generate a sequence error card file with entries for all possible combinations of faults. Therefore the card file contains sequences with only one fault at a time. Some exceptions have been included however where experience has indicated that several faults can occur simultaneously.

  18. 45 CFR 670.6 - Prior possession exception.

    Science.gov (United States)

    2010-10-01

    ... CONSERVATION OF ANTARCTIC ANIMALS AND PLANTS Prohibited Acts, Exceptions § 670.6 Prior possession exception. (a... captivity on or before October 28, 1978; or (2) Any offspring of such mammal, bird, or plant. (b... such act was not held in captivity on or before October 28, 1978, or was not an offspring referred to...

  19. 46 CFR 150.150 - Exceptions to the compatibility chart.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Exceptions to the compatibility chart. 150.150 Section... CARGOES COMPATIBILITY OF CARGOES § 150.150 Exceptions to the compatibility chart. The Commandant (G-MSO... 1, the Compatibility Chart. ...

  20. 34 CFR 300.220 - Exception for prior local plans.

    Science.gov (United States)

    2010-07-01

    ... CHILDREN WITH DISABILITIES Local Educational Agency Eligibility § 300.220 Exception for prior local plans... 34 Education 2 2010-07-01 2010-07-01 false Exception for prior local plans. 300.220 Section 300... effective date of the Individuals with Disabilities Education Improvement Act of 2004, the applicable...

  1. 75 FR 82410 - Advisory Board for Exceptional Children

    Science.gov (United States)

    2010-12-30

    ... Bureau of Indian Affairs Advisory Board for Exceptional Children AGENCY: Bureau of Indian Education... Advisory Board for Exceptional Children (Advisory Board) will hold its next meeting in Albuquerque, New... Act of 2004 (IDEA) for Indian children with disabilities. DATES: The Advisory Board will meet on...

  2. 77 FR 28897 - Advisory Board for Exceptional Children

    Science.gov (United States)

    2012-05-16

    ... Bureau of Indian Affairs Advisory Board for Exceptional Children AGENCY: Bureau of Indian Affairs... Advisory Board for Exceptional Children (Advisory Board) will hold its next meeting in Albuquerque, New... Act of 2004 (IDEA) for Indian children with disabilities. DATES: The Advisory Board will meet on...

  3. 77 FR 47873 - Advisory Board for Exceptional Children

    Science.gov (United States)

    2012-08-10

    ... Bureau of Indian Affairs Advisory Board for Exceptional Children AGENCY: Bureau of Indian Affairs... Advisory Board for Exceptional Children (Advisory Board) will hold its next meeting in Washington, DC The... (IDEA) for Indian children with disabilities. DATES: The Advisory Board will meet on Thursday, September...

  4. 75 FR 50780 - Advisory Board for Exceptional Children

    Science.gov (United States)

    2010-08-17

    ... Bureau of Indian Affairs Advisory Board for Exceptional Children AGENCY: Bureau of Indian Affairs... Advisory Board for Exceptional Children (Advisory Board) will hold its next meeting in Washington, DC. The... (IDEA) for Indian children with disabilities. DATES: The Advisory Board will meet on Thursday, September...

  5. 76 FR 40929 - Advisory Board for Exceptional Children

    Science.gov (United States)

    2011-07-12

    ... Bureau of Indian Affairs Advisory Board for Exceptional Children AGENCY: Bureau of Indian Affairs... Advisory Board for Exceptional Children (Advisory Board) will hold its next meeting in Tampa, Florida. The... (IDEA) for Indian children with disabilities. DATES: The Advisory Board will meet on Sunday, September...

  6. 77 FR 16062 - Advisory Board for Exceptional Children

    Science.gov (United States)

    2012-03-19

    ... Bureau of Indian Affairs Advisory Board for Exceptional Children AGENCY: Bureau of Indian Affairs... Advisory Board for Exceptional Children (Advisory Board) will hold its next meeting in Albuquerque, New... Act of 2004 (IDEA) for Indian children with disabilities. DATES: The Advisory Board will meet on...

  7. 78 FR 42105 - Advisory Board for Exceptional Children

    Science.gov (United States)

    2013-07-15

    ... Bureau of Indian Affairs Advisory Board for Exceptional Children AGENCY: Bureau of Indian Affairs... Advisory Board for Exceptional Children (Advisory Board) will hold its next meeting in Albuquerque, New... Act of 2004 (IDEA) for Indian children with disabilities. DATES: The Advisory Board will meet on...

  8. 76 FR 17965 - Advisory Board for Exceptional Children

    Science.gov (United States)

    2011-03-31

    ... Bureau of Indian Affairs Advisory Board for Exceptional Children AGENCY: Bureau of Indian Affairs... Advisory Board for Exceptional Children (Advisory Board) will hold its next meeting in Riverside... Act of 2004 (IDEA) for Indian children with disabilities. DATES: The Advisory Board will meet on...

  9. 75 FR 23288 - Advisory Board for Exceptional Children

    Science.gov (United States)

    2010-05-03

    ... Bureau of Indian Affairs Advisory Board for Exceptional Children AGENCY: Bureau of Indian Affairs... Bureau of Indian Education (BIE) is announcing that the Advisory Board for Exceptional Children (Advisory... of the Individuals with Disabilities Education Act of 2004 (IDEA) for Indian children with...

  10. 77 FR 70807 - Advisory Board for Exceptional Children

    Science.gov (United States)

    2012-11-27

    ... Bureau of Indian Affairs Advisory Board for Exceptional Children AGENCY: Bureau of Indian Affairs... Advisory Board for Exceptional Children (Advisory Board) will hold its next meeting in Washington, DC. The... (IDEA) for Indian children with disabilities. DATES: The Advisory Board will meet on Thursday, January...

  11. 29 CFR 1606.3 - The national security exception.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false The national security exception. 1606.3 Section 1606.3... DISCRIMINATION BECAUSE OF NATIONAL ORIGIN § 1606.3 The national security exception. It is not an unlawful employment practice to deny employment opportunities to any individual who does not fulfill the national...

  12. Designing Inclusive Learning for Twice Exceptional Students in Minecraft

    Science.gov (United States)

    O'Sullivan, Muireann; Robb, Nigel; Howell, Stephen; Marshall, Kevin; Goodman, Lizbeth

    2017-01-01

    Twice exceptional learners are intellectually or creatively gifted yet also experience one or more learning difficulties. These students face a unique set of challenges in educational settings. Recommended strategies for accommodating twice exceptional learners focus on--among other things--(1) providing freedom and variety, so that students can…

  13. 49 CFR 173.156 - Exceptions for ORM materials.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Exceptions for ORM materials. 173.156 Section 173... for Hazardous Materials Other Than Class 1 and Class 7 § 173.156 Exceptions for ORM materials. (a... this part. (b) ORM-D. Packagings for ORM-D materials are specified according to hazard class in §§ 173...

  14. 18 CFR 1312.5 - Permit requirements and exceptions.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 2 2010-04-01 2010-04-01 false Permit requirements and exceptions. 1312.5 Section 1312.5 Conservation of Power and Water Resources TENNESSEE VALLEY AUTHORITY PROTECTION OF ARCHAEOLOGICAL RESOURCES: UNIFORM REGULATIONS § 1312.5 Permit requirements and exceptions. (a...

  15. Analysis for detecting and explaining exceptions in business data

    NARCIS (Netherlands)

    Liu, L.; Daniëls, H.A.M.; Lux Wigand, Dianne; et al,

    2013-01-01

    In this paper we describe the concepts of automatic analysis for the exceptional patterns which are hidden in a large set of business data. These exceptions are interesting to be investigated further for their causes and explanations, as they provide important decision support. The analysis process

  16. 42 CFR 413.184 - Payment exception: Pediatric patient mix.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Payment exception: Pediatric patient mix. 413.184 Section 413.184 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN...-Stage Renal Disease (ESRD) Services and Organ Procurement Costs § 413.184 Payment exception: Pediatric...

  17. An Approach for Search Based Testing of Null Pointer Exceptions

    NARCIS (Netherlands)

    Romano, D.; Di Penta, M.; Antoniol, G.

    2011-01-01

    Uncaught exceptions, and in particular null pointer exceptions (NPEs), constitute a major cause of crashes for software systems. Although tools for the static identification of potential NPEs exist, there is need for proper approaches able to identify system execution scenarios causing NPEs. This

  18. 49 CFR 173.310 - Exceptions for radiation detectors.

    Science.gov (United States)

    2010-10-01

    ...-GENERAL REQUIREMENTS FOR SHIPMENTS AND PACKAGINGS Gases; Preparation and Packaging § 173.310 Exceptions... from the specification packaging in this subchapter and, except when transported by air, from labeling... fragment upon impact. (b) Radiation detectors must not have a design pressure exceeding 4.83 MPa (700 psig...

  19. Adversity and Pitfalls of Twice-Exceptional Urban Learners

    Science.gov (United States)

    Mayes, Renae D.; Moore, James L., III.

    2016-01-01

    Current research provides unique insights into the experiences and context of twice-exceptional students in K-12 schools. However, within this literature, a critical gap exists concerning the voices of twice-exceptional African American students and their families. The current qualitative study examined the perceptions, attitudes, and experiences…

  20. Cultural Considerations for Twice-Exceptional Children from Asian Families

    Science.gov (United States)

    Park, Soeun

    2015-01-01

    Since the term twice-exceptional has been entered to the field of gifted education, many studies have investigated the population of students who possess both giftedness and disabilities. It has been shown that there are some challenges to recognizing twice-exceptional children due to current screening and identification process. For this reason,…

  1. 14 CFR 399.21 - Charter exemptions (except military).

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Charter exemptions (except military). 399.21 Section 399.21 Aeronautics and Space OFFICE OF THE SECRETARY, DEPARTMENT OF TRANSPORTATION... Authority § 399.21 Charter exemptions (except military). In deciding applications for exemptions from...

  2. 19 CFR 134.32 - General exceptions to marking requirements.

    Science.gov (United States)

    2010-04-01

    ...; DEPARTMENT OF THE TREASURY COUNTRY OF ORIGIN MARKING Exceptions to Marking Requirements § 134.32 General exceptions to marking requirements. The articles described or meeting the specified conditions set forth...): (a) Articles that are incapable of being marked; (b) Articles that cannot be marked prior to shipment...

  3. 12 CFR 313.54 - Exception to due process procedures.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Exception to due process procedures. 313.54 Section 313.54 Banks and Banking FEDERAL DEPOSIT INSURANCE CORPORATION PROCEDURE AND RULES OF PRACTICE PROCEDURES FOR CORPORATE DEBT COLLECTION Salary Offset § 313.54 Exception to due process procedures. (a) The...

  4. Students' Understanding of Advanced Properties of Java Exceptions

    Science.gov (United States)

    Rashkovits, Rami; Lavy, Ilana

    2012-01-01

    This study examines how Information Systems Engineering School students on the verge of their graduation understand the mechanism of exception handling. The main contributions of this paper are as follows: we construct a questionnaire aimed at examining students' level of understanding concerning exceptions; we classify and analyse the students'…

  5. Prescription errors and the impact of computerized prescription order entry system in a community-based hospital.

    Science.gov (United States)

    Jayawardena, Suriya; Eisdorfer, Jacob; Indulkar, Shalaka; Pal, Sethi Ajith; Sooriabalan, Danushan; Cucco, Robert

    2007-01-01

    Adverse drug events occur often in hospitals. They can be prevented to a large extent by minimizing the human errors of prescription writing. To evaluate the efficacy of a computerized prescription order entry (CPOE) system with the help of ancillary support in minimizing prescription errors. Retrospective study carried out in a community-based urban teaching hospital in south Brooklyn, NY from January 2004 to January 2005. Errors were categorized into inappropriate dosage adjustment for creatinine clearance, duplication, incorrect orders, allergy verification, and incomplete orders. The pharmacists identified the type of error, the severity of error, the class of drug involved, and the department that made the error. A total of 466,311 prescriptions were entered in the period of 1 year. There were 3513 errors during this period (7.53 errors per 1000 prescriptions). More than half of these errors were made by the internal medicine specialty. In our study, 50% of the errors were severe errors (overdosing medications with narrow therapeutic index or over-riding allergies), 46.28% were moderate errors (overdosing, wrong dosing, duplicate orders, or prescribing multiple antibiotics), and 3.71% were not harmful errors (wrong dosing or incomplete orders). The errors were also categorized according to the class of medication. Errors in antibiotic prescription accounted for 53.9% of all errors. The pharmacist detected all these prescription errors as the prescriptions were reviewed in the CPOE system. Prescription errors are common medical errors seen in hospitals. The CPOE system has prevented and alerted the prescriber and pharmacist to dosage errors and allergies. Involvement of the pharmacist in reviewing the prescription and alerting the physician has minimized prescription errors to a great degree in our hospital setting. The incidence of prescription errors before the CPOE has been reported to range from 3 to 99 per 1000 prescriptions. The disparity could be due to

  6. The valuation error in the compound values

    Directory of Open Access Journals (Sweden)

    Marina Ciuna

    2013-08-01

    Full Text Available In appraising the “valore di trasformazione” the valuation error is composed by the error on market value and the error on construction cost.  In appraising the “valore complementare” the valuation error is composed by the error on market value of complex real property and on market value of the residual part. The final error is a function of the partial errors and it can be studied using estimative and market ratios. The application of the compounds values to real estate appraisal misleads unacceptable errors if carried out with the expertise.

  7. Errores innatos del metabolismo: Enfermedades lisosomales

    Directory of Open Access Journals (Sweden)

    Caridad Menéndez Saínz

    2002-03-01

    Full Text Available Dentro de los errores innatos del metabolismo se encuentran las enfermedades de almacenamiento lisosomal o enzimopatías lisosomales, las cuáles se caracterizan por un déficit enzimático específico, la excreción de metabolitos por la orina y la acumulación de los compuestos no degradados en diferentes órganos y tejidos que ocasionan la disfución de éstos. Tienen un patrón de herencia autosómico recesivo, excepto para la enfermedad de Fabry y la enfermedad de Hunter en las que el patrón de herencia está ligado al cromosoma X. Estas enfermedades tienen una baja incidencia en general, aunque hay poblaciones donde algunas de ellas tienen una alta incidencia. Su importancia radica en la magnitud que representan como problema de salud, por la pobre calidad de vida de esos pacientes, así como su fallecimiento prematuro, motivo por el cual hay que evitar los nacimientos de nuevos niños afectados.Among the metabolism inborn errors, there are the lysosomal storage diseases or lysosomal enzymopathies that are characterized by an specific enzymatic deficiency, excretion of metabollites in urine and accumulation of non-degraded compounds in various organs and tissues causing their dysfunction. These diseases have a recessive autosomal heredity, except for Fabry´s disease and Hunter’s disease in which the pattern of heredity is chromosome X-linked. These diseases have a low incidence in general although there are populations where they show a high incidence. Their importance lies in what they represent as a health problem because of the poor quality of life of these patients and their early death, therefore, it is necessary to prevent the birth of new infants affected with these diseases.

  8. Space Science IS Accessible to Students with Exceptional Needs: Results from Exceptional Needs Workshops

    Science.gov (United States)

    Runyon, C. J.; Merritt, M.; Guimond, K.

    2003-12-01

    The majority of students with disabilities in the US are required to achieve the same academic levels as their non-impaired peers. Unfortunately, there are few specialized materials to help these exceptional students. To assist students in meeting their goals, SERCH, a NASA Office of Space Science Broker/Facilitator, has been working with NASA education product developers and educators from informal and formal settings to identify what kinds of materials they need and what mediums will work best. As a result of both direct classrooms observations and hands-on workshops we have begun generating adaptive lessons plans that meet the national standards for Science, Technology, Engineering and Mathematics. During the workshops, participants simulate various disabilities (e.g., hearing, vision, orthopedic impairments, learning difficulties) while working through Space Science activities and discuss necessary adaptations/modifications in real-time. For example, we modified the Solar System Distance activity first designed by ASU to include the use of larger beads or pom-poms instead of the suggested small plastic beads. This simple adaptation permits students with orthopedic impairments to more readily take part in the lesson and to actively "observe" the distance between the planets. Examples of this activity and more will be illustrated. In addition to making modifications and suggestions for adaptations, workshop participants shared many simple recommendations that can help ALL learners participate more readily in classroom activities and discussions. Among these are: (1) Use simple, sans-serif fonts and high contrast presentation media (e.g., white text on black is most effective); (2) Repetition and use of multiple presentation modes is very helpful. (3) Actively involve the learner, and (4) Keep things simple to begin with, then work toward the more complex - think of the audience, the ultimate user.

  9. Pre-analytical workstations: a tool for reducing laboratory errors.

    Science.gov (United States)

    Da Rin, Giorgio

    2009-06-01

    Laboratory testing, a highly complex process commonly called the total testing process (TTP), is usually subdivided into three traditional (pre-, intra-, and post-) analytical phases. The majority of errors in TTP originate in the pre-analytical phase, being due to individual or system design defects. In order to reduce errors in TTP, the pre-analytical phase should therefore be prioritized. In addition to developing procedures, providing training, improving interdepartmental cooperation, information technology and robotics may be a tool to reduce errors in specimen collection and pre-analytical sample handling. It has been estimated that >2000 clinical laboratories worldwide use total or subtotal automation supporting pre-analytic activities, with a high rate of increase compared to 2007; the need to reduce errors seems to be the catalyst for increasing the use of robotics. Automated systems to prevent medical personnel from drawing blood from the wrong patient were introduced commercially in the early 1990s. Correct patient identification and test tube labelling before phlebotomy are of extreme importance for patient safety in TTP, but currently few laboratories are interested in such products. At San Bassiano hospital, the implementation of advanced information technology and robotics in the pre-analytical phase (specimen collection and pre-analytical sample handling) have improved accuracy, and clinical efficiency of the laboratory process and created a TTP that minimizes errors.

  10. Error Sources in Proccessing LIDAR Based Bridge Inspection

    Science.gov (United States)

    Bian, H.; Chen, S. E.; Liu, W.

    2017-09-01

    Bridge inspection is a critical task in infrastructure management and is facing unprecedented challenges after a series of bridge failures. The prevailing visual inspection was insufficient in providing reliable and quantitative bridge information although a systematic quality management framework was built to ensure visual bridge inspection data quality to minimize errors during the inspection process. The LiDAR based remote sensing is recommended as an effective tool in overcoming some of the disadvantages of visual inspection. In order to evaluate the potential of applying this technology in bridge inspection, some of the error sources in LiDAR based bridge inspection are analysed. The scanning angle variance in field data collection and the different algorithm design in scanning data processing are the found factors that will introduce errors into inspection results. Besides studying the errors sources, advanced considerations should be placed on improving the inspection data quality, and statistical analysis might be employed to evaluate inspection operation process that contains a series of uncertain factors in the future. Overall, the development of a reliable bridge inspection system requires not only the improvement of data processing algorithms, but also systematic considerations to mitigate possible errors in the entire inspection workflow. If LiDAR or some other technology can be accepted as a supplement for visual inspection, the current quality management framework will be modified or redesigned, and this would be as urgent as the refine of inspection techniques.

  11. ERROR SOURCES IN PROCCESSING LIDAR BASED BRIDGE INSPECTION

    Directory of Open Access Journals (Sweden)

    H. Bian

    2017-09-01

    Full Text Available Bridge inspection is a critical task in infrastructure management and is facing unprecedented challenges after a series of bridge failures. The prevailing visual inspection was insufficient in providing reliable and quantitative bridge information although a systematic quality management framework was built to ensure visual bridge inspection data quality to minimize errors during the inspection process. The LiDAR based remote sensing is recommended as an effective tool in overcoming some of the disadvantages of visual inspection. In order to evaluate the potential of applying this technology in bridge inspection, some of the error sources in LiDAR based bridge inspection are analysed. The scanning angle variance in field data collection and the different algorithm design in scanning data processing are the found factors that will introduce errors into inspection results. Besides studying the errors sources, advanced considerations should be placed on improving the inspection data quality, and statistical analysis might be employed to evaluate inspection operation process that contains a series of uncertain factors in the future. Overall, the development of a reliable bridge inspection system requires not only the improvement of data processing algorithms, but also systematic considerations to mitigate possible errors in the entire inspection workflow. If LiDAR or some other technology can be accepted as a supplement for visual inspection, the current quality management framework will be modified or redesigned, and this would be as urgent as the refine of inspection techniques.

  12. A Method to Reduce Non-Nominal Troposphere Error

    Science.gov (United States)

    Wang, Zhipeng; Xin, Pumin; Li, Rui; Wang, Shujing

    2017-01-01

    Under abnormal troposphere, Ground-Based Augmentation System (GBAS) is unable to eliminate troposphere delay, resulting in non-nominal troposphere error. This paper analyzes the troposphere meteorological data of eight International GNSS Monitoring Assessment System (iGMAS) stations and 10 International GNSS Service (IGS) stations in China and records the most serious conditions during 2015 and 2016. Simulations show that the average increase in Vertical Protection Level (VPL) of all visible satellites under non-nominal troposphere is 2.32 m and that more satellites increase the VPL. To improve GBAS integrity, this paper proposes a satellite selection method to reduce the non-nominal troposphere error. First, the number of satellites in the optimal subset is determined to be 16 based on the relationship among VPL, non-nominal troposphere error and satellite geometry. Second, the distributions of the optimal satellites are determined. Finally, optimal satellites are selected in different elevation ranges. Results show that the average VPL increase caused by non-nominal troposphere error is 1.15 m using the proposed method. Compared with the brute method and greedy method, the running rate of the proposed method is improved by 390.91% and 111.65%, respectively. In summary, the proposed method balances the satellite geometry and non-nominal troposphere error while minimizing the VPL and improving the running rate. PMID:28758983

  13. Cervical spine reposition errors after cervical flexion and extension.

    Science.gov (United States)

    Wang, Xu; Lindstroem, René; Carstens, Niels Peter Bak; Graven-Nielsen, Thomas

    2017-03-13

    Upright head and neck position has been frequently applied as baseline for diagnosis of neck problems. However, the variance of the position after cervical motions has never been demonstrated. Thus, it is unclear if the baseline position varies evenly across the cervical joints. The purpose was to assess reposition errors of upright cervical spine. Cervical reposition errors were measured in twenty healthy subjects (6 females) using video-fluoroscopy. Two flexion movements were performed with a 20 s interval, the same was repeated for extension, with an interval of 5 min between flexion and extension movements. Cervical joint positions were assessed with anatomical landmarks and external markers in a Matlab program. Reposition errors were extracted in degrees (initial position minus reposition) as constant errors (CEs) and absolute errors (AEs). Twelve of twenty-eight CEs (7 joints times 4 repositions) exceeded the minimal detectable change (MDC), while all AEs exceeded the MDC. Averaged AEs across the cervical joints were larger after 5 min' intervals compared to 20 s intervals (p flexion and extension movements in healthy adults.

  14. Detection and Classification of Measurement Errors in Bioimpedance Spectroscopy.

    Directory of Open Access Journals (Sweden)

    David Ayllón

    Full Text Available Bioimpedance spectroscopy (BIS measurement errors may be caused by parasitic stray capacitance, impedance mismatch, cross-talking or their very likely combination. An accurate detection and identification is of extreme importance for further analysis because in some cases and for some applications, certain measurement artifacts can be corrected, minimized or even avoided. In this paper we present a robust method to detect the presence of measurement artifacts and identify what kind of measurement error is present in BIS measurements. The method is based on supervised machine learning and uses a novel set of generalist features for measurement characterization in different immittance planes. Experimental validation has been carried out using a database of complex spectra BIS measurements obtained from different BIS applications and containing six different types of errors, as well as error-free measurements. The method obtained a low classification error (0.33% and has shown good generalization. Since both the features and the classification schema are relatively simple, the implementation of this pre-processing task in the current hardware of bioimpedance spectrometers is possible.

  15. Detection and Classification of Measurement Errors in Bioimpedance Spectroscopy.

    Science.gov (United States)

    Ayllón, David; Gil-Pita, Roberto; Seoane, Fernando

    2016-01-01

    Bioimpedance spectroscopy (BIS) measurement errors may be caused by parasitic stray capacitance, impedance mismatch, cross-talking or their very likely combination. An accurate detection and identification is of extreme importance for further analysis because in some cases and for some applications, certain measurement artifacts can be corrected, minimized or even avoided. In this paper we present a robust method to detect the presence of measurement artifacts and identify what kind of measurement error is present in BIS measurements. The method is based on supervised machine learning and uses a novel set of generalist features for measurement characterization in different immittance planes. Experimental validation has been carried out using a database of complex spectra BIS measurements obtained from different BIS applications and containing six different types of errors, as well as error-free measurements. The method obtained a low classification error (0.33%) and has shown good generalization. Since both the features and the classification schema are relatively simple, the implementation of this pre-processing task in the current hardware of bioimpedance spectrometers is possible.

  16. Cyclone Simulation via Action Minimization

    Science.gov (United States)

    Plotkin, D. A.; Weare, J.; Abbot, D. S.

    2016-12-01

    A postulated impact of climate change is an increase in intensity of tropical cyclones (TCs). This hypothesized effect results from the fact that TCs are powered subsaturated boundary layer air picking up water vapor from the surface ocean as it flows inwards towards the eye. This water vapor serves as the energy input for TCs, which can be idealized as heat engines. The inflowing air has a nearly identical temperature as the surface ocean; therefore, warming of the surface leads to a warmer atmospheric boundary layer. By the Clausius-Clapeyron relationship, warmer boundary layer air can hold more water vapor and thus results in more energetic storms. Changes in TC intensity are difficult to predict due to the presence of fine structures (e.g. convective structures and rainbands) with length scales of less than 1 km, while general circulation models (GCMs) generally have horizontal resolutions of tens of kilometers. The models are therefore unable to capture these features, which are critical to accurately simulating cyclone structure and intensity. Further, strong TCs are rare events, meaning that long multi-decadal simulations are necessary to generate meaningful statistics about intense TC activity. This adds to the computational expense, making it yet more difficult to generate accurate statistics about long-term changes in TC intensity due to global warming via direct simulation. We take an alternative approach, applying action minimization techniques developed in molecular dynamics to the WRF weather/climate model. We construct artificial model trajectories that lead from quiescent (TC-free) states to TC states, then minimize the deviation of these trajectories from true model dynamics. We can thus create Monte Carlo model ensembles that are biased towards cyclogenesis, which reduces computational expense by limiting time spent in non-TC states. This allows for: 1) selective interrogation of model states with TCs; 2) finding the likeliest paths for

  17. Mini-Med School Planning Guide

    Science.gov (United States)

    National Institutes of Health, Office of Science Education, 2008

    2008-01-01

    Mini-Med Schools are public education programs now offered by more than 70 medical schools, universities, research institutions, and hospitals across the nation. There are even Mini-Med Schools in Ireland, Malta, and Canada! The program is typically a lecture series that meets once a week and provides "mini-med students" information on some of the…

  18. Having Fun with Error Analysis

    Science.gov (United States)

    Siegel, Peter

    2007-01-01

    We present a fun activity that can be used to introduce students to error analysis: the M&M game. Students are told to estimate the number of individual candies plus uncertainty in a bag of M&M's. The winner is the group whose estimate brackets the actual number with the smallest uncertainty. The exercise produces enthusiastic discussions and…

  19. and Correlated Error-Regressor

    African Journals Online (AJOL)

    Nekky Umera

    in queuing theory and econometrics, where the usual assumption of independent error terms may not be plausible in most cases. Also, when using time-series data on a number of micro-economic units, such as households and service oriented channels, where the stochastic disturbance terms in part reflect variables which ...

  20. Typical errors of ESP users

    Science.gov (United States)

    Eremina, Svetlana V.; Korneva, Anna A.

    2004-07-01

    The paper presents analysis of the errors made by ESP (English for specific purposes) users which have been considered as typical. They occur as a result of misuse of resources of English grammar and tend to resist. Their origin and places of occurrence have also been discussed.

  1. Learner Corpora without Error Tagging

    Directory of Open Access Journals (Sweden)

    Rastelli, Stefano

    2009-01-01

    Full Text Available The article explores the possibility of adopting a form-to-function perspective when annotating learner corpora in order to get deeper insights about systematic features of interlanguage. A split between forms and functions (or categories is desirable in order to avoid the "comparative fallacy" and because – especially in basic varieties – forms may precede functions (e.g., what resembles to a "noun" might have a different function or a function may show up in unexpected forms. In the computer-aided error analysis tradition, all items produced by learners are traced to a grid of error tags which is based on the categories of the target language. Differently, we believe it is possible to record and make retrievable both words and sequence of characters independently from their functional-grammatical label in the target language. For this purpose at the University of Pavia we adapted a probabilistic POS tagger designed for L1 on L2 data. Despite the criticism that this operation can raise, we found that it is better to work with "virtual categories" rather than with errors. The article outlines the theoretical background of the project and shows some examples in which some potential of SLA-oriented (non error-based tagging will be possibly made clearer.

  2. Serial and spatial error correlation

    NARCIS (Netherlands)

    Elhorst, J. Paul

    This paper demonstrates that jointly modeling serial and spatial error correlation results in a trade-off between the serial and spatial autocorrelation coefficients. Ignoring this trade-off causes inefficiency and may lead to nonstationarity. (C) 2008 Elsevier B.V. All rights reserved.

  3. Finding errors in big data

    NARCIS (Netherlands)

    Puts, Marco; Daas, Piet; de Waal, A.G.

    No data source is perfect. Mistakes inevitably creep in. Spotting errors is hard enough when dealing with survey responses from several thousand people, but the difficulty is multiplied hugely when that mysterious beast Big Data comes into play. Statistics Netherlands is about to publish its first

  4. Magnitude control of commutator errors

    NARCIS (Netherlands)

    Geurts, Bernardus J.; Wesseling, P.; Oñate, E.; Périaux, J.

    2006-01-01

    Non-uniform filtering of the Navier-Stokes equations expresses itself, next to the turbulent stresses, in additional closure terms known as commutator errors. These terms require explicit subgrid modeling if the non-uniformity of the filter is sufficiently pronounced. We derive expressions for the

  5. Theory of Test Translation Error

    Science.gov (United States)

    Solano-Flores, Guillermo; Backhoff, Eduardo; Contreras-Nino, Luis Angel

    2009-01-01

    In this article, we present a theory of test translation whose intent is to provide the conceptual foundation for effective, systematic work in the process of test translation and test translation review. According to the theory, translation error is multidimensional; it is not simply the consequence of defective translation but an inevitable fact…

  6. Error signals driving locomotor adaptation

    DEFF Research Database (Denmark)

    Choi, Julia T; Jensen, Peter; Nielsen, Jens Bo

    2016-01-01

    anaesthesia (n = 5) instead of repetitive nerve stimulation. Foot anaesthesia reduced ankle adaptation to external force perturbations during walking. Our results suggest that cutaneous input plays a role in force perception, and may contribute to the 'error' signal involved in driving walking adaptation when...

  7. What Is a Reading Error?

    Science.gov (United States)

    Labov, William; Baker, Bettina

    2010-01-01

    Early efforts to apply knowledge of dialect differences to reading stressed the importance of the distinction between differences in pronunciation and mistakes in reading. This study develops a method of estimating the probability that a given oral reading that deviates from the text is a true reading error by observing the semantic impact of the…

  8. Error Detection in Numeric Codes

    Indian Academy of Sciences (India)

    Admin

    engineering at IIT Patna. His interests include watching and playing cricket, listening to music and playing sitar. His research interests include cryptography and pattern recognition. This article investigates the e±ciency of four com- monly used methods for detecting the most fre- quent types of errors committed by individuals.

  9. Against explanatory minimalism in psychiatry

    Directory of Open Access Journals (Sweden)

    Tim eThornton

    2015-12-01

    Full Text Available The idea that psychiatry contains, in principle, a series of levels of explanation has been criticised both as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell’s criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation respectively and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein’s Zettel. But attention to the context of Wittgenstein’s remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of level of explanation. Only in a context broader than the one provided by interventionism is the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation.

  10. Against Explanatory Minimalism in Psychiatry.

    Science.gov (United States)

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell's criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein's Zettel. But attention to the context of Wittgenstein's remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation.

  11. Minimalism through intraoperative functional mapping.

    Science.gov (United States)

    Berger, M S

    1996-01-01

    Intraoperative stimulation mapping may be used to avoid unnecessary risk to functional regions subserving language and sensori-motor pathways. Based on the data presented here, language localization is variable in the entire population, with only certainty existing for the inferior frontal region responsible for motor speech. Anatomical landmarks such as the anterior temporal tip for temporal lobe language sites and the posterior aspect of the lateral sphenoid wing for the frontal lobe language zones are unreliable in avoiding postoperative aphasias. Thus, individual mapping to identify essential language sites has the greatest likelihood of avoiding permanent deficits in naming, reading, and motor speech. In a similar approach, motor and sensory pathways from the cortex and underlying white matter may be reliably stimulated and mapped in both awake and asleep patients. Although these techniques require an additional operative time and equipment nominally priced, the result is often gratifying, as postoperative morbidity has been greatly reduced in the process of incorporating these surgical strategies. The patients quality of life is improved in terms of seizure control, with or without antiepileptic drugs. This avoids having to perform a second costly operative procedure, which is routinely done when extraoperative stimulation and recording is done via subdural grids. In addition, an aggressive tumor resection at the initial operation lengthens the time to tumor recurrence and often obviates the need for a subsequent reoperation. Thus, intraoperative functional mapping may be best alluded to as a surgical technique that results in "minimalism in the long term".

  12. Robotic assisted minimally invasive surgery

    Directory of Open Access Journals (Sweden)

    Palep Jaydeep

    2009-01-01

    Full Text Available The term "robot" was coined by the Czech playright Karel Capek in 1921 in his play Rossom′s Universal Robots. The word "robot" is from the check word robota which means forced labor.The era of robots in surgery commenced in 1994 when the first AESOP (voice controlled camera holder prototype robot was used clinically in 1993 and then marketed as the first surgical robot ever in 1994 by the US FDA. Since then many robot prototypes like the Endoassist (Armstrong Healthcare Ltd., High Wycombe, Buck, UK, FIPS endoarm (Karlsruhe Research Center, Karlsruhe, Germany have been developed to add to the functions of the robot and try and increase its utility. Integrated Surgical Systems (now Intuitive Surgery, Inc. redesigned the SRI Green Telepresence Surgery system and created the daVinci Surgical System ® classified as a master-slave surgical system. It uses true 3-D visualization and EndoWrist ® . It was approved by FDA in July 2000 for general laparoscopic surgery, in November 2002 for mitral valve repair surgery. The da Vinci robot is currently being used in various fields such as urology, general surgery, gynecology, cardio-thoracic, pediatric and ENT surgery. It provides several advantages to conventional laparoscopy such as 3D vision, motion scaling, intuitive movements, visual immersion and tremor filtration. The advent of robotics has increased the use of minimally invasive surgery among laparoscopically naοve surgeons and expanded the repertoire of experienced surgeons to include more advanced and complex reconstructions.

  13. Radappertization of minimally processed carrots

    Energy Technology Data Exchange (ETDEWEB)

    Walder, Juliana F.A.; Walder, Julio M.M. [Centro de Energia Nuclear na Agricultura (CENA/USP), Piracicaba, SP (Brazil)]. E-mails: juwalder@gmail.com; jmwalder@cena.usp.br; Souza, Miriam C. de [Universidade Metodista de Piracicaba (UNIMEP), SP (Brazil)]. E-mail: mcsouza@unimep.br; Spoto, Marta H.F. [Universidade de Sao Paulo (USP), Piracicaba, SP (Brazil). Escola Superior de Agricultura Luiz de Queiroz (ESALQ)]. E-mail: mhfspoto@esalq.usp.br

    2007-07-01

    Full text: The goal of this work was to obtain shelf-stable irradiated carrots. It was evaluate the effect of high-doses (radappertization) of gamma radiation (Cobalt-60) on minimally processed carrots cv. Nantes. Before irradiation carrots were blanched, vacuum packaged in polyethylene film (52 {mu}m) and frozen (-80 deg C) prior to and during radiation processing. Used doses were 10, 20 and 30 kGy. After irradiation the carrot bags were kept at room conditions (25 - 28 deg C and RH 60-80 %) for 90 days period. Physical-chemical characteristics and microorganism population were determined at 1, 30 60 and 90 days after radiation process. Radappertization decreased total soluble solids (TSS), hardness and color. Radiation was responsible for reduction of 15,5% of total carotenoids content. By the other hand the storage period was responsible for 35 % losses. pH was not affected by radiation nor by storage period. Complete sterilization was achieved with doses of 20 kGy and 30 kGy. Radappertization affected negatively the sensorial characteristics of flavor, color and general appearance. Through sensorial analysis was possible to evaluate that polyethylene seal was inadequate for the purpose because allowed photo-chemical reactions in the carrots during the storage period. The metallized film kept best appearance of the irradiated carrots after 90 days storage. (author)

  14. Renormalization of minimally doubled fermions

    Science.gov (United States)

    Capitani, Stefano; Creutz, Michael; Weber, Johannes; Wittig, Hartmut

    2010-09-01

    We investigate the renormalization properties of minimally doubled fermions, at one loop in perturbation theory. Our study is based on the two particular realizations of Boriçi-Creutz and Karsten-Wilczek. A common feature of both formulations is the breaking of hyper-cubic symmetry, which requires that the lattice actions are supplemented by suitable counterterms. We show that three counterterms are required in each case and determine their coefficients to one loop in perturbation theory. For both actions we compute the vacuum polarization of the gluon. It is shown that no power divergences appear and that all contributions which arise from the breaking of Lorentz symmetry are cancelled by the counterterms. We also derive the conserved vector and axial-vector currents for Karsten-Wilczek fermions. Like in the case of the previously studied Boriçi-Creutz action, one obtains simple expressions, involving only nearest-neighbour sites. We suggest methods how to fix the coefficients of the counterterms non-perturbatively and discuss the implications of our findings for practical simulations.

  15. Error and its meaning in forensic science.

    Science.gov (United States)

    Christensen, Angi M; Crowder, Christian M; Ousley, Stephen D; Houck, Max M

    2014-01-01

    The discussion of "error" has gained momentum in forensic science in the wake of the Daubert guidelines and has intensified with the National Academy of Sciences' Report. Error has many different meanings, and too often, forensic practitioners themselves as well as the courts misunderstand scientific error and statistical error rates, often confusing them with practitioner error (or mistakes). Here, we present an overview of these concepts as they pertain to forensic science applications, discussing the difference between practitioner error (including mistakes), instrument error, statistical error, and method error. We urge forensic practitioners to ensure that potential sources of error and method limitations are understood and clearly communicated and advocate that the legal community be informed regarding the differences between interobserver errors, uncertainty, variation, and mistakes. © 2013 American Academy of Forensic Sciences.

  16. Absolute vs. relative error characterization of electromagnetic tracking accuracy

    Science.gov (United States)

    Matinfar, Mohammad; Narayanasamy, Ganesh; Gutierrez, Luis; Chan, Raymond; Jain, Ameet

    2010-02-01

    Electromagnetic (EM) tracking systems are often used for real time navigation of medical tools in an Image Guided Therapy (IGT) system. They are specifically advantageous when the medical device requires tracking within the body of a patient where line of sight constraints prevent the use of conventional optical tracking. EM tracking systems are however very sensitive to electromagnetic field distortions. These distortions, arising from changes in the electromagnetic environment due to the presence of conductive ferromagnetic surgical tools or other medical equipment, limit the accuracy of EM tracking, in some cases potentially rendering tracking data unusable. We present a mapping method for the operating region over which EM tracking sensors are used, allowing for characterization of measurement errors, in turn providing physicians with visual feedback about measurement confidence or reliability of localization estimates. In this instance, we employ a calibration phantom to assess distortion within the operating field of the EM tracker and to display in real time the distribution of measurement errors, as well as the location and extent of the field associated with minimal spatial distortion. The accuracy is assessed relative to successive measurements. Error is computed for a reference point and consecutive measurement errors are displayed relative to the reference in order to characterize the accuracy in near-real-time. In an initial set-up phase, the phantom geometry is calibrated by registering the data from a multitude of EM sensors in a non-ferromagnetic ("clean") EM environment. The registration results in the locations of sensors with respect to each other and defines the geometry of the sensors in the phantom. In a measurement phase, the position and orientation data from all sensors are compared with the known geometry of the sensor spacing, and localization errors (displacement and orientation) are computed. Based on error thresholds provided by the

  17. NHI program for introducing thoracoscopic minimally invasive mitral and tricuspid valve surgery

    Directory of Open Access Journals (Sweden)

    Tamer El Banna

    2014-03-01

    Conclusions: Thoracoscopic minimally invasive mitral valve surgery can be performed safely but definitely requires a learning curve. Good results and a high patient satisfaction are guaranteed. We now utilize this approach for isolated atrioventricular valve disease and our plan is to make this exclusive by the end of this year for all the patients except Redo Cases.

  18. Minimally invasive surgery for the lumbar spine.

    Science.gov (United States)

    Gandhi, S D; Anderson, D G

    2012-03-01

    Minimally invasive spine surgery is a rapidly developing field that has the potential to decrease surgical morbidity and improve recovery compared to traditional spinal approaches. Minimally invasive approaches have been developed for all regions of the spine, but have been best documented for degenerative conditions of the lumbar spine. Lumbar decompression and lumbar interbody fusion are two of the most well-studied minimally invasive surgical approaches. This article will review both the rationale and technique for minimally invasive lumbar decompression and for a minimally invasive transforaminal lumbar interbody fusion (TLIF).

  19. Minimally invasive surgery in pelvic floor repair.

    Science.gov (United States)

    Zwain, Omar; Aoun, Joelle; Eisenstein, David

    2017-08-01

    To review the use and efficacy of minimally invasive surgery in pelvic organ prolapse (POP) repair. This review summarizes surgical options for management of POP with special emphasis on minimally invasive surgical approach and discusses the recent experience and feasibility of integrating robot-assisted technology. Minimally invasive approaches have equal efficacy and less morbidity than laparotomy for POP repair, particularly apical prolapse. Robotics may facilitate the rate of minimally invasive surgery for POP repair with greater cost and as yet no proven superiority for conventional laparoscopy. Minimally invasive surgery is the preferred approach to POP repair. Conventional laparoscopic or robotic sacral colpopexy is recommended for apical defect and procidentia.

  20. Frequency of inappropriate medical exceptions to quality measures.

    Science.gov (United States)

    Persell, Stephen D; Dolan, Nancy C; Friesema, Elisha M; Thompson, Jason A; Kaiser, Darren; Baker, David W

    2010-02-16

    Quality improvement programs that allow physicians to document medical reasons for deviating from guidelines preserve clinicians' judgment while enabling them to strive for high performance. However, physician misconceptions or gaming potentially limit programs. To implement computerized decision support with mechanisms to document medical exceptions to quality measures and to perform peer review of exceptions and provide feedback when appropriate. Observational study. Large internal medicine practice. Patients eligible for 1 or more quality measures. A peer-review panel judged medical exceptions to 16 chronic disease and prevention quality measures as appropriate, inappropriate, or of uncertain appropriateness. Medical records were reviewed after feedback was given to determine whether care changed. Physicians recorded 650 standardized medical exceptions during 7 months. The reporting tool was used without any medical reason 36 times (5.5%). Of the remaining 614 exceptions, 93.6% were medically appropriate, 3.1% were inappropriate, and 3.3% were of uncertain appropriateness. Frequencies of inappropriate exceptions were 7 (6.9%) for coronary heart disease, 0 (0%) for heart failure, 10 (10.8%) for diabetes, and 2 (0.6%) for preventive services. After physicians received direct feedback about inappropriate exceptions, 8 of 19 (42%) changed management. The peer-review process took less than 5 minutes per case, but for each change in clinical care, 65 reviews were required. The findings could differ at other sites or if financial incentives were in place. Physician-recorded medical exceptions were correct most of the time. Peer review of medical exceptions can identify myths and misconceptions, but the process needs to be more efficient to be sustainable. Agency for Healthcare Research and Quality.

  1. Conflict monitoring in speech processing : An fMRI study of error detection in speech production and perception

    NARCIS (Netherlands)

    Gauvin, Hanna; De Baene, W.; Brass, Marcel; Hartsuiker, Robert

    2016-01-01

    To minimize the number of errors in speech, and thereby facilitate communication, speech is monitored before articulation. It is, however, unclear at which level during speech production monitoring takes place, and what mechanisms are used to detect and correct errors. The present study investigated

  2. In vivo comparison of hip mechanics for minimally invasive versus traditional total hip arthroplasty.

    Science.gov (United States)

    Glaser, Diana; Dennis, Douglas A; Komistek, Richard D; Miner, Todd M

    2008-02-01

    Minimally invasive surgery has been developed to reduce incision length, muscle damage, and rehabilitation time. However, reduced exposure of anatomical landmarks may result in technical errors and inferior implant survivorship. The objective of this study was to compare in vivo motions and hip joint contact forces during gait in total hip arthroplasty subjects, performed with either minimally invasive surgery or standard surgical approaches. Fifteen subjects implanted using either minimally invasive surgery anterolateral, minimally invasive surgery posterolateral, or traditional posterolateral total hip arthroplasty were evaluated using fluoroscopy while performing gait on a treadmill. Kinematics, obtained using 3D-to-2D image registration technique, were input as temporal functions in a 3D inverse dynamic mathematical model that determines in vivo soft tissue and hip contact forces. The subjects implanted with posterolateral and anterolateral minimally invasive surgery demonstrated significantly less separation than those implanted with the traditional approach (P<0.01). The minimally invasive surgery subjects also experienced lower average maximum peak forces, with 3.2 body weight for the anterolateral minimally invasive surgery and 2.9 body weight for the posterolateral minimally invasive surgery subjects, compared to 3.5 body weight for the traditional subjects (P=0.02 and P=0.03, respectively). This is the first study to compare in vivo weight-bearing kinematics, separation and kinetics for traditional, anterolateral minimally invasive surgery and posterolateral minimally invasive surgery total hip arthroplasty subject groups. Our data indicated in all analyzed parameters differences between the minimally invasive surgery and the traditional groups, with favorable results for the minimally invasive surgery subjects. This may be related, to a reduction in stabilizing soft tissues after a minimally invasive surgery procedure, leading to lower bearing surface

  3. Managing the financial cost of exception to contracting standards

    DEFF Research Database (Denmark)

    Henschel, Rene Franz

    2008-01-01

    In managing financial cost of exception to contracting standards, the first step is to put up an intelligent contract standards exception monitoring system The next step is to maintain tailor-made, fair and transparent contracting standards The third step is to eliminate unnecessary information...... and repetitiveness in contracting standards The fourth step is to enable your organization and the customers or suppliers to handle the necessary exceptions themselves Finally you should consider the use of independent contracting standards and elimination of your own standards as a tool in managing the cost...

  4. Minimal complexity control law synthesis

    Science.gov (United States)

    Bernstein, Dennis S.; Haddad, Wassim M.; Nett, Carl N.

    1989-01-01

    A paradigm for control law design for modern engineering systems is proposed: Minimize control law complexity subject to the achievement of a specified accuracy in the face of a specified level of uncertainty. Correspondingly, the overall goal is to make progress towards the development of a control law design methodology which supports this paradigm. Researchers achieve this goal by developing a general theory of optimal constrained-structure dynamic output feedback compensation, where here constrained-structure means that the dynamic-structure (e.g., dynamic order, pole locations, zero locations, etc.) of the output feedback compensation is constrained in some way. By applying this theory in an innovative fashion, where here the indicated iteration occurs over the choice of the compensator dynamic-structure, the paradigm stated above can, in principle, be realized. The optimal constrained-structure dynamic output feedback problem is formulated in general terms. An elegant method for reducing optimal constrained-structure dynamic output feedback problems to optimal static output feedback problems is then developed. This reduction procedure makes use of star products, linear fractional transformations, and linear fractional decompositions, and yields as a byproduct a complete characterization of the class of optimal constrained-structure dynamic output feedback problems which can be reduced to optimal static output feedback problems. Issues such as operational/physical constraints, operating-point variations, and processor throughput/memory limitations are considered, and it is shown how anti-windup/bumpless transfer, gain-scheduling, and digital processor implementation can be facilitated by constraining the controller dynamic-structure in an appropriate fashion.

  5. Minimal models of multidimensional computations.

    Directory of Open Access Journals (Sweden)

    Jeffrey D Fitzgerald

    2011-03-01

    Full Text Available The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs.

  6. Robotic assisted minimally invasive surgery

    Science.gov (United States)

    Palep, Jaydeep H

    2009-01-01

    The term “robot” was coined by the Czech playright Karel Capek in 1921 in his play Rossom's Universal Robots. The word “robot” is from the check word robota which means forced labor. The era of robots in surgery commenced in 1994 when the first AESOP (voice controlled camera holder) prototype robot was used clinically in 1993 and then marketed as the first surgical robot ever in 1994 by the US FDA. Since then many robot prototypes like the Endoassist (Armstrong Healthcare Ltd., High Wycombe, Buck, UK), FIPS endoarm (Karlsruhe Research Center, Karlsruhe, Germany) have been developed to add to the functions of the robot and try and increase its utility. Integrated Surgical Systems (now Intuitive Surgery, Inc.) redesigned the SRI Green Telepresence Surgery system and created the daVinci Surgical System® classified as a master-slave surgical system. It uses true 3-D visualization and EndoWrist®. It was approved by FDA in July 2000 for general laparoscopic surgery, in November 2002 for mitral valve repair surgery. The da Vinci robot is currently being used in various fields such as urology, general surgery, gynecology, cardio-thoracic, pediatric and ENT surgery. It provides several advantages to conventional laparoscopy such as 3D vision, motion scaling, intuitive movements, visual immersion and tremor filtration. The advent of robotics has increased the use of minimally invasive surgery among laparoscopically naïve surgeons and expanded the repertoire of experienced surgeons to include more advanced and complex reconstructions. PMID:19547687

  7. The Sources of Error in Spanish Writing.

    Science.gov (United States)

    Justicia, Fernando; Defior, Sylvia; Pelegrina, Santiago; Martos, Francisco J.

    1999-01-01

    Determines the pattern of errors in Spanish spelling. Analyzes and proposes a classification system for the errors made by children in the initial stages of the acquisition of spelling skills. Finds the diverse forms of only 20 Spanish words produces 36% of the spelling errors in Spanish; and substitution is the most frequent type of error. (RS)

  8. Error Analysis of Band Matrix Method

    OpenAIRE

    Taniguchi, Takeo; Soga, Akira

    1984-01-01

    Numerical error in the solution of the band matrix method based on the elimination method in single precision is investigated theoretically and experimentally, and the behaviour of the truncation error and the roundoff error is clarified. Some important suggestions for the useful application of the band solver are proposed by using the results of above error analysis.

  9. Error Correction in Oral Classroom English Teaching

    Science.gov (United States)

    Jing, Huang; Xiaodong, Hao; Yu, Liu

    2016-01-01

    As is known to all, errors are inevitable in the process of language learning for Chinese students. Should we ignore students' errors in learning English? In common with other questions, different people hold different opinions. All teachers agree that errors students make in written English are not allowed. For the errors students make in oral…

  10. Pathologists' Perspectives on Disclosing Harmful Pathology Error.

    Science.gov (United States)

    Dintzis, Suzanne M; Clennon, Emily K; Prouty, Carolyn D; Reich, Lisa M; Elmore, Joann G; Gallagher, Thomas H

    2017-06-01

    - Medical errors are unfortunately common. The US Institute of Medicine proposed guidelines for mitigating and disclosing errors. Implementing these recommendations in pathology will require a better understanding of how errors occur in pathology, the relationship between pathologists and treating clinicians in reducing error, and pathologists' experiences with and attitudes toward disclosure of medical error. - To understand pathologists' attitudes toward disclosing pathology error to treating clinicians and patients. - We conducted 5 structured focus groups in Washington State and Missouri with 45 pathologists in academic and community practice. Participants were questioned about pathology errors, how clinicians respond to pathology errors, and what roles pathologists should play in error disclosure to patients. - These pathologists believe that neither treating physicians nor patients understand the subtleties and limitations of pathologic diagnoses, which complicates discussions about pathology errors. Pathologists' lack of confidence in communication skills and fear of being misrepresented or misunderstood are major barriers to their participation in disclosure discussions. Pathologists see potential for their future involvement in disclosing error to patients, but at present advocate reliance on treating clinicians to disclose pathology errors to patients. Most group members believed that going forward pathologists should offer to participate more actively in error disclosure to patients. - Pathologists lack confidence in error disclosure communication skills with both treating physicians and patients. Improved communication between pathologists and treating physicians could enhance transparency and promote disclosure of pathology errors. Consensus guidelines for best practices in pathology error disclosure may be useful.

  11. Correction of errors in power measurements

    DEFF Research Database (Denmark)

    Pedersen, Knud Ole Helgesen

    1998-01-01

    Small errors in voltage and current measuring transformers cause inaccuracies in power measurements.In this report correction factors are derived to compensate for such errors.......Small errors in voltage and current measuring transformers cause inaccuracies in power measurements.In this report correction factors are derived to compensate for such errors....

  12. Error-Related Psychophysiology and Negative Affect

    Science.gov (United States)

    Hajcak, G.; McDonald, N.; Simons, R.F.

    2004-01-01

    The error-related negativity (ERN/Ne) and error positivity (Pe) have been associated with error detection and response monitoring. More recently, heart rate (HR) and skin conductance (SC) have also been shown to be sensitive to the internal detection of errors. An enhanced ERN has consistently been observed in anxious subjects and there is some…

  13. Total Survey Error for Longitudinal Surveys

    NARCIS (Netherlands)

    Lynn, Peter; Lugtig, P.J.

    2016-01-01

    This article describes the application of the total survey error paradigm to longitudinal surveys. Several aspects of survey error, and of the interactions between different types of error, are distinct in the longitudinal survey context. Furthermore, error trade-off decisions in survey design and

  14. An exceptional collision tumor: gastric calcified stromal tumor and ...

    African Journals Online (AJOL)

    An exceptional collision tumor: gastric calcified stromal tumor and pancreatic adenocarcinoma. Hicham Baba, Mohamed Elfahssi, Mohamed Said Belhamidi, Abderrahman Elhjouji, Ahmed Bounaim, Abdelmounaim Ait Ali, Khalid Sair, Aziz Zentar ...

  15. Accessing the exceptional points of parity-time symmetric acoustics

    Science.gov (United States)

    Shi, Chengzhi; Dubois, Marc; Chen, Yun; Cheng, Lei; Ramezani, Hamidreza; Wang, Yuan; Zhang, Xiang

    2016-01-01

    Parity-time (PT) symmetric systems experience phase transition between PT exact and broken phases at exceptional point. These PT phase transitions contribute significantly to the design of single mode lasers, coherent perfect absorbers, isolators, and diodes. However, such exceptional points are extremely difficult to access in practice because of the dispersive behaviour of most loss and gain materials required in PT symmetric systems. Here we introduce a method to systematically tame these exceptional points and control PT phases. Our experimental demonstration hinges on an active acoustic element that realizes a complex-valued potential and simultaneously controls the multiple interference in the structure. The manipulation of exceptional points offers new routes to broaden applications for PT symmetric physics in acoustics, optics, microwaves and electronics, which are essential for sensing, communication and imaging. PMID:27025443

  16. 50 CFR 14.64 - Exceptions to export declaration requirements.

    Science.gov (United States)

    2010-10-01

    ..., preserved, dried, or embedded scientific specimens or parts thereof, exported by accredited scientists or... any specimens or parts thereof taken as a result of sport hunting. (c) Except for wildlife requiring a...

  17. 19 CFR 134.33 - J-List exceptions.

    Science.gov (United States)

    2010-04-01

    ... sheets; shafting; slabs; and metal in similar forms. Mica not further manufactured than cut or stamped to.... Scrap and waste. Screws. Shims, track. Shingles (wood), bundles of (except bundles of red-cedar shingles...

  18. 49 CFR 173.151 - Exceptions for Class 4.

    Science.gov (United States)

    2010-10-01

    ... not exceeding 30 kg (66 pounds) gross weight, may be renamed “Consumer commodity” and reclassed as ORM... addition to the exceptions provided by paragraph (b) of this section, shipments of ORM-D materials are not...

  19. Properties of the Exceptional (Xl) Laguerre and Jacobi Polynomials

    National Research Council Canada - National Science Library

    Choon-Lin Ho; Satoru Odake; Ryu Sasaki

    2011-01-01

    We present various results on the properties of the four infinite sets of the exceptional Xl polynomials discovered recently by Odake and Sasaki [Phys. Lett. B 679 (2009), 414-417; Phys. Lett. B 684 (2010), 173-176...

  20. Entropic error-disturbance relations

    Science.gov (United States)

    Coles, Patrick; Furrer, Fabian

    2014-03-01

    We derive an entropic error-disturbance relation for a sequential measurement scenario as originally considered by Heisenberg, and we discuss how our relation could be tested using existing experimental setups. Our relation is valid for discrete observables, such as spin, as well as continuous observables, such as position and momentum. The novel aspect of our relation compared to earlier versions is its clear operational interpretation and the quantification of error and disturbance using entropic quantities. This directly relates the measurement uncertainty, a fundamental property of quantum mechanics, to information theoretical limitations and offers potential applications in for instance quantum cryptography. PC is funded by National Research Foundation Singapore and Ministry of Education Tier 3 Grant ``Random numbers from quantum processes'' (MOE2012-T3-1-009). FF is funded by Japan Society for the Promotion of Science, KAKENHI grant No. 24-02793.

  1. Exceptional epidemics: AIDS still deserves a global response

    Directory of Open Access Journals (Sweden)

    Smith Julia

    2009-11-01

    Full Text Available Abstract There has been a renewed debate over whether AIDS deserves an exceptional response. We argue that as AIDS is having differentiated impacts depending on the scale of the epidemic, and population groups impacted, and so responses must be tailored accordingly. AIDS is exceptional, but not everywhere. Exceptionalism developed as a Western reaction to a once poorly understood epidemic, but remains relevant in the current multi-dimensional global response. The attack on AIDS exceptionalism has arisen because of the amount of funding targeted to the disease and the belief that AIDS activists prioritize it above other health issues. The strongest detractors of exceptionalism claim that the AIDS response has undermined health systems in developing countries. We agree that in countries with low prevalence, AIDS should be normalised and treated as a public health issue--but responses must forcefully address human rights and tackle the stigma and discrimination faced by marginalized groups. Similarly, AIDS should be normalized in countries with mid-level prevalence, except when life-long treatment is dependent on outside resources--as is the case with most African countries--because treatment dependency creates unique sustainability challenges. AIDS always requires an exceptional response in countries with high prevalence (over 10 percent. In these settings there is substantial morbidity, filling hospitals and increasing care burdens; and increased mortality, which most visibly reduces life expectancy. The idea that exceptionalism is somehow wrong is an oversimplification. The AIDS response can not be mounted in isolation; it is part of the development agenda. It must be based on human rights principles, and it must aim to improve health and well-being of societies as a whole.

  2. Half-maximal consistent truncations using exceptional field theory

    Science.gov (United States)

    Malek, E.

    We show how to construct half-maximal consistent truncations of 10- and 11-dimensional supergravity to seven dimensions using exceptional field theory. This procedure gives rise to a seven-dimensional half-maximal gauged supergravity coupled to n vector multiplets, with n ≠ 3 in general. We also show how these techniques can be used to reduce exceptional field theory to heterotic double field theory.

  3. Gifted students with a coexisting disability: The twice exceptional

    OpenAIRE

    Pfeiffer, Steven I.

    2015-01-01

    The twice exceptional are students who have both high ability and a disability or disorder. The ability can be in any culturally-valued domain, including high intelligence, academics, the visual or performing arts, and athletics. The co-existing disability can be physical, medical, or psychological. There is a growing literature of scholarly opinion about twice exceptionality; however, there are few well-designed empirical investigations of gifted students with anxiety, depression, bipolar di...

  4. Error studies of Halbach Magnets

    Energy Technology Data Exchange (ETDEWEB)

    Brooks, S. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2017-03-02

    These error studies were done on the Halbach magnets for the CBETA “First Girder” as described in note [CBETA001]. The CBETA magnets have since changed slightly to the lattice in [CBETA009]. However, this is not a large enough change to significantly affect the results here. The QF and BD arc FFAG magnets are considered. For each assumed set of error distributions and each ideal magnet, 100 random magnets with errors are generated. These are then run through an automated version of the iron wire multipole cancellation algorithm. The maximum wire diameter allowed is 0.063” as in the proof-of-principle magnets. Initially, 32 wires (2 per Halbach wedge) are tried, then if this does not achieve 1e-­4 level accuracy in the simulation, 48 and then 64 wires. By “1e-4 accuracy”, it is meant the FOM defined by √(Σn≥sextupole an 2+bn 2) is less than 1 unit, where the multipoles are taken at the maximum nominal beam radius, R=23mm for these magnets. The algorithm initially uses 20 convergence interations. If 64 wires does not achieve 1e-­4 accuracy, this is increased to 50 iterations to check for slow converging cases. There are also classifications for magnets that do not achieve 1e-4 but do achieve 1e-3 (FOM ≤ 10 units). This is technically within the spec discussed in the Jan 30, 2017 review; however, there will be errors in practical shimming not dealt with in the simulation, so it is preferable to do much better than the spec in the simulation.

  5. Reflection of medical error highlighted on media in Turkey: A retrospective study.

    Science.gov (United States)

    Isik, Oguz; Bayin, Gamze; Ugurluoglu, Ozgur

    2016-01-01

    This study was performed with the aim of identifying how news on medical errors have be transmitted, and how the types, reasons, and conclusions of medical errors have been reflected to by the media in Turkey. A content analysis method was used in the study, and in this context, the data for the study was acquired by scanning five newspapers with the top editions on the national basis between the years 2012 and 2015 for the news about medical errors. Some specific selection criteria was used for the scanning of resulted news, and 116 news items acquired as a result of all the eliminations. According to the results of the study; the vast majority of medical errors (40.5%) transmitted by the news resulted from the negligence of the medical staff. The medical errors were caused by physicians in the ratio of 74.1%, they most commonly occurred in state hospitals (31.9%). Another important result of the research was that medical errors resulted in either patient death to a large extent (51.7%), or permanent damage and disability to patients (25.0%). The news concerning medical errors provided information about the types, causes, and the results of these medical errors. It also reflected the media point of view on the issue. The examination of the content of the medical errors reported by the media were important which calls for appropriate interventions to avoid and minimize the occurrence of medical errors by improving the healthcare delivery system.

  6. [To know, understand and combating medication errors related to computerized physician order entry].

    Science.gov (United States)

    Vialle, V; Tiphine, T; Poirier, Y; Raingeard, E; Feldman, D; Freville, J-C

    2011-05-01

    The aim of the study is to identify medication errors related to computerized physician order entry in our hospital. At the end of this 1-year study (2008 to 2009), 378 beds were computerized by a business software. Medication errors were identified from notifications sent to the publisher of the software, feedback of health professionals and the analysis of Pharmacists' interventions formulate following prescription errors due to computerization. They were qualified according to the medication error's French dictionary of the French Society of Clinical Pharmacy. Thirty-five categories of medication errors were found. Most of them appear during prescription. Dosage and concentration errors, dose errors, omission errors and drug errors are the most frequent. Three main causes were found: human factor, closely related to the software settings and the quality of user training; communication problems, related to the ergonomics; conception problems, related to intuitiveness and intricacy of the software. These results confirm the existence of medication errors induced by computerized physician order entry systems. They highlight the need to involve initial and ongoing training of users, relevance and scalability of the setup and use of mature and certified software to minimized them. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  7. Strategy for Syntax Error Recovering

    Directory of Open Access Journals (Sweden)

    Henry F Báez

    2003-07-01

    Full Text Available This paper describes a new strategy for syntax error recovering for a compiler that does not have instruction separators like ";" or opening and closing brackets like "{" and "}". This strategy is based on 4 steps. 1. Find a set of tokens (called ACEPTA set for each non terminal symbol of the grammar. 2. During the syntax analysis of each non terminal symbol, eliminate the tokens that are not in the ACEPTA set. 3. Eliminate repeated tokens that are not accepted by the grammar, and 4. Complete symbols in the syntax analysis with the hope that the token that has not been erased later will match with a terminal symbol expected by the syntax analyser; otherwise the symbol will be eliminated in some particular productions. The strategy for sintax error recovering is a method that can be used in whatever not ambiguos context free grammar includying those that use instruction separators like ";". It is implemented with an algorithm and it is much more easy to implement than other strategies for syntax error recovering like those based on stacks.

  8. Isolating Graphical Failure-Inducing Input for Privacy Protection in Error Reporting Systems

    Directory of Open Access Journals (Sweden)

    Matos João

    2016-04-01

    Full Text Available This work proposes a new privacy-enhancing system that minimizes the disclosure of information in error reports. Error reporting mechanisms are of the utmost importance to correct software bugs but, unfortunately, the transmission of an error report may reveal users’ private information. Some privacy-enhancing systems for error reporting have been presented in the past years, yet they rely on path condition analysis, which we show in this paper to be ineffective when it comes to graphical-based input. Knowing that numerous applications have graphical user interfaces (GUI, it is very important to overcome such limitation. This work describes a new privacy-enhancing error reporting system, based on a new input minimization algorithm called GUIᴍɪɴ that is geared towards GUI, to remove input that is unnecessary to reproduce the observed failure. Before deciding whether to submit the error report, the user is provided with a step-by-step graphical replay of the minimized input, to evaluate whether it still yields sensitive information. We also provide an open source implementation of the proposed system and evaluate it with well-known applications.

  9. [Minimally invasive approach for cervical spondylotic radiculopathy].

    Science.gov (United States)

    Ding, Liang; Sun, Taicun; Huang, Yonghui

    2010-01-01

    To summarize the recent minimally invasive approach for cervical spondylotic radiculopathy (CSR). The recent literature at home and abroad concerning minimally invasive approach for CSR was reviewed and summarized. There were two techniques of minimally invasive approach for CSR at present: percutaneous puncture techniques and endoscopic techniques. The degenerate intervertebral disc was resected or nucleolysis by percutaneous puncture technique if CSR was caused by mild or moderate intervertebral disc herniations. The cervical microendoscopic discectomy and foraminotomy was an effective minimally invasive approach which could provide a clear view. The endoscopy techniques were suitable to treat CSR caused by foraminal osteophytes, lateral disc herniations, local ligamentum flavum thickening and spondylotic foraminal stenosis. The minimally invasive procedure has the advantages of simple handling, minimally invasive and low incidence of complications. But the scope of indications is relatively narrow at present.

  10. Minimal Cells-Real and Imagined.

    Science.gov (United States)

    Glass, John I; Merryman, Chuck; Wise, Kim S; Hutchison, Clyde A; Smith, Hamilton O

    2017-12-01

    A minimal cell is one whose genome only encodes the minimal set of genes necessary for the cell to survive. Scientific reductionism postulates the best way to learn the first principles of cellular biology would be to use a minimal cell in which the functions of all genes and components are understood. The genes in a minimal cell are, by definition, essential. In 2016, synthesis of a genome comprised of only the set of essential and quasi-essential genes encoded by the bacterium Mycoplasma mycoides created a near-minimal bacterial cell. This organism performs the cellular functions common to all organisms. It replicates DNA, transcribes RNA, translates proteins, undergoes cell division, and little else. In this review, we examine this organism and contrast it with other bacteria that have been used as surrogates for a minimal cell. Copyright © 2017 Cold Spring Harbor Laboratory Press; all rights reserved.

  11. Reducing Soft-error Vulnerability of Caches using Data Compression

    Energy Technology Data Exchange (ETDEWEB)

    Mittal, Sparsh [ORNL; Vetter, Jeffrey S [ORNL

    2016-01-01

    With ongoing chip miniaturization and voltage scaling, particle strike-induced soft errors present increasingly severe threat to the reliability of on-chip caches. In this paper, we present a technique to reduce the vulnerability of caches to soft-errors. Our technique uses data compression to reduce the number of vulnerable data bits in the cache and performs selective duplication of more critical data-bits to provide extra protection to them. Microarchitectural simulations have shown that our technique is effective in reducing architectural vulnerability factor (AVF) of the cache and outperforms another technique. For single and dual-core system configuration, the average reduction in AVF is 5.59X and 8.44X, respectively. Also, the implementation and performance overheads of our technique are minimal and it is useful for a broad range of workloads.

  12. Heart bypass surgery - minimally invasive - discharge

    Science.gov (United States)

    Minimally invasive direct coronary artery bypass - discharge; MIDCAB - discharge; Robot assisted coronary artery bypass - discharge; RACAB - discharge; Keyhole heart surgery - discharge; Coronary artery disease - MIDCAB discharge; CAD - ...

  13. Minimally invasive approaches to the cervical spine.

    Science.gov (United States)

    Celestre, Paul C; Pazmiño, Pablo R; Mikhael, Mark M; Wolf, Christopher F; Feldman, Lacey A; Lauryssen, Carl; Wang, Jeffrey C

    2012-01-01

    Minimally invasive approaches and operative techniques are becoming increasingly popular for the treatment of cervical spine disorders. Minimally invasive spine surgery attempts to decrease iatrogenic muscle injury, decrease pain, and speed postoperative recovery with the use of smaller incisions and specialized instruments. This article explains in detail minimally invasive approaches to the posterior spine, the techniques for posterior cervical foraminotomy and arthrodesis via lateral mass screw placement, and anterior cervical foraminotomy. Complications are also discussed. Additionally, illustrated cases are presented detailing the use of minimally invasive surgical techniques. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Righting errors in writing errors: the Wing and Baddeley (1980) spelling error corpus revisited.

    Science.gov (United States)

    Wing, Alan M; Baddeley, Alan D

    2009-03-01

    We present a new analysis of our previously published corpus of handwriting errors (slips) using the proportional allocation algorithm of Machtynger and Shallice (2009). As previously, the proportion of slips is greater in the middle of the word than at the ends, however, in contrast to before, the proportion is greater at the end than at the beginning of the word. The findings are consistent with the hypothesis of memory effects in a graphemic output buffer.

  15. Evaluating student progress through error reduction in intraoral radiographic technique.

    Science.gov (United States)

    Patel, J R; Greer, D F

    1986-10-01

    A simple system was developed to collate the errors made when radiographic surveys are taken of the complete mouth. A radiographic critique form was used to evaluate each radiograph. This study used 1150 complete-mouth radiographic surveys made by junior dental students. From a total of 24,150 radiographs, 2238 were clinically unacceptable as a result of one or more errors in technique. No retakes caused by processing or mechanical errors were considered. The four major errors that were found in the study included cone cutting (11.17%), incorrect vertical angulation (11.75%), incorrect horizontal angulation (4.6%), and incorrect film placement (64.9%). Although expected, perhaps the most noteworthy finding was that there was a statistically significant difference between the performance of students during the first quarter as opposed to the third quarter of clinical training; this indicated the need for a minimum of twenty to twenty-five complete mouth radiographic surveys to achieve minimal technical proficiency.

  16. Temporal prediction errors in visual and auditory cortices.

    Science.gov (United States)

    Lee, Hweeling; Noppeney, Uta

    2014-04-14

    To form a coherent percept of the environment, the brain needs to bind sensory signals emanating from a common source, but to segregate those from different sources [1]. Temporal correlations and synchrony act as prominent cues for multisensory integration [2-4], but the neural mechanisms by which such cues are identified remain unclear. Predictive coding suggests that the brain iteratively optimizes an internal model of its environment by minimizing the errors between its predictions and the sensory inputs [5,6]. This model enables the brain to predict the temporal evolution of natural audiovisual inputs and their statistical (for example, temporal) relationship. A prediction of this theory is that asynchronous audiovisual signals violating the model's predictions induce an error signal that depends on the directionality of the audiovisual asynchrony. As the visual system generates the dominant temporal predictions for visual leading asynchrony, the delayed auditory inputs are expected to generate a prediction error signal in the auditory system (and vice versa for auditory leading asynchrony). Using functional magnetic resonance imaging (fMRI), we measured participants' brain responses to synchronous, visual leading and auditory leading movies of speech, sinewave speech or music. In line with predictive coding, auditory leading asynchrony elicited a prediction error in visual cortices and visual leading asynchrony in auditory cortices. Our results reveal predictive coding as a generic mechanism to temporally bind signals from multiple senses into a coherent percept. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Error tracking in a clinical biochemistry laboratory

    DEFF Research Database (Denmark)

    Szecsi, Pal Bela; Ødum, Lars

    2009-01-01

    BACKGROUND: We report our results for the systematic recording of all errors in a standard clinical laboratory over a 1-year period. METHODS: Recording was performed using a commercial database program. All individuals in the laboratory were allowed to report errors. The testing processes were...... classified according to function, and errors were classified as pre-analytical, analytical, post-analytical, or service-related, and then further divided into descriptive subgroups. Samples were taken from hospital wards (38.6%), outpatient clinics (25.7%), general practitioners (29.4%), and other hospitals....... RESULTS: A total of 1189 errors were reported in 1151 reports during the first year, corresponding to an error rate of 1 error for every 142 patients, or 1 per 1223 tests. The majority of events were due to human errors (82.6%), and only a few (4.3%) were the result of technical errors. Most of the errors...

  18. BANKRUPTCY PREDICTION MODEL WITH ZETAc OPTIMAL CUT-OFF SCORE TO CORRECT TYPE I ERRORS

    Directory of Open Access Journals (Sweden)

    Mohamad Iwan

    2005-06-01

    This research has successfully attained the following results: (1 type I error is in fact 59,83 times more costly compared to type II error, (2 22 ratios distinguish between bankrupt and non-bankrupt groups, (3 2 financial ratios proved to be effective in predicting bankruptcy, (4 prediction using ZETAc optimal cut-off score predicts more companies filing for bankruptcy within one year compared to prediction using Hair et al. optimum cutting score, (5 Although prediction using Hair et al. optimum cutting score is more accurate, prediction using ZETAc optimal cut-off score proved to be able to minimize cost incurred from classification errors.

  19. The possible benefits of reduced errors in the motor skills acquisition of children

    Directory of Open Access Journals (Sweden)

    Capio Catherine M

    2012-01-01

    Full Text Available Abstract An implicit approach to motor learning suggests that relatively complex movement skills may be better acquired in environments that constrain errors during the initial stages of practice. This current concept paper proposes that reducing the number of errors committed during motor learning leads to stable performance when attention demands are increased by concurrent cognitive tasks. While it appears that this approach to practice may be beneficial for motor learning, further studies are needed to both confirm this advantage and better understand the underlying mechanisms. An approach involving error minimization during early learning may have important applications in paediatric rehabilitation.

  20. Active and passive compensation of APPLE II-introduced multipole errors through beam-based measurement

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Ting-Yi; Huang, Szu-Jung; Fu, Huang-Wen; Chang, Ho-Ping; Chang, Cheng-Hsiang [National Synchrotron Radiation Research Center, Hsinchu Science Park, Hsinchu 30076, Taiwan (China); Hwang, Ching-Shiang [National Synchrotron Radiation Research Center, Hsinchu Science Park, Hsinchu 30076, Taiwan (China); Department of Electrophysics, National Chiao Tung University, Hsinchu 30050, Taiwan (China)

    2016-08-01

    The effect of an APPLE II-type elliptically polarized undulator (EPU) on the beam dynamics were investigated using active and passive methods. To reduce the tune shift and improve the injection efficiency, dynamic multipole errors were compensated using L-shaped iron shims, which resulted in stable top-up operation for a minimum gap. The skew quadrupole error was compensated using a multipole corrector, which was located downstream of the EPU for minimizing betatron coupling, and it ensured the enhancement of the synchrotron radiation brightness. The investigation methods, a numerical simulation algorithm, a multipole error correction method, and the beam-based measurement results are discussed.