Students’ Errors in Geometry Viewed from Spatial Intelligence
Riastuti, N.; Mardiyana, M.; Pramudya, I.
2017-09-01
Geometry is one of the difficult materials because students must have ability to visualize, describe images, draw shapes, and know the kind of shapes. This study aim is to describe student error based on Newmans’ Error Analysis in solving geometry problems viewed from spatial intelligence. This research uses descriptive qualitative method by using purposive sampling technique. The datas in this research are the result of geometri material test and interview by the 8th graders of Junior High School in Indonesia. The results of this study show that in each category of spatial intelligence has a different type of error in solving the problem on the material geometry. Errors are mostly made by students with low spatial intelligence because they have deficiencies in visual abilities. Analysis of student error viewed from spatial intelligence is expected to help students do reflection in solving the problem of geometry.
Entropy Error Model of Planar Geometry Features in GIS
Institute of Scientific and Technical Information of China (English)
LI Dajun; GUAN Yunlan; GONG Jianya; DU Daosheng
2003-01-01
Positional error of line segments is usually described by using "g-band", however, its band width is in relation to the confidence level choice. In fact, given different confidence levels, a series of concentric bands can be obtained. To overcome the effect of confidence level on the error indicator, by introducing the union entropy theory, we propose an entropy error ellipse index of point, then extend it to line segment and polygon,and establish an entropy error band of line segment and an entropy error donut of polygon. The research shows that the entropy error index can be determined uniquely and is not influenced by confidence level, and that they are suitable for positional uncertainty of planar geometry features.
Euclidean Geometry Codes, minimum weight words and decodable error-patterns using bit-flipping
DEFF Research Database (Denmark)
Høholdt, Tom; Justesen, Jørn; Jonsson, Bergtor
2005-01-01
We determine the number of minimum wigth words in a class of Euclidean Geometry codes and link the performance of the bit-flipping decoding algorithm to the geometry of the error patterns.......We determine the number of minimum wigth words in a class of Euclidean Geometry codes and link the performance of the bit-flipping decoding algorithm to the geometry of the error patterns....
Errors Analysis of Students in Mathematics Department to Learn Plane Geometry
Mirna, M.
2018-04-01
This article describes the results of qualitative descriptive research that reveal the locations, types and causes of student error in answering the problem of plane geometry at the problem-solving level. Answers from 59 students on three test items informed that students showed errors ranging from understanding the concepts and principles of geometry itself to the error in applying it to problem solving. Their type of error consists of concept errors, principle errors and operational errors. The results of reflection with four subjects reveal the causes of the error are: 1) student learning motivation is very low, 2) in high school learning experience, geometry has been seen as unimportant, 3) the students' experience using their reasoning in solving the problem is very less, and 4) students' reasoning ability is still very low.
Probabilistic error bounds for reduced order modeling
Energy Technology Data Exchange (ETDEWEB)
Abdo, M.G.; Wang, C.; Abdel-Khalik, H.S., E-mail: abdo@purdue.edu, E-mail: wang1730@purdue.edu, E-mail: abdelkhalik@purdue.edu [Purdue Univ., School of Nuclear Engineering, West Lafayette, IN (United States)
2015-07-01
Reduced order modeling has proven to be an effective tool when repeated execution of reactor analysis codes is required. ROM operates on the assumption that the intrinsic dimensionality of the associated reactor physics models is sufficiently small when compared to the nominal dimensionality of the input and output data streams. By employing a truncation technique with roots in linear algebra matrix decomposition theory, ROM effectively discards all components of the input and output data that have negligible impact on reactor attributes of interest. This manuscript introduces a mathematical approach to quantify the errors resulting from the discarded ROM components. As supported by numerical experiments, the introduced analysis proves that the contribution of the discarded components could be upper-bounded with an overwhelmingly high probability. The reverse of this statement implies that the ROM algorithm can self-adapt to determine the level of the reduction needed such that the maximum resulting reduction error is below a given tolerance limit that is set by the user. (author)
Huo, Ming-Xia; Li, Ying
2017-12-01
Quantum error correction is important to quantum information processing, which allows us to reliably process information encoded in quantum error correction codes. Efficient quantum error correction benefits from the knowledge of error rates. We propose a protocol for monitoring error rates in real time without interrupting the quantum error correction. Any adaptation of the quantum error correction code or its implementation circuit is not required. The protocol can be directly applied to the most advanced quantum error correction techniques, e.g. surface code. A Gaussian processes algorithm is used to estimate and predict error rates based on error correction data in the past. We find that using these estimated error rates, the probability of error correction failures can be significantly reduced by a factor increasing with the code distance.
Fast Erasure and Error decoding of Algebraic Geometry Codes up to the Feng-Rao Bound
DEFF Research Database (Denmark)
Jensen, Helge Elbrønd; Sakata, S.; Leonard, D.
1996-01-01
This paper gives an errata(that is erasure-and error-) decoding algorithm of one-point algebraic geometry codes up to the Feng-Rao designed minimum distance using Sakata's multidimensional generalization of the Berlekamp-massey algorithm and the votin procedure of Feng and Rao.......This paper gives an errata(that is erasure-and error-) decoding algorithm of one-point algebraic geometry codes up to the Feng-Rao designed minimum distance using Sakata's multidimensional generalization of the Berlekamp-massey algorithm and the votin procedure of Feng and Rao....
Grauer, Jared A.; Morelli, Eugene A.
2013-01-01
A nonlinear simulation of the NASA Generic Transport Model was used to investigate the effects of errors in sensor measurements, mass properties, and aircraft geometry on the accuracy of dynamic models identified from flight data. Measurements from a typical system identification maneuver were systematically and progressively deteriorated and then used to estimate stability and control derivatives within a Monte Carlo analysis. Based on the results, recommendations were provided for maximum allowable errors in sensor measurements, mass properties, and aircraft geometry to achieve desired levels of dynamic modeling accuracy. Results using other flight conditions, parameter estimation methods, and a full-scale F-16 nonlinear aircraft simulation were compared with these recommendations.
FMEA: a model for reducing medical errors.
Chiozza, Maria Laura; Ponzetti, Clemente
2009-06-01
Patient safety is a management issue, in view of the fact that clinical risk management has become an important part of hospital management. Failure Mode and Effect Analysis (FMEA) is a proactive technique for error detection and reduction, firstly introduced within the aerospace industry in the 1960s. Early applications in the health care industry dating back to the 1990s included critical systems in the development and manufacture of drugs and in the prevention of medication errors in hospitals. In 2008, the Technical Committee of the International Organization for Standardization (ISO), licensed a technical specification for medical laboratories suggesting FMEA as a method for prospective risk analysis of high-risk processes. Here we describe the main steps of the FMEA process and review data available on the application of this technique to laboratory medicine. A significant reduction of the risk priority number (RPN) was obtained when applying FMEA to blood cross-matching, to clinical chemistry analytes, as well as to point-of-care testing (POCT).
Reducing diagnostic errors in medicine: what's the goal?
Graber, Mark; Gordon, Ruthanna; Franklin, Nancy
2002-10-01
This review considers the feasibility of reducing or eliminating the three major categories of diagnostic errors in medicine: "No-fault errors" occur when the disease is silent, presents atypically, or mimics something more common. These errors will inevitably decline as medical science advances, new syndromes are identified, and diseases can be detected more accurately or at earlier stages. These errors can never be eradicated, unfortunately, because new diseases emerge, tests are never perfect, patients are sometimes noncompliant, and physicians will inevitably, at times, choose the most likely diagnosis over the correct one, illustrating the concept of necessary fallibility and the probabilistic nature of choosing a diagnosis. "System errors" play a role when diagnosis is delayed or missed because of latent imperfections in the health care system. These errors can be reduced by system improvements, but can never be eliminated because these improvements lag behind and degrade over time, and each new fix creates the opportunity for novel errors. Tradeoffs also guarantee system errors will persist, when resources are just shifted. "Cognitive errors" reflect misdiagnosis from faulty data collection or interpretation, flawed reasoning, or incomplete knowledge. The limitations of human processing and the inherent biases in using heuristics guarantee that these errors will persist. Opportunities exist, however, for improving the cognitive aspect of diagnosis by adopting system-level changes (e.g., second opinions, decision-support systems, enhanced access to specialists) and by training designed to improve cognition or cognitive awareness. Diagnostic error can be substantially reduced, but never eradicated.
Minimizing pulling geometry errors in atomic force microscope single molecule force spectroscopy.
Rivera, Monica; Lee, Whasil; Ke, Changhong; Marszalek, Piotr E; Cole, Daniel G; Clark, Robert L
2008-10-01
In atomic force microscopy-based single molecule force spectroscopy (AFM-SMFS), it is assumed that the pulling angle is negligible and that the force applied to the molecule is equivalent to the force measured by the instrument. Recent studies, however, have indicated that the pulling geometry errors can drastically alter the measured force-extension relationship of molecules. Here we describe a software-based alignment method that repositions the cantilever such that it is located directly above the molecule's substrate attachment site. By aligning the applied force with the measurement axis, the molecule is no longer undergoing combined loading, and the full force can be measured by the cantilever. Simulations and experimental results verify the ability of the alignment program to minimize pulling geometry errors in AFM-SMFS studies.
International Nuclear Information System (INIS)
Barros, R.C. de; Larsen, E.W.
1991-01-01
A generalization of the one-group Spectral Green's Function (SGF) method is developed for multigroup, slab-geometry discrete ordinates (S N ) problems. The multigroup SGF method is free from spatial truncation errors; it generated numerical values for the cell-edge and cell-average angular fluxes that agree with the analytic solution of the multigroup S N equations. Numerical results are given to illustrate the method's accuracy
Perceptual learning eases crowding by reducing recognition errors but not position errors.
Xiong, Ying-Zi; Yu, Cong; Zhang, Jun-Yun
2015-08-01
When an observer reports a letter flanked by additional letters in the visual periphery, the response errors (the crowding effect) may result from failure to recognize the target letter (recognition errors), from mislocating a correctly recognized target letter at a flanker location (target misplacement errors), or from reporting a flanker as the target letter (flanker substitution errors). Crowding can be reduced through perceptual learning. However, it is not known how perceptual learning operates to reduce crowding. In this study we trained observers with a partial-report task (Experiment 1), in which they reported the central target letter of a three-letter string presented in the visual periphery, or a whole-report task (Experiment 2), in which they reported all three letters in order. We then assessed the impact of training on recognition of both unflanked and flanked targets, with particular attention to how perceptual learning affected the types of errors. Our results show that training improved target recognition but not single-letter recognition, indicating that training indeed affected crowding. However, training did not reduce target misplacement errors or flanker substitution errors. This dissociation between target recognition and flanker substitution errors supports the view that flanker substitution may be more likely a by-product (due to response bias), rather than a cause, of crowding. Moreover, the dissociation is not consistent with hypothesized mechanisms of crowding that would predict reduced positional errors.
Electronic prescribing reduces prescribing error in public hospitals.
Shawahna, Ramzi; Rahman, Nisar-Ur; Ahmad, Mahmood; Debray, Marcel; Yliperttula, Marjo; Declèves, Xavier
2011-11-01
To examine the incidence of prescribing errors in a main public hospital in Pakistan and to assess the impact of introducing electronic prescribing system on the reduction of their incidence. Medication errors are persistent in today's healthcare system. The impact of electronic prescribing on reducing errors has not been tested in developing world. Prospective review of medication and discharge medication charts before and after the introduction of an electronic inpatient record and prescribing system. Inpatient records (n = 3300) and 1100 discharge medication sheets were reviewed for prescribing errors before and after the installation of electronic prescribing system in 11 wards. Medications (13,328 and 14,064) were prescribed for inpatients, among which 3008 and 1147 prescribing errors were identified, giving an overall error rate of 22·6% and 8·2% throughout paper-based and electronic prescribing, respectively. Medications (2480 and 2790) were prescribed for discharge patients, among which 418 and 123 errors were detected, giving an overall error rate of 16·9% and 4·4% during paper-based and electronic prescribing, respectively. Electronic prescribing has a significant effect on the reduction of prescribing errors. Prescribing errors are commonplace in Pakistan public hospitals. The study evaluated the impact of introducing electronic inpatient records and electronic prescribing in the reduction of prescribing errors in a public hospital in Pakistan. © 2011 Blackwell Publishing Ltd.
Reduced phase error through optimized control of a superconducting qubit
International Nuclear Information System (INIS)
Lucero, Erik; Kelly, Julian; Bialczak, Radoslaw C.; Lenander, Mike; Mariantoni, Matteo; Neeley, Matthew; O'Connell, A. D.; Sank, Daniel; Wang, H.; Weides, Martin; Wenner, James; Cleland, A. N.; Martinis, John M.; Yamamoto, Tsuyoshi
2010-01-01
Minimizing phase and other errors in experimental quantum gates allows higher fidelity quantum processing. To quantify and correct for phase errors, in particular, we have developed an experimental metrology - amplified phase error (APE) pulses - that amplifies and helps identify phase errors in general multilevel qubit architectures. In order to correct for both phase and amplitude errors specific to virtual transitions and leakage outside of the qubit manifold, we implement 'half derivative', an experimental simplification of derivative reduction by adiabatic gate (DRAG) control theory. The phase errors are lowered by about a factor of five using this method to ∼1.6 deg. per gate, and can be tuned to zero. Leakage outside the qubit manifold, to the qubit |2> state, is also reduced to ∼10 -4 for 20% faster gates.
Sossinsky, A B
2012-01-01
The book is an innovative modern exposition of geometry, or rather, of geometries; it is the first textbook in which Felix Klein's Erlangen Program (the action of transformation groups) is systematically used as the basis for defining various geometries. The course of study presented is dedicated to the proposition that all geometries are created equal--although some, of course, remain more equal than others. The author concentrates on several of the more distinguished and beautiful ones, which include what he terms "toy geometries", the geometries of Platonic bodies, discrete geometries, and classical continuous geometries. The text is based on first-year semester course lectures delivered at the Independent University of Moscow in 2003 and 2006. It is by no means a formal algebraic or analytic treatment of geometric topics, but rather, a highly visual exposition containing upwards of 200 illustrations. The reader is expected to possess a familiarity with elementary Euclidean geometry, albeit those lacking t...
Indian Academy of Sciences (India)
. In the previous article we looked at the origins of synthetic and analytic geometry. More practical minded people, the builders and navigators, were studying two other aspects of geometry- trigonometry and integral calculus. These are actually ...
Reduced error signalling in medication-naive children with ADHD
DEFF Research Database (Denmark)
Plessen, Kerstin J; Allen, Elena A; Eichele, Heike
2016-01-01
reduced in children with ADHD. This adaptation was inversely related to activation of the right-lateralized ventral attention network (VAN) on error trials and to task-driven connectivity between the cingulo-opercular system and the VAN. LIMITATIONS: Our study was limited by the modest sample size......BACKGROUND: We examined the blood-oxygen level-dependent (BOLD) activation in brain regions that signal errors and their association with intraindividual behavioural variability and adaptation to errors in children with attention-deficit/hyperactivity disorder (ADHD). METHODS: We acquired...
Errors as a Means of Reducing Impulsive Food Choice.
Sellitto, Manuela; di Pellegrino, Giuseppe
2016-06-05
Nowadays, the increasing incidence of eating disorders due to poor self-control has given rise to increased obesity and other chronic weight problems, and ultimately, to reduced life expectancy. The capacity to refrain from automatic responses is usually high in situations in which making errors is highly likely. The protocol described here aims at reducing imprudent preference in women during hypothetical intertemporal choices about appetitive food by associating it with errors. First, participants undergo an error task where two different edible stimuli are associated with two different error likelihoods (high and low). Second, they make intertemporal choices about the two edible stimuli, separately. As a result, this method decreases the discount rate for future amounts of the edible reward that cued higher error likelihood, selectively. This effect is under the influence of the self-reported hunger level. The present protocol demonstrates that errors, well known as motivationally salient events, can induce the recruitment of cognitive control, thus being ultimately useful in reducing impatient choices for edible commodities.
Prasolov, V V
2015-01-01
This book provides a systematic introduction to various geometries, including Euclidean, affine, projective, spherical, and hyperbolic geometries. Also included is a chapter on infinite-dimensional generalizations of Euclidean and affine geometries. A uniform approach to different geometries, based on Klein's Erlangen Program is suggested, and similarities of various phenomena in all geometries are traced. An important notion of duality of geometric objects is highlighted throughout the book. The authors also include a detailed presentation of the theory of conics and quadrics, including the theory of conics for non-Euclidean geometries. The book contains many beautiful geometric facts and has plenty of problems, most of them with solutions, which nicely supplement the main text. With more than 150 figures illustrating the arguments, the book can be recommended as a textbook for undergraduate and graduate-level courses in geometry.
Reduced error signalling in medication-naive children with ADHD
DEFF Research Database (Denmark)
Plessen, Kerstin J; Allen, Elena A; Eichele, Heike
2016-01-01
BACKGROUND: We examined the blood-oxygen level-dependent (BOLD) activation in brain regions that signal errors and their association with intraindividual behavioural variability and adaptation to errors in children with attention-deficit/hyperactivity disorder (ADHD). METHODS: We acquired...... functional MRI data during a Flanker task in medication-naive children with ADHD and healthy controls aged 8-12 years and analyzed the data using independent component analysis. For components corresponding to performance monitoring networks, we compared activations across groups and conditions...... and correlated them with reaction times (RT). Additionally, we analyzed post-error adaptations in behaviour and motor component activations. RESULTS: We included 25 children with ADHD and 29 controls in our analysis. Children with ADHD displayed reduced activation to errors in cingulo-opercular regions...
Reducing Error, Fraud and Corruption (EFC) in Social Protection Programs
Tesliuc, Emil Daniel; Milazzo, Annamaria
2007-01-01
Social Protection (SP) and Social Safety Net (SSN) programs channel a large amount of public resources, it is important to make sure that these reach the intended beneficiaries. Error, fraud, or corruption (EFC) reduces the economic efficiency of these interventions by decreasing the amount of money that goes to the intended beneficiaries, and erodes the political support for the program. ...
Reducing Approximation Error in the Fourier Flexible Functional Form
Directory of Open Access Journals (Sweden)
Tristan D. Skolrud
2017-12-01
Full Text Available The Fourier Flexible form provides a global approximation to an unknown data generating process. In terms of limiting function specification error, this form is preferable to functional forms based on second-order Taylor series expansions. The Fourier Flexible form is a truncated Fourier series expansion appended to a second-order expansion in logarithms. By replacing the logarithmic expansion with a Box-Cox transformation, we show that the Fourier Flexible form can reduce approximation error by 25% on average in the tails of the data distribution. The new functional form allows for nested testing of a larger set of commonly implemented functional forms.
Stereotype threat can reduce older adults' memory errors.
Barber, Sarah J; Mather, Mara
2013-01-01
Stereotype threat often incurs the cost of reducing the amount of information that older adults accurately recall. In the current research, we tested whether stereotype threat can also benefit memory. According to the regulatory focus account of stereotype threat, threat induces a prevention focus in which people become concerned with avoiding errors of commission and are sensitive to the presence or absence of losses within their environment. Because of this, we predicted that stereotype threat might reduce older adults' memory errors. Results were consistent with this prediction. Older adults under stereotype threat had lower intrusion rates during free-recall tests (Experiments 1 and 2). They also reduced their false alarms and adopted more conservative response criteria during a recognition test (Experiment 2). Thus, stereotype threat can decrease older adults' false memories, albeit at the cost of fewer veridical memories, as well.
Reducing systematic errors in measurements made by a SQUID magnetometer
International Nuclear Information System (INIS)
Kiss, L.F.; Kaptás, D.; Balogh, J.
2014-01-01
A simple method is described which reduces those systematic errors of a superconducting quantum interference device (SQUID) magnetometer that arise from possible radial displacements of the sample in the second-order gradiometer superconducting pickup coil. By rotating the sample rod (and hence the sample) around its axis into a position where the best fit is obtained to the output voltage of the SQUID as the sample is moved through the pickup coil, the accuracy of measuring magnetic moments can be increased significantly. In the cases of an examined Co 1.9 Fe 1.1 Si Heusler alloy, pure iron and nickel samples, the accuracy could be increased over the value given in the specification of the device. The suggested method is only meaningful if the measurement uncertainty is dominated by systematic errors – radial displacement in particular – and not by instrumental or environmental noise. - Highlights: • A simple method is described which reduces systematic errors of a SQUID. • The errors arise from a radial displacement of the sample in the gradiometer coil. • The procedure is to rotate the sample rod (with the sample) around its axis. • The best fit to the SQUID voltage has to be attained moving the sample through the coil. • The accuracy of measuring magnetic moment can be increased significantly
Pedoe, Dan
1988-01-01
""A lucid and masterly survey."" - Mathematics Gazette Professor Pedoe is widely known as a fine teacher and a fine geometer. His abilities in both areas are clearly evident in this self-contained, well-written, and lucid introduction to the scope and methods of elementary geometry. It covers the geometry usually included in undergraduate courses in mathematics, except for the theory of convex sets. Based on a course given by the author for several years at the University of Minnesota, the main purpose of the book is to increase geometrical, and therefore mathematical, understanding and to he
Reducing errors in the GRACE gravity solutions using regularization
Save, Himanshu; Bettadpur, Srinivas; Tapley, Byron D.
2012-09-01
The nature of the gravity field inverse problem amplifies the noise in the GRACE data, which creeps into the mid and high degree and order harmonic coefficients of the Earth's monthly gravity fields provided by GRACE. Due to the use of imperfect background models and data noise, these errors are manifested as north-south striping in the monthly global maps of equivalent water heights. In order to reduce these errors, this study investigates the use of the L-curve method with Tikhonov regularization. L-curve is a popular aid for determining a suitable value of the regularization parameter when solving linear discrete ill-posed problems using Tikhonov regularization. However, the computational effort required to determine the L-curve is prohibitively high for a large-scale problem like GRACE. This study implements a parameter-choice method, using Lanczos bidiagonalization which is a computationally inexpensive approximation to L-curve. Lanczos bidiagonalization is implemented with orthogonal transformation in a parallel computing environment and projects a large estimation problem on a problem of the size of about 2 orders of magnitude smaller for computing the regularization parameter. Errors in the GRACE solution time series have certain characteristics that vary depending on the ground track coverage of the solutions. These errors increase with increasing degree and order. In addition, certain resonant and near-resonant harmonic coefficients have higher errors as compared with the other coefficients. Using the knowledge of these characteristics, this study designs a regularization matrix that provides a constraint on the geopotential coefficients as a function of its degree and order. This regularization matrix is then used to compute the appropriate regularization parameter for each monthly solution. A 7-year time-series of the candidate regularized solutions (Mar 2003-Feb 2010) show markedly reduced error stripes compared with the unconstrained GRACE release 4
Fast Erasure-and error decoding of algebraic geometry codes up to the Feng-Rao bound
DEFF Research Database (Denmark)
Høholdt, Tom; Jensen, Helge Elbrønd; Sakata, Shojiro
1998-01-01
This correspondence gives an errata (that is erasure-and error-) decoding algorithm of one-point algebraic-geometry codes up to the Feng-Rao designed minimum distance using Sakata's multidimensional generalization of the Berlekamp-Massey algorithm and the voting procedure of Feng and Rao....
Approaches to reducing photon dose calculation errors near metal implants
Energy Technology Data Exchange (ETDEWEB)
Huang, Jessie Y.; Followill, David S.; Howell, Rebecca M.; Mirkovic, Dragan; Kry, Stephen F., E-mail: sfkry@mdanderson.org [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 and Graduate School of Biomedical Sciences, The University of Texas Health Science Center Houston, Houston, Texas 77030 (United States); Liu, Xinming [Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 and Graduate School of Biomedical Sciences, The University of Texas Health Science Center Houston, Houston, Texas 77030 (United States); Stingo, Francesco C. [Department of Biostatistics, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 and Graduate School of Biomedical Sciences, The University of Texas Health Science Center Houston, Houston, Texas 77030 (United States)
2016-09-15
Purpose: Dose calculation errors near metal implants are caused by limitations of the dose calculation algorithm in modeling tissue/metal interface effects as well as density assignment errors caused by imaging artifacts. The purpose of this study was to investigate two strategies for reducing dose calculation errors near metal implants: implementation of metal-based energy deposition kernels in the convolution/superposition (C/S) dose calculation method and use of metal artifact reduction methods for computed tomography (CT) imaging. Methods: Both error reduction strategies were investigated using a simple geometric slab phantom with a rectangular metal insert (composed of titanium or Cerrobend), as well as two anthropomorphic phantoms (one with spinal hardware and one with dental fillings), designed to mimic relevant clinical scenarios. To assess the dosimetric impact of metal kernels, the authors implemented titanium and silver kernels in a commercial collapsed cone C/S algorithm. To assess the impact of CT metal artifact reduction methods, the authors performed dose calculations using baseline imaging techniques (uncorrected 120 kVp imaging) and three commercial metal artifact reduction methods: Philips Healthcare’s O-MAR, GE Healthcare’s monochromatic gemstone spectral imaging (GSI) using dual-energy CT, and GSI with metal artifact reduction software (MARS) applied. For the simple geometric phantom, radiochromic film was used to measure dose upstream and downstream of metal inserts. For the anthropomorphic phantoms, ion chambers and radiochromic film were used to quantify the benefit of the error reduction strategies. Results: Metal kernels did not universally improve accuracy but rather resulted in better accuracy upstream of metal implants and decreased accuracy directly downstream. For the clinical cases (spinal hardware and dental fillings), metal kernels had very little impact on the dose calculation accuracy (<1.0%). Of the commercial CT artifact
Approaches to reducing photon dose calculation errors near metal implants
International Nuclear Information System (INIS)
Huang, Jessie Y.; Followill, David S.; Howell, Rebecca M.; Mirkovic, Dragan; Kry, Stephen F.; Liu, Xinming; Stingo, Francesco C.
2016-01-01
Purpose: Dose calculation errors near metal implants are caused by limitations of the dose calculation algorithm in modeling tissue/metal interface effects as well as density assignment errors caused by imaging artifacts. The purpose of this study was to investigate two strategies for reducing dose calculation errors near metal implants: implementation of metal-based energy deposition kernels in the convolution/superposition (C/S) dose calculation method and use of metal artifact reduction methods for computed tomography (CT) imaging. Methods: Both error reduction strategies were investigated using a simple geometric slab phantom with a rectangular metal insert (composed of titanium or Cerrobend), as well as two anthropomorphic phantoms (one with spinal hardware and one with dental fillings), designed to mimic relevant clinical scenarios. To assess the dosimetric impact of metal kernels, the authors implemented titanium and silver kernels in a commercial collapsed cone C/S algorithm. To assess the impact of CT metal artifact reduction methods, the authors performed dose calculations using baseline imaging techniques (uncorrected 120 kVp imaging) and three commercial metal artifact reduction methods: Philips Healthcare’s O-MAR, GE Healthcare’s monochromatic gemstone spectral imaging (GSI) using dual-energy CT, and GSI with metal artifact reduction software (MARS) applied. For the simple geometric phantom, radiochromic film was used to measure dose upstream and downstream of metal inserts. For the anthropomorphic phantoms, ion chambers and radiochromic film were used to quantify the benefit of the error reduction strategies. Results: Metal kernels did not universally improve accuracy but rather resulted in better accuracy upstream of metal implants and decreased accuracy directly downstream. For the clinical cases (spinal hardware and dental fillings), metal kernels had very little impact on the dose calculation accuracy (<1.0%). Of the commercial CT artifact
Directory of Open Access Journals (Sweden)
Laura Marchal-Crespo
2017-06-01
Full Text Available Research on motor learning suggests that training with haptic guidance enhances learning of the timing components of motor tasks, whereas error amplification is better for learning the spatial components. We present a novel mixed guidance controller that combines haptic guidance and error amplification to simultaneously promote learning of the timing and spatial components of complex motor tasks. The controller is realized using a force field around the desired position. This force field has a stable manifold tangential to the trajectory that guides subjects in velocity-related aspects. The force field has an unstable manifold perpendicular to the trajectory, which amplifies the perpendicular (spatial error. We also designed a controller that applies randomly varying, unpredictable disturbing forces to enhance the subjects’ active participation by pushing them away from their “comfort zone.” We conducted an experiment with thirty-two healthy subjects to evaluate the impact of four different training strategies on motor skill learning and self-reported motivation: (i No haptics, (ii mixed guidance, (iii perpendicular error amplification and tangential haptic guidance provided in sequential order, and (iv randomly varying disturbing forces. Subjects trained two motor tasks using ARMin IV, a robotic exoskeleton for upper limb rehabilitation: follow circles with an ellipsoidal speed profile, and move along a 3D line following a complex speed profile. Mixed guidance showed no detectable learning advantages over the other groups. Results suggest that the effectiveness of the training strategies depends on the subjects’ initial skill level. Mixed guidance seemed to benefit subjects who performed the circle task with smaller errors during baseline (i.e., initially more skilled subjects, while training with no haptics was more beneficial for subjects who created larger errors (i.e., less skilled subjects. Therefore, perhaps the high functional
Sensitivity of subject-specific models to errors in musculo-skeletal geometry.
Carbone, V; van der Krogt, M M; Koopman, H F J M; Verdonschot, N
2012-09-21
Subject-specific musculo-skeletal models of the lower extremity are an important tool for investigating various biomechanical problems, for instance the results of surgery such as joint replacements and tendon transfers. The aim of this study was to assess the potential effects of errors in musculo-skeletal geometry on subject-specific model results. We performed an extensive sensitivity analysis to quantify the effect of the perturbation of origin, insertion and via points of each of the 56 musculo-tendon parts contained in the model. We used two metrics, namely a Local Sensitivity Index (LSI) and an Overall Sensitivity Index (OSI), to distinguish the effect of the perturbation on the predicted force produced by only the perturbed musculo-tendon parts and by all the remaining musculo-tendon parts, respectively, during a simulated gait cycle. Results indicated that, for each musculo-tendon part, only two points show a significant sensitivity: its origin, or pseudo-origin, point and its insertion, or pseudo-insertion, point. The most sensitive points belong to those musculo-tendon parts that act as prime movers in the walking movement (insertion point of the Achilles Tendon: LSI=15.56%, OSI=7.17%; origin points of the Rectus Femoris: LSI=13.89%, OSI=2.44%) and as hip stabilizers (insertion points of the Gluteus Medius Anterior: LSI=17.92%, OSI=2.79%; insertion point of the Gluteus Minimus: LSI=21.71%, OSI=2.41%). The proposed priority list provides quantitative information to improve the predictive accuracy of subject-specific musculo-skeletal models. Copyright © 2012 Elsevier Ltd. All rights reserved.
Reducing WCET Overestimations by Correcting Errors in Loop Bound Constraints
Directory of Open Access Journals (Sweden)
Fanqi Meng
2017-12-01
Full Text Available In order to reduce overestimations of worst-case execution time (WCET, in this article, we firstly report a kind of specific WCET overestimation caused by non-orthogonal nested loops. Then, we propose a novel correction approach which has three basic steps. The first step is to locate the worst-case execution path (WCEP in the control flow graph and then map it onto source code. The second step is to identify non-orthogonal nested loops from the WCEP by means of an abstract syntax tree. The last step is to recursively calculate the WCET errors caused by the loose loop bound constraints, and then subtract the total errors from the overestimations. The novelty lies in the fact that the WCET correction is only conducted on the non-branching part of WCEP, thus avoiding potential safety risks caused by possible WCEP switches. Experimental results show that our approach reduces the specific WCET overestimation by an average of more than 82%, and 100% of corrected WCET is no less than the actual WCET. Thus, our approach is not only effective but also safe. It will help developers to design energy-efficient and safe real-time systems.
Reducing image interpretation errors – Do communication strategies undermine this?
International Nuclear Information System (INIS)
Snaith, B.; Hardy, M.; Lewis, E.F.
2014-01-01
Introduction: Errors in the interpretation of diagnostic images in the emergency department are a persistent problem internationally. To address this issue, a number of risk reduction strategies have been suggested but only radiographer abnormality detection schemes (RADS) have been widely implemented in the UK. This study considers the variation in RADS operation and communication in light of technological advances and changes in service operation. Methods: A postal survey of all NHS hospitals operating either an Emergency Department or Minor Injury Unit and a diagnostic imaging (radiology) department (n = 510) was undertaken between July and August 2011. The questionnaire was designed to elicit information on emergency service provision and details of RADS. Results: 325 questionnaires were returned (n = 325/510; 63.7%). The majority of sites (n = 288/325; 88.6%) operated a RADS with the majority (n = 227/288; 78.8%) employing a visual ‘flagging’ system as the only method of communication although symbols used were inconsistent and contradictory across sites. 61 sites communicated radiographer findings through a written proforma (paper or electronic) but this was run in conjunction with a flagging system at 50 sites. The majority of sites did not have guidance on the scope or operation of the ‘flagging’ or written communication system in use. Conclusions: RADS is an established clinical intervention to reduce errors in diagnostic image interpretation within the emergency setting. The lack of standardisation in communication processes and practices alongside the rapid adoption of technology has increased the potential for error and miscommunication
Bagheri, Zahra S; Melancon, David; Liu, Lu; Johnston, R Burnett; Pasini, Damiano
2017-06-01
The accuracy of Additive Manufacturing processes in fabricating porous biomaterials is currently limited by their capacity to render pore morphology that precisely matches its design. In a porous biomaterial, a geometric mismatch can result in pore occlusion and strut thinning, drawbacks that can inherently compromise bone ingrowth and severely impact mechanical performance. This paper focuses on Selective Laser Melting of porous microarchitecture and proposes a compensation scheme that reduces the morphology mismatch between as-designed and as-manufactured geometry, in particular that of the pore. A spider web analog is introduced, built out of Ti-6Al-4V powder via SLM, and morphologically characterized. Results from error analysis of strut thickness are used to generate thickness compensation relations expressed as a function of the angle each strut formed with the build plane. The scheme is applied to fabricate a set of three-dimensional porous biomaterials, which are morphologically and mechanically characterized via micro Computed Tomography, mechanically tested and numerically analyzed. For strut thickness, the results show the largest mismatch (60% from the design) occurring for horizontal members, reduces to 3.1% upon application of the compensation. Similar improvement is observed also for the mechanical properties, a factor that further corroborates the merit of the design-oriented scheme here introduced. Copyright © 2016 Elsevier Ltd. All rights reserved.
Reducing waste and errors: piloting lean principles at Intermountain Healthcare.
Jimmerson, Cindy; Weber, Dorothy; Sobek, Durward K
2005-05-01
The Toyota Production System (TPS), based on industrial engineering principles and operational innovations, is used to achieve waste reduction and efficiency while increasing product quality. Several key tools and principles, adapted to health care, have proved effective in improving hospital operations. Value Stream Maps (VSMs), which represent the key people, material, and information flows required to deliver a product or service, distinguish between value-adding and non-value-adding steps. The one-page Problem-Solving A3 Report guides staff through a rigorous and systematic problem-solving process. PILOT PROJECT at INTERMOUNTAIN HEALTHCARE: In a pilot project, participants made many improvements, ranging from simple changes implemented immediately (for example, heart monitor paper not available when a patient presented with a dysrythmia) to larger projects involving patient or information flow issues across multiple departments. Most of the improvements required little or no investment and reduced significant amounts of wasted time for front-line workers. In one unit, turnaround time for pathologist reports from an anatomical pathology lab was reduced from five to two days. TPS principles and tools are applicable to an endless variety of processes and work settings in health care and can be used to address critical challenges such as medical errors, escalating costs, and staffing shortages.
Reducing number entry errors: solving a widespread, serious problem.
Thimbleby, Harold; Cairns, Paul
2010-10-06
Number entry is ubiquitous: it is required in many fields including science, healthcare, education, government, mathematics and finance. People entering numbers are to be expected to make errors, but shockingly few systems make any effort to detect, block or otherwise manage errors. Worse, errors may be ignored but processed in arbitrary ways, with unintended results. A standard class of error (defined in the paper) is an 'out by 10 error', which is easily made by miskeying a decimal point or a zero. In safety-critical domains, such as drug delivery, out by 10 errors generally have adverse consequences. Here, we expose the extent of the problem of numeric errors in a very wide range of systems. An analysis of better error management is presented: under reasonable assumptions, we show that the probability of out by 10 errors can be halved by better user interface design. We provide a demonstration user interface to show that the approach is practical.To kill an error is as good a service as, and sometimes even better than, the establishing of a new truth or fact. (Charles Darwin 1879 [2008], p. 229).
Interventions for reducing medication errors in children in hospital
Maaskant, Jolanda M; Vermeulen, Hester; Apampa, Bugewa; Fernando, Bernard; Ghaleb, Maisoon A; Neubert, Antje; Thayyil, Sudhin; Soe, Aung
2015-01-01
BACKGROUND: Many hospitalised patients are affected by medication errors (MEs) that may cause discomfort, harm and even death. Children are at especially high risk of harm as the result of MEs because such errors are potentially more hazardous to them than to adults. Until now, interventions to
Interventions for reducing medication errors in children in hospital
Maaskant, Jolanda M.; Vermeulen, Hester; Apampa, Bugewa; Fernando, Bernard; Ghaleb, Maisoon A.; Neubert, Antje; Thayyil, Sudhin; Soe, Aung
2015-01-01
Background Many hospitalised patients are affected by medication errors (MEs) that may cause discomfort, harm and even death. Children are at especially high risk of harm as the result of MEs because such errors are potentially more hazardous to them than to adults. Until now, interventions to
Sensitivity of subject-specific models to errors in musculo-skeletal geometry
Carbone, V.; van der Krogt, M.M.; Koopman, H.F.J.M.; Verdonschot, N.
2012-01-01
Subject-specific musculo-skeletal models of the lower extremity are an important tool for investigating various biomechanical problems, for instance the results of surgery such as joint replacements and tendon transfers. The aim of this study was to assess the potential effects of errors in
Reducing Technology-Induced Errors: Organizational and Health Systems Approaches.
Borycki, Elizabeth M; Senthriajah, Yalini; Kushniruk, Andre W; Palojoki, Sari; Saranto, Kaija; Takeda, Hiroshi
2016-01-01
Technology-induced errors are a growing concern for health care organizations. Such errors arise from the interaction between healthcare and information technology deployed in complex settings and contexts. As the number of health information technologies that are used to provide patient care rises so will the need to develop ways to improve the quality and safety of the technology that we use. The objective of the panel is to describe varying approaches to improving software safety from and organizational and health systems perspective. We define what a technology-induced error is. Then, we discuss how software design and testing can be used to improve health information technologies. This discussion is followed by work in the area of monitoring and reporting at a health district and national level. Lastly, we draw on the quality, safety and resilience literature. The target audience for this work are nursing and health informatics researchers, practitioners, administrators, policy makers and students.
Physical predictions from lattice QCD. Reducing systematic errors
International Nuclear Information System (INIS)
Pittori, C.
1994-01-01
Some recent developments in the theoretical understanding of lattice quantum chromodynamics and of its possible sources of systematic errors are reported, and a review of some of the latest Monte Carlo results for light quarks phenomenology is presented. A very general introduction on a quantum field theory on a discrete spacetime lattice is given, and the Monte Carlo methods which allow to compute many interesting physical quantities in the non-perturbative domain of strong interactions, is illustrated. (author). 17 refs., 3 figs., 3 tabs
Maintenance strategies to reduce downtime due to machine positional errors
Shagluf, Abubaker; Longstaff, A.P.; Fletcher, S.
2014-01-01
Proceedings of Maintenance Performance Measurement and Management (MPMM) Conference 2014 Manufacturing strives to reduce waste and increase Overall Equipment Effectiveness (OEE). When managing machine tool maintenance a manufacturer must apply an appropriate decision technique in order to reveal hidden costs associated with production losses, reduce equipment downtime competentely and similiarly identify the machines performance. Total productive maintenance (TPM) is a maintenance progr...
Cognitive strategies: a method to reduce diagnostic errors in ER
Directory of Open Access Journals (Sweden)
Carolina Prevaldi
2009-02-01
Full Text Available I wonder why sometimes we are able to rapidly recognize patterns of disease presentation, formulate a speedy diagnostic closure, and go on with a treatment plan. On the other hand sometimes we proceed studing in deep our patient in an analytic, slow and rational way of decison making. Why decisions sometimes can be intuitive, while sometimes we have to proceed in a rigorous way? What is the “back ground noise” and the “signal to noise ratio” of presenting sintoms? What is the risk in premature labeling or “closure” of a patient? When is it useful the “cook-book” approach in clinical decision making? The Emergency Department is a natural laboratory for the study of error” stated an author. Many studies have focused on the occurrence of errors in medicine, and in hospital practice, but the ED with his unique operating characteristics seems to be a uniquely errorprone environment. That's why it is useful to understand the underlying pattern of thinking that can lead us to misdiagnosis. The general knowledge of thought processes gives the psysician awareness an the ability to apply different tecniques in clinical decision making and to recognize and avoid pitfalls.
Twice cutting method reduces tibial cutting error in unicompartmental knee arthroplasty.
Inui, Hiroshi; Taketomi, Shuji; Yamagami, Ryota; Sanada, Takaki; Tanaka, Sakae
2016-01-01
Bone cutting error can be one of the causes of malalignment in unicompartmental knee arthroplasty (UKA). The amount of cutting error in total knee arthroplasty has been reported. However, none have investigated cutting error in UKA. The purpose of this study was to reveal the amount of cutting error in UKA when open cutting guide was used and clarify whether cutting the tibia horizontally twice using the same cutting guide reduced the cutting errors in UKA. We measured the alignment of the tibial cutting guides, the first-cut cutting surfaces and the second cut cutting surfaces using the navigation system in 50 UKAs. Cutting error was defined as the angular difference between the cutting guide and cutting surface. The mean absolute first-cut cutting error was 1.9° (1.1° varus) in the coronal plane and 1.1° (0.6° anterior slope) in the sagittal plane, whereas the mean absolute second-cut cutting error was 1.1° (0.6° varus) in the coronal plane and 1.1° (0.4° anterior slope) in the sagittal plane. Cutting the tibia horizontally twice reduced the cutting errors in the coronal plane significantly (Pcutting the tibia horizontally twice using the same cutting guide reduced cutting error in the coronal plane. Copyright © 2014 Elsevier B.V. All rights reserved.
Utilizing AFIS searching tools to reduce errors in fingerprint casework.
Langenburg, Glenn; Hall, Carey; Rosemarie, Quincy
2015-12-01
Fifty-six (56) adjudicated, property crime cases involving fingerprint evidence were reviewed using a case-specific AFIS database tool. This tool allowed fingerprint experts to search latent prints in the cases against a database of friction ridge exemplars limited to only the individuals specific to that particular case. We utilized three different methods to encode and search the latent prints: automatic feature extraction, manual encoding performed by a student intern, and manual encoding performed by a fingerprint expert. Performance in the study was strongest when the encoding was conducted by the fingerprint expert. The results of the study showed that while the AFIS tools failed to locate all of the identifications originally reported by the initial fingerprint expert that worked the case, the AFIS tools helped to identify 7 additional latent prints that were not reported by the initial fingerprint expert. We conclude that this technology, when combined with fingerprint expertise, will reduce the number of instances where an erroneous exclusion could occur, increase the efficiency of a fingerprint unit, and be a useful tool for reviewing active or cold cases for missed opportunities to report identifications. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
O'Connell, Emer; Pegler, Joe; Lehane, Elaine; Livingstone, Vicki; McCarthy, Nora; Sahm, Laura J; Tabirca, Sabin; O'Driscoll, Aoife; Corrigan, Mark
2016-01-01
Patient safety requires optimal management of medications. Electronic systems are encouraged to reduce medication errors. Near field communications (NFC) is an emerging technology that may be used to develop novel medication management systems. An NFC-based system was designed to facilitate prescribing, administration and review of medications commonly used on surgical wards. Final year medical, nursing, and pharmacy students were recruited to test the electronic system in a cross-over observational setting on a simulated ward. Medication errors were compared against errors recorded using a paper-based system. A significant difference in the commission of medication errors was seen when NFC and paper-based medication systems were compared. Paper use resulted in a mean of 4.09 errors per prescribing round while NFC prescribing resulted in a mean of 0.22 errors per simulated prescribing round (P=0.000). Likewise, medication administration errors were reduced from a mean of 2.30 per drug round with a Paper system to a mean of 0.80 errors per round using NFC (PNFC based medication system may be used to effectively reduce medication errors in a simulated ward environment.
Automated drug dispensing system reduces medication errors in an intensive care setting.
Chapuis, Claire; Roustit, Matthieu; Bal, Gaëlle; Schwebel, Carole; Pansu, Pascal; David-Tchouda, Sandra; Foroni, Luc; Calop, Jean; Timsit, Jean-François; Allenet, Benoît; Bosson, Jean-Luc; Bedouch, Pierrick
2010-12-01
We aimed to assess the impact of an automated dispensing system on the incidence of medication errors related to picking, preparation, and administration of drugs in a medical intensive care unit. We also evaluated the clinical significance of such errors and user satisfaction. Preintervention and postintervention study involving a control and an intervention medical intensive care unit. Two medical intensive care units in the same department of a 2,000-bed university hospital. Adult medical intensive care patients. After a 2-month observation period, we implemented an automated dispensing system in one of the units (study unit) chosen randomly, with the other unit being the control. The overall error rate was expressed as a percentage of total opportunities for error. The severity of errors was classified according to National Coordinating Council for Medication Error Reporting and Prevention categories by an expert committee. User satisfaction was assessed through self-administered questionnaires completed by nurses. A total of 1,476 medications for 115 patients were observed. After automated dispensing system implementation, we observed a reduced percentage of total opportunities for error in the study compared to the control unit (13.5% and 18.6%, respectively; perror (20.4% and 13.5%; perror showed a significant impact of the automated dispensing system in reducing preparation errors (perrors caused no harm (National Coordinating Council for Medication Error Reporting and Prevention category C). The automated dispensing system did not reduce errors causing harm. Finally, the mean for working conditions improved from 1.0±0.8 to 2.5±0.8 on the four-point Likert scale. The implementation of an automated dispensing system reduced overall medication errors related to picking, preparation, and administration of drugs in the intensive care unit. Furthermore, most nurses favored the new drug dispensation organization.
Reducing errors benefits the field-based learning of a fundamental movement skill in children.
Capio, C M; Poolton, J M; Sit, C H P; Holmstrom, M; Masters, R S W
2013-03-01
Proficient fundamental movement skills (FMS) are believed to form the basis of more complex movement patterns in sports. This study examined the development of the FMS of overhand throwing in children through either an error-reduced (ER) or error-strewn (ES) training program. Students (n = 216), aged 8-12 years (M = 9.16, SD = 0.96), practiced overhand throwing in either a program that reduced errors during practice (ER) or one that was ES. ER program reduced errors by incrementally raising the task difficulty, while the ES program had an incremental lowering of task difficulty. Process-oriented assessment of throwing movement form (Test of Gross Motor Development-2) and product-oriented assessment of throwing accuracy (absolute error) were performed. Changes in performance were examined among children in the upper and lower quartiles of the pretest throwing accuracy scores. ER training participants showed greater gains in movement form and accuracy, and performed throwing more effectively with a concurrent secondary cognitive task. Movement form improved among girls, while throwing accuracy improved among children with low ability. Reduced performance errors in FMS training resulted in greater learning than a program that did not restrict errors. Reduced cognitive processing costs (effective dual-task performance) associated with such approach suggest its potential benefits for children with developmental conditions. © 2011 John Wiley & Sons A/S.
LENUS (Irish Health Repository)
O’Connell, Emer
2016-07-01
Patient safety requires optimal management of medications. Electronic systems are encouraged to reduce medication errors. Near field communications (NFC) is an emerging technology that may be used to develop novel medication management systems.
Automation of Commanding at NASA: Reducing Human Error in Space Flight
Dorn, Sarah J.
2010-01-01
Automation has been implemented in many different industries to improve efficiency and reduce human error. Reducing or eliminating the human interaction in tasks has been proven to increase productivity in manufacturing and lessen the risk of mistakes by humans in the airline industry. Human space flight requires the flight controllers to monitor multiple systems and react quickly when failures occur so NASA is interested in implementing techniques that can assist in these tasks. Using automation to control some of these responsibilities could reduce the number of errors the flight controllers encounter due to standard human error characteristics. This paper will investigate the possibility of reducing human error in the critical area of manned space flight at NASA.
Sub-Doppler cooling in reduced-period optical lattice geometries
International Nuclear Information System (INIS)
Berman, P.R.; Raithel, G.; Zhang, R.; Malinovsky, V.S.
2005-01-01
It is shown that sub-Doppler cooling occurs in an atom-field geometry that can lead to reduced-period optical lattices. Four optical fields are combined to produce a 'standing wave' Raman field that drives transitions between two ground state sublevels. In contrast to conventional Sisyphus cooling, sub-Doppler cooling to zero velocity occurs when all fields are polarized in the same direction. Solutions are obtained using both semiclassical and quantum Monte Carlo methods in the case of exact two-photon resonance. The connection of the results with conventional Sisyphus cooling is established using a dressed state basis
Reducing patient identification errors related to glucose point-of-care testing
Directory of Open Access Journals (Sweden)
Gaurav Alreja
2011-01-01
Full Text Available Background: Patient identification (ID errors in point-of-care testing (POCT can cause test results to be transferred to the wrong patient′s chart or prevent results from being transmitted and reported. Despite the implementation of patient barcoding and ongoing operator training at our institution, patient ID errors still occur with glucose POCT. The aim of this study was to develop a solution to reduce identification errors with POCT. Materials and Methods: Glucose POCT was performed by approximately 2,400 clinical operators throughout our health system. Patients are identified by scanning in wristband barcodes or by manual data entry using portable glucose meters. Meters are docked to upload data to a database server which then transmits data to any medical record matching the financial number of the test result. With a new model, meters connect to an interface manager where the patient ID (a nine-digit account number is checked against patient registration data from admission, discharge, and transfer (ADT feeds and only matched results are transferred to the patient′s electronic medical record. With the new process, the patient ID is checked prior to testing, and testing is prevented until ID errors are resolved. Results: When averaged over a period of a month, ID errors were reduced to 3 errors/month (0.015% in comparison with 61.5 errors/month (0.319% before implementing the new meters. Conclusion: Patient ID errors may occur with glucose POCT despite patient barcoding. The verification of patient identification should ideally take place at the bedside before testing occurs so that the errors can be addressed in real time. The introduction of an ADT feed directly to glucose meters reduced patient ID errors in POCT.
Song, Chang; Du, Liqun; Zhao, Wenjun; Zhu, Heqing; Zhao, Wen; Wang, Weitai
2018-04-01
Micro electroforming, as a mature micromachining technology, is widely used to fabricate metal microdevices in micro electro mechanical systems (MEMS). However, large residual stress in the local positions of the micro electroforming layer often leads to non-uniform residual stress distributions, dimension accuracy defects and reliability issues during fabrication of the metal microdevice. To solve this problem, a novel design method of presetting stress release geometries in the topological structure of the metal microstructure is proposed in this paper. First, the effect of stress release geometries (circular shape, annular groove shape and rivet shape) on the residual stress in the metal microstructure was investigated by finite element modeling (FEM) analysis. Two evaluation parameters, stress concentration factor K T and stress non-uniformity factor δ were calculated. The simulation results show that presetting stress release geometries can effectively reduce and homogenize the residual stress in the metal microstructures were measured metal microstructure. By combined use with stress release geometries of annular groove shape and rivet shape, the stress concentration factor K T and the stress non-uniformity factor δ both decreased at a maximum of 49% and 53%, respectively. Meanwhile, the average residual stress σ avg decreased at a maximum of 20% from -292.4 MPa to -232.6 MPa. Then, micro electroforming experiments were carried out corresponding to the simulation models. The residual stresses in the metal microstructures were measured by micro Raman spectroscopy (MRS) method. The results of the experiment proved that the stress non-uniformity factor δ and the average residual stress σ avg also decreased at a maximum with the combination use of annular groove shape and rivet shape stress release geometries, which is in agreement with the results of FEM analysis. The stress non-uniformity factor δ has a maximum decrease of 49% and the
A continuous quality improvement project to reduce medication error in the emergency department.
Lee, Sara Bc; Lee, Larry Ly; Yeung, Richard Sd; Chan, Jimmy Ts
2013-01-01
Medication errors are a common source of adverse healthcare incidents particularly in the emergency department (ED) that has a number of factors that make it prone to medication errors. This project aims to reduce medication errors and improve the health and economic outcomes of clinical care in Hong Kong ED. In 2009, a task group was formed to identify problems that potentially endanger medication safety and developed strategies to eliminate these problems. Responsible officers were assigned to look after seven error-prone areas. Strategies were proposed, discussed, endorsed and promulgated to eliminate the problems identified. A reduction of medication incidents (MI) from 16 to 6 was achieved before and after the improvement work. This project successfully established a concrete organizational structure to safeguard error-prone areas of medication safety in a sustainable manner.
The possible benefits of reduced errors in the motor skills acquisition of children
Directory of Open Access Journals (Sweden)
Capio Catherine M
2012-01-01
Full Text Available Abstract An implicit approach to motor learning suggests that relatively complex movement skills may be better acquired in environments that constrain errors during the initial stages of practice. This current concept paper proposes that reducing the number of errors committed during motor learning leads to stable performance when attention demands are increased by concurrent cognitive tasks. While it appears that this approach to practice may be beneficial for motor learning, further studies are needed to both confirm this advantage and better understand the underlying mechanisms. An approach involving error minimization during early learning may have important applications in paediatric rehabilitation.
Current pulse: can a production system reduce medical errors in health care?
Printezis, Antonios; Gopalakrishnan, Mohan
2007-01-01
One of the reasons for rising health care costs is medical errors, a majority of which result from faulty systems and processes. Health care in the past has used process-based initiatives such as Total Quality Management, Continuous Quality Improvement, and Six Sigma to reduce errors. These initiatives to redesign health care, reduce errors, and improve overall efficiency and customer satisfaction have had moderate success. Current trend is to apply the successful Toyota Production System (TPS) to health care since its organizing principles have led to tremendous improvement in productivity and quality for Toyota and other businesses that have adapted them. This article presents insights on the effectiveness of TPS principles in health care and the challenges that lie ahead in successfully integrating this approach with other quality initiatives.
2010-04-12
... packaging designs. Among these measures, FDA agreed that by the end of FY 2010, after public consultation... product names and designing product labels and packaging to reduce medication errors. Four panel... of product packaging design, and costs associated with designing product packaging. Panel 3 will...
Applying lessons learned to enhance human performance and reduce human error for ISS operations
Energy Technology Data Exchange (ETDEWEB)
Nelson, W.R.
1998-09-01
A major component of reliability, safety, and mission success for space missions is ensuring that the humans involved (flight crew, ground crew, mission control, etc.) perform their tasks and functions as required. This includes compliance with training and procedures during normal conditions, and successful compensation when malfunctions or unexpected conditions occur. A very significant issue that affects human performance in space flight is human error. Human errors can invalidate carefully designed equipment and procedures. If certain errors combine with equipment failures or design flaws, mission failure or loss of life can occur. The control of human error during operation of the International Space Station (ISS) will be critical to the overall success of the program. As experience from Mir operations has shown, human performance plays a vital role in the success or failure of long duration space missions. The Department of Energy`s Idaho National Engineering and Environmental Laboratory (INEEL) is developed a systematic approach to enhance human performance and reduce human errors for ISS operations. This approach is based on the systematic identification and evaluation of lessons learned from past space missions such as Mir to enhance the design and operation of ISS. This paper describes previous INEEL research on human error sponsored by NASA and how it can be applied to enhance human reliability for ISS.
Mulej Bratec, Satja; Xie, Xiyao; Schmid, Gabriele; Doll, Anselm; Schilbach, Leonhard; Zimmer, Claus; Wohlschläger, Afra; Riedl, Valentin; Sorg, Christian
2015-12-01
Cognitive emotion regulation is a powerful way of modulating emotional responses. However, despite the vital role of emotions in learning, it is unknown whether the effect of cognitive emotion regulation also extends to the modulation of learning. Computational models indicate prediction error activity, typically observed in the striatum and ventral tegmental area, as a critical neural mechanism involved in associative learning. We used model-based fMRI during aversive conditioning with and without cognitive emotion regulation to test the hypothesis that emotion regulation would affect prediction error-related neural activity in the striatum and ventral tegmental area, reflecting an emotion regulation-related modulation of learning. Our results show that cognitive emotion regulation reduced emotion-related brain activity, but increased prediction error-related activity in a network involving ventral tegmental area, hippocampus, insula and ventral striatum. While the reduction of response activity was related to behavioral measures of emotion regulation success, the enhancement of prediction error-related neural activity was related to learning performance. Furthermore, functional connectivity between the ventral tegmental area and ventrolateral prefrontal cortex, an area involved in regulation, was specifically increased during emotion regulation and likewise related to learning performance. Our data, therefore, provide first-time evidence that beyond reducing emotional responses, cognitive emotion regulation affects learning by enhancing prediction error-related activity, potentially via tegmental dopaminergic pathways. Copyright © 2015 Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Hardcastle, Nicholas; Bender, Edward T.; Tomé, Wolfgang A.
2014-01-01
It has previously been shown that deformable image registrations (DIRs) often result in deformation maps that are neither inverse-consistent nor transitive, and that the dose accumulation based on these deformation maps can be inconsistent if different image pathways are used for dose accumulation. A method presented to reduce inverse consistency and transitivity errors has been shown to result in more consistent dose accumulation, regardless of the image pathway selected for dose accumulation. The present study investigates the effect on the dose accumulation accuracy of deformation maps processed to reduce inverse consistency and transitivity errors. A set of lung 4DCT phases were analysed, consisting of four images on which a dose grid was created. Dose to 75 corresponding anatomical locations was manually tracked. Dose accumulation was performed between all image sets with Demons derived deformation maps as well as deformation maps processed to reduce inverse consistency and transitivity errors. The ground truth accumulated dose was then compared with the accumulated dose derived from DIR. Two dose accumulation image pathways were considered. The post-processing method to reduce inverse consistency and transitivity errors had minimal effect on the dose accumulation accuracy. There was a statistically significant improvement in dose accumulation accuracy for one pathway, but for the other pathway there was no statistically significant difference. A post-processing technique to reduce inverse consistency and transitivity errors has a positive, yet minimal effect on the dose accumulation accuracy. Thus the post-processing technique improves consistency of dose accumulation with minimal effect on dose accumulation accuracy.
The use of adaptive radiation therapy to reduce setup error: a prospective clinical study
International Nuclear Information System (INIS)
Yan Di; Wong, John; Vicini, Frank; Robertson, John; Horwitz, Eric; Brabbins, Donald; Cook, Carla; Gustafson, Gary; Stromberg, Jannifer; Martinez, Alvaro
1996-01-01
, eight patients had completed the study. Their mean systematic setup error was 4 mm with a range of 2 mm to 6 mm before adjustment; and was reduced to 0.8 mm with a range of 0.2 mm to 1.8 mm after adjustments. There was no significant difference in their random setup errors before and after adjustment. Analysis of the block overlap distributions shows that the fractions of the prescribed field areas covered by the daily treatment increased after setup adjustment. The block overlap distributions also show that the magnitude of random setup errors at different field edges were different; 50% of which were small enough to allow the treatment margin to be reduced to 4 mm or less. Results from the on-going treatments of the remaining 12 patients show similar trends and magnitudes, and are not expected to be different. Conclusion: Our prospective study demonstrates that the ART process provides an effective and reliable approach to compensate for the systematic setup error of the individual patient. Adjusting the MLC field allows accurate setup adjustment as small as 2 mm, minimizes the possibility of 'unsettling' the patient and reduces the work load of the therapists. The ART process can be extended to correct for random setup errors by further modification of the MLC field shape and prescribed dose. Most importantly, ART integrates the use of advanced technologies to maximize treatment benefits, and can be important in the implementation of dose escalated conformal therapy
Reducing wrong patient selection errors: exploring the design space of user interface techniques.
Sopan, Awalin; Plaisant, Catherine; Powsner, Seth; Shneiderman, Ben
2014-01-01
Wrong patient selection errors are a major issue for patient safety; from ordering medication to performing surgery, the stakes are high. Widespread adoption of Electronic Health Record (EHR) and Computerized Provider Order Entry (CPOE) systems makes patient selection using a computer screen a frequent task for clinicians. Careful design of the user interface can help mitigate the problem by helping providers recall their patients' identities, accurately select their names, and spot errors before orders are submitted. We propose a catalog of twenty seven distinct user interface techniques, organized according to a task analysis. An associated video demonstrates eighteen of those techniques. EHR designers who consider a wider range of human-computer interaction techniques could reduce selection errors, but verification of efficacy is still needed.
Energy Technology Data Exchange (ETDEWEB)
Olama, Mohammed M [ORNL; Matalgah, Mustafa M [ORNL; Bobrek, Miljko [ORNL
2015-01-01
Traditional encryption techniques require packet overhead, produce processing time delay, and suffer from severe quality of service deterioration due to fades and interference in wireless channels. These issues reduce the effective transmission data rate (throughput) considerably in wireless communications, where data rate with limited bandwidth is the main constraint. In this paper, performance evaluation analyses are conducted for an integrated signaling-encryption mechanism that is secure and enables improved throughput and probability of bit-error in wireless channels. This mechanism eliminates the drawbacks stated herein by encrypting only a small portion of an entire transmitted frame, while the rest is not subject to traditional encryption but goes through a signaling process (designed transformation) with the plaintext of the portion selected for encryption. We also propose to incorporate error correction coding solely on the small encrypted portion of the data to drastically improve the overall bit-error rate performance while not noticeably increasing the required bit-rate. We focus on validating the signaling-encryption mechanism utilizing Hamming and convolutional error correction coding by conducting an end-to-end system-level simulation-based study. The average probability of bit-error and throughput of the encryption mechanism are evaluated over standard Gaussian and Rayleigh fading-type channels and compared to the ones of the conventional advanced encryption standard (AES).
An improved approach to reduce partial volume errors in brain SPET
International Nuclear Information System (INIS)
Hatton, R.L.; Hatton, B.F.; Michael, G.; Barnden, L.; QUT, Brisbane, QLD; The Queen Elizabeth Hospital, Adelaide, SA
1999-01-01
Full text: Limitations in SPET resolution give rise to significant partial volume error (PVE) in small brain structures We have investigated a previously published method (Muller-Gartner et al., J Cereb Blood Flow Metab 1992;16: 650-658) to correct PVE in grey matter using MRI. An MRI is registered and segmented to obtain a grey matter tissue volume which is then smoothed to obtain resolution matched to the corresponding SPET. By dividing the original SPET with this correction map, structures can be corrected for PVE on a pixel-by-pixel basis. Since this approach is limited by space-invariant filtering, modification was made by estimating projections for the segmented MRI and reconstructing these using identical parameters to SPET. The methods were tested on simulated brain scans, reconstructed with the ordered subsets EM algorithm (8,16, 32, 64 equivalent EM iterations) The new method provided better recovery visually. For 32 EM iterations, recovery coefficients were calculated for grey matter regions. The effects of potential errors in the method were examined. Mean recovery was unchanged with one pixel registration error, the maximum error found in most registration programs. Errors in segmentation > 2 pixels results in loss of accuracy for small structures. The method promises to be useful for reducing PVE in brain SPET
Blaya, J A; Shin, S S; Yale, G; Suarez, C; Asencios, L; Contreras, C; Rodriguez, P; Kim, J; Cegielski, P; Fraser, H S F
2010-08-01
To evaluate the impact of the e-Chasqui laboratory information system in reducing reporting errors compared to the current paper system. Cluster randomized controlled trial in 76 health centers (HCs) between 2004 and 2008. Baseline data were collected every 4 months for 12 months. HCs were then randomly assigned to intervention (e-Chasqui) or control (paper). Further data were collected for the same months the following year. Comparisons were made between intervention and control HCs, and before and after the intervention. Intervention HCs had respectively 82% and 87% fewer errors in reporting results for drug susceptibility tests (2.1% vs. 11.9%, P = 0.001, OR 0.17, 95%CI 0.09-0.31) and cultures (2.0% vs. 15.1%, P Chasqui users sent on average three electronic error reports per week to the laboratories. e-Chasqui reduced the number of missing laboratory results at point-of-care health centers. Clinical users confirmed viewing electronic results not available on paper. Reporting errors to the laboratory using e-Chasqui promoted continuous quality improvement. The e-Chasqui laboratory information system is an important part of laboratory infrastructure improvements to support multidrug-resistant tuberculosis care in Peru.
Benefits and risks of using smart pumps to reduce medication error rates: a systematic review.
Ohashi, Kumiko; Dalleur, Olivia; Dykes, Patricia C; Bates, David W
2014-12-01
Smart infusion pumps have been introduced to prevent medication errors and have been widely adopted nationally in the USA, though they are not always used in Europe or other regions. Despite widespread usage of smart pumps, intravenous medication errors have not been fully eliminated. Through a systematic review of recent studies and reports regarding smart pump implementation and use, we aimed to identify the impact of smart pumps on error reduction and on the complex process of medication administration, and strategies to maximize the benefits of smart pumps. The medical literature related to the effects of smart pumps for improving patient safety was searched in PUBMED, EMBASE, and the Cochrane Central Register of Controlled Trials (CENTRAL) (2000-2014) and relevant papers were selected by two researchers. After the literature search, 231 papers were identified and the full texts of 138 articles were assessed for eligibility. Of these, 22 were included after removal of papers that did not meet the inclusion criteria. We assessed both the benefits and negative effects of smart pumps from these studies. One of the benefits of using smart pumps was intercepting errors such as the wrong rate, wrong dose, and pump setting errors. Other benefits include reduction of adverse drug event rates, practice improvements, and cost effectiveness. Meanwhile, the current issues or negative effects related to using smart pumps were lower compliance rates of using smart pumps, the overriding of soft alerts, non-intercepted errors, or the possibility of using the wrong drug library. The literature suggests that smart pumps reduce but do not eliminate programming errors. Although the hard limits of a drug library play a main role in intercepting medication errors, soft limits were still not as effective as hard limits because of high override rates. Compliance in using smart pumps is key towards effectively preventing errors. Opportunities for improvement include upgrading drug
Human factors interventions to reduce human errors and improve productivity in maintenance tasks
International Nuclear Information System (INIS)
Isoda, Hachiro; Yasutake, J.Y.
1992-01-01
This paper describes work in progress to develop interventions to reduce human errors and increase maintenance productivity in nuclear power plants. The effort is part of a two-phased Human Factors research program being conducted jointly by the Central Research Institute of Electric Power Industry (CRIEPI) in Japan and the Electric Power Research Institute (EPRI) in the United States. The overall objective of this joint research program is to identify critical maintenance tasks and to develop, implement and evaluate interventions which have high potential for reducing human errors or increasing maintenance productivity. As a result of the Phase 1 effort, ten critical maintenance tasks were identified. For these tasks, over 25 candidate interventions were identified for potential development. After careful analysis, seven interventions were selected for development during Phase 2. This paper describes the methodology used to analyze and identify the most critical tasks, the process of identifying and developing selected interventions and some of the initial results. (author)
The effect of TWD estimation error on the geometry of machined surfaces in micro-EDM milling
DEFF Research Database (Denmark)
Puthumana, Govindan; Bissacco, Giuliano; Hansen, Hans Nørgaard
In micro EDM (electrical discharge machining) milling, tool electrode wear must be effectively compensated in order to achieve high accuracy of machined features [1]. Tool wear compensation in micro-EDM milling can be based on off-line techniques with limited accuracy such as estimation...... and statistical characterization of the discharge population [3]. The TWD based approach permits the direct control of the position of the tool electrode front surface. However, TWD estimation errors will generate a self-amplifying error on the tool electrode axial depth during micro-EDM milling. Therefore....... The error propagation effect is demonstrated through a software simulation tool developed by the authors for determination of the correct TWD for subsequent use in compensation of electrode wear in EDM milling. The implemented model uses an initial arbitrary estimation of TWD and a single experiment...
Strategies for reducing basis set superposition error (BSSE) in O/AU and O/Ni
Shuttleworth, I.G.
2015-01-01
© 2015 Elsevier Ltd. All rights reserved. The effect of basis set superposition error (BSSE) and effective strategies for the minimisation have been investigated using the SIESTA-LCAO DFT package. Variation of the energy shift parameter ΔEPAO has been shown to reduce BSSE for bulk Au and Ni and across their oxygenated surfaces. Alternative strategies based on either the expansion or contraction of the basis set have been shown to be ineffective in reducing BSSE. Comparison of the binding energies for the surface systems obtained using LCAO were compared with BSSE-free plane wave energies.
Strategies for reducing basis set superposition error (BSSE) in O/AU and O/Ni
Shuttleworth, I.G.
2015-11-01
© 2015 Elsevier Ltd. All rights reserved. The effect of basis set superposition error (BSSE) and effective strategies for the minimisation have been investigated using the SIESTA-LCAO DFT package. Variation of the energy shift parameter ΔEPAO has been shown to reduce BSSE for bulk Au and Ni and across their oxygenated surfaces. Alternative strategies based on either the expansion or contraction of the basis set have been shown to be ineffective in reducing BSSE. Comparison of the binding energies for the surface systems obtained using LCAO were compared with BSSE-free plane wave energies.
A method for optical ground station reduce alignment error in satellite-ground quantum experiments
He, Dong; Wang, Qiang; Zhou, Jian-Wei; Song, Zhi-Jun; Zhong, Dai-Jun; Jiang, Yu; Liu, Wan-Sheng; Huang, Yong-Mei
2018-03-01
A satellite dedicated for quantum science experiments, has been developed and successfully launched from Jiuquan, China, on August 16, 2016. Two new optical ground stations (OGSs) were built to cooperate with the satellite to complete satellite-ground quantum experiments. OGS corrected its pointing direction by satellite trajectory error to coarse tracking system and uplink beacon sight, therefore fine tracking CCD and uplink beacon optical axis alignment accuracy was to ensure that beacon could cover the quantum satellite in all time when it passed the OGSs. Unfortunately, when we tested specifications of the OGSs, due to the coarse tracking optical system was commercial telescopes, the change of position of the target in the coarse CCD was up to 600μrad along with the change of elevation angle. In this paper, a method of reduce alignment error between beacon beam and fine tracking CCD is proposed. Firstly, OGS fitted the curve of target positions in coarse CCD along with the change of elevation angle. Secondly, OGS fitted the curve of hexapod secondary mirror positions along with the change of elevation angle. Thirdly, when tracking satellite, the fine tracking error unloaded on the real-time zero point position of coarse CCD which computed by the firstly calibration data. Simultaneously the positions of the hexapod secondary mirror were adjusted by the secondly calibration data. Finally the experiment result is proposed. Results show that the alignment error is less than 50μrad.
Novel error propagation approach for reducing H2S/O2 reaction mechanism
International Nuclear Information System (INIS)
Selim, H.; Gupta, A.K.; Sassi, M.
2012-01-01
A reduction strategy of hydrogen sulfide/oxygen reaction mechanism is conducted to simplify the detailed mechanism. Direct relation graph and error propagation methodology (DRGEP) has been used. A novel approach of direct elementary reaction error (DERE) has been developed in this study. The developed approach allowed for further reduction of the reaction mechanism. The reduced mechanism has been compared with the detailed mechanism under different conditions to emphasize its validity. The results obtained from the resulting reduced mechanism showed good agreement with that from the detailed mechanism. However, some discrepancies have been found for some species. Hydrogen and oxygen mole fractions showed the largest discrepancy of all combustion products. The reduced mechanism was also found to be capable of tracking the changes that occur in chemical kinetics through the change in reaction conditions. A comparison on the ignition delay time obtained from the reduced mechanism and previous experimental data showed good agreement. The reduced mechanism was used to track changes in mechanistic pathways of Claus reactions with the reaction progress.
Customization of user interfaces to reduce errors and enhance user acceptance.
Burkolter, Dina; Weyers, Benjamin; Kluge, Annette; Luther, Wolfram
2014-03-01
Customization is assumed to reduce error and increase user acceptance in the human-machine relation. Reconfiguration gives the operator the option to customize a user interface according to his or her own preferences. An experimental study with 72 computer science students using a simulated process control task was conducted. The reconfiguration group (RG) interactively reconfigured their user interfaces and used the reconfigured user interface in the subsequent test whereas the control group (CG) used a default user interface. Results showed significantly lower error rates and higher acceptance of the RG compared to the CG while there were no significant differences between the groups regarding situation awareness and mental workload. Reconfiguration seems to be promising and therefore warrants further exploration. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Checklist Usage as a Guidance on Read-Back Reducing the Potential Risk of Medication Error
Directory of Open Access Journals (Sweden)
Ida Bagus N. Maharjana
2014-06-01
Full Text Available Hospital as a last line of health services shall provide quality service and oriented on patient safety, one responsibility in preventing medication errors. Effective collaboration and communication between the profession needed to achieve patient safety. Read-back is one way of doing effective communication. Before-after study with PDCA TQM approach. The samples were on the medication chart patient medical rd rd records in the 3 week of May (before and the 3 week in July (after 2013. Treatment using the check list, asked for time 2 minutes to read-back by the doctors and nurses after the visit together. Obtained 57 samples (before and 64 samples (after. Before charging 45.54% incomplete medication chart on patient medical records that have the potential risk of medication error to 10.17% after treatment with a read back check list for 10 weeks, with 77.78% based on the achievement of the PDCA TQM approach. Checklist usage as a guidance on Read-back as an effective communication can reduce charging incompleteness drug records on medical records that have the potential risk of medication errors, 45.54% to 10.17%.
Cheng, Dunlei; Branscum, Adam J; Stamey, James D
2010-07-01
To quantify the impact of ignoring misclassification of a response variable and measurement error in a covariate on statistical power, and to develop software for sample size and power analysis that accounts for these flaws in epidemiologic data. A Monte Carlo simulation-based procedure is developed to illustrate the differences in design requirements and inferences between analytic methods that properly account for misclassification and measurement error to those that do not in regression models for cross-sectional and cohort data. We found that failure to account for these flaws in epidemiologic data can lead to a substantial reduction in statistical power, over 25% in some cases. The proposed method substantially reduced bias by up to a ten-fold margin compared to naive estimates obtained by ignoring misclassification and mismeasurement. We recommend as routine practice that researchers account for errors in measurement of both response and covariate data when determining sample size, performing power calculations, or analyzing data from epidemiological studies. 2010 Elsevier Inc. All rights reserved.
Impact of Educational Activities in Reducing Pre-Analytical Laboratory Errors: A quality initiative.
Al-Ghaithi, Hamed; Pathare, Anil; Al-Mamari, Sahimah; Villacrucis, Rodrigo; Fawaz, Naglaa; Alkindi, Salam
2017-08-01
Pre-analytic errors during diagnostic laboratory investigations can lead to increased patient morbidity and mortality. This study aimed to ascertain the effect of educational nursing activities on the incidence of pre-analytical errors resulting in non-conforming blood samples. This study was conducted between January 2008 and December 2015. All specimens received at the Haematology Laboratory of the Sultan Qaboos University Hospital, Muscat, Oman, during this period were prospectively collected and analysed. Similar data from 2007 were collected retrospectively and used as a baseline for comparison. Non-conforming samples were defined as either clotted samples, haemolysed samples, use of the wrong anticoagulant, insufficient quantities of blood collected, incorrect/lack of labelling on a sample or lack of delivery of a sample in spite of a sample request. From 2008 onwards, multiple educational training activities directed at the hospital nursing staff and nursing students primarily responsible for blood collection were implemented on a regular basis. After initiating corrective measures in 2008, a progressive reduction in the percentage of non-conforming samples was observed from 2009 onwards. Despite a 127.84% increase in the total number of specimens received, there was a significant reduction in non-conforming samples from 0.29% in 2007 to 0.07% in 2015, resulting in an improvement of 75.86% ( P educational activities directed primarily towards hospital nursing staff had a positive impact on the quality of laboratory specimens by significantly reducing pre-analytical errors.
Carette, Yannick; Vanhove, Hans; Duflou, Joost
2018-05-01
Single Point Incremental Forming is a flexible process that is well-suited for small batch production and rapid prototyping of complex sheet metal parts. The distributed nature of the deformation process and the unsupported sheet imply that controlling the final accuracy of the workpiece is challenging. To improve the process limits and the accuracy of SPIF, the use of multiple forming passes has been proposed and discussed by a number of authors. Most methods use multiple intermediate models, where the previous one is strictly smaller than the next one, while gradually increasing the workpieces' wall angles. Another method that can be used is the manufacture of a smoothed-out "base geometry" in the first pass, after which more detailed features can be added in subsequent passes. In both methods, the selection of these intermediate shapes is freely decided by the user. However, their practical implementation in the production of complex freeform parts is not straightforward. The original CAD model can be manually adjusted or completely new CAD models can be created. This paper discusses an automatic method that is able to extract the base geometry from a full STL-based CAD model in an analytical way. Harmonic decomposition is used to express the final geometry as the sum of individual surface harmonics. It is then possible to filter these harmonic contributions to obtain a new CAD model with a desired level of geometric detail. This paper explains the technique and its implementation, as well as its use in the automatic generation of multi-step geometries.
Hagger-Johnson, Gareth; Harron, Katie; Goldstein, Harvey; Aldridge, Robert; Gilbert, Ruth
2017-06-30
BACKGROUND: The pseudonymisation algorithm used to link together episodes of care belonging to the same patients in England (HESID) has never undergone any formal evaluation, to determine the extent of data linkage error. To quantify improvements in linkage accuracy from adding probabilistic linkage to existing deterministic HESID algorithms. Inpatient admissions to NHS hospitals in England (Hospital Episode Statistics, HES) over 17 years (1998 to 2015) for a sample of patients (born 13/28th of months in 1992/1998/2005/2012). We compared the existing deterministic algorithm with one that included an additional probabilistic step, in relation to a reference standard created using enhanced probabilistic matching with additional clinical and demographic information. Missed and false matches were quantified and the impact on estimates of hospital readmission within one year were determined. HESID produced a high missed match rate, improving over time (8.6% in 1998 to 0.4% in 2015). Missed matches were more common for ethnic minorities, those living in areas of high socio-economic deprivation, foreign patients and those with 'no fixed abode'. Estimates of the readmission rate were biased for several patient groups owing to missed matches, which was reduced for nearly all groups. CONCLUSION: Probabilistic linkage of HES reduced missed matches and bias in estimated readmission rates, with clear implications for commissioning, service evaluation and performance monitoring of hospitals. The existing algorithm should be modified to address data linkage error, and a retrospective update of the existing data would address existing linkage errors and their implications.
Hepatic glucose output in humans measured with labeled glucose to reduce negative errors
International Nuclear Information System (INIS)
Levy, J.C.; Brown, G.; Matthews, D.R.; Turner, R.C.
1989-01-01
Steele and others have suggested that minimizing changes in glucose specific activity when estimating hepatic glucose output (HGO) during glucose infusions could reduce non-steady-state errors. This approach was assessed in nondiabetic and type II diabetic subjects during constant low dose [27 mumol.kg ideal body wt (IBW)-1.min-1] glucose infusion followed by a 12 mmol/l hyperglycemic clamp. Eight subjects had paired tests with and without labeled infusions. Labeled infusion was used to compare HGO in 11 nondiabetic and 15 diabetic subjects. Whereas unlabeled infusions produced negative values for endogenous glucose output, labeled infusions largely eliminated this error and reduced the dependence of the Steele model on the pool fraction in the paired tests. By use of labeled infusions, 11 nondiabetic subjects suppressed HGO from 10.2 +/- 0.6 (SE) fasting to 0.8 +/- 0.9 mumol.kg IBW-1.min-1 after 90 min of glucose infusion and to -1.9 +/- 0.5 mumol.kg IBW-1.min-1 after 90 min of a 12 mmol/l glucose clamp, but 15 diabetic subjects suppressed only partially from 13.0 +/- 0.9 fasting to 5.7 +/- 1.2 at the end of the glucose infusion and 5.6 +/- 1.0 mumol.kg IBW-1.min-1 in the clamp (P = 0.02, 0.002, and less than 0.001, respectively)
A channel-by-channel method of reducing the errors associated with peak area integration
International Nuclear Information System (INIS)
Luedeke, T.P.; Tripard, G.E.
1996-01-01
A new method of reducing the errors associated with peak area integration has been developed. This method utilizes the signal content of each channel as an estimate of the overall peak area. These individual estimates can then be weighted according to the precision with which each estimate is known, producing an overall area estimate. Experimental measurements were performed on a small peak sitting on a large background, and the results compared to those obtained from a commercial software program. Results showed a marked decrease in the spread of results around the true value (obtained by counting for a long period of time), and a reduction in the statistical uncertainty associated with the peak area. (orig.)
Reducing Individual Variation for fMRI Studies in Children by Minimizing Template Related Errors.
Directory of Open Access Journals (Sweden)
Jian Weng
Full Text Available Spatial normalization is an essential process for group comparisons in functional MRI studies. In practice, there is a risk of normalization errors particularly in studies involving children, seniors or diseased populations and in regions with high individual variation. One way to minimize normalization errors is to create a study-specific template based on a large sample size. However, studies with a large sample size are not always feasible, particularly for children studies. The performance of templates with a small sample size has not been evaluated in fMRI studies in children. In the current study, this issue was encountered in a working memory task with 29 children in two groups. We compared the performance of different templates: a study-specific template created by the experimental population, a Chinese children template and the widely used adult MNI template. We observed distinct differences in the right orbitofrontal region among the three templates in between-group comparisons. The study-specific template and the Chinese children template were more sensitive for the detection of between-group differences in the orbitofrontal cortex than the MNI template. Proper templates could effectively reduce individual variation. Further analysis revealed a correlation between the BOLD contrast size and the norm index of the affine transformation matrix, i.e., the SFN, which characterizes the difference between a template and a native image and differs significantly across subjects. Thereby, we proposed and tested another method to reduce individual variation that included the SFN as a covariate in group-wise statistics. This correction exhibits outstanding performance in enhancing detection power in group-level tests. A training effect of abacus-based mental calculation was also demonstrated, with significantly elevated activation in the right orbitofrontal region that correlated with behavioral response time across subjects in the trained group.
Rose, Julian A. R.; Tong, Jenna R.; Allain, Damien J.; Mitchell, Cathryn N.
2011-01-01
Signals from Global Positioning System (GPS) satellites at the horizon or at low elevations are often excluded from a GPS solution because they experience considerable ionospheric delays and multipath effects. Their exclusion can degrade the overall satellite geometry for the calculations, resulting in greater errors; an effect known as the Dilution of Precision (DOP). In contrast, signals from high elevation satellites experience less ionospheric delays and multipath effects. The aim is to find a balance in the choice of elevation mask, to reduce the propagation delays and multipath whilst maintaining good satellite geometry, and to use tomography to correct for the ionosphere and thus improve single-frequency GPS timing accuracy. GPS data, collected from a global network of dual-frequency GPS receivers, have been used to produce four GPS timing solutions, each with a different ionospheric compensation technique. One solution uses a 4D tomographic algorithm, Multi-Instrument Data Analysis System (MIDAS), to compensate for the ionospheric delay. Maps of ionospheric electron density are produced and used to correct the single-frequency pseudorange observations. This method is compared to a dual-frequency solution and two other single-frequency solutions: one does not include any ionospheric compensation and the other uses the broadcast Klobuchar model. Data from the solar maximum year 2002 and October 2003 have been investigated to display results when the ionospheric delays are large and variable. The study focuses on Europe and results are produced for the chosen test site, VILL (Villafranca, Spain). The effects of excluding all of the GPS satellites below various elevation masks, ranging from 5° to 40°, on timing solutions for fixed (static) and mobile (moving) situations are presented. The greatest timing accuracies when using the fixed GPS receiver technique are obtained by using a 40° mask, rather than a 5° mask. The mobile GPS timing solutions are most
Khalil, Hanan; Bell, Brian; Chambers, Helen; Sheikh, Aziz; Avery, Anthony J
2017-10-04
Medication-related adverse events in primary care represent an important cause of hospital admissions and mortality. Adverse events could result from people experiencing adverse drug reactions (not usually preventable) or could be due to medication errors (usually preventable). To determine the effectiveness of professional, organisational and structural interventions compared to standard care to reduce preventable medication errors by primary healthcare professionals that lead to hospital admissions, emergency department visits, and mortality in adults. We searched CENTRAL, MEDLINE, Embase, three other databases, and two trial registries on 4 October 2016, together with reference checking, citation searching and contact with study authors to identify additional studies. We also searched several sources of grey literature. We included randomised trials in which healthcare professionals provided community-based medical services. We also included interventions in outpatient clinics attached to a hospital where people are seen by healthcare professionals but are not admitted to hospital. We only included interventions that aimed to reduce medication errors leading to hospital admissions, emergency department visits, or mortality. We included all participants, irrespective of age, who were prescribed medication by a primary healthcare professional. Three review authors independently extracted data. Each of the outcomes (hospital admissions, emergency department visits, and mortality), are reported in natural units (i.e. number of participants with an event per total number of participants at follow-up). We presented all outcomes as risk ratios (RRs) with 95% confidence intervals (CIs). We used the GRADE tool to assess the certainty of evidence. We included 30 studies (169,969 participants) in the review addressing various interventions to prevent medication errors; four studies addressed professional interventions (8266 participants) and 26 studies described
Making Residents Part of the Safety Culture: Improving Error Reporting and Reducing Harms.
Fox, Michael D; Bump, Gregory M; Butler, Gabriella A; Chen, Ling-Wan; Buchert, Andrew R
2017-01-30
Reporting medical errors is a focus of the patient safety movement. As frontline physicians, residents are optimally positioned to recognize errors and flaws in systems of care. Previous work highlights the difficulty of engaging residents in identification and/or reduction of medical errors and in integrating these trainees into their institutions' cultures of safety. The authors describe the implementation of a longitudinal, discipline-based, multifaceted curriculum to enhance the reporting of errors by pediatric residents at Children's Hospital of Pittsburgh of University of Pittsburgh Medical Center. The key elements of this curriculum included providing the necessary education to identify medical errors with an emphasis on systems-based causes, modeling of error reporting by faculty, and integrating error reporting and discussion into the residents' daily activities. The authors tracked monthly error reporting rates by residents and other health care professionals, in addition to serious harm event rates at the institution. The interventions resulted in significant increases in error reports filed by residents, from 3.6 to 37.8 per month over 4 years (P error reporting correlated with a decline in serious harm events, from 15.0 to 8.1 per month over 4 years (P = 0.01). Integrating patient safety into the everyday resident responsibilities encourages frequent reporting and discussion of medical errors and leads to improvements in patient care. Multiple simultaneous interventions are essential to making residents part of the safety culture of their training hospitals.
Naik, Aanand Dinkar; Rao, Raghuram; Petersen, Laura Ann
2008-01-01
Diagnostic errors are poorly understood despite being a frequent cause of medical errors. Recent efforts have aimed to advance the "basic science" of diagnostic error prevention by tracing errors to their most basic origins. Although a refined theory of diagnostic error prevention will take years to formulate, we focus on communication breakdown, a major contributor to diagnostic errors and an increasingly recognized preventable factor in medical mishaps. We describe a comprehensive framework that integrates the potential sources of communication breakdowns within the diagnostic process and identifies vulnerable steps in the diagnostic process where various types of communication breakdowns can precipitate error. We then discuss potential information technology-based interventions that may have efficacy in preventing one or more forms of these breakdowns. These possible intervention strategies include using new technologies to enhance communication between health providers and health systems, improve patient involvement, and facilitate management of information in the medical record. PMID:18373151
The optimisation of limiter geometry to reduce impurity influx in tokamaks
International Nuclear Information System (INIS)
Matthews, G.F.; McCracken, G.M.; Sewell, P.; Goodall, D.H.J.; Stangeby, P.C.; Pitcher, C.S.
1987-01-01
Conventional limiters are designed to withstand large power loadings and hence are constructed with surfaces at grazing angles to the toroidal magnetic field. As a result any impurities released from the limiter surface are projected towards the centre of the plasma and are poorly screened from it. The impurity control limiter (ICL), an alternative concept which has an inverted geometry is discussed. The ICL shape is designed to direct the impurities towards the wall. Results are presented from a two-dimensional neutral particle code which maps the ionisation of carbon physically sputtered by deuterons from a carbon limiter. This ionisation source is coupled to a one-dimensional impurity transport code which calculates the implied central impurity density. The results demonstrate that the ICL achieves impurity control in two ways. Firstly, many of the sputtered impurities directed towards the wall are not ionised and return to the wall as neutrals. Secondly, much of the ionisation which does occur is located in the scrape-off layer. Here there is a strong ion sink which may also be enhanced by the flow of hydrogenic ions entraining impurity ions created close to the limiter surface. We conclude that a reduction in central impurity density of a factor of 10 is possible in a Tokamak such as DITE provided that the limiter is the main source of impurities. (author)
Estrada, T; Zhang, B; Cicotti, P; Armen, R S; Taufer, M
2012-07-01
We present a scalable and accurate method for classifying protein-ligand binding geometries in molecular docking. Our method is a three-step process: the first step encodes the geometry of a three-dimensional (3D) ligand conformation into a single 3D point in the space; the second step builds an octree by assigning an octant identifier to every single point in the space under consideration; and the third step performs an octree-based clustering on the reduced conformation space and identifies the most dense octant. We adapt our method for MapReduce and implement it in Hadoop. The load-balancing, fault-tolerance, and scalability in MapReduce allow screening of very large conformation spaces not approachable with traditional clustering methods. We analyze results for docking trials for 23 protein-ligand complexes for HIV protease, 21 protein-ligand complexes for Trypsin, and 12 protein-ligand complexes for P38alpha kinase. We also analyze cross docking trials for 24 ligands, each docking into 24 protein conformations of the HIV protease, and receptor ensemble docking trials for 24 ligands, each docking in a pool of HIV protease receptors. Our method demonstrates significant improvement over energy-only scoring for the accurate identification of native ligand geometries in all these docking assessments. The advantages of our clustering approach make it attractive for complex applications in real-world drug design efforts. We demonstrate that our method is particularly useful for clustering docking results using a minimal ensemble of representative protein conformational states (receptor ensemble docking), which is now a common strategy to address protein flexibility in molecular docking. Copyright © 2012 Elsevier Ltd. All rights reserved.
Trauma Quality Improvement: Reducing Triage Errors by Automating the Level Assignment Process.
Stonko, David P; O Neill, Dillon C; Dennis, Bradley M; Smith, Melissa; Gray, Jeffrey; Guillamondegui, Oscar D
2018-04-12
Trauma patients are triaged by the severity of their injury or need for intervention while en route to the trauma center according to trauma activation protocols that are institution specific. Significant research has been aimed at improving these protocols in order to optimize patient outcomes while striving for efficiency in care. However, it is known that patients are often undertriaged or overtriaged because protocol adherence remains imperfect. The goal of this quality improvement (QI) project was to improve this adherence, and thereby reduce the triage error. It was conducted as part of the formal undergraduate medical education curriculum at this institution. A QI team was assembled and baseline data were collected, then 2 Plan-Do-Study-Act (PDSA) cycles were implemented sequentially. During the first cycle, a novel web tool was developed and implemented in order to automate the level assignment process (it takes EMS-provided data and automatically determines the level); the tool was based on the existing trauma activation protocol. The second PDSA cycle focused on improving triage accuracy in isolated, less than 10% total body surface area burns, which we identified to be a point of common error. Traumas were reviewed and tabulated at the end of each PDSA cycle, and triage accuracy was followed with a run chart. This study was performed at Vanderbilt University Medical Center and Medical School, which has a large level 1 trauma center covering over 75,000 square miles, and which sees urban, suburban, and rural trauma. The baseline assessment period and each PDSA cycle lasted 2 weeks. During this time, all activated, adult, direct traumas were reviewed. There were 180 patients during the baseline period, 189 after the first test of change, and 150 after the second test of change. All were included in analysis. Of 180 patients, 30 were inappropriately triaged during baseline analysis (3 undertriaged and 27 overtriaged) versus 16 of 189 (3 undertriaged and 13
Martis, Walston R; Hannam, Jacqueline A; Lee, Tracey; Merry, Alan F; Mitchell, Simon J
2016-09-09
A new approach to administering the surgical safety checklist (SSC) at our institution using wall-mounted charts for each SSC domain coupled with migrated leadership among operating room (OR) sub-teams, led to improved compliance with the Sign Out domain. Since surgical specimens are reviewed at Sign Out, we aimed to quantify any related change in surgical specimen labelling errors. Prospectively maintained error logs for surgical specimens sent to pathology were examined for the six months before and after introduction of the new SSC administration paradigm. We recorded errors made in the labelling or completion of the specimen pot and on the specimen laboratory request form. Total error rates were calculated from the number of errors divided by total number of specimens. Rates from the two periods were compared using a chi square test. There were 19 errors in 4,760 specimens (rate 3.99/1,000) and eight errors in 5,065 specimens (rate 1.58/1,000) before and after the change in SSC administration paradigm (P=0.0225). Improved compliance with administering the Sign Out domain of the SSC can reduce surgical specimen errors. This finding provides further evidence that OR teams should optimise compliance with the SSC.
Nurses' Behaviors and Visual Scanning Patterns May Reduce Patient Identification Errors
Marquard, Jenna L.; Henneman, Philip L.; He, Ze; Jo, Junghee; Fisher, Donald L.; Henneman, Elizabeth A.
2011-01-01
Patient identification (ID) errors occurring during the medication administration process can be fatal. The aim of this study is to determine whether differences in nurses' behaviors and visual scanning patterns during the medication administration process influence their capacities to identify patient ID errors. Nurse participants (n = 20)…
International Nuclear Information System (INIS)
Lydia, Emilio J.; Barros, Ricardo C.
2011-01-01
In this paper we describe a response matrix method for one-speed slab-geometry discrete ordinates (SN) neutral particle transport problems that is completely free from spatial truncation errors. The unknowns in the method are the cell-edge angular fluxes of particles. The numerical results generated for these quantities are exactly those obtained from the analytic solution of the SN problem apart from finite arithmetic considerations. Our method is based on a spectral analysis that we perform in the SN equations with scattering inside a discretization cell of the spatial grid set up on the slab. As a result of this spectral analysis, we are able to obtain an expression for the local general solution of the SN equations. With this local general solution, we determine the response matrix and use the prescribed boundary conditions and continuity conditions to sweep across the discretization cells from left to right and from right to left across the slab, until a prescribed convergence criterion is satisfied. (author)
Toward reduced transport errors in a high resolution urban CO2 inversion system
Directory of Open Access Journals (Sweden)
Aijun Deng
2017-05-01
Full Text Available We present a high-resolution atmospheric inversion system combining a Lagrangian Particle Dispersion Model (LPDM and the Weather Research and Forecasting model (WRF, and test the impact of assimilating meteorological observation on transport accuracy. A Four Dimensional Data Assimilation (FDDA technique continuously assimilates meteorological observations from various observing systems into the transport modeling system, and is coupled to the high resolution CO2 emission product Hestia to simulate the atmospheric mole fractions of CO2. For the Indianapolis Flux Experiment (INFLUX project, we evaluated the impact of assimilating different meteorological observation systems on the linearized adjoint solutions and the CO2 inverse fluxes estimated using observed CO2 mole fractions from 11 out of 12 communications towers over Indianapolis for the Sep.-Nov. 2013 period. While assimilating WMO surface measurements improved the simulated wind speed and direction, their impact on the planetary boundary layer (PBL was limited. Simulated PBL wind statistics improved significantly when assimilating upper-air observations from the commercial airline program Aircraft Communications Addressing and Reporting System (ACARS and continuous ground-based Doppler lidar wind observations. Wind direction mean absolute error (MAE decreased from 26 to 14 degrees and the wind speed MAE decreased from 2.0 to 1.2 m s–1, while the bias remains small in all configurations (< 6 degrees and 0.2 m s–1. Wind speed MAE and ME are larger in daytime than in nighttime. PBL depth MAE is reduced by ~10%, with little bias reduction. The inverse results indicate that the spatial distribution of CO2 inverse fluxes were affected by the model performance while the overall flux estimates changed little across WRF simulations when aggregated over the entire domain. Our results show that PBL wind observations are a potent tool for increasing the precision of urban meteorological reanalyses
Effectiveness of Toyota process redesign in reducing thyroid gland fine-needle aspiration error.
Raab, Stephen S; Grzybicki, Dana Marie; Sudilovsky, Daniel; Balassanian, Ronald; Janosky, Janine E; Vrbin, Colleen M
2006-10-01
Our objective was to determine whether the Toyota Production System process redesign resulted in diagnostic error reduction for patients who underwent cytologic evaluation of thyroid nodules. In this longitudinal, nonconcurrent cohort study, we compared the diagnostic error frequency of a thyroid aspiration service before and after implementation of error reduction initiatives consisting of adoption of a standardized diagnostic terminology scheme and an immediate interpretation service. A total of 2,424 patients underwent aspiration. Following terminology standardization, the false-negative rate decreased from 41.8% to 19.1% (P = .006), the specimen nondiagnostic rate increased from 5.8% to 19.8% (P Toyota process change led to significantly fewer diagnostic errors for patients who underwent thyroid fine-needle aspiration.
Spinning geometry = Twisted geometry
International Nuclear Information System (INIS)
Freidel, Laurent; Ziprick, Jonathan
2014-01-01
It is well known that the SU(2)-gauge invariant phase space of loop gravity can be represented in terms of twisted geometries. These are piecewise-linear-flat geometries obtained by gluing together polyhedra, but the resulting geometries are not continuous across the faces. Here we show that this phase space can also be represented by continuous, piecewise-flat three-geometries called spinning geometries. These are composed of metric-flat three-cells glued together consistently. The geometry of each cell and the manner in which they are glued is compatible with the choice of fluxes and holonomies. We first remark that the fluxes provide each edge with an angular momentum. By studying the piecewise-flat geometries which minimize edge lengths, we show that these angular momenta can be literally interpreted as the spin of the edges: the geometries of all edges are necessarily helices. We also show that the compatibility of the gluing maps with the holonomy data results in the same conclusion. This shows that a spinning geometry represents a way to glue together the three-cells of a twisted geometry to form a continuous geometry which represents a point in the loop gravity phase space. (paper)
About errors, inaccuracies and sterotypes: Mistakes in media coverage - and how to reduce them
Scherzler, D.
2010-12-01
The main complaint made by scientists about the work of journalists is that there are mistakes and inaccuracies in TV programmes, radio or the print media. This seems to be an important reason why too few researchers want to deal with journalists. Such scientists regularly discover omissions, errors, exaggerations, distortions, stereotypes and sensationalism in the media. Surveys carried out on so-called accuracy research seem to concede this point as well. Errors frequently occur in journalism, and it is the task of the editorial offices to work very hard in order to keep the number of errors as low as possible. On closer inspection some errors, however, turn out to be simplifications and omissions. Both are obligatory in journalism and do not automatically cause factual errors. This paper examines the different kinds of mistakes and misleading information that scientists observe in the mass media. By giving a view from inside the mass media it tries to explain how errors come to exist in the journalist’s working routines. It outlines that the criteria of journalistic quality which scientists and science journalists apply differ substantially. The expectation of many scientists is that good science journalism passes on their results to the public in as “unadulterated” a form as possible. The author suggests, however, that quality criteria for journalism cannot be derived from how true to detail and how comprehensively it reports on science, nor to what extent the journalistic presentation is “correct” in the eyes of the researcher. The paper suggests in its main part that scientists who are contacted or interviewed by the mass media should not accept that errors just happen. On the contrary, they can do a lot to help preventing mistakes that might occur in the journalistic product. The author proposes several strategies how scientists and press information officers could identify possible errors, stereotypes and exaggeration by journalists in advance and
Training situational awareness to reduce surgical errors in the operating room
Graafland, M.; Schraagen, J.M.C.; Boermeester, M.A.; Bemelman, W.A.; Schijven, M.P.
2015-01-01
Background: Surgical errors result from faulty decision-making, misperceptions and the application of suboptimal problem-solving strategies, just as often as they result from technical failure. To date, surgical training curricula have focused mainly on the acquisition of technical skills. The aim
On failure of the pruning technique in "error repair in shift-reduce parsers"
Bertsch, E; Nederhof, MJ
A previous article presented a technique to compute the least-cost error repair by incrementally generating configurations that result from inserting and deleting tokens in a syntactically incorrect input. An additional mechanism to improve the run-time efficiency of this algorithm by pruning some
Reducing Check-in Errors at Brigham Young University through Statistical Process Control
Spackman, N. Andrew
2005-01-01
The relationship between the library and its patrons is damaged and the library's reputation suffers when returned items are not checked in. An informal survey reveals librarians' concern for this problem and their efforts to combat it, although few libraries collect objective measurements of errors or the effects of improvement efforts. Brigham…
Gonçalves da Silva, Anders; Barendse, William; Kijas, James W; Barris, Wes C; McWilliam, Sean; Bunch, Rowan J; McCullough, Russell; Harrison, Blair; Hoelzel, A Rus; England, Phillip R
2015-07-01
Single nucleotide polymorphisms (SNPs) have become the marker of choice for genetic studies in organisms of conservation, commercial or biological interest. Most SNP discovery projects in nonmodel organisms apply a strategy for identifying putative SNPs based on filtering rules that account for random sequencing errors. Here, we analyse data used to develop 4723 novel SNPs for the commercially important deep-sea fish, orange roughy (Hoplostethus atlanticus), to assess the impact of not accounting for systematic sequencing errors when filtering identified polymorphisms when discovering SNPs. We used SAMtools to identify polymorphisms in a velvet assembly of genomic DNA sequence data from seven individuals. The resulting set of polymorphisms were filtered to minimize 'bycatch'-polymorphisms caused by sequencing or assembly error. An Illumina Infinium SNP chip was used to genotype a final set of 7714 polymorphisms across 1734 individuals. Five predictors were examined for their effect on the probability of obtaining an assayable SNP: depth of coverage, number of reads that support a variant, polymorphism type (e.g. A/C), strand-bias and Illumina SNP probe design score. Our results indicate that filtering out systematic sequencing errors could substantially improve the efficiency of SNP discovery. We show that BLASTX can be used as an efficient tool to identify single-copy genomic regions in the absence of a reference genome. The results have implications for research aiming to identify assayable SNPs and build SNP genotyping assays for nonmodel organisms. © 2014 John Wiley & Sons Ltd.
Optimal control strategy to reduce the temporal wavefront error in AO systems
Doelman, N.J.; Hinnen, K.J.G.; Stoffelen, F.J.G.; Verhaegen, M.H.
2004-01-01
An Adaptive Optics (AO) system for astronomy is analysed from a control point of view. The focus is put on the temporal error. The AO controller is identified as a feedback regulator system, operating in closed-loop with the aim of rejecting wavefront disturbances. Limitations on the performance of
Reducing Bias and Error in the Correlation Coefficient Due to Nonnormality
Bishara, Anthony J.; Hittner, James B.
2015-01-01
It is more common for educational and psychological data to be nonnormal than to be approximately normal. This tendency may lead to bias and error in point estimates of the Pearson correlation coefficient. In a series of Monte Carlo simulations, the Pearson correlation was examined under conditions of normal and nonnormal data, and it was compared…
Good ergonomics and team diversity reduce absenteeism and errors in car manufacturing.
Fritzsche, Lars; Wegge, Jürgen; Schmauder, Martin; Kliegel, Matthias; Schmidt, Klaus-Helmut
2014-01-01
Prior research suggests that ergonomics work design and mixed teams (in age and gender) may compensate declines in certain abilities of ageing employees. This study investigates simultaneous effects of both team level factors on absenteeism and performance (error rates) over one year in a sample of 56 car assembly teams (N = 623). Results show that age was related to prolonged absenteeism and more mistakes in work planning, but not to overall performance. In comparison, high-physical workload was strongly associated with longer absenteeism and increased error rates. Furthermore, controlling for physical workload, age diversity was related to shorter absenteeism, and the presence of females in the team was associated with shorter absenteeism and better performance. In summary, this study suggests that both ergonomics work design and mixed team composition may compensate age-related productivity risks in manufacturing by maintaining the work ability of older employees and improving job quality.
2016-06-10
use different terminology depending on which sister service they are from. Every service has various medical capabilities for each role of medical ... Medical Errors, Combat Casualty Care, Culture of Safety 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a...Army) AE Adverse event AHRQ Agency for Healthcare Research and Quality AHS Army Health System AMEDD Army Medical Department CPQ Clinical Practice
1983-08-01
Standard Errors for B1 Bell-shaped distribution Rectangular Item b Bn-45 n=90 n-45 n=45 -No. i i N-1500 N=1500 N-6000 N=1500 1 -2.01 -1.75 0.516 0.466...34th Streets Lawrence, KS 66045 Baltimore, MD 21218 ENIC Facility-Acquisitions 1 Dr. Ron Hambleton 4t33 Rugby Avenue School of Education Lcthesda, !ID
In-hospital fellow coverage reduces communication errors in the surgical intensive care unit.
Williams, Mallory; Alban, Rodrigo F; Hardy, James P; Oxman, David A; Garcia, Edward R; Hevelone, Nathanael; Frendl, Gyorgy; Rogers, Selwyn O
2014-06-01
Staff coverage strategies of intensive care units (ICUs) impact clinical outcomes. High-intensity staff coverage strategies are associated with lower morbidity and mortality. Accessible clinical expertise, team work, and effective communication have all been attributed to the success of this coverage strategy. We evaluate the impact of in-hospital fellow coverage (IHFC) on improving communication of cardiorespiratory events. A prospective observational study performed in an academic tertiary care center with high-intensity staff coverage. The main outcome measure was resident to fellow communication of cardiorespiratory events during IHFC vs home coverage (HC) periods. Three hundred twelve cardiorespiratory events were collected in 114 surgical ICU patients in 134 study days. Complete data were available for 306 events. One hundred three communication errors occurred. IHFC was associated with significantly better communication of events compared to HC (Pcommunicated 89% of events during IHFC vs 51% of events during HC (PCommunication patterns of junior and midlevel residents were similar. Midlevel residents communicated 68% of all on-call events (87% IHFC vs 50% HC, Pcommunicated 66% of events (94% IHFC vs 52% HC, PCommunication errors were lower in all ICUs during IHFC (Pcommunication errors. Copyright © 2014 Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Hoogeman, Mischa S.; Herk, Marcel van; Bois, Josien de; Lebesque, Joos V.
2005-01-01
Background and purpose: The goal of this work is to develop and evaluate strategies to reduce the uncertainty in the prostate position and rectum shape that arises in the preparation stage of the radiation treatment of prostate cancer. Patients and methods: Nineteen prostate cancer patients, who were treated with 3-dimensional conformal radiotherapy, received each a planning CT scan and 8-13 repeat CT scans during the treatment period. We quantified prostate motion relative to the pelvic bone by first matching the repeat CT scans on the planning CT scan using the bony anatomy. Subsequently, each contoured prostate, including seminal vesicles, was matched on the prostate in the planning CT scan to obtain the translations and rotations. The variation in prostate position was determined in terms of the systematic, random and group mean error. We tested the performance of two correction strategies to reduce the systematic error due to prostate motion. The first strategy, the pre-treatment strategy, used only the initial rectum volume in the planning CT scan to adjust the angle of the prostate with respect to the left-right (LR) axis and the shape and position of the rectum. The second strategy, the adaptive strategy, used the data of repeat CT scans to improve the estimate of the prostate position and rectum shape during the treatment. Results: The largest component of prostate motion was a rotation around the LR axis. The systematic error (1 SD) was 5.1 deg and the random error was 3.6 deg (1 SD). The average LR-axis rotation between the planning and the repeat CT scans correlated significantly with the rectum volume in the planning CT scan (r=0.86, P<0.0001). Correction of the rotational position on the basis of the planning rectum volume alone reduced the systematic error by 28%. A correction, based on the data of the planning CT scan and 4 repeat CT scans reduced the systematic error over the complete treatment period by a factor of 2. When the correction was
Raab, Stephen S; Andrew-Jaja, Carey; Condel, Jennifer L; Dabbs, David J
2006-01-01
The objective of the study was to determine whether the Toyota production system process improves Papanicolaou test quality and patient safety. An 8-month nonconcurrent cohort study that included 464 case and 639 control women who had a Papanicolaou test was performed. Office workflow was redesigned using Toyota production system methods by introducing a 1-by-1 continuous flow process. We measured the frequency of Papanicolaou tests without a transformation zone component, follow-up and Bethesda System diagnostic frequency of atypical squamous cells of undetermined significance, and diagnostic error frequency. After the intervention, the percentage of Papanicolaou tests lacking a transformation zone component decreased from 9.9% to 4.7% (P = .001). The percentage of Papanicolaou tests with a diagnosis of atypical squamous cells of undetermined significance decreased from 7.8% to 3.9% (P = .007). The frequency of error per correlating cytologic-histologic specimen pair decreased from 9.52% to 7.84%. The introduction of the Toyota production system process resulted in improved Papanicolaou test quality.
Concha Larrauri, P.
2015-12-01
Orange production in Florida has experienced a decline over the past decade. Hurricanes in 2004 and 2005 greatly affected production, almost to the same degree as strong freezes that occurred in the 1980's. The spread of the citrus greening disease after the hurricanes has also contributed to a reduction in orange production in Florida. The occurrence of hurricanes and diseases cannot easily be predicted but the additional effects of climate on orange yield can be studied and incorporated into existing production forecasts that are based on physical surveys, such as the October Citrus forecast issued every year by the USDA. Specific climate variables ocurring before and after the October forecast is issued can have impacts on flowering, orange drop rates, growth, and maturation, and can contribute to the forecast error. Here we present a methodology to incorporate local climate variables to predict the USDA's orange production forecast error, and we study the local effects of climate on yield in different counties in Florida. This information can aid farmers to gain an insight on what is to be expected during the orange production cycle, and can help supply chain managers to better plan their strategy.
Methods to reduce medication errors in a clinical trial of an investigational parenteral medication
Directory of Open Access Journals (Sweden)
Gillian L. Fell
2016-12-01
Full Text Available There are few evidence-based guidelines to inform optimal design of complex clinical trials, such as those assessing the safety and efficacy of intravenous drugs administered daily with infusion times over many hours per day and treatment durations that may span years. This study is a retrospective review of inpatient administration deviation reports for an investigational drug that is administered daily with infusion times of 8–24 h, and variable treatment durations for each patient. We report study design modifications made in 2007–2008 aimed at minimizing deviations from an investigational drug infusion protocol approved by an institutional review board and the United States Food and Drug Administration. Modifications were specifically aimed at minimizing errors of infusion rate, incorrect dose, incorrect patient, or wrong drug administered. We found that the rate of these types of administration errors of the study drug was significantly decreased following adoption of the specific study design changes. This report provides guidance in the design of clinical trials testing the safety and efficacy of study drugs administered via intravenous infusion in an inpatient setting so as to minimize drug administration protocol deviations and optimize patient safety.
System care improves trauma outcome: patient care errors dominate reduced preventable death rate.
Thoburn, E; Norris, P; Flores, R; Goode, S; Rodriguez, E; Adams, V; Campbell, S; Albrink, M; Rosemurgy, A
1993-01-01
A review of 452 trauma deaths in Hillsborough County, Florida, in 1984 documented that 23% of non-CNS trauma deaths were preventable and occurred because of inadequate resuscitation or delay in proper surgical care. In late 1988 Hillsborough County organized a County Trauma Agency (HCTA) to coordinate trauma care among prehospital providers and state-designated trauma centers. The purpose of this study was to review county trauma deaths after the inception of the HCTA to determine the frequency of preventable deaths. 504 trauma deaths occurring between October 1989 and April 1991 were reviewed. Through committee review, 10 deaths were deemed preventable; 2 occurred outside the trauma system. Of the 10 deaths, 5 preventable deaths occurred late in severely injured patients. The preventable death rate has decreased to 7.0% with system care. The causes of preventable deaths have changed from delayed or inadequate intervention to postoperative care errors.
Multidisciplinary strategy to reduce errors with the use of medical gases.
Amor-García, Miguel Ángel; Ibáñez-García, Sara; Díaz-Redondo, Alicia; Herranz Alonso, Ana; Sanjurjo Sáez, María
2018-05-01
Lack of awareness of the risks associated with the use of medical gases amongst health professionals and health organizations is concerning. The objective of this study is to redefine the use process of medical gases in a hospital setting. A sentinel event took place in a clinical unit, the incorrect administration of a medical gas to an inpatient. A multidisciplinary causeroot analysis of the sentinel event was carried out. Different improvement points were identified for each error detected and so we defined a good strategy to ensure the safe use of these drugs. 9 errors were identified and the following improvement actions were defined: storage (gases of clinical use were separated from those of industrial use and proper identification signs were placed), prescription (6 protocols were included in the hospital´s Computerized Physician Order Entry software), validation (pharmacist validation of the prescription to ensure appropriate use of these), dispensation (a new protocol for medical gases dispensation and transportation was designed and implemented) and administration (information on the pressure gauges used for each type of gas was collected and reviewed). 72 Signs with recommendations for medical gases identification and administration were placed in all the clinical units. Specific training on the safe use of medical gases and general safety training was imparted. The implementation of a process that integrates all phases of use of medical gases and applies to all professionals involved is presented here as a strategy to increase safety in the use of these medicines. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
Maintenance Strategies to Reduce Downtime Due to\\ud Machine Positional Errors
Shagluf, Abubaker; Longstaff, Andrew P.; Fletcher, Simon
2014-01-01
Manufacturing strives to reduce waste and increase\\ud Overall Equipment Effectiveness (OEE). When managing machine tool maintenance a manufacturer must apply an appropriate decision technique in order to reveal hidden costs associated with production losses, reduce equipment downtime\\ud competently and similarly identify the machines’ performance.\\ud Total productive maintenance (TPM) is a maintenance program that involves concepts for maintaining plant and equipment effectively. OEE is a pow...
International Nuclear Information System (INIS)
William H. Miller; Dr. Jatinder Palta
2007-01-01
The objective of this research is to implement an inexpensive, quick and simple monitor that provides an accurate indication of proper patient position during the treatment of cancer by external beam X-ray radiation and also checks for any significant changes in patient anatomy. It is believed that this system will significantly reduce the treatment margin, provide an additional, independent quality assurance check of positioning accuracy prior to all treatments and reduce the probability of misadministration of therapeutic dose
International Nuclear Information System (INIS)
Fernández-Ahumada, E; Gómez, A; Vallesquino, P; Guerrero, J E; Pérez-Marín, D; Garrido-Varo, A; Fearn, T
2008-01-01
According to the current demands of the authorities, the manufacturers and the consumers, controls and assessments of the feed compound manufacturing process have become a key concern. Among others, it must be assured that a given compound feed is well manufactured and labelled in terms of the ingredient composition. When near-infrared spectroscopy (NIRS) together with linear models were used for the prediction of the ingredient composition, the results were not always acceptable. Therefore, the performance of nonlinear methods has been investigated. Artificial neural networks and least squares support vector machines (LS-SVM) have been applied to a large (N = 20 320) and heterogeneous population of non-milled feed compounds for the NIR prediction of the inclusion percentage of wheat and sunflower meal, as representative of two different classes of ingredients. Compared to partial least squares regression, results showed considerable reductions of standard error of prediction values for both methods and ingredients: reductions of 45% with ANN and 49% with LS-SVM for wheat and reductions of 44% with ANN and 46% with LS-SVM for sunflower meal. These improvements together with the facility of NIRS technology to be implemented in the process make it ideal for meeting the requirements of the animal feed industry
Vélez-Díaz-Pallarés, Manuel; Delgado-Silveira, Eva; Carretero-Accame, María Emilia; Bermejo-Vicedo, Teresa
2013-01-01
To identify actions to reduce medication errors in the process of drug prescription, validation and dispensing, and to evaluate the impact of their implementation. A Health Care Failure Mode and Effect Analysis (HFMEA) was supported by a before-and-after medication error study to measure the actual impact on error rate after the implementation of corrective actions in the process of drug prescription, validation and dispensing in wards equipped with computerised physician order entry (CPOE) and unit-dose distribution system (788 beds out of 1080) in a Spanish university hospital. The error study was carried out by two observers who reviewed medication orders on a daily basis to register prescription errors by physicians and validation errors by pharmacists. Drugs dispensed in the unit-dose trolleys were reviewed for dispensing errors. Error rates were expressed as the number of errors for each process divided by the total opportunities for error in that process times 100. A reduction in prescription errors was achieved by providing training for prescribers on CPOE, updating prescription procedures, improving clinical decision support and automating the software connection to the hospital census (relative risk reduction (RRR), 22.0%; 95% CI 12.1% to 31.8%). Validation errors were reduced after optimising time spent in educating pharmacy residents on patient safety, developing standardised validation procedures and improving aspects of the software's database (RRR, 19.4%; 95% CI 2.3% to 36.5%). Two actions reduced dispensing errors: reorganising the process of filling trolleys and drawing up a protocol for drug pharmacy checking before delivery (RRR, 38.5%; 95% CI 14.1% to 62.9%). HFMEA facilitated the identification of actions aimed at reducing medication errors in a healthcare setting, as the implementation of several of these led to a reduction in errors in the process of drug prescription, validation and dispensing.
Merry, Alan F; Webster, Craig S; Hannam, Jacqueline; Mitchell, Simon J; Henderson, Robert; Reid, Papaarangi; Edwards, Kylie-Ellen; Jardim, Anisoara; Pak, Nick; Cooper, Jeremy; Hopley, Lara; Frampton, Chris; Short, Timothy G
2011-09-22
To clinically evaluate a new patented multimodal system (SAFERSleep) designed to reduce errors in the recording and administration of drugs in anaesthesia. Prospective randomised open label clinical trial. Five designated operating theatres in a major tertiary referral hospital. Eighty nine consenting anaesthetists managing 1075 cases in which there were 10,764 drug administrations. Use of the new system (which includes customised drug trays and purpose designed drug trolley drawers to promote a well organised anaesthetic workspace and aseptic technique; pre-filled syringes for commonly used anaesthetic drugs; large legible colour coded drug labels; a barcode reader linked to a computer, speakers, and touch screen to provide automatic auditory and visual verification of selected drugs immediately before each administration; automatic compilation of an anaesthetic record; an on-screen and audible warning if an antibiotic has not been administered within 15 minutes of the start of anaesthesia; and certain procedural rules-notably, scanning the label before each drug administration) versus conventional practice in drug administration with a manually compiled anaesthetic record. Primary: composite of errors in the recording and administration of intravenous drugs detected by direct observation and by detailed reconciliation of the contents of used drug vials against recorded administrations; and lapses in responding to an intermittent visual stimulus (vigilance latency task). Secondary: outcomes in patients; analyses of anaesthetists' tasks and assessments of workload; evaluation of the legibility of anaesthetic records; evaluation of compliance with the procedural rules of the new system; and questionnaire based ratings of the respective systems by participants. The overall mean rate of drug errors per 100 administrations was 9.1 (95% confidence interval 6.9 to 11.4) with the new system (one in 11 administrations) and 11.6 (9.3 to 13.9) with conventional methods (one
Schoenberg, Mike R; Osborn, Katie E; Mahone, E Mark; Feigon, Maia; Roth, Robert M; Pliskin, Neil H
2017-11-08
Errors in communication are a leading cause of medical errors. A potential source of error in communicating neuropsychological results is confusion in the qualitative descriptors used to describe standardized neuropsychological data. This study sought to evaluate the extent to which medical consumers of neuropsychological assessments believed that results/findings were not clearly communicated. In addition, preference data for a variety of qualitative descriptors commonly used to communicate normative neuropsychological test scores were obtained. Preference data were obtained for five qualitative descriptor systems as part of a larger 36-item internet-based survey of physician satisfaction with neuropsychological services. A new qualitative descriptor system termed the Simplified Qualitative Classification System (Q-Simple) was proposed to reduce the potential for communication errors using seven terms: very superior, superior, high average, average, low average, borderline, and abnormal/impaired. A non-random convenience sample of 605 clinicians identified from four United States academic medical centers from January 1, 2015 through January 7, 2016 were invited to participate. A total of 182 surveys were completed. A minority of clinicians (12.5%) indicated that neuropsychological study results were not clearly communicated. When communicating neuropsychological standardized scores, the two most preferred qualitative descriptor systems were by Heaton and colleagues (26%) and a newly proposed Q-simple system (22%). Comprehensive norms for an extended Halstead-Reitan battery: Demographic corrections, research findings, and clinical applications. Odessa, TX: Psychological Assessment Resources) (26%) and the newly proposed Q-Simple system (22%). Initial findings highlight the need to improve and standardize communication of neuropsychological results. These data offer initial guidance for preferred terms to communicate test results and form a foundation for more
Reducing Error Rates for Iris Image using higher Contrast in Normalization process
Aminu Ghali, Abdulrahman; Jamel, Sapiee; Abubakar Pindar, Zahraddeen; Hasssan Disina, Abdulkadir; Mat Daris, Mustafa
2017-08-01
Iris recognition system is the most secured, and faster means of identification and authentication. However, iris recognition system suffers a setback from blurring, low contrast and illumination due to low quality image which compromises the accuracy of the system. The acceptance or rejection rates of verified user depend solely on the quality of the image. In many cases, iris recognition system with low image contrast could falsely accept or reject user. Therefore this paper adopts Histogram Equalization Technique to address the problem of False Rejection Rate (FRR) and False Acceptance Rate (FAR) by enhancing the contrast of the iris image. A histogram equalization technique enhances the image quality and neutralizes the low contrast of the image at normalization stage. The experimental result shows that Histogram Equalization Technique has reduced FRR and FAR compared to the existing techniques.
Directory of Open Access Journals (Sweden)
Ahmed Al Kuwaiti
2016-06-01
Full Text Available Medication errors will affect the patient safety and quality of healthcare. The aim of this study is to analyze the effect of Six Sigma (DMAIC methodology in reducing medication errors in the outpatient pharmacy of King Fahd Hospital of the University, Saudi Arabia. It was conducted through the five phases of Define, Measure, Analyze, Improve, Control (DMAIC model using various quality tools. The goal was fixed as to reduce medication errors in an outpatient pharmacy by 20%. After implementation of improvement strategies, there was a marked reduction of defects and also improvement of their sigma rating. Especially, Parts per million (PPM of prescription/data entry errors reduced from 56,000 to 5,000 and its sigma rating improved from 3.09 to 4.08. This study concluded that the Six Sigma (DMAIC methodology is found to be more significant in reducing medication errors and ensuring patient safety.
Applying the Toyota Production System: using a patient safety alert system to reduce error.
Furman, Cathie; Caplan, Robert
2007-07-01
In 2002, Virginia Mason Medical Center (VMMC) adapted the Toyota Production System, also known as lean manufacturing. To translate the techniques of zero defects and stopping the line into health care, the Patient Safety Alert (PSA) system requires any employee who encounters a situation that is likely to harm a patient to make an immediate report and to cease any activity that could cause further harm (stopping the line). IMPLEMENTING THE PSA SYSTEM--STOPPING THE LINE: If any VMMC employee's practice or conduct is deemed capable of causing harm to a patient, a PSA can cause that person to be stopped from working until the problem is resolved. A policy statement, senior executive commitment, dedicated resources, a 24-hour hotline, and communication were all key features of implementation. As of December 2006, 6,112 PSA reports were received: 20% from managers, 8% from physicians, 44% from nurses, and 23% from nonclinical support personnel, for example. The number of reports received per month increased from an average of 3 in 2002 to 285 in 2006. Most reports were processed within 24 hours and were resolved within 2 to 3 weeks. Implementing the PSA system has drastically increased the number of safety concerns that are resolved at VMMC, while drastically reducing the time it takes to resolve them. Transparent discussion and feedback have helped promote staff acceptance and participation.
[A project to reduce the incidence of intubation care errors among foreign health aides].
Chen, Mei-Ju; Lu, Yu-Hua; Chen, Chiu-Chun; Li, Ai-Cheng
2014-08-01
Foreign health aides are the main providers of care for the elderly and the physically disabled in Taiwan. Correct care skills improve patient safety. In 2010, the incidence of mistakes among foreign health aides in our hospital unit was 58% for nasogastric tube care and 57% for tracheostomy tube care. A survey of foreign health aides and nurses in the unit identified the main causes of these mistakes as: communication difficulties, inaccurate instructions given to patients, and a lack of standard operating procedures given to the foreign health aides. This project was designed to reduce the rates of improper nasogastric tube care and improper tracheostomy tube care to 20%, respectively. This project implemented several appropriate measures. We produced patient instruction hand-outs in Bahasa Indonesia, established a dedicated file holder for Bahasa Indonesian tube care reference information, produced Bahasa Indonesian tube-care-related posters, produced a short film about tube care in Bahasa Indonesian, and established a standardized operating procedure for tube care in our unit. Between December 15th and 31st, 2011, we audited the performance of a total of 32 foreign health aides for proper execution of nasogastric tube care (21 aides) and of proper execution of tracheostomy tube care (11 aides). Patients with concurrent nasogastric and tracheostomy tubes were inspected separately for each care group. The incidence of improper care decreased from 58% to 18% nasogastric intubation and 57% to 18% for tracheostomy intubation. This project decreased significantly the incidence of improper tube care by the foreign health aides in our unit. Furthermore, the foreign health aides improved their tube nursing care skills. Therefore, this project improved the quality of patient care.
Salmerón, Diego; Cano, Juan A; Chirlaque, María D
2015-08-30
In cohort studies, binary outcomes are very often analyzed by logistic regression. However, it is well known that when the goal is to estimate a risk ratio, the logistic regression is inappropriate if the outcome is common. In these cases, a log-binomial regression model is preferable. On the other hand, the estimation of the regression coefficients of the log-binomial model is difficult owing to the constraints that must be imposed on these coefficients. Bayesian methods allow a straightforward approach for log-binomial regression models and produce smaller mean squared errors in the estimation of risk ratios than the frequentist methods, and the posterior inferences can be obtained using the software WinBUGS. However, Markov chain Monte Carlo methods implemented in WinBUGS can lead to large Monte Carlo errors in the approximations to the posterior inferences because they produce correlated simulations, and the accuracy of the approximations are inversely related to this correlation. To reduce correlation and to improve accuracy, we propose a reparameterization based on a Poisson model and a sampling algorithm coded in R. Copyright © 2015 John Wiley & Sons, Ltd.
Interruption Practice Reduces Errors
2014-01-01
miscalculations (Koppel et al., 2005). There are cases where the user (medical staff, MD, Nurse , etc.) forgets to complete the PCS which is to log off or...13. SUPPLEMENTARY NOTES Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 58, 1 vol. pp. 265-269, 2014. 14. ABSTRACT...2000). The effects of interruptions in work activ- ity: Field and laboratory results. Applied Ergonomics , 31(5), 537– 543. González, V. M., & Mark, G
Energy Technology Data Exchange (ETDEWEB)
Jeong, Kwan Seong; Moon, Jei Kwon; Choi, Byung Seon; Hyun, Dong jun; Lee, Jong Hwan; Kim, Ik June; Kang, Shin Young [KAERI, Daejeon (Korea, Republic of)
2016-05-15
Decommissioning of nuclear facilities has to be accomplished by assuring the safety of workers. So, it is necessary that before decommissioning, the exposure dose to workers has to be analyzed and assessed under the principle of ALARA (as low as reasonably achievable). Furthermore, to improve the proficiency of decommissioning environments, method and system need to be developed. To establish the plan of exposure dose to workers during decommissioning of nuclear facilities before decommissioning activities, it is necessary that assessment system is developed. This system has been successfully developed so that exposure dose to workers could be real-time measured and assessed in virtual decommissioning environments. It can be concluded that this system could be protected from accidents and enable workers to improve his familiarization about working environments. It is expected that this system can reduce human errors because workers are able to improve the proficiency of hazardous working environments due to virtual training like real decommissioning situations.
International Nuclear Information System (INIS)
He Wei; Liu Jianyu; Li Xuan; Li Jianying; Liao Jingmin
2009-01-01
Objective: To evaluate the effect of a breath-motion-correction (BMC) technique in reducing measurement error of the time-density curve (TDC) in hepatic CT perfusion imaging. Methods: Twenty-five patients with suspected liver diseases underwent hepatic CT perfusion scans. The right branch of portal vein was selected as the anatomy of interest and performed BMC to realign image slices for the TDC according to the rule of minimizing the temporal changes of overall structures. Ten ROIs was selected on the right branch of portal vein to generate 10 TDCs each with and without BMC. The values of peak enhancement and the time-to-peak enhancement for each TDC were measured. The coefficients of variation (CV) of peak enhancement and the time-to-peak enhancement were calculated for each patient with and without BMC. Wilcoxon signed ranks test was used to evaluate the difference between the CV of the two parameters obtained with and without BMC. Independent-samples t test was used to evaluate the difference between the values of peak enhancement obtained with and without BMC. Results: The median (quartiles) of CV of peak enhancement with BMC [2.84% (2.10%, 4.57%)] was significantly lower than that without BMC [5.19% (3.90%, 7.27%)] (Z=-3.108,P<0.01). The median (quartiles) of CV of time-to-peak enhancement with BMC [2.64% (0.76%, 4.41%)] was significantly lower than that without BMC [5.23% (3.81%, 7.43%)] (Z=-3.924, P<0.01). In 8 cases, TDC demonstrated statistically significant higher peak enhancement with BMC (P<0.05). Conclusion: By applying the BMC technique we can effectively reduce measurement error for parameters of the TDC in hepatic CT perfusion imaging. (authors)
DEFF Research Database (Denmark)
Risør, Bettina Wulff; Lisby, Marianne; Sørensen, Jan
2017-01-01
Background Medication errors have received extensive attention in recent decades and are of significant concern to healthcare organisations globally. Medication errors occur frequently, and adverse events associated with medications are one of the largest causes of harm to hospitalised patients...... cabinet, automated dispensing and barcode medication administration; (2) non-patient specific automated dispensing and barcode medication administration. The occurrence of administration errors was observed in three 3 week periods. The error rates were calculated by dividing the number of doses with one...
Cook, Nancy R; Rosner, Bernard A; Chen, Wei; Srinivasan, Sathanur R; Berenson, Gerald S
2004-11-30
Tracking correlations of blood pressure, particularly childhood measures, may be attenuated by within-person variability. Combining multiple measurements can reduce this error substantially. The area under the curve (AUC) computed from longitudinal growth curve models can be used to improve the prediction of young adult blood pressure from childhood measures. Quadratic random-effects models over unequally spaced repeated measures were used to compute the area under the curve separately within the age periods 5-14 and 20-34 years in the Bogalusa Heart Study. This method adjusts for the uneven age distribution and captures the underlying or average blood pressure, leading to improved estimates of correlation and risk prediction. Tracking correlations were computed by race and gender, and were approximately 0.6 for systolic, 0.5-0.6 for K4 diastolic, and 0.4-0.6 for K5 diastolic blood pressure. The AUC can also be used to regress young adult blood pressure on childhood blood pressure and childhood and young adult body mass index (BMI). In these data, while childhood blood pressure and young adult BMI were generally directly predictive of young adult blood pressure, childhood BMI was negatively correlated with young adult blood pressure when childhood blood pressure was in the model. In addition, racial differences in young adult blood pressure were reduced, but not eliminated, after controlling for childhood blood pressure, childhood BMI, and young adult BMI, suggesting that other genetic or lifestyle factors contribute to this difference. 2004 John Wiley & Sons, Ltd.
Pitkänen, T. P.; Käyhkö, N.
2017-08-01
Mapping structural changes in vegetation dynamics has, for a long time, been carried out using satellite images, orthophotos and, more recently, airborne lidar acquisitions. Lidar has established its position as providing accurate material for structure-based analyses but its limited availability, relatively short history, and lack of spectral information, however, are generally impeding the use of lidar data for change detection purposes. A potential solution in respect of detecting both contemporary vegetation structures and their previous trajectories is to combine lidar acquisitions with optical remote sensing data, which can substantially extend the coverage, span and spectral range needed for vegetation mapping. In this study, we tested the simultaneous use of a single low-density lidar data set, a series of Landsat satellite frames and two high-resolution orthophotos to detect vegetation succession related to grassland overgrowth, i.e. encroachment of woody plants into semi-natural grasslands. We built several alternative Random Forest models with different sets of variables and tested the applicability of respective data sources for change detection purposes, aiming at distinguishing unchanged grassland and woodland areas from overgrown grasslands. Our results show that while lidar alone provides a solid basis for indicating structural differences between grassland and woodland vegetation, and orthophoto-generated variables alone are better in detecting successional changes, their combination works considerably better than its respective parts. More specifically, a model combining all the used data sets reduces the total error from 17.0% to 11.0% and omission error of detecting overgrown grasslands from 56.9% to 31.2%, when compared to model constructed solely based on lidar data. This pinpoints the efficiency of the approach where lidar-generated structural metrics are combined with optical and multitemporal observations, providing a workable framework to
Yin, X X; Ng, B W-H; Ramamohanarao, K; Baghai-Wadji, A; Abbott, D
2012-09-01
It has been shown that, magnetic resonance images (MRIs) with sparsity representation in a transformed domain, e.g. spatial finite-differences (FD), or discrete cosine transform (DCT), can be restored from undersampled k-space via applying current compressive sampling theory. The paper presents a model-based method for the restoration of MRIs. The reduced-order model, in which a full-system-response is projected onto a subspace of lower dimensionality, has been used to accelerate image reconstruction by reducing the size of the involved linear system. In this paper, the singular value threshold (SVT) technique is applied as a denoising scheme to reduce and select the model order of the inverse Fourier transform image, and to restore multi-slice breast MRIs that have been compressively sampled in k-space. The restored MRIs with SVT for denoising show reduced sampling errors compared to the direct MRI restoration methods via spatial FD, or DCT. Compressive sampling is a technique for finding sparse solutions to underdetermined linear systems. The sparsity that is implicit in MRIs is to explore the solution to MRI reconstruction after transformation from significantly undersampled k-space. The challenge, however, is that, since some incoherent artifacts result from the random undersampling, noise-like interference is added to the image with sparse representation. These recovery algorithms in the literature are not capable of fully removing the artifacts. It is necessary to introduce a denoising procedure to improve the quality of image recovery. This paper applies a singular value threshold algorithm to reduce the model order of image basis functions, which allows further improvement of the quality of image reconstruction with removal of noise artifacts. The principle of the denoising scheme is to reconstruct the sparse MRI matrices optimally with a lower rank via selecting smaller number of dominant singular values. The singular value threshold algorithm is performed
Iversen, Birger
1992-01-01
Although it arose from purely theoretical considerations of the underlying axioms of geometry, the work of Einstein and Dirac has demonstrated that hyperbolic geometry is a fundamental aspect of modern physics
Energy Technology Data Exchange (ETDEWEB)
Kim, Sa Kil; Sim, Joo Hyun; Lee, Hyun Chul [Korea Atomic Research Institute, Daejeon (Korea, Republic of)
2016-10-15
In this study, we identify the emerging types of team errors, especially, in digitalized control room of nuclear power plants such as the APR-1400 main control room of Korea. Most works in nuclear industry are to be performed by a team of more than two persons. Even though the individual errors can be detected and recovered by the qualified others and/or the well trained team, it is rather seldom that the errors by team could be easily detected and properly recovered by the team itself. Note that the team is defined as two or more people who are appropriately interacting with each other, and the team is a dependent aggregate, which accomplishes a valuable goal. Organizational errors sometimes increase the likelihood of operator errors through the active failure pathway and, at the same time, enhance the possibility of adverse outcomes through defensive weaknesses. We incorporate the crew resource management as a representative approach to deal with the team factors of the human errors. We suggest a set of crew resource management training procedures under the unsafe environments where human errors can have devastating effects. We are on the way to develop alternative interfaces against team error in a condition of using a computer-based procedure system in a digitalized main control room. The computer-based procedure system is a representative feature of digitalized control room. In this study, we will propose new interfaces of computer-based procedure system to reduce feasible team errors. We are on the way of effectiveness test to validate whether the new interface can reduce team errors during operating with a computer-based procedure system in a digitalized control room.
International Nuclear Information System (INIS)
Kim, Sa Kil; Sim, Joo Hyun; Lee, Hyun Chul
2016-01-01
In this study, we identify the emerging types of team errors, especially, in digitalized control room of nuclear power plants such as the APR-1400 main control room of Korea. Most works in nuclear industry are to be performed by a team of more than two persons. Even though the individual errors can be detected and recovered by the qualified others and/or the well trained team, it is rather seldom that the errors by team could be easily detected and properly recovered by the team itself. Note that the team is defined as two or more people who are appropriately interacting with each other, and the team is a dependent aggregate, which accomplishes a valuable goal. Organizational errors sometimes increase the likelihood of operator errors through the active failure pathway and, at the same time, enhance the possibility of adverse outcomes through defensive weaknesses. We incorporate the crew resource management as a representative approach to deal with the team factors of the human errors. We suggest a set of crew resource management training procedures under the unsafe environments where human errors can have devastating effects. We are on the way to develop alternative interfaces against team error in a condition of using a computer-based procedure system in a digitalized main control room. The computer-based procedure system is a representative feature of digitalized control room. In this study, we will propose new interfaces of computer-based procedure system to reduce feasible team errors. We are on the way of effectiveness test to validate whether the new interface can reduce team errors during operating with a computer-based procedure system in a digitalized control room
van den Broek, P.M.
1984-01-01
The aim of this paper is to give a detailed exposition of the relation between the geometry of twistor space and the geometry of Minkowski space. The paper has a didactical purpose; no use has been made of differential geometry and cohomology.
van de Plas, Afke; Slikkerveer, Mariëlle; Hoen, Saskia; Schrijnemakers, Rick; Driessen, Johanna; de Vries, Frank; van den Bemt, Patricia
2017-01-01
In this controlled before-after study the effect of improvements, derived from Lean Six Sigma strategy, on parenteral medication administration errors and the potential risk of harm was determined. During baseline measurement, on control versus intervention ward, at least one administration error
Mira, José Joaquín; Carrillo, Irene; Guilabert, Mercedes; Lorenzo, Susana; Pérez-Pérez, Pastora; Silvestre, Carmen; Ferrús, Lena
2017-06-08
Adverse events (incidents that harm a patient) can also produce emotional hardship for the professionals involved (second victims). Although a few international pioneering programs exist that aim to facilitate the recovery of the second victim, there are no known initiatives that aim to raise awareness in the professional community about this issue and prevent the situation from worsening. The aim of this study was to design and evaluate an online program directed at frontline hospital and primary care health professionals that raises awareness and provides information about the second victim phenomenon. The design of the Mitigating Impact in Second Victims (MISE) online program was based on a literature review, and its contents were selected by a group of 15 experts on patient safety with experience in both clinical and academic settings. The website hosting MISE was subjected to an accreditation process by an external quality agency that specializes in evaluating health websites. The MISE structure and content were evaluated by 26 patient safety managers at hospitals and within primary care in addition to 266 frontline health care professionals who followed the program, taking into account its comprehension, usefulness of the information, and general adequacy. Finally, the amount of knowledge gained from the program was assessed with three objective measures (pre- and posttest design). The website earned Advanced Accreditation for health websites after fulfilling required standards. The comprehension and practical value of the MISE content were positively assessed by 88% (23/26) and 92% (24/26) of patient safety managers, respectively. MISE was positively evaluated by health care professionals, who awarded it 8.8 points out of a maximum 10. Users who finished MISE improved their knowledge on patient safety terminology, prevalence and impact of adverse events and clinical errors, second victim support models, and recommended actions following a severe adverse
International Nuclear Information System (INIS)
Unkelbach, Jan; Bortfeld, Thomas; Martin, Benjamin C.; Soukup, Martin
2009-01-01
Treatment plans optimized for intensity modulated proton therapy (IMPT) may be very sensitive to setup errors and range uncertainties. If these errors are not accounted for during treatment planning, the dose distribution realized in the patient may by strongly degraded compared to the planned dose distribution. The authors implemented the probabilistic approach to incorporate uncertainties directly into the optimization of an intensity modulated treatment plan. Following this approach, the dose distribution depends on a set of random variables which parameterize the uncertainty, as does the objective function used to optimize the treatment plan. The authors optimize the expected value of the objective function. They investigate IMPT treatment planning regarding range uncertainties and setup errors. They demonstrate that incorporating these uncertainties into the optimization yields qualitatively different treatment plans compared to conventional plans which do not account for uncertainty. The sensitivity of an IMPT plan depends on the dose contributions of individual beam directions. Roughly speaking, steep dose gradients in beam direction make treatment plans sensitive to range errors. Steep lateral dose gradients make plans sensitive to setup errors. More robust treatment plans are obtained by redistributing dose among different beam directions. This can be achieved by the probabilistic approach. In contrast, the safety margin approach as widely applied in photon therapy fails in IMPT and is neither suitable for handling range variations nor setup errors.
Rodger, Alison
1995-01-01
Molecular Geometry discusses topics relevant to the arrangement of atoms. The book is comprised of seven chapters that tackle several areas of molecular geometry. Chapter 1 reviews the definition and determination of molecular geometry, while Chapter 2 discusses the unified view of stereochemistry and stereochemical changes. Chapter 3 covers the geometry of molecules of second row atoms, and Chapter 4 deals with the main group elements beyond the second row. The book also talks about the complexes of transition metals and f-block elements, and then covers the organometallic compounds and trans
van de Plas, Afke; Slikkerveer, Mariëlle; Hoen, Saskia; Schrijnemakers, Rick; Driessen, Johanna; de Vries, Frank; van den Bemt, Patricia
2017-01-01
In this controlled before-after study the effect of improvements, derived from Lean Six Sigma strategy, on parenteral medication administration errors and the potential risk of harm was determined. During baseline measurement, on control versus intervention ward, at least one administration error occurred in 14 (74%) and 6 (46%) administrations with potential risk of harm in 6 (32%) and 1 (8%) administrations. Most administration errors with high potential risk of harm occurred in bolus injections: 8 (57%) versus 2 (67%) bolus injections were injected too fast with a potential risk of harm in 6 (43%) and 1 (33%) bolus injections on control and intervention ward. Implemented improvement strategies, based on major causes of too fast administration of bolus injections, were: Substitution of bolus injections by infusions, education, availability of administration information and drug round tabards. Post intervention, on the control ward in 76 (76%) administrations at least one error was made (RR 1.03; CI95:0.77-1.38), with a potential risk of harm in 14 (14%) administrations (RR 0.45; CI95:0.20-1.02). In 40 (68%) administrations on the intervention ward at least one error occurred (RR 1.47; CI95:0.80-2.71) but no administrations were associated with a potential risk of harm. A shift in wrong duration administration errors from bolus injections to infusions, with a reduction of potential risk of harm, seems to have occurred on the intervention ward. Although data are insufficient to prove an effect, Lean Six Sigma was experienced as a suitable strategy to select tailored improvements. Further studies are required to prove the effect of the strategy on parenteral medication administration errors.
International Nuclear Information System (INIS)
Robinson, I.; Trautman, A.
1988-01-01
The geometry of classical physics is Lorentzian; but weaker geometries are often more appropriate: null geodesics and electromagnetic fields, for example, are well known to be objects of conformal geometry. To deal with a single null congruence, or with the radiative electromagnetic fields associated with it, even less is needed: flag geometry for the first, optical geometry, with which this paper is chiefly concerned, for the second. The authors establish a natural one-to-one correspondence between optical geometries, considered locally, and three-dimensional Cauchy-Riemann structures. A number of Lorentzian geometries are shown to be equivalent from the optical point of view. For example the Goedel universe, the Taub-NUT metric and Hauser's twisting null solution have an optical geometry isomorphic to the one underlying the Robinson congruence in Minkowski space. The authors present general results on the problem of lifting a CR structure to a Lorentz manifold and, in particular, to Minkowski space; and exhibit the relevance of the deviation form to this problem
Alastruey, Jordi; Siggers, Jennifer H.; Peiffer, Véronique; Doorly, Denis J.; Sherwin, Spencer J.
2012-03-01
Three-dimensional simulations of blood flow usually produce such large quantities of data that they are unlikely to be of clinical use unless methods are available to simplify our understanding of the flow dynamics. We present a new method to investigate the mechanisms by which vascular curvature and torsion affect blood flow, and we apply it to the steady-state flow in single bends, helices, double bends, and a rabbit thoracic aorta based on image data. By calculating forces and accelerations in an orthogonal coordinate system following the centreline of each vessel, we obtain the inertial forces (centrifugal, Coriolis, and torsional) explicitly, which directly depend on vascular curvature and torsion. We then analyse the individual roles of the inertial, pressure gradient, and viscous forces on the patterns of primary and secondary velocities, vortical structures, and wall stresses in each cross section. We also consider cross-sectional averages of the in-plane components of these forces, which can be thought of as reducing the dynamics of secondary flows onto the vessel centreline. At Reynolds numbers between 50 and 500, secondary motions in the directions of the local normals and binormals behave as two underdamped oscillators. These oscillate around the fully developed state and are coupled by torsional forces that break the symmetry of the flow. Secondary flows are driven by the centrifugal and torsional forces, and these are counterbalanced by the in-plane pressure gradients generated by the wall reaction. The viscous force primarily opposes the pressure gradient, rather than the inertial forces. In the axial direction, and depending on the secondary motion, the curvature-dependent Coriolis force can either enhance or oppose the bulk of the axial flow, and this shapes the velocity profile. For bends with little or no torsion, the Coriolis force tends to restore flow axisymmetry. The maximum circumferential and axial wall shear stresses along the centreline
Borgia, Andrea; Spera, Frank J.
1990-01-01
This work discusses the propagation of errors for the recovery of the shear rate from wide-gap concentric cylinder viscometric measurements of non-Newtonian fluids. A least-square regression of stress on angular velocity data to a system of arbitrary functions is used to propagate the errors for the series solution to the viscometric flow developed by Krieger and Elrod (1953) and Pawlowski (1953) ('power-law' approximation) and for the first term of the series developed by Krieger (1968). A numerical experiment shows that, for measurements affected by significant errors, the first term of the Krieger-Elrod-Pawlowski series ('infinite radius' approximation) and the power-law approximation may recover the shear rate with equal accuracy as the full Krieger-Elrod-Pawlowski solution. An experiment on a clay slurry indicates that the clay has a larger yield stress at rest than during shearing, and that, for the range of shear rates investigated, a four-parameter constitutive equation approximates reasonably well its rheology. The error analysis presented is useful for studying the rheology of fluids such as particle suspensions, slurries, foams, and magma.
Weaver, Sallie J.; Newman-Toker, David E.; Rosen, Michael A.
2012-01-01
Missed, delayed, or wrong diagnoses can have a severe impact on patients, providers, and the entire health care system. One mechanism implicated in such diagnostic errors is the deterioration of cognitive diagnostic skills that are used rarely or not at all over a prolonged period of time. Existing evidence regarding maintenance of effective…
Smith, Gennifer T.; Dwork, Nicholas; Khan, Saara A.; Millet, Matthew; Magar, Kiran; Javanmard, Mehdi; Bowden, Audrey K.
2017-03-01
Urinalysis dipsticks were designed to revolutionize urine-based medical diagnosis. They are cheap, extremely portable, and have multiple assays patterned on a single platform. They were also meant to be incredibly easy to use. Unfortunately, there are many aspects in both the preparation and the analysis of the dipsticks that are plagued by user error. This high error is one reason that dipsticks have failed to flourish in both the at-home market and in low-resource settings. Sources of error include: inaccurate volume deposition, varying lighting conditions, inconsistent timing measurements, and misinterpreted color comparisons. We introduce a novel manifold and companion software for dipstick urinalysis that eliminates the aforementioned error sources. A micro-volume slipping manifold ensures precise sample delivery, an opaque acrylic box guarantees consistent lighting conditions, a simple sticker-based timing mechanism maintains accurate timing, and custom software that processes video data captured by a mobile phone ensures proper color comparisons. We show that the results obtained with the proposed device are as accurate and consistent as a properly executed dip-and-wipe method, the industry gold-standard, suggesting the potential for this strategy to enable confident urinalysis testing. Furthermore, the proposed all-acrylic slipping manifold is reusable and low in cost, making it a potential solution for at-home users and low-resource settings.
Pottmann, Helmut; Eigensatz, Michael; Vaxman, Amir; Wallner, Johannes
2014-01-01
Around 2005 it became apparent in the geometry processing community that freeform architecture contains many problems of a geometric nature to be solved, and many opportunities for optimization which however require geometric understanding. This area of research, which has been called architectural geometry, meanwhile contains a great wealth of individual contributions which are relevant in various fields. For mathematicians, the relation to discrete differential geometry is significant, in particular the integrable system viewpoint. Besides, new application contexts have become available for quite some old-established concepts. Regarding graphics and geometry processing, architectural geometry yields interesting new questions but also new objects, e.g. replacing meshes by other combinatorial arrangements. Numerical optimization plays a major role but in itself would be powerless without geometric understanding. Summing up, architectural geometry has become a rewarding field of study. We here survey the main directions which have been pursued, we show real projects where geometric considerations have played a role, and we outline open problems which we think are significant for the future development of both theory and practice of architectural geometry.
Pottmann, Helmut
2014-11-26
Around 2005 it became apparent in the geometry processing community that freeform architecture contains many problems of a geometric nature to be solved, and many opportunities for optimization which however require geometric understanding. This area of research, which has been called architectural geometry, meanwhile contains a great wealth of individual contributions which are relevant in various fields. For mathematicians, the relation to discrete differential geometry is significant, in particular the integrable system viewpoint. Besides, new application contexts have become available for quite some old-established concepts. Regarding graphics and geometry processing, architectural geometry yields interesting new questions but also new objects, e.g. replacing meshes by other combinatorial arrangements. Numerical optimization plays a major role but in itself would be powerless without geometric understanding. Summing up, architectural geometry has become a rewarding field of study. We here survey the main directions which have been pursued, we show real projects where geometric considerations have played a role, and we outline open problems which we think are significant for the future development of both theory and practice of architectural geometry.
Maor, Eli
2014-01-01
If you've ever thought that mathematics and art don't mix, this stunning visual history of geometry will change your mind. As much a work of art as a book about mathematics, Beautiful Geometry presents more than sixty exquisite color plates illustrating a wide range of geometric patterns and theorems, accompanied by brief accounts of the fascinating history and people behind each. With artwork by Swiss artist Eugen Jost and text by acclaimed math historian Eli Maor, this unique celebration of geometry covers numerous subjects, from straightedge-and-compass constructions to intriguing configur
Energy Technology Data Exchange (ETDEWEB)
Grotz, Andreas
2011-10-07
In this thesis, a formulation of a Lorentzian quantum geometry based on the framework of causal fermion systems is proposed. After giving the general definition of causal fermion systems, we deduce space-time as a topological space with an underlying causal structure. Restricting attention to systems of spin dimension two, we derive the objects of our quantum geometry: the spin space, the tangent space endowed with a Lorentzian metric, connection and curvature. In order to get the correspondence to classical differential geometry, we construct examples of causal fermion systems by regularizing Dirac sea configurations in Minkowski space and on a globally hyperbolic Lorentzian manifold. When removing the regularization, the objects of our quantum geometry reduce to the common objects of spin geometry on Lorentzian manifolds, up to higher order curvature corrections.
International Nuclear Information System (INIS)
Grotz, Andreas
2011-01-01
In this thesis, a formulation of a Lorentzian quantum geometry based on the framework of causal fermion systems is proposed. After giving the general definition of causal fermion systems, we deduce space-time as a topological space with an underlying causal structure. Restricting attention to systems of spin dimension two, we derive the objects of our quantum geometry: the spin space, the tangent space endowed with a Lorentzian metric, connection and curvature. In order to get the correspondence to classical differential geometry, we construct examples of causal fermion systems by regularizing Dirac sea configurations in Minkowski space and on a globally hyperbolic Lorentzian manifold. When removing the regularization, the objects of our quantum geometry reduce to the common objects of spin geometry on Lorentzian manifolds, up to higher order curvature corrections.
Introduction to combinatorial geometry
International Nuclear Information System (INIS)
Gabriel, T.A.; Emmett, M.B.
1985-01-01
The combinatorial geometry package as used in many three-dimensional multimedia Monte Carlo radiation transport codes, such as HETC, MORSE, and EGS, is becoming the preferred way to describe simple and complicated systems. Just about any system can be modeled using the package with relatively few input statements. This can be contrasted against the older style geometry packages in which the required input statements could be large even for relatively simple systems. However, with advancements come some difficulties. The users of combinatorial geometry must be able to visualize more, and, in some instances, all of the system at a time. Errors can be introduced into the modeling which, though slight, and at times hard to detect, can have devastating effects on the calculated results. As with all modeling packages, the best way to learn the combinatorial geometry is to use it, first on a simple system then on more complicated systems. The basic technique for the description of the geometry consists of defining the location and shape of the various zones in terms of the intersections and unions of geometric bodies. The geometric bodies which are generally included in most combinatorial geometry packages are: (1) box, (2) right parallelepiped, (3) sphere, (4) right circular cylinder, (5) right elliptic cylinder, (6) ellipsoid, (7) truncated right cone, (8) right angle wedge, and (9) arbitrary polyhedron. The data necessary to describe each of these bodies are given. As can be easily noted, there are some subsets included for simplicity
Kemnitz, Arnfried
Der Grundgedanke der Analytischen Geometrie besteht darin, dass geometrische Untersuchungen mit rechnerischen Mitteln geführt werden. Geometrische Objekte werden dabei durch Gleichungen beschrieben und mit algebraischen Methoden untersucht.
Lefschetz, Solomon
2005-01-01
An introduction to algebraic geometry and a bridge between its analytical-topological and algebraical aspects, this text for advanced undergraduate students is particularly relevant to those more familiar with analysis than algebra. 1953 edition.
Ay, Nihat; Lê, Hông Vân; Schwachhöfer, Lorenz
2017-01-01
The book provides a comprehensive introduction and a novel mathematical foundation of the field of information geometry with complete proofs and detailed background material on measure theory, Riemannian geometry and Banach space theory. Parametrised measure models are defined as fundamental geometric objects, which can be both finite or infinite dimensional. Based on these models, canonical tensor fields are introduced and further studied, including the Fisher metric and the Amari-Chentsov tensor, and embeddings of statistical manifolds are investigated. This novel foundation then leads to application highlights, such as generalizations and extensions of the classical uniqueness result of Chentsov or the Cramér-Rao inequality. Additionally, several new application fields of information geometry are highlighted, for instance hierarchical and graphical models, complexity theory, population genetics, or Markov Chain Monte Carlo. The book will be of interest to mathematicians who are interested in geometry, inf...
Latorre-Arteaga, Sergio; Gil-González, Diana; Enciso, Olga; Phelan, Aoife; García-Muñoz, Ángel; Kohler, Johannes
2014-01-01
Background Refractive error is defined as the inability of the eye to bring parallel rays of light into focus on the retina, resulting in nearsightedness (myopia), farsightedness (Hyperopia) or astigmatism. Uncorrected refractive error in children is associated with increased morbidity and reduced educational opportunities. Vision screening (VS) is a method for identifying children with visual impairment or eye conditions likely to lead to visual impairment. Objective To analyze the utility of vision screening conducted by teachers and to contribute to a better estimation of the prevalence of childhood refractive errors in Apurimac, Peru. Design A pilot vision screening program in preschool (Group I) and elementary school children (Group II) was conducted with the participation of 26 trained teachers. Children whose visual acuity was<6/9 [20/30] (Group I) and≤6/9 (Group II) in one or both eyes, measured with the Snellen Tumbling E chart at 6 m, were referred for a comprehensive eye exam. Specificity and positive predictive value to detect refractive error were calculated against clinical examination. Program assessment with participants was conducted to evaluate outcomes and procedures. Results A total sample of 364 children aged 3–11 were screened; 45 children were examined at Centro Oftalmológico Monseñor Enrique Pelach (COMEP) Eye Hospital. Prevalence of refractive error was 6.2% (Group I) and 6.9% (Group II); specificity of teacher vision screening was 95.8% and 93.0%, while positive predictive value was 59.1% and 47.8% for each group, respectively. Aspects highlighted to improve the program included extending training, increasing parental involvement, and helping referred children to attend the hospital. Conclusion Prevalence of refractive error in children is significant in the region. Vision screening performed by trained teachers is a valid intervention for early detection of refractive error, including screening of preschool children. Program
Keyworth, Chris; Hart, Jo; Thoong, Hong; Ferguson, Jane; Tully, Mary
2017-08-01
Although prescribing of medication in hospitals is rarely an error-free process, prescribers receive little feedback on their mistakes and ways to change future practices. Audit and feedback interventions may be an effective approach to modifying the clinical practice of health professionals, but these may pose logistical challenges when used in hospitals. Moreover, such interventions are often labor intensive. Consequently, there is a need to develop effective and innovative interventions to overcome these challenges and to improve the delivery of feedback on prescribing. Implementation intentions, which have been shown to be effective in changing behavior, link critical situations with an appropriate response; however, these have rarely been used in the context of improving prescribing practices. Semistructured qualitative interviews were conducted to evaluate the acceptability and feasibility of providing feedback on prescribing errors via MyPrescribe, a mobile-compatible website informed by implementation intentions. Data relating to 200 prescribing errors made by 52 junior doctors were collected by 11 hospital pharmacists. These errors were populated into MyPrescribe, where prescribers were able to construct their own personalized action plans. Qualitative interviews with a subsample of 15 junior doctors were used to explore issues regarding feasibility and acceptability of MyPrescribe and their experiences of using implementation intentions to construct prescribing action plans. Framework analysis was used to identify prominent themes, with findings mapped to the behavioral components of the COM-B model (capability, opportunity, motivation, and behavior) to inform the development of future interventions. MyPrescribe was perceived to be effective in providing opportunities for critical reflection on prescribing errors and to complement existing training (such as junior doctors' e-portfolio). The participants were able to provide examples of how they would use
Gourdji, S. M.; Yadav, V.; Karion, A.; Mueller, K. L.; Conley, S.; Ryerson, T.; Nehrkorn, T.; Kort, E. A.
2018-04-01
Urban greenhouse gas (GHG) flux estimation with atmospheric measurements and modeling, i.e. the ‘top-down’ approach, can potentially support GHG emission reduction policies by assessing trends in surface fluxes and detecting anomalies from bottom-up inventories. Aircraft-collected GHG observations also have the potential to help quantify point-source emissions that may not be adequately sampled by fixed surface tower-based atmospheric observing systems. Here, we estimate CH4 emissions from a known point source, the Aliso Canyon natural gas leak in Los Angeles, CA from October 2015–February 2016, using atmospheric inverse models with airborne CH4 observations from twelve flights ≈4 km downwind of the leak and surface sensitivities from a mesoscale atmospheric transport model. This leak event has been well-quantified previously using various methods by the California Air Resources Board, thereby providing high confidence in the mass-balance leak rate estimates of (Conley et al 2016), used here for comparison to inversion results. Inversions with an optimal setup are shown to provide estimates of the leak magnitude, on average, within a third of the mass balance values, with remaining errors in estimated leak rates predominantly explained by modeled wind speed errors of up to 10 m s‑1, quantified by comparing airborne meteorological observations with modeled values along the flight track. An inversion setup using scaled observational wind speed errors in the model-data mismatch covariance matrix is shown to significantly reduce the influence of transport model errors on spatial patterns and estimated leak rates from the inversions. In sum, this study takes advantage of a natural tracer release experiment (i.e. the Aliso Canyon natural gas leak) to identify effective approaches for reducing the influence of transport model error on atmospheric inversions of point-source emissions, while suggesting future potential for integrating surface tower and
Burdette, A C
1971-01-01
Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st
Berger, Marcel
2010-01-01
Both classical geometry and modern differential geometry have been active subjects of research throughout the 20th century and lie at the heart of many recent advances in mathematics and physics. The underlying motivating concept for the present book is that it offers readers the elements of a modern geometric culture by means of a whole series of visually appealing unsolved (or recently solved) problems that require the creation of concepts and tools of varying abstraction. Starting with such natural, classical objects as lines, planes, circles, spheres, polygons, polyhedra, curves, surfaces,
Robinson, Gilbert de B
2011-01-01
This brief undergraduate-level text by a prominent Cambridge-educated mathematician explores the relationship between algebra and geometry. An elementary course in plane geometry is the sole requirement for Gilbert de B. Robinson's text, which is the result of several years of teaching and learning the most effective methods from discussions with students. Topics include lines and planes, determinants and linear equations, matrices, groups and linear transformations, and vectors and vector spaces. Additional subjects range from conics and quadrics to homogeneous coordinates and projective geom
Connes, Alain
1994-01-01
This English version of the path-breaking French book on this subject gives the definitive treatment of the revolutionary approach to measure theory, geometry, and mathematical physics developed by Alain Connes. Profusely illustrated and invitingly written, this book is ideal for anyone who wants to know what noncommutative geometry is, what it can do, or how it can be used in various areas of mathematics, quantization, and elementary particles and fields.Key Features* First full treatment of the subject and its applications* Written by the pioneer of this field* Broad applications in mathemat
Maslanik, J. A.; Key, J.
1992-01-01
An expert system framework has been developed to classify sea ice types using satellite passive microwave data, an operational classification algorithm, spatial and temporal information, ice types estimated from a dynamic-thermodynamic model, output from a neural network that detects the onset of melt, and knowledge about season and region. The rule base imposes boundary conditions upon the ice classification, modifies parameters in the ice algorithm, determines a `confidence' measure for the classified data, and under certain conditions, replaces the algorithm output with model output. Results demonstrate the potential power of such a system for minimizing overall error in the classification and for providing non-expert data users with a means of assessing the usefulness of the classification results for their applications.
Directory of Open Access Journals (Sweden)
Fatemeh Donboli Miandoab
2017-12-01
Full Text Available Background: Professionalism and adherence to ethics and professional standards are among the most important topics in medical ethics that can play a role in reducing medical errors. This paper examines and evaluates the effect of professional ethics on reducing medical errors from the viewpoint of faculty members in the medical school of the Tabriz University of Medical Sciences. Methods: in this cross-sectional descriptive study, faculty members of the Tabriz University of Medical Sciences were the statistical population from whom 105 participants were randomly selected through simple random sampling. A questionnaire was used, to examine and compare the self-assessed opinions of faculty members in the internal, surgical, pediatric, gynecological, and psychiatric departments. The questionnaires were completed by a self-assessment method and the collected data was analyzed using SPSS 21. Results: Based on physicians’ opinions, professional ethical considerations and its three domains and aspects have a significant role in reducing medical errors and crimes. The mean scores (standard deviations of the managerial, knowledge and communication skills and environmental variables were respectively 46.7 (5.64, 64.6 (8.14 and 16.2 (2.97 from the physicians’ viewpoints. The significant factors with highest scores on the reduction of medical errors and crimes in all three domains were as follows: in the managerial skills variable, trust, physician’s sense of responsibility against the patient and his/her respect for patients’ rights; in the knowledge and communication skills domain, general competence and eligibility as a physician and examination and diagnosis skills; and, last, in the environmental domain, the sufficiency of trainings in ethical issues during education and their satisfaction with basic needs. Conclusion: Based on the findings of this research, attention to the improvement of communication, management and environment skills should
Indian Academy of Sciences (India)
mathematicians are trained to use very precise language, and so find it hard to simplify and state .... thing. If you take a plane on which there are two such triangles which enjoy the above ... within this geometry to simplify things if needed.
Geometry -----------~--------------RESONANCE
Indian Academy of Sciences (India)
Parallel: A pair of lines in a plane is said to be parallel if they do not meet. Mathematicians were at war ... Subsequently, Poincare, Klein, Beltrami and others refined non-. Euclidean geometry. ... plane divides the plane into two half planes and.
Energy Technology Data Exchange (ETDEWEB)
Falconer, David A.; Tiwari, Sanjiv K.; Moore, Ronald L. [NASA Marshall Space Flight Center, Huntsville, AL 35812 (United States); Khazanov, Igor, E-mail: David.a.Falconer@nasa.gov [Center for Space Plasma and Aeronomic Research, University of Alabama in Huntsville, Huntsville, AL 35899 (United States)
2016-12-20
Projection errors limit the use of vector magnetograms of active regions (ARs) far from the disk center. In this Letter, for ARs observed up to 60° from the disk center, we demonstrate a method for measuring and reducing the projection error in the magnitude of any whole-AR parameter that is derived from a vector magnetogram that has been deprojected to the disk center. The method assumes that the center-to-limb curve of the average of the parameter’s absolute values, measured from the disk passage of a large number of ARs and normalized to each AR’s absolute value of the parameter at central meridian, gives the average fractional projection error at each radial distance from the disk center. To demonstrate the method, we use a large set of large-flux ARs and apply the method to a whole-AR parameter that is among the simplest to measure: whole-AR magnetic flux. We measure 30,845 SDO /Helioseismic and Magnetic Imager vector magnetograms covering the disk passage of 272 large-flux ARs, each having whole-AR flux >10{sup 22} Mx. We obtain the center-to-limb radial-distance run of the average projection error in measured whole-AR flux from a Chebyshev fit to the radial-distance plot of the 30,845 normalized measured values. The average projection error in the measured whole-AR flux of an AR at a given radial distance is removed by multiplying the measured flux by the correction factor given by the fit. The correction is important for both the study of the evolution of ARs and for improving the accuracy of forecasts of an AR’s major flare/coronal mass ejection productivity.
Pérula de Torres, Luis Angel; Pulido Ortega, Laura; Pérula de Torres, Carlos; González Lama, Jesús; Olaya Caro, Inmaculada; Ruiz Moral, Roger
2014-10-21
To evaluate the effectiveness of an intervention based on motivational interviewing to reduce medication errors in chronic patients over 65 with polypharmacy. Cluster randomized trial that included doctors and nurses of 16 Primary Care centers and chronic patients with polypharmacy over 65 years. The professionals were assigned to the experimental or the control group using stratified randomization. Interventions consisted of training of professionals and revision of patient treatments, application of motivational interviewing in the experimental group and also the usual approach in the control group. The primary endpoint (medication error) was analyzed at individual level, and was estimated with the absolute risk reduction (ARR), relative risk reduction (RRR), number of subjects to treat (NNT) and by multiple logistic regression analysis. Thirty-two professionals were randomized (19 doctors and 13 nurses), 27 of them recruited 154 patients consecutively (13 professionals in the experimental group recruited 70 patients and 14 professionals recruited 84 patients in the control group) and completed 6 months of follow-up. The mean age of patients was 76 years (68.8% women). A decrease in the average of medication errors was observed along the period. The reduction was greater in the experimental than in the control group (F=5.109, P=.035). RRA 29% (95% confidence interval [95% CI] 15.0-43.0%), RRR 0.59 (95% CI:0.31-0.76), and NNT 3.5 (95% CI 2.3-6.8). Motivational interviewing is more efficient than the usual approach to reduce medication errors in patients over 65 with polypharmacy. Copyright © 2013 Elsevier España, S.L.U. All rights reserved.
Siebert, Johan N; Ehrler, Frederic; Combescure, Christophe; Lacroix, Laurence; Haddad, Kevin; Sanchez, Oliver; Gervaix, Alain; Lovis, Christian; Manzano, Sergio
2017-02-01
During pediatric cardiopulmonary resuscitation (CPR), vasoactive drug preparation for continuous infusion is both complex and time-consuming, placing children at higher risk than adults for medication errors. Following an evidence-based ergonomic-driven approach, we developed a mobile device app called Pediatric Accurate Medication in Emergency Situations (PedAMINES), intended to guide caregivers step-by-step from preparation to delivery of drugs requiring continuous infusion. The aim of our study was to determine whether the use of PedAMINES reduces drug preparation time (TDP) and time to delivery (TDD; primary outcome), as well as medication errors (secondary outcomes) when compared with conventional preparation methods. The study was a randomized controlled crossover trial with 2 parallel groups comparing PedAMINES with a conventional and internationally used drugs infusion rate table in the preparation of continuous drug infusion. We used a simulation-based pediatric CPR cardiac arrest scenario with a high-fidelity manikin in the shock room of a tertiary care pediatric emergency department. After epinephrine-induced return of spontaneous circulation, pediatric emergency nurses were first asked to prepare a continuous infusion of dopamine, using either PedAMINES (intervention group) or the infusion table (control group), and second, a continuous infusion of norepinephrine by crossing the procedure. The primary outcome was the elapsed time in seconds, in each allocation group, from the oral prescription by the physician to TDD by the nurse. TDD included TDP. The secondary outcome was the medication dosage error rate during the sequence from drug preparation to drug injection. A total of 20 nurses were randomized into 2 groups. During the first study period, mean TDP while using PedAMINES and conventional preparation methods was 128.1 s (95% CI 102-154) and 308.1 s (95% CI 216-400), respectively (180 s reduction, P=.002). Mean TDD was 214 s (95% CI 171-256) and
Petersen, Peter
2016-01-01
Intended for a one year course, this text serves as a single source, introducing readers to the important techniques and theorems, while also containing enough background on advanced topics to appeal to those students wishing to specialize in Riemannian geometry. This is one of the few Works to combine both the geometric parts of Riemannian geometry and the analytic aspects of the theory. The book will appeal to a readership that have a basic knowledge of standard manifold theory, including tensors, forms, and Lie groups. Important revisions to the third edition include: a substantial addition of unique and enriching exercises scattered throughout the text; inclusion of an increased number of coordinate calculations of connection and curvature; addition of general formulas for curvature on Lie Groups and submersions; integration of variational calculus into the text allowing for an early treatment of the Sphere theorem using a proof by Berger; incorporation of several recent results about manifolds with posit...
International Nuclear Information System (INIS)
Strominger, A.
1990-01-01
A special manifold is an allowed target manifold for the vector multiplets of D=4, N=2 supergravity. These manifolds are of interest for string theory because the moduli spaces of Calabi-Yau threefolds and c=9, (2,2) conformal field theories are special. Previous work has given a local, coordinate-dependent characterization of special geometry. A global description of special geometries is given herein, and their properties are studied. A special manifold M of complex dimension n is characterized by the existence of a holomorphic Sp(2n+2,R)xGL(1,C) vector bundle over M with a nowhere-vanishing holomorphic section Ω. The Kaehler potential on M is the logarithm of the Sp(2n+2,R) invariant norm of Ω. (orig.)
Gilgien, Matthias; Spörri, Jörg; Kröll, Josef; Müller, Erich
2016-01-01
Background Injuries in downhill (DH) are often related to high speed and, therefore, to high energy and forces which are involved in injury situations. Yet to date, no study has investigated the effect of ski geometry and standing height on kinetic energy (EKIN) in DH. This knowledge would be essential to define appropriate equipment rules that have the potential to protect the athletes’ health. Methods During a field experiment on an official World Cup DH course, 2 recently retired world class skiers skied on 5 different pairs of skis varying in width, length and standing height. Course characteristics, terrain and the skiers’ centre of mass position were captured by a differential Global Navigational Satellite System-based methodology. EKIN, speed, ski–snow friction force (FF), ground reaction force (FGRF) and ski–snow friction coefficient (CoeffF) were calculated and analysed in dependency of the used skis. Results In the steep terrain, longer skis with reduced width and standing height significantly decreased average EKIN by ∼3%. Locally, even larger reductions of EKIN were observed (up to 7%). These local decreases in EKIN were mainly explainable by higher FF. Moreover, CoeffF differences seem of greater importance for explaining local FF differences than the differences in FGRF. Conclusions Knowing that increased speed and EKIN likely lead to increased forces in fall/crash situations, the observed equipment-induced reduction in EKIN can be considered a reasonable measure to improve athlete safety, even though the achieved preventative gains are rather small and limited to steep terrain. PMID:26702013
Gilgien, Matthias; Spörri, Jörg; Kröll, Josef; Müller, Erich
2016-01-01
Injuries in downhill (DH) are often related to high speed and, therefore, to high energy and forces which are involved in injury situations. Yet to date, no study has investigated the effect of ski geometry and standing height on kinetic energy (EKIN) in DH. This knowledge would be essential to define appropriate equipment rules that have the potential to protect the athletes' health. During a field experiment on an official World Cup DH course, 2 recently retired world class skiers skied on 5 different pairs of skis varying in width, length and standing height. Course characteristics, terrain and the skiers' centre of mass position were captured by a differential Global Navigational Satellite System-based methodology. EKIN, speed, ski-snow friction force (FF), ground reaction force (FGRF) and ski-snow friction coefficient (CoeffF) were calculated and analysed in dependency of the used skis. In the steep terrain, longer skis with reduced width and standing height significantly decreased average EKIN by ∼ 3%. Locally, even larger reductions of EKIN were observed (up to 7%). These local decreases in EKIN were mainly explainable by higher FF. Moreover, CoeffF differences seem of greater importance for explaining local FF differences than the differences in FGRF. Knowing that increased speed and EKIN likely lead to increased forces in fall/crash situations, the observed equipment-induced reduction in EKIN can be considered a reasonable measure to improve athlete safety, even though the achieved preventative gains are rather small and limited to steep terrain. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
General Geometry and Geometry of Electromagnetism
Shahverdiyev, Shervgi S.
2002-01-01
It is shown that Electromagnetism creates geometry different from Riemannian geometry. General geometry including Riemannian geometry as a special case is constructed. It is proven that the most simplest special case of General Geometry is geometry underlying Electromagnetism. Action for electromagnetic field and Maxwell equations are derived from curvature function of geometry underlying Electromagnetism. And it is shown that equation of motion for a particle interacting with electromagnetic...
Multilevel geometry optimization
Rodgers, Jocelyn M.; Fast, Patton L.; Truhlar, Donald G.
2000-02-01
Geometry optimization has been carried out for three test molecules using six multilevel electronic structure methods, in particular Gaussian-2, Gaussian-3, multicoefficient G2, multicoefficient G3, and two multicoefficient correlation methods based on correlation-consistent basis sets. In the Gaussian-2 and Gaussian-3 methods, various levels are added and subtracted with unit coefficients, whereas the multicoefficient Gaussian-x methods involve noninteger parameters as coefficients. The multilevel optimizations drop the average error in the geometry (averaged over the 18 cases) by a factor of about two when compared to the single most expensive component of a given multilevel calculation, and in all 18 cases the accuracy of the atomization energy for the three test molecules improves; with an average improvement of 16.7 kcal/mol.
Multilevel geometry optimization
Energy Technology Data Exchange (ETDEWEB)
Rodgers, Jocelyn M. [Department of Chemistry and Supercomputer Institute, University of Minnesota, Minneapolis, Minnesota 55455-0431 (United States); Fast, Patton L. [Department of Chemistry and Supercomputer Institute, University of Minnesota, Minneapolis, Minnesota 55455-0431 (United States); Truhlar, Donald G. [Department of Chemistry and Supercomputer Institute, University of Minnesota, Minneapolis, Minnesota 55455-0431 (United States)
2000-02-15
Geometry optimization has been carried out for three test molecules using six multilevel electronic structure methods, in particular Gaussian-2, Gaussian-3, multicoefficient G2, multicoefficient G3, and two multicoefficient correlation methods based on correlation-consistent basis sets. In the Gaussian-2 and Gaussian-3 methods, various levels are added and subtracted with unit coefficients, whereas the multicoefficient Gaussian-x methods involve noninteger parameters as coefficients. The multilevel optimizations drop the average error in the geometry (averaged over the 18 cases) by a factor of about two when compared to the single most expensive component of a given multilevel calculation, and in all 18 cases the accuracy of the atomization energy for the three test molecules improves; with an average improvement of 16.7 kcal/mol. (c) 2000 American Institute of Physics.
Ciarlet, Philippe G
2007-01-01
This book gives the basic notions of differential geometry, such as the metric tensor, the Riemann curvature tensor, the fundamental forms of a surface, covariant derivatives, and the fundamental theorem of surface theory in a selfcontained and accessible manner. Although the field is often considered a classical one, it has recently been rejuvenated, thanks to the manifold applications where it plays an essential role. The book presents some important applications to shells, such as the theory of linearly and nonlinearly elastic shells, the implementation of numerical methods for shells, and
Lago, Paola; Bizzarri, Giancarlo; Scalzotto, Francesca; Parpaiola, Antonella; Amigoni, Angela; Putoto, Giovanni; Perilongo, Giorgio
2012-01-01
Administering medication to hospitalised infants and children is a complex process at high risk of error. Failure mode and effect analysis (FMEA) is a proactive tool used to analyse risks, identify failures before they happen and prioritise remedial measures. To examine the hazards associated with the process of drug delivery to children, we performed a proactive risk-assessment analysis. Five multidisciplinary teams, representing different divisions of the paediatric department at Padua University Hospital, were trained to analyse the drug-delivery process, to identify possible causes of failures and their potential effects, to calculate a risk priority number (RPN) for each failure and plan changes in practices. To identify higher-priority potential failure modes as defined by RPNs and planning changes in clinical practice to reduce the risk of patients harm and improve safety in the process of medication use in children. In all, 37 higher-priority potential failure modes and 71 associated causes and effects were identified. The highest RPNs related (>48) mainly to errors in calculating drug doses and concentrations. Many of these failure modes were found in all the five units, suggesting the presence of common targets for improvement, particularly in enhancing the safety of prescription and preparation of endovenous drugs. The introductions of new activities in the revised process of administering drugs allowed reducing the high-risk failure modes of 60%. FMEA is an effective proactive risk-assessment tool useful to aid multidisciplinary groups in understanding a process care and identifying errors that may occur, prioritising remedial interventions and possibly enhancing the safety of drug delivery in children.
Directory of Open Access Journals (Sweden)
Cresswell Kathrin M
2012-06-01
Full Text Available Abstract Background There is a need to shed light on the pathways through which complex interventions mediate their effects in order to enable critical reflection on their transferability. We sought to explore and understand key stakeholder accounts of the acceptability, likely impact and strategies for optimizing and rolling-out a successful pharmacist-led information technology-enabled (PINCER intervention, which substantially reduced the risk of clinically important errors in medicines management in primary care. Methods Data were collected at two geographical locations in central England through a combination of one-to-one longitudinal semi-structured telephone interviews (one at the beginning of the trial and another when the trial was well underway, relevant documents, and focus group discussions following delivery of the PINCER intervention. Participants included PINCER pharmacists, general practice staff, researchers involved in the running of the trial, and primary care trust staff. PINCER pharmacists were interviewed at three different time-points during the delivery of the PINCER intervention. Analysis was thematic with diffusion of innovation theory providing a theoretical framework. Results We conducted 52 semi-structured telephone interviews and six focus group discussions with 30 additional participants. In addition, documentary data were collected from six pharmacist diaries, along with notes from four meetings of the PINCER pharmacists and feedback meetings from 34 practices. Key findings that helped to explain the success of the PINCER intervention included the perceived importance of focusing on prescribing errors to all stakeholders, and the credibility and appropriateness of a pharmacist-led intervention to address these shortcomings. Central to this was the face-to-face contact and relationship building between pharmacists and a range of practice staff, and pharmacists’ explicitly designated role as a change agent
Energy Technology Data Exchange (ETDEWEB)
Han, Gi Yeong; Kim, Song Hyun; Kim, Do Hyun; Shin, Chang Ho; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of)
2014-05-15
In this study, how the geometry splitting strategy affects the calculation efficiency was analyzed. In this study, a geometry splitting method was proposed to increase the calculation efficiency in Monte Carlo simulation. First, the analysis of the neutron distribution characteristics in a deep penetration problem was performed. Then, considering the neutron population distribution, a geometry splitting method was devised. Using the proposed method, the FOMs with benchmark problems were estimated and compared with the conventional geometry splitting strategy. The results show that the proposed method can considerably increase the calculation efficiency in using geometry splitting method. It is expected that the proposed method will contribute to optimizing the computational cost as well as reducing the human errors in Monte Carlo simulation. Geometry splitting in Monte Carlo (MC) calculation is one of the most popular variance reduction techniques due to its simplicity, reliability and efficiency. For the use of the geometry splitting, the user should determine locations of geometry splitting and assign the relative importance of each region. Generally, the splitting parameters are decided by the user's experience. However, in this process, the splitting parameters can ineffectively or erroneously be selected. In order to prevent it, there is a recommendation to help the user eliminate guesswork, which is to split the geometry evenly. And then, the importance is estimated by a few iterations for preserving population of particle penetrating each region. However, evenly geometry splitting method can make the calculation inefficient due to the change in mean free path (MFP) of particles.
International Nuclear Information System (INIS)
Han, Gi Yeong; Kim, Song Hyun; Kim, Do Hyun; Shin, Chang Ho; Kim, Jong Kyung
2014-01-01
In this study, how the geometry splitting strategy affects the calculation efficiency was analyzed. In this study, a geometry splitting method was proposed to increase the calculation efficiency in Monte Carlo simulation. First, the analysis of the neutron distribution characteristics in a deep penetration problem was performed. Then, considering the neutron population distribution, a geometry splitting method was devised. Using the proposed method, the FOMs with benchmark problems were estimated and compared with the conventional geometry splitting strategy. The results show that the proposed method can considerably increase the calculation efficiency in using geometry splitting method. It is expected that the proposed method will contribute to optimizing the computational cost as well as reducing the human errors in Monte Carlo simulation. Geometry splitting in Monte Carlo (MC) calculation is one of the most popular variance reduction techniques due to its simplicity, reliability and efficiency. For the use of the geometry splitting, the user should determine locations of geometry splitting and assign the relative importance of each region. Generally, the splitting parameters are decided by the user's experience. However, in this process, the splitting parameters can ineffectively or erroneously be selected. In order to prevent it, there is a recommendation to help the user eliminate guesswork, which is to split the geometry evenly. And then, the importance is estimated by a few iterations for preserving population of particle penetrating each region. However, evenly geometry splitting method can make the calculation inefficient due to the change in mean free path (MFP) of particles
Siebert, Johan N; Ehrler, Frederic; Lovis, Christian; Combescure, Christophe; Haddad, Kevin; Gervaix, Alain; Manzano, Sergio
2017-08-22
During pediatric cardiopulmonary resuscitation (CPR), vasoactive drug preparation for continuous infusions is complex and time-consuming. The need for individual specific weight-based drug dose calculation and preparation places children at higher risk than adults for medication errors. Following an evidence-based and ergonomic driven approach, we developed a mobile device app called Pediatric Accurate Medication in Emergency Situations (PedAMINES), intended to guide caregivers step-by-step from preparation to delivery of drugs requiring continuous infusion. In a prior single center randomized controlled trial, medication errors were reduced from 70% to 0% by using PedAMINES when compared with conventional preparation methods. The purpose of this study is to determine whether the use of PedAMINES in both university and smaller hospitals reduces medication dosage errors (primary outcome), time to drug preparation (TDP), and time to drug delivery (TDD) (secondary outcomes) during pediatric CPR when compared with conventional preparation methods. This is a multicenter, prospective, randomized controlled crossover trial with 2 parallel groups comparing PedAMINES with a conventional and internationally used drug infusion rate table in the preparation of continuous drug infusion. The evaluation setting uses a simulation-based pediatric CPR cardiac arrest scenario with a high-fidelity manikin. The study involving 120 certified nurses (sample size) will take place in the resuscitation rooms of 3 tertiary pediatric emergency departments and 3 smaller hospitals. After epinephrine-induced return of spontaneous circulation, nurses will be asked to prepare a continuous infusion of dopamine using either PedAMINES (intervention group) or the infusion table (control group) and then prepare a continuous infusion of norepinephrine by crossing the procedure. The primary outcome is the medication dosage error rate. The secondary outcome is the time in seconds elapsed since the oral
DEFF Research Database (Denmark)
Tian, Yanjun; Loh, Poh Chiang; Deng, Fujin
2016-01-01
Cascaded converter is formed by connecting two subconverters together, sharing a common intermediate dc-link voltage. Regulation of this dc-link voltage is frequently realized with a proportional-integral (PI) controller, whose high gain at dc helps to force a zero steady-state tracking error....... The proposed scheme can be used with either unidirectional or bidirectional power flow, and has been verified by simulation and experimental results presented in this paper........ Such precise tracking is, however, at the expense of increasing the system type, caused by the extra pole at the origin introduced by the PI controller. The overall system may, hence, be tougher to control. To reduce the system type while preserving precise dc-link voltage tracking, this paper proposes...
Kartush, J M
1996-11-01
Practicing medicine successfully requires that errors in diagnosis and treatment be minimized. Malpractice laws encourage litigators to ascribe all medical errors to incompetence and negligence. There are, however, many other causes of unintended outcomes. This article describes common causes of errors and suggests ways to minimize mistakes in otologic practice. Widespread dissemination of knowledge about common errors and their precursors can reduce the incidence of their occurrence. Consequently, laws should be passed to allow for a system of non-punitive, confidential reporting of errors and "near misses" that can be shared by physicians nationwide.
Medication errors: prescribing faults and prescription errors.
Velo, Giampaolo P; Minuz, Pietro
2009-06-01
1. Medication errors are common in general practice and in hospitals. Both errors in the act of writing (prescription errors) and prescribing faults due to erroneous medical decisions can result in harm to patients. 2. Any step in the prescribing process can generate errors. Slips, lapses, or mistakes are sources of errors, as in unintended omissions in the transcription of drugs. Faults in dose selection, omitted transcription, and poor handwriting are common. 3. Inadequate knowledge or competence and incomplete information about clinical characteristics and previous treatment of individual patients can result in prescribing faults, including the use of potentially inappropriate medications. 4. An unsafe working environment, complex or undefined procedures, and inadequate communication among health-care personnel, particularly between doctors and nurses, have been identified as important underlying factors that contribute to prescription errors and prescribing faults. 5. Active interventions aimed at reducing prescription errors and prescribing faults are strongly recommended. These should be focused on the education and training of prescribers and the use of on-line aids. The complexity of the prescribing procedure should be reduced by introducing automated systems or uniform prescribing charts, in order to avoid transcription and omission errors. Feedback control systems and immediate review of prescriptions, which can be performed with the assistance of a hospital pharmacist, are also helpful. Audits should be performed periodically.
Energy Technology Data Exchange (ETDEWEB)
Farah, J., E-mail: jad.farah@irsn.fr; Clairand, I.; Huet, C. [External Dosimetry Department, Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP-17, 92260 Fontenay-aux-Roses (France); Trianni, A. [Medical Physics Department, Udine University Hospital S. Maria della Misericordia (AOUD), p.le S. Maria della Misericordia, 15, 33100 Udine (Italy); Ciraj-Bjelac, O. [Vinca Institute of Nuclear Sciences (VINCA), P.O. Box 522, 11001 Belgrade (Serbia); De Angelis, C. [Department of Technology and Health, Istituto Superiore di Sanità (ISS), Viale Regina Elena 299, 00161 Rome (Italy); Delle Canne, S. [Fatebenefratelli San Giovanni Calibita Hospital (FBF), UOC Medical Physics - Isola Tiberina, 00186 Rome (Italy); Hadid, L.; Waryn, M. J. [Radiology Department, Hôpital Jean Verdier (HJV), Avenue du 14 Juillet, 93140 Bondy Cedex (France); Jarvinen, H.; Siiskonen, T. [Radiation and Nuclear Safety Authority (STUK), P.O. Box 14, 00881 Helsinki (Finland); Negri, A. [Veneto Institute of Oncology (IOV), Via Gattamelata 64, 35124 Padova (Italy); Novák, L. [National Radiation Protection Institute (NRPI), Bartoškova 28, 140 00 Prague 4 (Czech Republic); Pinto, M. [Istituto Nazionale di Metrologia delle Radiazioni Ionizzanti (ENEA-INMRI), C.R. Casaccia, Via Anguillarese 301, I-00123 Santa Maria di Galeria (RM) (Italy); Knežević, Ž. [Ruđer Bošković Institute (RBI), Bijenička c. 54, 10000 Zagreb (Croatia)
2015-07-15
Purpose: To investigate the optimal use of XR-RV3 GafChromic{sup ®} films to assess patient skin dose in interventional radiology while addressing the means to reduce uncertainties in dose assessment. Methods: XR-Type R GafChromic films have been shown to represent the most efficient and suitable solution to determine patient skin dose in interventional procedures. As film dosimetry can be associated with high uncertainty, this paper presents the EURADOS WG 12 initiative to carry out a comprehensive study of film characteristics with a multisite approach. The considered sources of uncertainties include scanner, film, and fitting-related errors. The work focused on studying film behavior with clinical high-dose-rate pulsed beams (previously unavailable in the literature) together with reference standard laboratory beams. Results: First, the performance analysis of six different scanner models has shown that scan uniformity perpendicular to the lamp motion axis and that long term stability are the main sources of scanner-related uncertainties. These could induce errors of up to 7% on the film readings unless regularly checked and corrected. Typically, scan uniformity correction matrices and reading normalization to the scanner-specific and daily background reading should be done. In addition, the analysis on multiple film batches has shown that XR-RV3 films have generally good uniformity within one batch (<1.5%), require 24 h to stabilize after the irradiation and their response is roughly independent of dose rate (<5%). However, XR-RV3 films showed large variations (up to 15%) with radiation quality both in standard laboratory and in clinical conditions. As such, and prior to conducting patient skin dose measurements, it is mandatory to choose the appropriate calibration beam quality depending on the characteristics of the x-ray systems that will be used clinically. In addition, yellow side film irradiations should be preferentially used since they showed a lower
Hoede, C.; Li, Z.
2001-01-01
In coding theory the problem of decoding focuses on error vectors. In the simplest situation code words are $(0,1)$-vectors, as are the received messages and the error vectors. Comparison of a received word with the code words yields a set of error vectors. In deciding on the original code word,
Numerically robust geometry engine for compound solid geometries
International Nuclear Information System (INIS)
Vlachoudis, V.; Sinuela-Pastor, D.
2013-01-01
Monte Carlo programs heavily rely on a fast and numerically robust solid geometry engines. However the success of solid modeling, depends on facilities for specifying and editing parameterized models through a user-friendly graphical front-end. Such a user interface has to be fast enough in order to be interactive for 2D and/or 3D displays, but at the same time numerically robust in order to display possible modeling errors at real time that could be critical for the simulation. The graphical user interface Flair for FLUKA currently employs such an engine where special emphasis has been given on being fast and numerically robust. The numerically robustness is achieved by a novel method of estimating the floating precision of the operations, which dynamically adapts all the decision operations accordingly. Moreover a predictive caching mechanism is ensuring that logical errors in the geometry description are found online, without compromising the processing time by checking all regions. (authors)
International Nuclear Information System (INIS)
Knuefer; Lindauer
1980-01-01
Besides that at spectacular events a combination of component failure and human error is often found. Especially the Rasmussen-Report and the German Risk Assessment Study show for pressurised water reactors that human error must not be underestimated. Although operator errors as a form of human error can never be eliminated entirely, they can be minimized and their effects kept within acceptable limits if a thorough training of personnel is combined with an adequate design of the plant against accidents. Contrary to the investigation of engineering errors, the investigation of human errors has so far been carried out with relatively small budgets. Intensified investigations in this field appear to be a worthwhile effort. (orig.)
Silva, Alessandro
1993-01-01
The papers in this wide-ranging collection report on the results of investigations from a number of linked disciplines, including complex algebraic geometry, complex analytic geometry of manifolds and spaces, and complex differential geometry.
Eisenhart, Luther Pfahler
2005-01-01
This concise text by a prominent mathematician deals chiefly with manifolds dominated by the geometry of paths. Topics include asymmetric and symmetric connections, the projective geometry of paths, and the geometry of sub-spaces. 1927 edition.
International Nuclear Information System (INIS)
Gurevich, L.Eh.; Gliner, Eh.B.
1978-01-01
Problems of investigating the Universe space-time geometry are described on a popular level. Immediate space-time geometries, corresponding to three cosmologic models are considered. Space-time geometry of a closed model is the spherical Riemann geonetry, of an open model - is the Lobachevskij geometry; and of a plane model - is the Euclidean geometry. The Universe real geometry in the contemporary epoch of development is based on the data testifying to the fact that the Universe is infinitely expanding
International Nuclear Information System (INIS)
Hull, C.M.
1993-01-01
The geometric structure of theories with gauge fields of spins two and higher should involve a higher spin generalisation of Riemannian geometry. Such geometries are discussed and the case of W ∝ -gravity is analysed in detail. While the gauge group for gravity in d dimensions is the diffeomorphism group of the space-time, the gauge group for a certain W-gravity theory (which is W ∝ -gravity in the case d=2) is the group of symplectic diffeomorphisms of the cotangent bundle of the space-time. Gauge transformations for W-gravity gauge fields are given by requiring the invariance of a generalised line element. Densities exist and can be constructed from the line element (generalising √detg μν ) only if d=1 or d=2, so that only for d=1,2 can actions be constructed. These two cases and the corresponding W-gravity actions are considered in detail. In d=2, the gauge group is effectively only a subgroup of the symplectic diffeomorphisms group. Some of the constraints that arise for d=2 are similar to equations arising in the study of self-dual four-dimensional geometries and can be analysed using twistor methods, allowing contact to be made with other formulations of W-gravity. While the twistor transform for self-dual spaces with one Killing vector reduces to a Legendre transform, that for two Killing vectors gives a generalisation of the Legendre transform. (orig.)
Team errors: definition and taxonomy
International Nuclear Information System (INIS)
Sasou, Kunihide; Reason, James
1999-01-01
In error analysis or error management, the focus is usually upon individuals who have made errors. In large complex systems, however, most people work in teams or groups. Considering this working environment, insufficient emphasis has been given to 'team errors'. This paper discusses the definition of team errors and its taxonomy. These notions are also applied to events that have occurred in the nuclear power industry, aviation industry and shipping industry. The paper also discusses the relations between team errors and Performance Shaping Factors (PSFs). As a result, the proposed definition and taxonomy are found to be useful in categorizing team errors. The analysis also reveals that deficiencies in communication, resource/task management, excessive authority gradient, excessive professional courtesy will cause team errors. Handling human errors as team errors provides an opportunity to reduce human errors
Evaluating a medical error taxonomy.
Brixey, Juliana; Johnson, Todd R.; Zhang, Jiajie
2002-01-01
Healthcare has been slow in using human factors principles to reduce medical errors. The Center for Devices and Radiological Health (CDRH) recognizes that a lack of attention to human factors during product development may lead to errors that have the potential for patient injury, or even death. In response to the need for reducing medication errors, the National Coordinating Council for Medication Errors Reporting and Prevention (NCC MERP) released the NCC MERP taxonomy that provides a stand...
International Nuclear Information System (INIS)
Winterflood, A.H.
1980-01-01
In discussing Einstein's Special Relativity theory it is claimed that it violates the principle of relativity itself and that an anomalous sign in the mathematics is found in the factor which transforms one inertial observer's measurements into those of another inertial observer. The apparent source of this error is discussed. Having corrected the error a new theory, called Observational Kinematics, is introduced to replace Einstein's Special Relativity. (U.K.)
International Nuclear Information System (INIS)
Wu, Q; Snyder, K; Liu, C; Huang, Y; Li, H; Chetty, I; Wen, N
2015-01-01
Purpose: To develop an optimization algorithm to reduce normal brain dose by optimizing couch and collimator angles for single isocenter multiple targets treatment of stereotactic radiosurgery. Methods: Three metastatic brain lesions were retrospectively planned using single-isocenter volumetric modulated arc therapy (VMAT). Three matrices were developed to calculate the projection of each lesion on Beam’s Eye View (BEV) by the rotating couch, collimator and gantry respectively. The island blocking problem was addressed by computing the total area of open space between any two lesions with shared MLC leaf pairs. The couch and collimator angles resulting in the smallest open areas were the optimized angles for each treatment arc. Two treatment plans with and without couch and collimator angle optimization were developed using the same objective functions and to achieve 99% of each target volume receiving full prescription dose of 18Gy. Plan quality was evaluated by calculating each target’s Conformity Index (CI), Gradient Index (GI), and Homogeneity index (HI), and absolute volume of normal brain V8Gy, V10Gy, V12Gy, and V14Gy. Results: Using the new couch/collimator optimization strategy, dose to normal brain tissue was reduced substantially. V8, V10, V12, and V14 decreased by 2.3%, 3.6%, 3.5%, and 6%, respectively. There were no significant differences in the conformity index, gradient index, and homogeneity index between two treatment plans with and without the new optimization algorithm. Conclusion: We have developed a solution to the island blocking problem in delivering radiation to multiple brain metastases with shared isocenter. Significant reduction in dose to normal brain was achieved by using optimal couch and collimator angles that minimize total area of open space between any of the two lesions with shared MLC leaf pairs. This technique has been integrated into Eclipse treatment system using scripting API
Energy Technology Data Exchange (ETDEWEB)
Wu, Q [Wayne State University, Detroit, MI (United States); Snyder, K; Liu, C; Huang, Y; Li, H; Chetty, I; Wen, N [Henry Ford Health System, Detroit, MI (United States)
2015-06-15
Purpose: To develop an optimization algorithm to reduce normal brain dose by optimizing couch and collimator angles for single isocenter multiple targets treatment of stereotactic radiosurgery. Methods: Three metastatic brain lesions were retrospectively planned using single-isocenter volumetric modulated arc therapy (VMAT). Three matrices were developed to calculate the projection of each lesion on Beam’s Eye View (BEV) by the rotating couch, collimator and gantry respectively. The island blocking problem was addressed by computing the total area of open space between any two lesions with shared MLC leaf pairs. The couch and collimator angles resulting in the smallest open areas were the optimized angles for each treatment arc. Two treatment plans with and without couch and collimator angle optimization were developed using the same objective functions and to achieve 99% of each target volume receiving full prescription dose of 18Gy. Plan quality was evaluated by calculating each target’s Conformity Index (CI), Gradient Index (GI), and Homogeneity index (HI), and absolute volume of normal brain V8Gy, V10Gy, V12Gy, and V14Gy. Results: Using the new couch/collimator optimization strategy, dose to normal brain tissue was reduced substantially. V8, V10, V12, and V14 decreased by 2.3%, 3.6%, 3.5%, and 6%, respectively. There were no significant differences in the conformity index, gradient index, and homogeneity index between two treatment plans with and without the new optimization algorithm. Conclusion: We have developed a solution to the island blocking problem in delivering radiation to multiple brain metastases with shared isocenter. Significant reduction in dose to normal brain was achieved by using optimal couch and collimator angles that minimize total area of open space between any of the two lesions with shared MLC leaf pairs. This technique has been integrated into Eclipse treatment system using scripting API.
Ragon, Théa; Sladen, Anthony; Simons, Mark
2018-05-01
The ill-posed nature of earthquake source estimation derives from several factors including the quality and quantity of available observations and the fidelity of our forward theory. Observational errors are usually accounted for in the inversion process. Epistemic errors, which stem from our simplified description of the forward problem, are rarely dealt with despite their potential to bias the estimate of a source model. In this study, we explore the impact of uncertainties related to the choice of a fault geometry in source inversion problems. The geometry of a fault structure is generally reduced to a set of parameters, such as position, strike and dip, for one or a few planar fault segments. While some of these parameters can be solved for, more often they are fixed to an uncertain value. We propose a practical framework to address this limitation by following a previously implemented method exploring the impact of uncertainties on the elastic properties of our models. We develop a sensitivity analysis to small perturbations of fault dip and position. The uncertainties in fault geometry are included in the inverse problem under the formulation of the misfit covariance matrix that combines both prediction and observation uncertainties. We validate this approach with the simplified case of a fault that extends infinitely along strike, using both Bayesian and optimization formulations of a static inversion. If epistemic errors are ignored, predictions are overconfident in the data and source parameters are not reliably estimated. In contrast, inclusion of uncertainties in fault geometry allows us to infer a robust posterior source model. Epistemic uncertainties can be many orders of magnitude larger than observational errors for great earthquakes (Mw > 8). Not accounting for uncertainties in fault geometry may partly explain observed shallow slip deficits for continental earthquakes. Similarly, ignoring the impact of epistemic errors can also bias estimates of
Idzinga, J. C.; de Jong, A. L.; van den Bemt, P. M. L. A.
2009-01-01
Background: Previous studies, both in hospitals and in institutions for clients with an intellectual disability (ID), have shown that medication errors at the administration stage are frequent, especially when medication has to be administered through an enteral feeding tube. In hospitals a specially designed intervention programme has proven to…
Klopotowska, J.E.; Kuiper, R.; van Kan, H.J.; de Pont, A.C.; Dijkgraaf, M.G.; Lie-A-Huen, L.; Vroom, M.B.; Smorenburg, S.M.
2010-01-01
Introduction: Patients admitted to an intensive care unit (ICU) are at high risk for prescribing errors and related adverse drug events (ADEs). An effective intervention to decrease this risk, based on studies conducted mainly in North America, is on-ward participation of a clinical pharmacist in an
Klopotowska, Joanna E.; Kuiper, Rob; van Kan, Hendrikus J.; de Pont, Anne-Cornelie; Dijkgraaf, Marcel G.; Lie-A-Huen, Loraine; Vroom, Margreeth B.; Smorenburg, Susanne M.
2010-01-01
Patients admitted to an intensive care unit (ICU) are at high risk for prescribing errors and related adverse drug events (ADEs). An effective intervention to decrease this risk, based on studies conducted mainly in North America, is on-ward participation of a clinical pharmacist in an ICU team. As
Meyer, Walter J
2006-01-01
Meyer''s Geometry and Its Applications, Second Edition, combines traditional geometry with current ideas to present a modern approach that is grounded in real-world applications. It balances the deductive approach with discovery learning, and introduces axiomatic, Euclidean geometry, non-Euclidean geometry, and transformational geometry. The text integrates applications and examples throughout and includes historical notes in many chapters. The Second Edition of Geometry and Its Applications is a significant text for any college or university that focuses on geometry''s usefulness in other disciplines. It is especially appropriate for engineering and science majors, as well as future mathematics teachers.* Realistic applications integrated throughout the text, including (but not limited to): - Symmetries of artistic patterns- Physics- Robotics- Computer vision- Computer graphics- Stability of architectural structures- Molecular biology- Medicine- Pattern recognition* Historical notes included in many chapters...
Indian Academy of Sciences (India)
algebraic geometry but also in related fields like number theory. ... every vector bundle on the affine space is trivial. (equivalently ... les on a compact Riemann surface to unitary rep- ... tial geometry and topology and was generalised in.
International Nuclear Information System (INIS)
Sloane, Peter
2007-01-01
We adapt the spinorial geometry method introduced in [J. Gillard, U. Gran and G. Papadopoulos, 'The spinorial geometry of supersymmetric backgrounds,' Class. Quant. Grav. 22 (2005) 1033 [ (arXiv:hep-th/0410155)
Energy Technology Data Exchange (ETDEWEB)
Sloane, Peter [Department of Mathematics, King' s College, University of London, Strand, London WC2R 2LS (United Kingdom)
2007-09-15
We adapt the spinorial geometry method introduced in [J. Gillard, U. Gran and G. Papadopoulos, 'The spinorial geometry of supersymmetric backgrounds,' Class. Quant. Grav. 22 (2005) 1033 [ (arXiv:hep-th/0410155)
Liu, Hsiu-Chu; Li, Hsing; Chang, Hsin-Fei; Lu, Mei-Rou; Chen, Feng-Chuan
2015-01-01
Learning from the experience of another medical center in Taiwan, Kaohsiung Municipal Kai-Suan Psychiatric Hospital has changed the nursing informatics system step by step in the past year and a half . We considered ethics in the original idea of implementing barcodes on the test tube labels to process the identification of the psychiatric patients. The main aims of this project are to maintain the confidential information and to transport the sample effectively. The primary nurses had been using different work sheets for this project to ensure the acceptance of the new barcode system. In the past two years the errors in the blood testing process were as high as 11,000 in 14,000 events per year, resulting in wastage of resources. The actions taken by the nurses and the new barcode system implementation can improve the clinical nursing care quality, safety of the patients, and efficiency, while decreasing the cost due to the human error.
An Error Analysis of Structured Light Scanning of Biological Tissue
DEFF Research Database (Denmark)
Jensen, Sebastian Hoppe Nesgaard; Wilm, Jakob; Aanæs, Henrik
2017-01-01
This paper presents an error analysis and correction model for four structured light methods applied to three common types of biological tissue; skin, fat and muscle. Despite its many advantages, structured light is based on the assumption of direct reflection at the object surface only......, statistical linear model based on the scan geometry. As such, scans can be corrected without introducing any specially designed pattern strategy or hardware. We can effectively reduce the error in a structured light scanner applied to biological tissue by as much as factor of two or three........ This assumption is violated by most biological material e.g. human skin, which exhibits subsurface scattering. In this study, we find that in general, structured light scans of biological tissue deviate significantly from the ground truth. We show that a large portion of this error can be predicted with a simple...
Geometry essentials for dummies
Ryan, Mark
2011-01-01
Just the critical concepts you need to score high in geometry This practical, friendly guide focuses on critical concepts taught in a typical geometry course, from the properties of triangles, parallelograms, circles, and cylinders, to the skills and strategies you need to write geometry proofs. Geometry Essentials For Dummies is perfect for cramming or doing homework, or as a reference for parents helping kids study for exams. Get down to the basics - get a handle on the basics of geometry, from lines, segments, and angles, to vertices, altitudes, and diagonals Conque
Arithmetic noncommutative geometry
Marcolli, Matilde
2005-01-01
Arithmetic noncommutative geometry denotes the use of ideas and tools from the field of noncommutative geometry, to address questions and reinterpret in a new perspective results and constructions from number theory and arithmetic algebraic geometry. This general philosophy is applied to the geometry and arithmetic of modular curves and to the fibers at archimedean places of arithmetic surfaces and varieties. The main reason why noncommutative geometry can be expected to say something about topics of arithmetic interest lies in the fact that it provides the right framework in which the tools of geometry continue to make sense on spaces that are very singular and apparently very far from the world of algebraic varieties. This provides a way of refining the boundary structure of certain classes of spaces that arise in the context of arithmetic geometry, such as moduli spaces (of which modular curves are the simplest case) or arithmetic varieties (completed by suitable "fibers at infinity"), by adding boundaries...
Energy Technology Data Exchange (ETDEWEB)
Bir, R [Commissariat a l' Energie Atomique, Saclay (France).Centre d' Etudes Nucleaires
1961-07-01
One of the most serious causes of systematic error in isotopic analyses of uranium from UF{sub 6} is the tendency of this material to become fixed in various ways in the mass spectrometer. As a result the value indicated by the instrument is influenced by the isotopic composition of the substances previously analysed. The resulting error is called a memory error. Making use of an elementary mathematical theory, the various methods used to reduce memory errors are analysed and compared. A new method is then suggested, which reduces the memory errors to an extent where they become negligible over a wide range of {sup 235}U concentration. The method is given in full, together with examples of its application. (author) [French] Une des causes d'erreurs systematiques les plus graves dans les analyses isotopiques d'uranium a partir d'UF{sub 6} est l'aptitude de ce produit a se fixer de diverses manieres dans le spectrometre de masse. Il en resulte une influence de la composition isotopique des produits precedemment analyses sur la valeur indiquee par l'appareil. L'erreur resultante est appelee erreur de memoire. A partir d'une theorie mathematique elementaire, on analyse et on compare les differentes methodes utilisees pour reduire les erreurs de memoire. On suggere ensuite une nouvelle methode qui reduit les erreurs de memoire dans une proportion telle qu'elles deviennent negligeables dans un grand domaine de concentration en {sup 235}U. On donne le mode operatoire complet et des exemples d'application. (auteur)
Numerical determination of transmission probabilities in cylindrical geometry
International Nuclear Information System (INIS)
Queiroz Bogado Leite, S. de.
1989-11-01
Efficient methods for numerical calculation of transmission probabilities in cylindrical geometry are presented. Relative errors of the order of 10 -5 or smaller are obtained using analytical solutions and low order quadrature integration schemes. (author) [pt
Bárány, Imre; Vilcu, Costin
2016-01-01
This volume presents easy-to-understand yet surprising properties obtained using topological, geometric and graph theoretic tools in the areas covered by the Geometry Conference that took place in Mulhouse, France from September 7–11, 2014 in honour of Tudor Zamfirescu on the occasion of his 70th anniversary. The contributions address subjects in convexity and discrete geometry, in distance geometry or with geometrical flavor in combinatorics, graph theory or non-linear analysis. Written by top experts, these papers highlight the close connections between these fields, as well as ties to other domains of geometry and their reciprocal influence. They offer an overview on recent developments in geometry and its border with discrete mathematics, and provide answers to several open questions. The volume addresses a large audience in mathematics, including researchers and graduate students interested in geometry and geometrical problems.
International Nuclear Information System (INIS)
HAALAND, DAVID M.; VAN BENTHEM, MARK H.; WEHLBURG, CHRISTINE M.; KOEHLER, IV FREDERICK W.
2002-01-01
Hyperspectral Fourier transform infrared images have been obtained from a neoprene sample aged in air at elevated temperatures. The massive amount of spectra available from this heterogeneous sample provides the opportunity to perform quantitative analysis of the spectral data without the need for calibration standards. Multivariate curve resolution (MCR) methods with non-negativity constraints applied to the iterative alternating least squares analysis of the spectral data has been shown to achieve the goal of quantitative image analysis without the use of standards. However, the pure-component spectra and the relative concentration maps were heavily contaminated by the presence of system artifacts in the spectral data. We have demonstrated that the detrimental effects of these artifacts can be minimized by adding an estimate of the error covariance structure of the spectral image data to the MCR algorithm. The estimate is added by augmenting the concentration and pure-component spectra matrices with scores and eigenvectors obtained from the mean-centered repeat image differences of the sample. The implementation of augmentation is accomplished by employing efficient equality constraints on the MCR analysis. Augmentation with the scores from the repeat images is found to primarily improve the pure-component spectral estimates while augmentation with the corresponding eigenvectors primarily improves the concentration maps. Augmentation with both scores and eigenvectors yielded the best result by generating less noisy pure-component spectral estimates and relative concentration maps that were largely free from a striping artifact that is present due to system errors in the FT-IR images. The MCR methods presented are general and can also be applied productively to non-image spectral data
Algorithms in Algebraic Geometry
Dickenstein, Alicia; Sommese, Andrew J
2008-01-01
In the last decade, there has been a burgeoning of activity in the design and implementation of algorithms for algebraic geometric computation. Some of these algorithms were originally designed for abstract algebraic geometry, but now are of interest for use in applications and some of these algorithms were originally designed for applications, but now are of interest for use in abstract algebraic geometry. The workshop on Algorithms in Algebraic Geometry that was held in the framework of the IMA Annual Program Year in Applications of Algebraic Geometry by the Institute for Mathematics and Its
O'Leary, Michael
2010-01-01
Guides readers through the development of geometry and basic proof writing using a historical approach to the topic. In an effort to fully appreciate the logic and structure of geometric proofs, Revolutions of Geometry places proofs into the context of geometry's history, helping readers to understand that proof writing is crucial to the job of a mathematician. Written for students and educators of mathematics alike, the book guides readers through the rich history and influential works, from ancient times to the present, behind the development of geometry. As a result, readers are successfull
Fundamental concepts of geometry
Meserve, Bruce E
1983-01-01
Demonstrates relationships between different types of geometry. Provides excellent overview of the foundations and historical evolution of geometrical concepts. Exercises (no solutions). Includes 98 illustrations.
Developments in special geometry
International Nuclear Information System (INIS)
Mohaupt, Thomas; Vaughan, Owen
2012-01-01
We review the special geometry of N = 2 supersymmetric vector and hypermultiplets with emphasis on recent developments and applications. A new formulation of the local c-map based on the Hesse potential and special real coordinates is presented. Other recent developments include the Euclidean version of special geometry, and generalizations of special geometry to non-supersymmetric theories. As applications we discuss the proof that the local r-map and c-map preserve geodesic completeness, and the construction of four- and five-dimensional static solutions through dimensional reduction over time. The shared features of the real, complex and quaternionic version of special geometry are stressed throughout.
Performance Analysis of a Decoding Algorithm for Algebraic Geometry Codes
DEFF Research Database (Denmark)
Jensen, Helge Elbrønd; Nielsen, Rasmus Refslund; Høholdt, Tom
1998-01-01
We analyse the known decoding algorithms for algebraic geometry codes in the case where the number of errors is greater than or equal to [(dFR-1)/2]+1, where dFR is the Feng-Rao distance......We analyse the known decoding algorithms for algebraic geometry codes in the case where the number of errors is greater than or equal to [(dFR-1)/2]+1, where dFR is the Feng-Rao distance...
Recent results in the decoding of Algebraic geometry codes
DEFF Research Database (Denmark)
Høholdt, Tom; Jensen, Helge Elbrønd; Nielsen, Rasmus Refslund
1998-01-01
We analyse the known decoding algorithms for algebraic geometry codes in the case where the number of errors is [(dFR-1)/2]+1, where dFR is the Feng-Rao distance......We analyse the known decoding algorithms for algebraic geometry codes in the case where the number of errors is [(dFR-1)/2]+1, where dFR is the Feng-Rao distance...
Vinay BC; Nikhitha MK; Patel Sunil B
2015-01-01
In this present review article, regarding medication errors its definition, medication error problem, types of medication errors, common causes of medication errors, monitoring medication errors, consequences of medication errors, prevention of medication error and managing medication errors have been explained neatly and legibly with proper tables which is easy to understand.
Systematics of IIB spinorial geometry
Gran, U.; Gutowski, J.; Papadopoulos, G.; Roest, D.
2005-01-01
We reduce the classification of all supersymmetric backgrounds of IIB supergravity to the evaluation of the Killing spinor equations and their integrability conditions, which contain the field equations, on five types of spinors. This extends the work of [hep-th/0503046] to IIB supergravity. We give the expressions of the Killing spinor equations on all five types of spinors. In this way, the Killing spinor equations become a linear system for the fluxes, geometry and spacetime derivatives of...
Geometry of multihadron production
Energy Technology Data Exchange (ETDEWEB)
Bjorken, J.D.
1994-10-01
This summary talk only reviews a small sample of topics featured at this symposium: Introduction; The Geometry and Geography of Phase space; Space-Time Geometry and HBT; Multiplicities, Intermittency, Correlations; Disoriented Chiral Condensate; Deep Inelastic Scattering at HERA; and Other Contributions.
1996-01-01
Designs and Finite Geometries brings together in one place important contributions and up-to-date research results in this important area of mathematics. Designs and Finite Geometries serves as an excellent reference, providing insight into some of the most important research issues in the field.
Geometry of multihadron production
International Nuclear Information System (INIS)
Bjorken, J.D.
1994-10-01
This summary talk only reviews a small sample of topics featured at this symposium: Introduction; The Geometry and Geography of Phase space; Space-Time Geometry and HBT; Multiplicities, Intermittency, Correlations; Disoriented Chiral Condensate; Deep Inelastic Scattering at HERA; and Other Contributions
Morris, Barbara H.
2004-01-01
This article describes a geometry project that used the beauty of stained-glass-window designs to teach middle school students about geometric figures and concepts. Three honors prealgebra teachers and a middle school mathematics gifted intervention specialist created a geometry project that covered the curriculum and also assessed students'…
Social aspects of clinical errors.
Richman, Joel; Mason, Tom; Mason-Whitehead, Elizabeth; McIntosh, Annette; Mercer, Dave
2009-08-01
Clinical errors, whether committed by doctors, nurses or other professions allied to healthcare, remain a sensitive issue requiring open debate and policy formulation in order to reduce them. The literature suggests that the issues underpinning errors made by healthcare professionals involve concerns about patient safety, professional disclosure, apology, litigation, compensation, processes of recording and policy development to enhance quality service. Anecdotally, we are aware of narratives of minor errors, which may well have been covered up and remain officially undisclosed whilst the major errors resulting in damage and death to patients alarm both professionals and public with resultant litigation and compensation. This paper attempts to unravel some of these issues by highlighting the historical nature of clinical errors and drawing parallels to contemporary times by outlining the 'compensation culture'. We then provide an overview of what constitutes a clinical error and review the healthcare professional strategies for managing such errors.
Methods of information geometry
Amari, Shun-Ichi
2000-01-01
Information geometry provides the mathematical sciences with a new framework of analysis. It has emerged from the investigation of the natural differential geometric structure on manifolds of probability distributions, which consists of a Riemannian metric defined by the Fisher information and a one-parameter family of affine connections called the \\alpha-connections. The duality between the \\alpha-connection and the (-\\alpha)-connection together with the metric play an essential role in this geometry. This kind of duality, having emerged from manifolds of probability distributions, is ubiquitous, appearing in a variety of problems which might have no explicit relation to probability theory. Through the duality, it is possible to analyze various fundamental problems in a unified perspective. The first half of this book is devoted to a comprehensive introduction to the mathematical foundation of information geometry, including preliminaries from differential geometry, the geometry of manifolds or probability d...
Siracuse, Jeffrey J; Benoit, Eric; Burke, Janet; Carter, Steven; Schwaitzberg, Steven D
2014-03-01
The decision to perform an elective procedure often originates during an office visit between surgeon and patient. Several administrative tasks follow, including scheduling or "booking" of the case and obtaining informed consent. These processes require communicating accurate information regarding diagnosis, procedure, and other patient-specific details necessary for the safe and effective performance of an operation. Nonstandardized and paper-based consents pose difficulty with legibility, portability, and consistency, thereby representing a source of potential error and inefficiency. There are numerous barriers to efficiently booking elective surgical procedures and obtaining a legible, complete, and easily retrievable informed consent. An integrated Web-based booking and consent system was developed at a multisite university-affiliated community hospital system to improve the speed and quality of work flow, as well as communication with both the patients and staff. A booking and consent system was developed and made available over the intranet. This customized system was created by leveraging existing information systems. The electronic consent system uses surgeon-specific templates and allows for a consistent approach to each procedure. A printed consent form can be generated at any time from any of the health care system's three campuses and is commonly stored in the electronic medical record. Integration into our perioperative system allows for coordination with the operating room staff, administrative personal, financial coordinators, and central supply. Total systems expenditure for development was estimated at $40,000 (US). Organizations considering standardizing their own consent and operating room booking processes can review this experience in making their own "make or buy" decision for their own settings.
Geometry on the space of geometries
International Nuclear Information System (INIS)
Christodoulakis, T.; Zanelli, J.
1988-06-01
We discuss the geometric structure of the configuration space of pure gravity. This is an infinite dimensional manifold, M, where each point represents one spatial geometry g ij (x). The metric on M is dictated by geometrodynamics, and from it, the Christoffel symbols and Riemann tensor can be found. A ''free geometry'' tracing a geodesic on the manifold describes the time evolution of space in the strong gravity limit. In a regularization previously introduced by the authors, it is found that M does not have the same dimensionality, D, everywhere, and that D is not a scalar, although it is covariantly constant. In this regularization, it is seen that the path integral measure can be absorbed in a renormalization of the cosmological constant. (author). 19 refs
Energy Technology Data Exchange (ETDEWEB)
Vinyard, Natalia Sergeevna [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Perry, Theodore Sonne [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Usov, Igor Olegovich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-10-04
We calculate opacity from k (hn)=-ln[T(hv)]/pL, where T(hv) is the transmission for photon energy hv, p is sample density, and L is path length through the sample. The density and path length are measured together by Rutherford backscatter. Δk = $\\partial k$\\ $\\partial T$ ΔT + $\\partial k$\\ $\\partial (pL)$. We can re-write this in terms of fractional error as Δk/k = Δ1n(T)/T + Δ(pL)/(pL). Transmission itself is calculated from T=(U-E)/(V-E)=B/B0, where B is transmitted backlighter (BL) signal and B_{0} is unattenuated backlighter signal. Then ΔT/T=Δln(T)=ΔB/B+ΔB_{0}/B_{0}, and consequently Δk/k = 1/T (ΔB/B + ΔB$_0$/B$_0$ + Δ(pL)/(pL). Transmission is measured in the range of 0.2
Planetary Image Geometry Library
Deen, Robert C.; Pariser, Oleg
2010-01-01
The Planetary Image Geometry (PIG) library is a multi-mission library used for projecting images (EDRs, or Experiment Data Records) and managing their geometry for in-situ missions. A collection of models describes cameras and their articulation, allowing application programs such as mosaickers, terrain generators, and pointing correction tools to be written in a multi-mission manner, without any knowledge of parameters specific to the supported missions. Camera model objects allow transformation of image coordinates to and from view vectors in XYZ space. Pointing models, specific to each mission, describe how to orient the camera models based on telemetry or other information. Surface models describe the surface in general terms. Coordinate system objects manage the various coordinate systems involved in most missions. File objects manage access to metadata (labels, including telemetry information) in the input EDRs and RDRs (Reduced Data Records). Label models manage metadata information in output files. Site objects keep track of different locations where the spacecraft might be at a given time. Radiometry models allow correction of radiometry for an image. Mission objects contain basic mission parameters. Pointing adjustment ("nav") files allow pointing to be corrected. The object-oriented structure (C++) makes it easy to subclass just the pieces of the library that are truly mission-specific. Typically, this involves just the pointing model and coordinate systems, and parts of the file model. Once the library was developed (initially for Mars Polar Lander, MPL), adding new missions ranged from two days to a few months, resulting in significant cost savings as compared to rewriting all the application programs for each mission. Currently supported missions include Mars Pathfinder (MPF), MPL, Mars Exploration Rover (MER), Phoenix, and Mars Science Lab (MSL). Applications based on this library create the majority of operational image RDRs for those missions. A
Complex and symplectic geometry
Medori, Costantino; Tomassini, Adriano
2017-01-01
This book arises from the INdAM Meeting "Complex and Symplectic Geometry", which was held in Cortona in June 2016. Several leading specialists, including young researchers, in the field of complex and symplectic geometry, present the state of the art of their research on topics such as the cohomology of complex manifolds; analytic techniques in Kähler and non-Kähler geometry; almost-complex and symplectic structures; special structures on complex manifolds; and deformations of complex objects. The work is intended for researchers in these areas.
Kulczycki, Stefan
2008-01-01
This accessible approach features two varieties of proofs: stereometric and planimetric, as well as elementary proofs that employ only the simplest properties of the plane. A short history of geometry precedes a systematic exposition of the principles of non-Euclidean geometry.Starting with fundamental assumptions, the author examines the theorems of Hjelmslev, mapping a plane into a circle, the angle of parallelism and area of a polygon, regular polygons, straight lines and planes in space, and the horosphere. Further development of the theory covers hyperbolic functions, the geometry of suff
Errors in clinical laboratories or errors in laboratory medicine?
Plebani, Mario
2006-01-01
Laboratory testing is a highly complex process and, although laboratory services are relatively safe, they are not as safe as they could or should be. Clinical laboratories have long focused their attention on quality control methods and quality assessment programs dealing with analytical aspects of testing. However, a growing body of evidence accumulated in recent decades demonstrates that quality in clinical laboratories cannot be assured by merely focusing on purely analytical aspects. The more recent surveys on errors in laboratory medicine conclude that in the delivery of laboratory testing, mistakes occur more frequently before (pre-analytical) and after (post-analytical) the test has been performed. Most errors are due to pre-analytical factors (46-68.2% of total errors), while a high error rate (18.5-47% of total errors) has also been found in the post-analytical phase. Errors due to analytical problems have been significantly reduced over time, but there is evidence that, particularly for immunoassays, interference may have a serious impact on patients. A description of the most frequent and risky pre-, intra- and post-analytical errors and advice on practical steps for measuring and reducing the risk of errors is therefore given in the present paper. Many mistakes in the Total Testing Process are called "laboratory errors", although these may be due to poor communication, action taken by others involved in the testing process (e.g., physicians, nurses and phlebotomists), or poorly designed processes, all of which are beyond the laboratory's control. Likewise, there is evidence that laboratory information is only partially utilized. A recent document from the International Organization for Standardization (ISO) recommends a new, broader definition of the term "laboratory error" and a classification of errors according to different criteria. In a modern approach to total quality, centered on patients' needs and satisfaction, the risk of errors and mistakes
Roe, John
2003-01-01
Coarse geometry is the study of spaces (particularly metric spaces) from a 'large scale' point of view, so that two spaces that look the same from a great distance are actually equivalent. This point of view is effective because it is often true that the relevant geometric properties of metric spaces are determined by their coarse geometry. Two examples of important uses of coarse geometry are Gromov's beautiful notion of a hyperbolic group and Mostow's proof of his famous rigidity theorem. The first few chapters of the book provide a general perspective on coarse structures. Even when only metric coarse structures are in view, the abstract framework brings the same simplification as does the passage from epsilons and deltas to open sets when speaking of continuity. The middle section reviews notions of negative curvature and rigidity. Modern interest in large scale geometry derives in large part from Mostow's rigidity theorem and from Gromov's subsequent 'large scale' rendition of the crucial properties of n...
Lectures on Symplectic Geometry
Silva, Ana Cannas
2001-01-01
The goal of these notes is to provide a fast introduction to symplectic geometry for graduate students with some knowledge of differential geometry, de Rham theory and classical Lie groups. This text addresses symplectomorphisms, local forms, contact manifolds, compatible almost complex structures, Kaehler manifolds, hamiltonian mechanics, moment maps, symplectic reduction and symplectic toric manifolds. It contains guided problems, called homework, designed to complement the exposition or extend the reader's understanding. There are by now excellent references on symplectic geometry, a subset of which is in the bibliography of this book. However, the most efficient introduction to a subject is often a short elementary treatment, and these notes attempt to serve that purpose. This text provides a taste of areas of current research and will prepare the reader to explore recent papers and extensive books on symplectic geometry where the pace is much faster. For this reprint numerous corrections and cl...
Kollár, János
1997-01-01
This volume contains the lectures presented at the third Regional Geometry Institute at Park City in 1993. The lectures provide an introduction to the subject, complex algebraic geometry, making the book suitable as a text for second- and third-year graduate students. The book deals with topics in algebraic geometry where one can reach the level of current research while starting with the basics. Topics covered include the theory of surfaces from the viewpoint of recent higher-dimensional developments, providing an excellent introduction to more advanced topics such as the minimal model program. Also included is an introduction to Hodge theory and intersection homology based on the simple topological ideas of Lefschetz and an overview of the recent interactions between algebraic geometry and theoretical physics, which involve mirror symmetry and string theory.
DEFF Research Database (Denmark)
Kokkendorff, Simon Lyngby
2002-01-01
The subject of this Ph.D.-thesis is somewhere in between continuous and discrete geometry. Chapter 2 treats the geometry of finite point sets in semi-Riemannian hyperquadrics,using a matrix whose entries are a trigonometric function of relative distances in a given point set. The distance...... to the geometry of a simplex in a semi-Riemannian hyperquadric. In chapter 3 we study which finite metric spaces that are realizable in a hyperbolic space in the limit where curvature goes to -∞. We show that such spaces are the so called leaf spaces, the set of degree 1 vertices of weighted trees. We also...... establish results on the limiting geometry of such an isometrically realized leaf space simplex in hyperbolic space, when curvature goes to -∞. Chapter 4 discusses negative type of metric spaces. We give a measure theoretic treatment of this concept and related invariants. The theory developed...
Busemann, Herbert
2005-01-01
A comprehensive approach to qualitative problems in intrinsic differential geometry, this text examines Desarguesian spaces, perpendiculars and parallels, covering spaces, the influence of the sign of the curvature on geodesics, more. 1955 edition. Includes 66 figures.
Tabachnikov, Serge
2005-01-01
Mathematical billiards describe the motion of a mass point in a domain with elastic reflections off the boundary or, equivalently, the behavior of rays of light in a domain with ideally reflecting boundary. From the point of view of differential geometry, the billiard flow is the geodesic flow on a manifold with boundary. This book is devoted to billiards in their relation with differential geometry, classical mechanics, and geometrical optics. The topics covered include variational principles of billiard motion, symplectic geometry of rays of light and integral geometry, existence and nonexistence of caustics, optical properties of conics and quadrics and completely integrable billiards, periodic billiard trajectories, polygonal billiards, mechanisms of chaos in billiard dynamics, and the lesser-known subject of dual (or outer) billiards. The book is based on an advanced undergraduate topics course (but contains more material than can be realistically taught in one semester). Although the minimum prerequisit...
Introduction to tropical geometry
Maclagan, Diane
2015-01-01
Tropical geometry is a combinatorial shadow of algebraic geometry, offering new polyhedral tools to compute invariants of algebraic varieties. It is based on tropical algebra, where the sum of two numbers is their minimum and the product is their sum. This turns polynomials into piecewise-linear functions, and their zero sets into polyhedral complexes. These tropical varieties retain a surprising amount of information about their classical counterparts. Tropical geometry is a young subject that has undergone a rapid development since the beginning of the 21st century. While establishing itself as an area in its own right, deep connections have been made to many branches of pure and applied mathematics. This book offers a self-contained introduction to tropical geometry, suitable as a course text for beginning graduate students. Proofs are provided for the main results, such as the Fundamental Theorem and the Structure Theorem. Numerous examples and explicit computations illustrate the main concepts. Each of t...
Rudiments of algebraic geometry
Jenner, WE
2017-01-01
Aimed at advanced undergraduate students of mathematics, this concise text covers the basics of algebraic geometry. Topics include affine spaces, projective spaces, rational curves, algebraic sets with group structure, more. 1963 edition.
Implosions and hypertoric geometry
DEFF Research Database (Denmark)
Dancer, A.; Kirwan, F.; Swann, A.
2013-01-01
The geometry of the universal hyperkahler implosion for SU (n) is explored. In particular, we show that the universal hyperkahler implosion naturally contains a hypertoric variety described in terms of quivers. Furthermore, we discuss a gauge theoretic approach to hyperkahler implosion.......The geometry of the universal hyperkahler implosion for SU (n) is explored. In particular, we show that the universal hyperkahler implosion naturally contains a hypertoric variety described in terms of quivers. Furthermore, we discuss a gauge theoretic approach to hyperkahler implosion....
Intermediate algebra & analytic geometry
Gondin, William R
1967-01-01
Intermediate Algebra & Analytic Geometry Made Simple focuses on the principles, processes, calculations, and methodologies involved in intermediate algebra and analytic geometry. The publication first offers information on linear equations in two unknowns and variables, functions, and graphs. Discussions focus on graphic interpretations, explicit and implicit functions, first quadrant graphs, variables and functions, determinate and indeterminate systems, independent and dependent equations, and defective and redundant systems. The text then examines quadratic equations in one variable, system
Ceresole, Anna; Gnecchi, Alessandra; Marrani, Alessio
2013-01-01
We analyze some properties of the four dimensional supergravity theories which originate from five dimensions upon reduction. They generalize to N>2 extended supersymmetries the d-geometries with cubic prepotentials, familiar from N=2 special K\\"ahler geometry. We emphasize the role of a suitable parametrization of the scalar fields and the corresponding triangular symplectic basis. We also consider applications to the first order flow equations for non-BPS extremal black holes.
International Nuclear Information System (INIS)
Siva, Shankar; Devereux, Tomas; Kron, Tomas
2014-01-01
The purpose of this study is to assess the impact of a vacuum immobilisation system on reproducibility of patient set-up, interfraction stability and tumour motion amplitude. From February 2010 to February 2012 as part of a prospective clinical trial 12 patients with solitary pulmonary metastases had consecutive four-dimensional computed tomography (4DCT) scans performed with and without vacuum immobilisation. The displacement of the tumour centroid position was recorded in each of the 10 phases of the 4DCT reconstruction. A further six patients with seven metastases underwent single fraction stereotactic ablative body radiotherapy (SABR) during this period (a total of 19 targets) and were included in an analysis of positional reproducibility and intrafraction immobilisation. Couch shifts recorded in the medio-lateral (X), cranio-caudal (Y) and ventero-dorsal (Z) planes. For the 19 treatments delivered, the median (0–90% range) shift required immediately pretreatment was 1mm (0–3) in the X-plane, 2mm (0–6) in the Y-plane and 4mm (0–8) in the Z-plane, respectively. The mean (+/− standard deviation) of mid-treatment shifts were 0.3mm (+/− 0.7), 1.1mm (+/− 2) and 0.8mm (+/− 1.5) in the X, Y and Z planes, respectively. Mid-treatment shifts were <2mm in all directions (P=<0.001). The length of treatment time correlated to the required shifts in the Z plane (r2=0.377, P=0.005), but not in the X or Y planes (P=0.198 and P=0.653, respectively). In the subset of 12 patients who had two 4DCTs, the median (range) amplitude of tumour displacements in the X, Y and Z planes when immobilised were 0.9mm (0.3–2.9), 2.6mm (0.2–10.6) and 1.6mm (0.5–5.5), respectively. Immobilisation reduced the volume of tumour displacement during respiration by a median of 52.6% (P=0.021). Vacuum immobilisation reduces total tumour excursion, facilitates reproducible positioning and provides robust intrafractional immobilisation during SABR treatments for pulmonary metastases.
Soudarissanane, S.S.
2016-01-01
Over the past few decades, Terrestrial Laser Scanners are increasingly being used in a broad spectrum of applications, from surveying to civil engineering, medical modeling and forensics. Especially surveying applications require on one hand a quickly obtainable, high resolution point cloud but also
International Nuclear Information System (INIS)
Osborne, I; Brownson, E; Eulisse, G; Jones, C D; Sexton-Kennedy, E; Lange, D J
2014-01-01
CMS faces real challenges with upgrade of the CMS detector through 2020 and beyond. One of the challenges, from the software point of view, is managing upgrade simulations with the same software release as the 2013 scenario. We present the CMS geometry description software model, its integration with the CMS event setup and core software. The CMS geometry configuration and selection is implemented in Python. The tools collect the Python configuration fragments into a script used in CMS workflow. This flexible and automated geometry configuration allows choosing either transient or persistent version of the same scenario and specific version of the same scenario. We describe how the geometries are integrated and validated, and how we define and handle different geometry scenarios in simulation and reconstruction. We discuss how to transparently manage multiple incompatible geometries in the same software release. Several examples are shown based on current implementation assuring consistent choice of scenario conditions. The consequences and implications for multiple/different code algorithms are discussed.
Software Geometry in Simulations
Alion, Tyler; Viren, Brett; Junk, Tom
2015-04-01
The Long Baseline Neutrino Experiment (LBNE) involves many detectors. The experiment's near detector (ND) facility, may ultimately involve several detectors. The far detector (FD) will be significantly larger than any other Liquid Argon (LAr) detector yet constructed; many prototype detectors are being constructed and studied to motivate a plethora of proposed FD designs. Whether it be a constructed prototype or a proposed ND/FD design, every design must be simulated and analyzed. This presents a considerable challenge to LBNE software experts; each detector geometry must be described to the simulation software in an efficient way which allows for multiple authors to easily collaborate. Furthermore, different geometry versions must be tracked throughout their use. We present a framework called General Geometry Description (GGD), written and developed by LBNE software collaborators for managing software to generate geometries. Though GGD is flexible enough to be used by any experiment working with detectors, we present it's first use in generating Geometry Description Markup Language (GDML) files to interface with LArSoft, a framework of detector simulations, event reconstruction, and data analyses written for all LAr technology users at Fermilab. Brett is the other of the framework discussed here, the General Geometry Description (GGD).
Geometric Monte Carlo and black Janus geometries
Energy Technology Data Exchange (ETDEWEB)
Bak, Dongsu, E-mail: dsbak@uos.ac.kr [Physics Department, University of Seoul, Seoul 02504 (Korea, Republic of); B.W. Lee Center for Fields, Gravity & Strings, Institute for Basic Sciences, Daejeon 34047 (Korea, Republic of); Kim, Chanju, E-mail: cjkim@ewha.ac.kr [Department of Physics, Ewha Womans University, Seoul 03760 (Korea, Republic of); Kim, Kyung Kiu, E-mail: kimkyungkiu@gmail.com [Department of Physics, Sejong University, Seoul 05006 (Korea, Republic of); Department of Physics, College of Science, Yonsei University, Seoul 03722 (Korea, Republic of); Min, Hyunsoo, E-mail: hsmin@uos.ac.kr [Physics Department, University of Seoul, Seoul 02504 (Korea, Republic of); Song, Jeong-Pil, E-mail: jeong_pil_song@brown.edu [Department of Chemistry, Brown University, Providence, RI 02912 (United States)
2017-04-10
We describe an application of the Monte Carlo method to the Janus deformation of the black brane background. We present numerical results for three and five dimensional black Janus geometries with planar and spherical interfaces. In particular, we argue that the 5D geometry with a spherical interface has an application in understanding the finite temperature bag-like QCD model via the AdS/CFT correspondence. The accuracy and convergence of the algorithm are evaluated with respect to the grid spacing. The systematic errors of the method are determined using an exact solution of 3D black Janus. This numerical approach for solving linear problems is unaffected initial guess of a trial solution and can handle an arbitrary geometry under various boundary conditions in the presence of source fields.
Global aspects of complex geometry
Catanese, Fabrizio; Huckleberry, Alan T
2006-01-01
Present an overview of developments in Complex Geometry. This book covers topics that range from curve and surface theory through special varieties in higher dimensions, moduli theory, Kahler geometry, and group actions to Hodge theory and characteristic p-geometry.
Statistical errors in Monte Carlo estimates of systematic errors
Roe, Byron P.
2007-01-01
For estimating the effects of a number of systematic errors on a data sample, one can generate Monte Carlo (MC) runs with systematic parameters varied and examine the change in the desired observed result. Two methods are often used. In the unisim method, the systematic parameters are varied one at a time by one standard deviation, each parameter corresponding to a MC run. In the multisim method (see ), each MC run has all of the parameters varied; the amount of variation is chosen from the expected distribution of each systematic parameter, usually assumed to be a normal distribution. The variance of the overall systematic error determination is derived for each of the two methods and comparisons are made between them. If one focuses not on the error in the prediction of an individual systematic error, but on the overall error due to all systematic errors in the error matrix element in data bin m, the number of events needed is strongly reduced because of the averaging effect over all of the errors. For simple models presented here the multisim model was far better if the statistical error in the MC samples was larger than an individual systematic error, while for the reverse case, the unisim model was better. Exact formulas and formulas for the simple toy models are presented so that realistic calculations can be made. The calculations in the present note are valid if the errors are in a linear region. If that region extends sufficiently far, one can have the unisims or multisims correspond to k standard deviations instead of one. This reduces the number of events required by a factor of k2. The specific terms unisim and multisim were coined by Peter Meyers and Steve Brice, respectively, for the MiniBooNE experiment. However, the concepts have been developed over time and have been in general use for some time.
Statistical errors in Monte Carlo estimates of systematic errors
Energy Technology Data Exchange (ETDEWEB)
Roe, Byron P. [Department of Physics, University of Michigan, Ann Arbor, MI 48109 (United States)]. E-mail: byronroe@umich.edu
2007-01-01
For estimating the effects of a number of systematic errors on a data sample, one can generate Monte Carlo (MC) runs with systematic parameters varied and examine the change in the desired observed result. Two methods are often used. In the unisim method, the systematic parameters are varied one at a time by one standard deviation, each parameter corresponding to a MC run. In the multisim method (see ), each MC run has all of the parameters varied; the amount of variation is chosen from the expected distribution of each systematic parameter, usually assumed to be a normal distribution. The variance of the overall systematic error determination is derived for each of the two methods and comparisons are made between them. If one focuses not on the error in the prediction of an individual systematic error, but on the overall error due to all systematic errors in the error matrix element in data bin m, the number of events needed is strongly reduced because of the averaging effect over all of the errors. For simple models presented here the multisim model was far better if the statistical error in the MC samples was larger than an individual systematic error, while for the reverse case, the unisim model was better. Exact formulas and formulas for the simple toy models are presented so that realistic calculations can be made. The calculations in the present note are valid if the errors are in a linear region. If that region extends sufficiently far, one can have the unisims or multisims correspond to k standard deviations instead of one. This reduces the number of events required by a factor of k{sup 2}.
Statistical errors in Monte Carlo estimates of systematic errors
International Nuclear Information System (INIS)
Roe, Byron P.
2007-01-01
For estimating the effects of a number of systematic errors on a data sample, one can generate Monte Carlo (MC) runs with systematic parameters varied and examine the change in the desired observed result. Two methods are often used. In the unisim method, the systematic parameters are varied one at a time by one standard deviation, each parameter corresponding to a MC run. In the multisim method (see ), each MC run has all of the parameters varied; the amount of variation is chosen from the expected distribution of each systematic parameter, usually assumed to be a normal distribution. The variance of the overall systematic error determination is derived for each of the two methods and comparisons are made between them. If one focuses not on the error in the prediction of an individual systematic error, but on the overall error due to all systematic errors in the error matrix element in data bin m, the number of events needed is strongly reduced because of the averaging effect over all of the errors. For simple models presented here the multisim model was far better if the statistical error in the MC samples was larger than an individual systematic error, while for the reverse case, the unisim model was better. Exact formulas and formulas for the simple toy models are presented so that realistic calculations can be made. The calculations in the present note are valid if the errors are in a linear region. If that region extends sufficiently far, one can have the unisims or multisims correspond to k standard deviations instead of one. This reduces the number of events required by a factor of k 2
Westbrook, J I; Li, L; Raban, M Z; Baysari, M T; Mumford, V; Prgomet, M; Georgiou, A; Kim, T; Lake, R; McCullagh, C; Dalla-Pozza, L; Karnon, J; O'Brien, T A; Ambler, G; Day, R; Cowell, C T; Gazarian, M; Worthington, R; Lehmann, C U; White, L; Barbaric, D; Gardo, A; Kelly, M; Kennedy, P
2016-10-21
Medication errors are the most frequent cause of preventable harm in hospitals. Medication management in paediatric patients is particularly complex and consequently potential for harms are greater than in adults. Electronic medication management (eMM) systems are heralded as a highly effective intervention to reduce adverse drug events (ADEs), yet internationally evidence of their effectiveness in paediatric populations is limited. This study will assess the effectiveness of an eMM system to reduce medication errors, ADEs and length of stay (LOS). The study will also investigate system impact on clinical work processes. A stepped-wedge cluster randomised controlled trial (SWCRCT) will measure changes pre-eMM and post-eMM system implementation in prescribing and medication administration error (MAE) rates, potential and actual ADEs, and average LOS. In stage 1, 8 wards within the first paediatric hospital will be randomised to receive the eMM system 1 week apart. In stage 2, the second paediatric hospital will randomise implementation of a modified eMM and outcomes will be assessed. Prescribing errors will be identified through record reviews, and MAEs through direct observation of nurses and record reviews. Actual and potential severity will be assigned. Outcomes will be assessed at the patient-level using mixed models, taking into account correlation of admissions within wards and multiple admissions for the same patient, with adjustment for potential confounders. Interviews and direct observation of clinicians will investigate the effects of the system on workflow. Data from site 1 will be used to develop improvements in the eMM and implemented at site 2, where the SWCRCT design will be repeated (stage 2). The research has been approved by the Human Research Ethics Committee of the Sydney Children's Hospitals Network and Macquarie University. Results will be reported through academic journals and seminar and conference presentations. Australian New Zealand
Sources of hyperbolic geometry
Stillwell, John
1996-01-01
This book presents, for the first time in English, the papers of Beltrami, Klein, and Poincaré that brought hyperbolic geometry into the mainstream of mathematics. A recognition of Beltrami comparable to that given the pioneering works of Bolyai and Lobachevsky seems long overdue-not only because Beltrami rescued hyperbolic geometry from oblivion by proving it to be logically consistent, but because he gave it a concrete meaning (a model) that made hyperbolic geometry part of ordinary mathematics. The models subsequently discovered by Klein and Poincaré brought hyperbolic geometry even further down to earth and paved the way for the current explosion of activity in low-dimensional geometry and topology. By placing the works of these three mathematicians side by side and providing commentaries, this book gives the student, historian, or professional geometer a bird's-eye view of one of the great episodes in mathematics. The unified setting and historical context reveal the insights of Beltrami, Klein, and Po...
International Nuclear Information System (INIS)
Jonsson, Rickard; Westman, Hans
2006-01-01
We show that by employing the standard projected curvature as a measure of spatial curvature, we can make a certain generalization of optical geometry (Abramowicz M A and Lasota J-P 1997 Class. Quantum Grav. A 14 23-30). This generalization applies to any spacetime that admits a hypersurface orthogonal shearfree congruence of worldlines. This is a somewhat larger class of spacetimes than the conformally static spacetimes assumed in standard optical geometry. In the generalized optical geometry, which in the generic case is time dependent, photons move with unit speed along spatial geodesics and the sideways force experienced by a particle following a spatially straight line is independent of the velocity. Also gyroscopes moving along spatial geodesics do not precess (relative to the forward direction). Gyroscopes that follow a curved spatial trajectory precess according to a very simple law of three-rotation. We also present an inertial force formalism in coordinate representation for this generalization. Furthermore, we show that by employing a new sense of spatial curvature (Jonsson R 2006 Class. Quantum Grav. 23 1)) closely connected to Fermat's principle, we can make a more extensive generalization of optical geometry that applies to arbitrary spacetimes. In general this optical geometry will be time dependent, but still geodesic photons move with unit speed and follow lines that are spatially straight in the new sense. Also, the sideways experienced (comoving) force on a test particle following a line that is straight in the new sense will be independent of the velocity
Sources of Error in Satellite Navigation Positioning
Directory of Open Access Journals (Sweden)
Jacek Januszewski
2017-09-01
Full Text Available An uninterrupted information about the user’s position can be obtained generally from satellite navigation system (SNS. At the time of this writing (January 2017 currently two global SNSs, GPS and GLONASS, are fully operational, two next, also global, Galileo and BeiDou are under construction. In each SNS the accuracy of the user’s position is affected by the three main factors: accuracy of each satellite position, accuracy of pseudorange measurement and satellite geometry. The user’s position error is a function of both the pseudorange error called UERE (User Equivalent Range Error and user/satellite geometry expressed by right Dilution Of Precision (DOP coefficient. This error is decomposed into two types of errors: the signal in space ranging error called URE (User Range Error and the user equipment error UEE. The detailed analyses of URE, UEE, UERE and DOP coefficients, and the changes of DOP coefficients in different days are presented in this paper.
Computational synthetic geometry
Bokowski, Jürgen
1989-01-01
Computational synthetic geometry deals with methods for realizing abstract geometric objects in concrete vector spaces. This research monograph considers a large class of problems from convexity and discrete geometry including constructing convex polytopes from simplicial complexes, vector geometries from incidence structures and hyperplane arrangements from oriented matroids. It turns out that algorithms for these constructions exist if and only if arbitrary polynomial equations are decidable with respect to the underlying field. Besides such complexity theorems a variety of symbolic algorithms are discussed, and the methods are applied to obtain new mathematical results on convex polytopes, projective configurations and the combinatorics of Grassmann varieties. Finally algebraic varieties characterizing matroids and oriented matroids are introduced providing a new basis for applying computer algebra methods in this field. The necessary background knowledge is reviewed briefly. The text is accessible to stud...
Discrete and computational geometry
Devadoss, Satyan L
2011-01-01
Discrete geometry is a relatively new development in pure mathematics, while computational geometry is an emerging area in applications-driven computer science. Their intermingling has yielded exciting advances in recent years, yet what has been lacking until now is an undergraduate textbook that bridges the gap between the two. Discrete and Computational Geometry offers a comprehensive yet accessible introduction to this cutting-edge frontier of mathematics and computer science. This book covers traditional topics such as convex hulls, triangulations, and Voronoi diagrams, as well as more recent subjects like pseudotriangulations, curve reconstruction, and locked chains. It also touches on more advanced material, including Dehn invariants, associahedra, quasigeodesics, Morse theory, and the recent resolution of the Poincaré conjecture. Connections to real-world applications are made throughout, and algorithms are presented independently of any programming language. This richly illustrated textbook also fe...
Ochiai, T.; Nacher, J. C.
2011-09-01
Recently, the application of geometry and conformal mappings to artificial materials (metamaterials) has attracted the attention in various research communities. These materials, characterized by a unique man-made structure, have unusual optical properties, which materials found in nature do not exhibit. By applying the geometry and conformal mappings theory to metamaterial science, it may be possible to realize so-called "Harry Potter cloaking device". Although such a device is still in the science fiction realm, several works have shown that by using such metamaterials it may be possible to control the direction of the electromagnetic field at will. We could then make an object hidden inside of a cloaking device. Here, we will explain how to design invisibility device using differential geometry and conformal mappings.
2002-01-01
Discrete geometry investigates combinatorial properties of configurations of geometric objects. To a working mathematician or computer scientist, it offers sophisticated results and techniques of great diversity and it is a foundation for fields such as computational geometry or combinatorial optimization. This book is primarily a textbook introduction to various areas of discrete geometry. In each area, it explains several key results and methods, in an accessible and concrete manner. It also contains more advanced material in separate sections and thus it can serve as a collection of surveys in several narrower subfields. The main topics include: basics on convex sets, convex polytopes, and hyperplane arrangements; combinatorial complexity of geometric configurations; intersection patterns and transversals of convex sets; geometric Ramsey-type results; polyhedral combinatorics and high-dimensional convexity; and lastly, embeddings of finite metric spaces into normed spaces. Jiri Matousek is Professor of Com...
Zheng, Fangyang
2002-01-01
The theory of complex manifolds overlaps with several branches of mathematics, including differential geometry, algebraic geometry, several complex variables, global analysis, topology, algebraic number theory, and mathematical physics. Complex manifolds provide a rich class of geometric objects, for example the (common) zero locus of any generic set of complex polynomials is always a complex manifold. Yet complex manifolds behave differently than generic smooth manifolds; they are more coherent and fragile. The rich yet restrictive character of complex manifolds makes them a special and interesting object of study. This book is a self-contained graduate textbook that discusses the differential geometric aspects of complex manifolds. The first part contains standard materials from general topology, differentiable manifolds, and basic Riemannian geometry. The second part discusses complex manifolds and analytic varieties, sheaves and holomorphic vector bundles, and gives a brief account of the surface classifi...
Heuristic errors in clinical reasoning.
Rylander, Melanie; Guerrasio, Jeannette
2016-08-01
Errors in clinical reasoning contribute to patient morbidity and mortality. The purpose of this study was to determine the types of heuristic errors made by third-year medical students and first-year residents. This study surveyed approximately 150 clinical educators inquiring about the types of heuristic errors they observed in third-year medical students and first-year residents. Anchoring and premature closure were the two most common errors observed amongst third-year medical students and first-year residents. There was no difference in the types of errors observed in the two groups. Errors in clinical reasoning contribute to patient morbidity and mortality Clinical educators perceived that both third-year medical students and first-year residents committed similar heuristic errors, implying that additional medical knowledge and clinical experience do not affect the types of heuristic errors made. Further work is needed to help identify methods that can be used to reduce heuristic errors early in a clinician's education. © 2015 John Wiley & Sons Ltd.
Yale, Paul B
2012-01-01
This book is an introduction to the geometry of Euclidean, affine, and projective spaces with special emphasis on the important groups of symmetries of these spaces. The two major objectives of the text are to introduce the main ideas of affine and projective spaces and to develop facility in handling transformations and groups of transformations. Since there are many good texts on affine and projective planes, the author has concentrated on the n-dimensional cases.Designed to be used in advanced undergraduate mathematics or physics courses, the book focuses on ""practical geometry,"" emphasi
Laboratory errors and patient safety.
Miligy, Dawlat A
2015-01-01
Laboratory data are extensively used in medical practice; consequently, laboratory errors have a tremendous impact on patient safety. Therefore, programs designed to identify and reduce laboratory errors, as well as, setting specific strategies are required to minimize these errors and improve patient safety. The purpose of this paper is to identify part of the commonly encountered laboratory errors throughout our practice in laboratory work, their hazards on patient health care and some measures and recommendations to minimize or to eliminate these errors. Recording the encountered laboratory errors during May 2008 and their statistical evaluation (using simple percent distribution) have been done in the department of laboratory of one of the private hospitals in Egypt. Errors have been classified according to the laboratory phases and according to their implication on patient health. Data obtained out of 1,600 testing procedure revealed that the total number of encountered errors is 14 tests (0.87 percent of total testing procedures). Most of the encountered errors lay in the pre- and post-analytic phases of testing cycle (representing 35.7 and 50 percent, respectively, of total errors). While the number of test errors encountered in the analytic phase represented only 14.3 percent of total errors. About 85.7 percent of total errors were of non-significant implication on patients health being detected before test reports have been submitted to the patients. On the other hand, the number of test errors that have been already submitted to patients and reach the physician represented 14.3 percent of total errors. Only 7.1 percent of the errors could have an impact on patient diagnosis. The findings of this study were concomitant with those published from the USA and other countries. This proves that laboratory problems are universal and need general standardization and bench marking measures. Original being the first data published from Arabic countries that
Modeling coherent errors in quantum error correction
Greenbaum, Daniel; Dutton, Zachary
2018-01-01
Analysis of quantum error correcting codes is typically done using a stochastic, Pauli channel error model for describing the noise on physical qubits. However, it was recently found that coherent errors (systematic rotations) on physical data qubits result in both physical and logical error rates that differ significantly from those predicted by a Pauli model. Here we examine the accuracy of the Pauli approximation for noise containing coherent errors (characterized by a rotation angle ɛ) under the repetition code. We derive an analytic expression for the logical error channel as a function of arbitrary code distance d and concatenation level n, in the small error limit. We find that coherent physical errors result in logical errors that are partially coherent and therefore non-Pauli. However, the coherent part of the logical error is negligible at fewer than {ε }-({dn-1)} error correction cycles when the decoder is optimized for independent Pauli errors, thus providing a regime of validity for the Pauli approximation. Above this number of correction cycles, the persistent coherent logical error will cause logical failure more quickly than the Pauli model would predict, and this may need to be combated with coherent suppression methods at the physical level or larger codes.
Indoor localization using unsupervised manifold alignment with geometry perturbation
Majeed, Khaqan
2014-04-01
The main limitation of deploying/updating Received Signal Strength (RSS) based indoor localization is the construction of fingerprinted radio map, which is quite a hectic and time-consuming process especially when the indoor area is enormous and/or dynamic. Different approaches have been undertaken to reduce such deployment/update efforts, but the performance degrades when the fingerprinting load is reduced below a certain level. In this paper, we propose an indoor localization scheme that requires as low as 1% fingerprinting load. This scheme employs unsupervised manifold alignment that takes crowd sourced RSS readings and localization requests as source data set and the environment\\'s plan coordinates as destination data set. The 1% fingerprinting load is only used to perturb the local geometries in the destination data set. Our proposed algorithm was shown to achieve less than 5 m mean localization error with 1% fingerprinting load and a limited number of crowd sourced readings, when other learning based localization schemes pass the 10 m mean error with the same information.
Indoor localization using unsupervised manifold alignment with geometry perturbation
Majeed, Khaqan; Sorour, Sameh; Al-Naffouri, Tareq Y.; Valaee, Shahrokh
2014-01-01
The main limitation of deploying/updating Received Signal Strength (RSS) based indoor localization is the construction of fingerprinted radio map, which is quite a hectic and time-consuming process especially when the indoor area is enormous and/or dynamic. Different approaches have been undertaken to reduce such deployment/update efforts, but the performance degrades when the fingerprinting load is reduced below a certain level. In this paper, we propose an indoor localization scheme that requires as low as 1% fingerprinting load. This scheme employs unsupervised manifold alignment that takes crowd sourced RSS readings and localization requests as source data set and the environment's plan coordinates as destination data set. The 1% fingerprinting load is only used to perturb the local geometries in the destination data set. Our proposed algorithm was shown to achieve less than 5 m mean localization error with 1% fingerprinting load and a limited number of crowd sourced readings, when other learning based localization schemes pass the 10 m mean error with the same information.
Reducing Contact Resistance Errors In Measuring Thermal ...
African Journals Online (AJOL)
Values of thermal conductivity (k) of glass beads, quartz sand, stone dust and clay were determined using a thermal probe with and without heat sink compounds (arctic silver grease (ASG) and white grease (WG)) at different water contents, bulk densities and particle sizes. The heat sink compounds (HSC) increased k at ...
Cognitive aspect of diagnostic errors.
Phua, Dong Haur; Tan, Nigel C K
2013-01-01
Diagnostic errors can result in tangible harm to patients. Despite our advances in medicine, the mental processes required to make a diagnosis exhibits shortcomings, causing diagnostic errors. Cognitive factors are found to be an important cause of diagnostic errors. With new understanding from psychology and social sciences, clinical medicine is now beginning to appreciate that our clinical reasoning can take the form of analytical reasoning or heuristics. Different factors like cognitive biases and affective influences can also impel unwary clinicians to make diagnostic errors. Various strategies have been proposed to reduce the effect of cognitive biases and affective influences when clinicians make diagnoses; however evidence for the efficacy of these methods is still sparse. This paper aims to introduce the reader to the cognitive aspect of diagnostic errors, in the hope that clinicians can use this knowledge to improve diagnostic accuracy and patient outcomes.
Towards relativistic quantum geometry
Energy Technology Data Exchange (ETDEWEB)
Ridao, Luis Santiago [Instituto de Investigaciones Físicas de Mar del Plata (IFIMAR), Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET), Mar del Plata (Argentina); Bellini, Mauricio, E-mail: mbellini@mdp.edu.ar [Departamento de Física, Facultad de Ciencias Exactas y Naturales, Universidad Nacional de Mar del Plata, Funes 3350, C.P. 7600, Mar del Plata (Argentina); Instituto de Investigaciones Físicas de Mar del Plata (IFIMAR), Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET), Mar del Plata (Argentina)
2015-12-17
We obtain a gauge-invariant relativistic quantum geometry by using a Weylian-like manifold with a geometric scalar field which provides a gauge-invariant relativistic quantum theory in which the algebra of the Weylian-like field depends on observers. An example for a Reissner–Nordström black-hole is studied.
Multiplicity in difference geometry
Tomasic, Ivan
2011-01-01
We prove a first principle of preservation of multiplicity in difference geometry, paving the way for the development of a more general intersection theory. In particular, the fibres of a \\sigma-finite morphism between difference curves are all of the same size, when counted with correct multiplicities.
Spacetime and Euclidean geometry
Brill, Dieter; Jacobson, Ted
2006-04-01
Using only the principle of relativity and Euclidean geometry we show in this pedagogical article that the square of proper time or length in a two-dimensional spacetime diagram is proportional to the Euclidean area of the corresponding causal domain. We use this relation to derive the Minkowski line element by two geometric proofs of the spacetime Pythagoras theorem.
International Nuclear Information System (INIS)
Konopleva, N.P.
2009-01-01
The basic ideas of description methods of physical fields and elementary particle interactions are discussed. One of such ideas is the conception of space-time geometry. In this connection experimental measurement methods are analyzed. It is shown that measure procedures are the origin of geometrical axioms. The connection between space symmetry properties and the conservation laws is considered
Wares, Arsalan; Elstak, Iwan
2017-01-01
The purpose of this paper is to describe the mathematics that emanates from the construction of an origami box. We first construct a simple origami box from a rectangular sheet and then discuss some of the mathematical questions that arise in the context of geometry and algebra. The activity can be used as a context for illustrating how algebra…
MacKeown, P. K.
1984-01-01
Clarifies two concepts of gravity--those of a fictitious force and those of how space and time may have geometry. Reviews the position of Newton's theory of gravity in the context of special relativity and considers why gravity (as distinct from electromagnetics) lends itself to Einstein's revolutionary interpretation. (JN)
DEFF Research Database (Denmark)
Booss-Bavnbek, Bernhelm
2011-01-01
This paper applies I.M. Gelfand's distinction between adequate and non-adequate use of mathematical language in different contexts to the newly opened window of model-based measurements of intracellular dynamics. The specifics of geometry and dynamics on the mesoscale of cell physiology are elabo...
Diophantine geometry an introduction
Hindry, Marc
2000-01-01
This is an introduction to diophantine geometry at the advanced graduate level. The book contains a proof of the Mordell conjecture which will make it quite attractive to graduate students and professional mathematicians. In each part of the book, the reader will find numerous exercises.
Sliding vane geometry turbines
Sun, Harold Huimin; Zhang, Jizhong; Hu, Liangjun; Hanna, Dave R
2014-12-30
Various systems and methods are described for a variable geometry turbine. In one example, a turbine nozzle comprises a central axis and a nozzle vane. The nozzle vane includes a stationary vane and a sliding vane. The sliding vane is positioned to slide in a direction substantially tangent to an inner circumference of the turbine nozzle and in contact with the stationary vane.
Boyer, Carl B
2012-01-01
Designed as an integrated survey of the development of analytic geometry, this study presents the concepts and contributions from before the Alexandrian Age through the eras of the great French mathematicians Fermat and Descartes, and on through Newton and Euler to the "Golden Age," from 1789 to 1850.
Coxeter, HSM
1965-01-01
This textbook introduces non-Euclidean geometry, and the third edition adds a new chapter, including a description of the two families of 'mid-lines' between two given lines and an elementary derivation of the basic formulae of spherical trigonometry and hyperbolic trigonometry, and other new material.
International Nuclear Information System (INIS)
Ezin, J.P.
1988-08-01
The lectures given at the ''5th Symposium of Mathematics in Abidjan: Differential Geometry and Mechanics'' are presented. They are divided into four chapters: Riemannian metric on a differential manifold, curvature tensor fields on a Riemannian manifold, some classical functionals on Riemannian manifolds and questions. 11 refs
Hartshorne, Robin
2000-01-01
In recent years, I have been teaching a junior-senior-level course on the classi cal geometries. This book has grown out of that teaching experience. I assume only high-school geometry and some abstract algebra. The course begins in Chapter 1 with a critical examination of Euclid's Elements. Students are expected to read concurrently Books I-IV of Euclid's text, which must be obtained sepa rately. The remainder of the book is an exploration of questions that arise natu rally from this reading, together with their modern answers. To shore up the foundations we use Hilbert's axioms. The Cartesian plane over a field provides an analytic model of the theory, and conversely, we see that one can introduce coordinates into an abstract geometry. The theory of area is analyzed by cutting figures into triangles. The algebra of field extensions provides a method for deciding which geometrical constructions are possible. The investigation of the parallel postulate leads to the various non-Euclidean geometries. And ...
Improved Landau gauge fixing and discretisation errors
International Nuclear Information System (INIS)
Bonnet, F.D.R.; Bowman, P.O.; Leinweber, D.B.; Richards, D.G.; Williams, A.G.
2000-01-01
Lattice discretisation errors in the Landau gauge condition are examined. An improved gauge fixing algorithm in which O(a 2 ) errors are removed is presented. O(a 2 ) improvement of the gauge fixing condition displays the secondary benefit of reducing the size of higher-order errors. These results emphasise the importance of implementing an improved gauge fixing condition
Examination of the program to avoid round-off error
International Nuclear Information System (INIS)
Shiota, Y.; Kusunoki, T.; Tabushi, K.; Shimomura, K.; Kitou, S.
2005-01-01
The MACRO programs which express a simple shape such as PLANE, SPHERE, CYLINDER and CONE, are used to the formation of the geometry in EGS4. Each MACRO calculates the important value for the main code to recognize the configured geometry. This calculation process may generate the calculation error due to the effect of a round-off error. SPHERE, CYLINDER and CONE MACRO include the function to avoid the effect, but PLANE MACRO dose not include. The effect of the round-off error is small usually in case of PLANE MACRO, however a slant plane may cause the expansion of the effect. Therefore, we have configured the DELPLANE program with the function to avoid the effect of the round-off error. In this study, we examine the DELPLANE program using the simply geometry with slant plane. As a result, the normal PLANE MACRO generates the round-off error, however DELPLANE program dose not generates one. (author)
Effects of Geometry Design Parameters on the Static Strength and Dynamics for Spiral Bevel Gear
Directory of Open Access Journals (Sweden)
Zhiheng Feng
2017-01-01
Full Text Available Considering the geometry design parameters, a quasi-static mesh model of spiral bevel gears was established and the mesh characteristics were computed. Considering the time-varying effects of mesh points, mesh force, line-of-action vector, mesh stiffness, transmission error, friction force direction, and friction coefficient, a nonlinear lumped parameter dynamic model was developed for the spiral bevel gear pair. Based on the mesh model and the nonlinear dynamic model, the effects of main geometry parameters on the contact and bending strength were analyzed. Also, the effects on the dynamic mesh force and dynamic transmission error were investigated. Results show that higher value for the pressure angle, root fillet radius, and the ratio of tooth thickness tend to improve the contact and bending strength and to reduce the risk of tooth fracture. Improved gears have a better vibration performance in the targeted frequency range. Finally, bench tests for both types of spiral bevel gears were performed. Results show that the main failure mode is the tooth fracture and the life was increased a lot for the spiral bevel gears with improved geometry parameters compared to the original design.
Learning from prescribing errors
Dean, B
2002-01-01
The importance of learning from medical error has recently received increasing emphasis. This paper focuses on prescribing errors and argues that, while learning from prescribing errors is a laudable goal, there are currently barriers that can prevent this occurring. Learning from errors can take place on an individual level, at a team level, and across an organisation. Barriers to learning from prescribing errors include the non-discovery of many prescribing errors, lack of feedback to th...
Multivariate calculus and geometry
Dineen, Seán
2014-01-01
Multivariate calculus can be understood best by combining geometric insight, intuitive arguments, detailed explanations and mathematical reasoning. This textbook has successfully followed this programme. It additionally provides a solid description of the basic concepts, via familiar examples, which are then tested in technically demanding situations. In this new edition the introductory chapter and two of the chapters on the geometry of surfaces have been revised. Some exercises have been replaced and others provided with expanded solutions. Familiarity with partial derivatives and a course in linear algebra are essential prerequisites for readers of this book. Multivariate Calculus and Geometry is aimed primarily at higher level undergraduates in the mathematical sciences. The inclusion of many practical examples involving problems of several variables will appeal to mathematics, science and engineering students.
Transformational plane geometry
Umble, Ronald N
2014-01-01
Axioms of Euclidean Plane Geometry The Existence and Incidence Postulates The Distance and Ruler Postulates The Plane Separation Postulate The Protractor Postulate The Side-Angle-Side Postulate and the Euclidean Parallel Postulate Theorems of Euclidean Plane Geometry The Exterior Angle Theorem Triangle Congruence Theorems The Alternate Interior Angles Theorem and the Angle Sum Theorem Similar Triangles Introduction to Transformations, Isometries, and Similarities Transformations Isometries and SimilaritiesAppendix: Proof of Surjectivity Translations, Rotations, and Reflections Translations Rotations Reflections Appendix: Geometer's Sketchpad Commands Required by Exploratory Activities Compositions of Translations, Rotations, and Reflections The Three Points Theorem Rotations as Compositions of Two Reflections Translations as Compositions of Two Halfturns or Two Reflections The Angle Addition Theorem Glide Reflections Classification of Isometries The Fundamental Theorem and Congruence Classification of Isometr...
Krauss, Lawrence M.; Turner, Michael S.
1999-01-01
The recognition that the cosmological constant may be non-zero forces us to re-evaluate standard notions about the connection between geometry and the fate of our Universe. An open Universe can recollapse, and a closed Universe can expand forever. As a corollary, we point out that there is no set of cosmological observations we can perform that will unambiguously allow us to determine what the ultimate destiny of the Universe will be.
DEFF Research Database (Denmark)
Tamke, Martin; Ramsgaard Thomsen, Mette; Riiber Nielsen, Jacob
2009-01-01
The versatility of wood constructions and traditional wood joints for the production of non standard elements was in focus of a design based research. Herein we established a seamless process from digital design to fabrication. A first research phase centered on the development of a robust...... parametric model and a generic design language a later explored the possibilities to construct complex shaped geometries with self registering joints on modern wood crafting machines. The research was carried out as collaboration with industrial partners....
International Nuclear Information System (INIS)
Lepora, N.; Kibble, T.
1999-01-01
We analyse symmetry breaking in the Weinberg-Salam model paying particular attention to the underlying geometry of the theory. In this context we find two natural metrics upon the vacuum manifold: an isotropic metric associated with the scalar sector, and a squashed metric associated with the gauge sector. Physically, the interplay between these metrics gives rise to many of the non-perturbative features of Weinberg-Salam theory. (author)
Integral geometry and valuations
Solanes, Gil
2014-01-01
Valuations are finitely additive functionals on the space of convex bodies. Their study has become a central subject in convexity theory, with fundamental applications to integral geometry. In the last years there has been significant progress in the theory of valuations, which in turn has led to important achievements in integral geometry. This book originated from two courses delivered by the authors at the CRM and provides a self-contained introduction to these topics, covering most of the recent advances. The first part, by Semyon Alesker, is devoted to the theory of convex valuations, with emphasis on the latest developments. A special focus is put on the new fundamental structures of the space of valuations discovered after Alesker's irreducibility theorem. Moreover, the author describes the newly developed theory of valuations on manifolds. In the second part, Joseph H. G. Fu gives a modern introduction to integral geometry in the sense of Blaschke and Santaló, based on the notions and tools presented...
CBM RICH geometry optimization
Energy Technology Data Exchange (ETDEWEB)
Mahmoud, Tariq; Hoehne, Claudia [II. Physikalisches Institut, Giessen Univ. (Germany); Collaboration: CBM-Collaboration
2016-07-01
The Compressed Baryonic Matter (CBM) experiment at the future FAIR complex will investigate the phase diagram of strongly interacting matter at high baryon density and moderate temperatures in A+A collisions from 2-11 AGeV (SIS100) beam energy. The main electron identification detector in the CBM experiment will be a RICH detector with a CO{sub 2} gaseous-radiator, focusing spherical glass mirrors, and MAPMT photo-detectors being placed on a PMT-plane. The RICH detector is located directly behind the CBM dipole magnet. As the final magnet geometry is now available, some changes in the RICH geometry become necessary. In order to guarantee a magnetic field of 1 mT at maximum in the PMT plane for effective operation of the MAPMTs, two measures have to be taken: The PMT plane is moved outwards of the stray field by tilting the mirrors by 10 degrees and shielding boxes have been designed. In this contribution the results of the geometry optimization procedure are presented.
Directory of Open Access Journals (Sweden)
Murray Scott A
2009-05-01
Full Text Available Abstract Background Medication errors are an important cause of morbidity and mortality in primary care. The aims of this study are to determine the effectiveness, cost effectiveness and acceptability of a pharmacist-led information-technology-based complex intervention compared with simple feedback in reducing proportions of patients at risk from potentially hazardous prescribing and medicines management in general (family practice. Methods Research subject group: "At-risk" patients registered with computerised general practices in two geographical regions in England. Design: Parallel group pragmatic cluster randomised trial. Interventions: Practices will be randomised to either: (i Computer-generated feedback; or (ii Pharmacist-led intervention comprising of computer-generated feedback, educational outreach and dedicated support. Primary outcome measures: The proportion of patients in each practice at six and 12 months post intervention: - with a computer-recorded history of peptic ulcer being prescribed non-selective non-steroidal anti-inflammatory drugs - with a computer-recorded diagnosis of asthma being prescribed beta-blockers - aged 75 years and older receiving long-term prescriptions for angiotensin converting enzyme inhibitors or loop diuretics without a recorded assessment of renal function and electrolytes in the preceding 15 months. Secondary outcome measures; These relate to a number of other examples of potentially hazardous prescribing and medicines management. Economic analysis: An economic evaluation will be done of the cost per error avoided, from the perspective of the UK National Health Service (NHS, comparing the pharmacist-led intervention with simple feedback. Qualitative analysis: A qualitative study will be conducted to explore the views and experiences of health care professionals and NHS managers concerning the interventions, and investigate possible reasons why the interventions prove effective, or conversely prove
International Nuclear Information System (INIS)
Gervais, J.L.
1993-01-01
By analyzing the extrinsic geometry of two dimensional surfaces chirally embedded in C P n (the C P n W-surface), we give exact treatments in various aspects of the classical W-geometry in the conformal gauge: First, the basis of tangent and normal vectors are defined at regular points of the surface, such that their infinitesimal displacements are given by connections which coincide with the vector potentials of the (conformal) A n -Toda Lax pair. Since the latter is known to be intrinsically related with the W symmetries, this gives the geometrical meaning of the A n W-Algebra. Second, W-surfaces are put in one-to-one correspondence with solutions of the conformally-reduced WZNW model, which is such that the Toda fields give the Cartan part in the Gauss decomposition of its solutions. Third, the additional variables of the Toda hierarchy are used as coordinates of C P n . This allows us to show that W-transformations may be extended as particular diffeomorphisms of this target-space. Higher-dimensional generalizations of the WZNW equations are derived and related with the Zakharov-Shabat equations of the Toda hierarchy. Fourth, singular points are studied from a global viewpoint, using our earlier observation that W-surfaces may be regarded as instantons. The global indices of the W-geometry, which are written in terms of the Toda fields, are shown to be the instanton numbers for associated mappings of W-surfaces into the Grassmannians. The relation with the singularities of W-surface is derived by combining the Toda equations with the Gauss-Bonnet theorem. (orig.)
Introducing geometry concept based on history of Islamic geometry
Maarif, S.; Wahyudin; Raditya, A.; Perbowo, K. S.
2018-01-01
Geometry is one of the areas of mathematics interesting to discuss. Geometry also has a long history in mathematical developments. Therefore, it is important integrated historical development of geometry in the classroom to increase’ knowledge of how mathematicians earlier finding and constructing a geometric concept. Introduction geometrical concept can be started by introducing the Muslim mathematician who invented these concepts so that students can understand in detail how a concept of geometry can be found. However, the history of mathematics development, especially history of Islamic geometry today is less popular in the world of education in Indonesia. There are several concepts discovered by Muslim mathematicians that should be appreciated by the students in learning geometry. Great ideas of mathematicians Muslim can be used as study materials to supplement religious character values taught by Muslim mathematicians. Additionally, by integrating the history of geometry in teaching geometry are expected to improve motivation and geometrical understanding concept.
Parameterized combinatorial geometry modeling in Moritz
International Nuclear Information System (INIS)
Van Riper, K.A.
2005-01-01
We describe the use of named variables as surface and solid body coefficients in the Moritz geometry editing program. Variables can also be used as material numbers, cell densities, and transformation values. A variable is defined as a constant or an arithmetic combination of constants and other variables. A variable reference, such as in a surface coefficient, can be a single variable or an expression containing variables and constants. Moritz can read and write geometry models in MCNP and ITS ACCEPT format; support for other codes will be added. The geometry can be saved with either the variables in place, for modifying the models in Moritz, or with the variables evaluated for use in the transport codes. A program window shows a list of variables and provides fields for editing them. Surface coefficients and other values that use a variable reference are shown in a distinctive style on object property dialogs; associated buttons show fields for editing the reference. We discuss our use of variables in defining geometry models for shielding studies in PET clinics. When a model is parameterized through the use of variables, changes such as room dimensions, shielding layer widths, and cell compositions can be quickly achieved by changing a few numbers without requiring knowledge of the input syntax for the transport code or the tedious and error prone work of recalculating many surface or solid body coefficients. (author)
Two lectures on D-geometry and noncommutative geometry
International Nuclear Information System (INIS)
Douglas, M.R.
1999-01-01
This is a write-up of lectures given at the 1998 Spring School at the Abdus Salam ICTP. We give a conceptual introduction to D-geometry, the study of geometry as seen by D-branes in string theory, and to noncommutative geometry as it has appeared in D-brane and Matrix theory physics. (author)
Simplified discrete ordinates method in spherical geometry
International Nuclear Information System (INIS)
Elsawi, M.A.; Abdurrahman, N.M.; Yavuz, M.
1999-01-01
The authors extend the method of simplified discrete ordinates (SS N ) to spherical geometry. The motivation for such an extension is that the appearance of the angular derivative (redistribution) term in the spherical geometry transport equation makes it difficult to decide which differencing scheme best approximates this term. In the present method, the angular derivative term is treated implicitly and thus avoids the need for the approximation of such term. This method can be considered to be analytic in nature with the advantage of being free from spatial truncation errors from which most of the existing transport codes suffer. In addition, it treats the angular redistribution term implicitly with the advantage of avoiding approximations to that term. The method also can handle scattering in a very general manner with the advantage of spending almost the same computational effort for all scattering modes. Moreover, the methods can easily be applied to higher-order S N calculations
International Nuclear Information System (INIS)
Anon.
1991-01-01
This chapter addresses the extension of previous work in one-dimensional (linear) error theory to two-dimensional error analysis. The topics of the chapter include the definition of two-dimensional error, the probability ellipse, the probability circle, elliptical (circular) error evaluation, the application to position accuracy, and the use of control systems (points) in measurements
International Nuclear Information System (INIS)
Picard, R.R.
1989-01-01
Topics covered in this chapter include a discussion of exact results as related to nuclear materials management and accounting in nuclear facilities; propagation of error for a single measured value; propagation of error for several measured values; error propagation for materials balances; and an application of error propagation to an example of uranium hexafluoride conversion process
Martínez-Legaz, Juan Enrique; Soubeyran, Antoine
2003-01-01
We present a model of learning in which agents learn from errors. If an action turns out to be an error, the agent rejects not only that action but also neighboring actions. We find that, keeping memory of his errors, under mild assumptions an acceptable solution is asymptotically reached. Moreover, one can take advantage of big errors for a faster learning.
An improved injector bunching geometry for ATLAS
Indian Academy of Sciences (India)
This geometry improves the handling of space charge for high-current beams, signiﬁcantly increases the capture fraction into the primary rf bucket and reduces the capture fraction of the unwanted parasitic rf bucket. Total capture and transport through the PII has been demonstrated as high as 80% of the injected dc beam ...
Comparative study of the gamma spectrometry method performance in different measurement geometries
International Nuclear Information System (INIS)
Diaconescu, C.; Ichim, C.; Bujoreanu, L.; Florea, I.
2013-01-01
This paper presents the results obtained by gamma spectrometry on aqueous liquid waste sample using different measurement geometries. A liquid waste sample with known gamma emitters content was measured in three different geometries in order to assess the influence of the geometry on the final results. To obtain low measurement errors, gamma spectrometer was calibrated using a calibration standard with the same physical and chemical characteristics as the sample to be measured. Since the calibration was performed with the source at contact with HPGe detector, the waste sample was also measured, for all the three geometries, at the detector contact. The influence of the measurement geometry on the results was evaluated by computing the relative errors. The measurements performed using three different geometries (250 ml plastic vial, Sarpagan box and 24 ml Tricarb vial) showed that all these geometries may be used to quantify the activity of gamma emitters in different type of radioactive waste. (authors)
Error field considerations for BPX
International Nuclear Information System (INIS)
LaHaye, R.J.
1992-01-01
Irregularities in the position of poloidal and/or toroidal field coils in tokamaks produce resonant toroidal asymmetries in the vacuum magnetic fields. Otherwise stable tokamak discharges become non-linearly unstable to disruptive locked modes when subjected to low level error fields. Because of the field errors, magnetic islands are produced which would not otherwise occur in tearing mode table configurations; a concomitant reduction of the total confinement can result. Poloidal and toroidal asymmetries arise in the heat flux to the divertor target. In this paper, the field errors from perturbed BPX coils are used in a field line tracing code of the BPX equilibrium to study these deleterious effects. Limits on coil irregularities for device design and fabrication are computed along with possible correcting coils for reducing such field errors
International Nuclear Information System (INIS)
Hook, D W
2008-01-01
A geometric framework for quantum mechanics arose during the mid 1970s when authors such as Cantoni explored the notion of generalized transition probabilities, and Kibble promoted the idea that the space of pure quantum states provides a natural quantum mechanical analogue for classical phase space. This central idea can be seen easily since the projection of Schroedinger's equation from a Hilbert space into the space of pure spaces is a set of Hamilton's equations. Over the intervening years considerable work has been carried out by a variety of authors and a mature description of quantum mechanics in geometric terms has emerged with many applications. This current offering would seem ideally placed to review the last thirty years of progress and relate this to the most recent work in quantum entanglement. Bengtsson and Zyczkowski's beautifully illustrated volume, Geometry of Quantum States (referred to as GQS from now on) attempts to cover considerable ground in its 466 pages. Its topics range from colour theory in Chapter 1 to quantum entanglement in Chapter 15-to say that this is a whirlwind tour is, perhaps, no understatement. The use of the work 'introduction' in the subtitle of GQS, might suggest to the reader that this work be viewed as a textbook and I think that this interpretation would be incorrect. The authors have chosen to present a survey of different topics with the specific aim to introduce entanglement in geometric terms-the book is not intended as a pedagogical introduction to the geometric approach to quantum mechanics. Each of the fifteen chapters is a short, and mostly self-contained, essay on a particular aspect or application of geometry in the context of quantum mechanics with entanglement being addressed specifically in the final chapter. The chapters fall into three classifications: those concerned with the mathematical background, those which discuss quantum theory and the foundational aspects of the geometric framework, and
Generalized Gaussian Error Calculus
Grabe, Michael
2010-01-01
For the first time in 200 years Generalized Gaussian Error Calculus addresses a rigorous, complete and self-consistent revision of the Gaussian error calculus. Since experimentalists realized that measurements in general are burdened by unknown systematic errors, the classical, widespread used evaluation procedures scrutinizing the consequences of random errors alone turned out to be obsolete. As a matter of course, the error calculus to-be, treating random and unknown systematic errors side by side, should ensure the consistency and traceability of physical units, physical constants and physical quantities at large. The generalized Gaussian error calculus considers unknown systematic errors to spawn biased estimators. Beyond, random errors are asked to conform to the idea of what the author calls well-defined measuring conditions. The approach features the properties of a building kit: any overall uncertainty turns out to be the sum of a contribution due to random errors, to be taken from a confidence inter...
Functional integration over geometries
International Nuclear Information System (INIS)
Mottola, E.
1995-01-01
The geometric construction of the functional integral over coset spaces M/G is reviewed. The inner product on the cotangent space of infinitesimal deformations of M defines an invariant distance and volume form, or functional integration measure on the full configuration space. Then, by a simple change of coordinates parameterizing the gauge fiber G, the functional measure on the coset space M/G is deduced. This change of integration variables leads to a Jacobian which is entirely equivalent to the Faddeev--Popov determinant of the more traditional gauge fixed approach in non-abelian gauge theory. If the general construction is applied to the case where G is the group of coordinate reparameterizations of spacetime, the continuum functional integral over geometries, i.e. metrics modulo coordinate reparameterizations may be defined. The invariant functional integration measure is used to derive the trace anomaly and effective action for the conformal part of the metric in two and four dimensional spacetime. In two dimensions this approach generates the Polyakov--Liouville action of closed bosonic non-critical string theory. In four dimensions the corresponding effective action leads to novel conclusions on the importance of quantum effects in gravity in the far infrared, and in particular, a dramatic modification of the classical Einstein theory at cosmological distance scales, signaled first by the quantum instability of classical de Sitter spacetime. Finite volume scaling relations for the functional integral of quantum gravity in two and four dimensions are derived, and comparison with the discretized dynamical triangulation approach to the integration over geometries are discussed. Outstanding unsolved problems in both the continuum definition and the simplicial approach to the functional integral over geometries are highlighted
Dooner, David B
2012-01-01
Building on the first edition published in 1995 this new edition of Kinematic Geometry of Gearing has been extensively revised and updated with new and original material. This includes the methodology for general tooth forms, radius of torsure', cylinder of osculation, and cylindroid of torsure; the author has also completely reworked the '3 laws of gearing', the first law re-written to better parallel the existing 'Law of Gearing" as pioneered by Leonard Euler, expanded from Euler's original law to encompass non-circular gears and hypoid gears, the 2nd law of gearing describing a unique relat
Flegg, H Graham
2001-01-01
This excellent introduction to topology eases first-year math students and general readers into the subject by surveying its concepts in a descriptive and intuitive way, attempting to build a bridge from the familiar concepts of geometry to the formalized study of topology. The first three chapters focus on congruence classes defined by transformations in real Euclidean space. As the number of permitted transformations increases, these classes become larger, and their common topological properties become intuitively clear. Chapters 4-12 give a largely intuitive presentation of selected topics.
Torsional heterotic geometries
International Nuclear Information System (INIS)
Becker, Katrin; Sethi, Savdeep
2009-01-01
We construct new examples of torsional heterotic backgrounds using duality with orientifold flux compactifications. We explain how duality provides a perturbative solution to the type I/heterotic string Bianchi identity. The choice of connection used in the Bianchi identity plays an important role in the construction. We propose the existence of a much larger landscape of compact torsional geometries using string duality. Finally, we present some quantum exact metrics that correspond to NS5-branes placed on an elliptic space. These metrics describe how torus isometries are broken by NS flux.
Geometrie verstehen: statisch - kinematisch
Kroll, Ekkehard
Dem Allgemeinen steht begrifflich das Besondere gegenüber. In diesem Sinne sind allgemeine Überlegungen zum Verstehen von Mathematik zu ergänzen durch Untersuchungen hinsichtlich des Verstehens der einzelnen mathematischen Disziplinen, insbesondere der Geometrie. Hier haben viele Schülerinnen und Schüler Probleme. Diese rühren hauptsächlich daher, dass eine fertige geometrische Konstruktion in ihrer statischen Präsentation auf Papier nicht mehr die einzelnen Konstruktionsschritte erkennen lässt; zum Nachvollzug müssen sie daher ergänzend in einer Konstruktionsbeschreibung festgehalten werden.
Kendig, Keith
2015-01-01
Designed to make learning introductory algebraic geometry as easy as possible, this text is intended for advanced undergraduates and graduate students who have taken a one-year course in algebra and are familiar with complex analysis. This newly updated second edition enhances the original treatment's extensive use of concrete examples and exercises with numerous figures that have been specially redrawn in Adobe Illustrator. An introductory chapter that focuses on examples of curves is followed by a more rigorous and careful look at plane curves. Subsequent chapters explore commutative ring th
Abhyankar, Shreeram Shankar
1964-01-01
This book provides, for use in a graduate course or for self-study by graduate students, a well-motivated treatment of several topics, especially the following: (1) algebraic treatment of several complex variables; (2) geometric approach to algebraic geometry via analytic sets; (3) survey of local algebra; (4) survey of sheaf theory. The book has been written in the spirit of Weierstrass. Power series play the dominant role. The treatment, being algebraic, is not restricted to complex numbers, but remains valid over any complete-valued field. This makes it applicable to situations arising from
Akopyan, A V
2007-01-01
The book is devoted to the properties of conics (plane curves of second degree) that can be formulated and proved using only elementary geometry. Starting with the well-known optical properties of conics, the authors move to less trivial results, both classical and contemporary. In particular, the chapter on projective properties of conics contains a detailed analysis of the polar correspondence, pencils of conics, and the Poncelet theorem. In the chapter on metric properties of conics the authors discuss, in particular, inscribed conics, normals to conics, and the Poncelet theorem for confoca
2015-01-01
This stimulating volume offers a broad collection of the principles of geometry and trigonometry and contains colorful diagrams to bring mathematical principles to life. Subjects are enriched by references to famous mathematicians and their ideas, and the stories are presented in a very comprehensible way. Readers investigate the relationships of points, lines, surfaces, and solids. They study construction methods for drawing figures, a wealth of facts about these figures, and above all, methods to prove the facts. They learn about triangle measure for circular motion, sine and cosine, tangent
REA, The Editors of
2012-01-01
REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Geometry I includes methods of proof, points, lines, planes, angles, congruent angles and line segments, triangles, parallelism, quadrilaterals, geometric inequalities, and geometric
Graded geometry and Poisson reduction
Cattaneo, A S; Zambon, M
2009-01-01
The main result of [2] extends the Marsden-Ratiu reduction theorem [4] in Poisson geometry, and is proven by means of graded geometry. In this note we provide the background material about graded geometry necessary for the proof in [2]. Further, we provide an alternative algebraic proof for the main result. ©2009 American Institute of Physics
Quantization of the Schwarzschild geometry
International Nuclear Information System (INIS)
Melas, Evangelos
2013-01-01
The conditional symmetries of the reduced Einstein-Hilbert action emerging from a static, spherically symmetric geometry are used as supplementary conditions on the wave function. Based on their integrability conditions, only one of the three existing symmetries can be consistently imposed, while the unique Casimir invariant, being the product of the remaining two symmetries, is calculated as the only possible second condition on the wave function. This quadratic integral of motion is identified with the reparametrization generator, as an implication of the uniqueness of the dynamical evolution, by fixing a suitable parametrization of the r-lapse function. In this parametrization, the determinant of the supermetric plays the role of the mesure. The combined Wheeler – DeWitt and linear conditional symmetry equations are analytically solved. The solutions obtained depend on the product of the two ''scale factors''.
Thermal geometry from CFT at finite temperature
Directory of Open Access Journals (Sweden)
Wen-Cong Gan
2016-09-01
Full Text Available We present how the thermal geometry emerges from CFT at finite temperature by using the truncated entanglement renormalization network, the cMERA. For the case of 2d CFT, the reduced geometry is the BTZ black hole or the thermal AdS as expectation. In order to determine which spacetimes prefer to form, we propose a cMERA description of the Hawking–Page phase transition. Our proposal is in agreement with the picture of the recent proposed surface/state correspondence.
Thermal geometry from CFT at finite temperature
Energy Technology Data Exchange (ETDEWEB)
Gan, Wen-Cong, E-mail: ganwencong@gmail.com [Department of Physics, Nanchang University, Nanchang 330031 (China); Center for Relativistic Astrophysics and High Energy Physics, Nanchang University, Nanchang 330031 (China); Shu, Fu-Wen, E-mail: shufuwen@ncu.edu.cn [Department of Physics, Nanchang University, Nanchang 330031 (China); Center for Relativistic Astrophysics and High Energy Physics, Nanchang University, Nanchang 330031 (China); Wu, Meng-He, E-mail: menghewu.physik@gmail.com [Department of Physics, Nanchang University, Nanchang 330031 (China); Center for Relativistic Astrophysics and High Energy Physics, Nanchang University, Nanchang 330031 (China)
2016-09-10
We present how the thermal geometry emerges from CFT at finite temperature by using the truncated entanglement renormalization network, the cMERA. For the case of 2d CFT, the reduced geometry is the BTZ black hole or the thermal AdS as expectation. In order to determine which spacetimes prefer to form, we propose a cMERA description of the Hawking–Page phase transition. Our proposal is in agreement with the picture of the recent proposed surface/state correspondence.
Geometry of the local equivalence of states
Energy Technology Data Exchange (ETDEWEB)
Sawicki, A; Kus, M, E-mail: assawi@cft.edu.pl, E-mail: marek.kus@cft.edu.pl [Center for Theoretical Physics, Polish Academy of Sciences, Al Lotnikow 32/46, 02-668 Warszawa (Poland)
2011-12-09
We present a description of locally equivalent states in terms of symplectic geometry. Using the moment map between local orbits in the space of states and coadjoint orbits of the local unitary group, we reduce the problem of local unitary equivalence to an easy part consisting of identifying the proper coadjoint orbit and a harder problem of the geometry of fibers of the moment map. We give a detailed analysis of the properties of orbits of 'equally entangled states'. In particular, we show connections between certain symplectic properties of orbits such as their isotropy and coisotropy with effective criteria of local unitary equivalence. (paper)
Dual Numbers Approach in Multiaxis Machines Error Modeling
Directory of Open Access Journals (Sweden)
Jaroslav Hrdina
2014-01-01
Full Text Available Multiaxis machines error modeling is set in the context of modern differential geometry and linear algebra. We apply special classes of matrices over dual numbers and propose a generalization of such concept by means of general Weil algebras. We show that the classification of the geometric errors follows directly from the algebraic properties of the matrices over dual numbers and thus the calculus over the dual numbers is the proper tool for the methodology of multiaxis machines error modeling.
Simulating Irregular Source Geometries for Ionian Plumes
McDoniel, W. J.; Goldstein, D. B.; Varghese, P. L.; Trafton, L. M.; Buchta, D. A.; Freund, J.; Kieffer, S. W.
2011-05-01
Volcanic plumes on Io respresent a complex rarefied flow into a near-vacuum in the presence of gravity. A 3D Direct Simulation Monte Carlo (DSMC) method is used to investigate the gas dynamics of such plumes, with a focus on the effects of source geometry on far-field deposition patterns. A rectangular slit and a semicircular half annulus are simulated to illustrate general principles, especially the effects of vent curvature on deposition ring structure. Then two possible models for the giant plume Pele are presented. One is a curved line source corresponding to an IR image of a particularly hot region in the volcano's caldera and the other is a large area source corresponding to the entire caldera. The former is seen to produce the features seen in observations of Pele's ring, but with an error in orientation. The latter corrects the error in orientation, but loses some structure. A hybrid simulation of 3D slit flow is also discussed.
Simulating Irregular Source Geometries for Ionian Plumes
International Nuclear Information System (INIS)
McDoniel, W. J.; Goldstein, D. B.; Varghese, P. L.; Trafton, L. M.; Buchta, D. A.; Freund, J.; Kieffer, S. W.
2011-01-01
Volcanic plumes on Io respresent a complex rarefied flow into a near-vacuum in the presence of gravity. A 3D Direct Simulation Monte Carlo (DSMC) method is used to investigate the gas dynamics of such plumes, with a focus on the effects of source geometry on far-field deposition patterns. A rectangular slit and a semicircular half annulus are simulated to illustrate general principles, especially the effects of vent curvature on deposition ring structure. Then two possible models for the giant plume Pele are presented. One is a curved line source corresponding to an IR image of a particularly hot region in the volcano's caldera and the other is a large area source corresponding to the entire caldera. The former is seen to produce the features seen in observations of Pele's ring, but with an error in orientation. The latter corrects the error in orientation, but loses some structure. A hybrid simulation of 3D slit flow is also discussed.
Error analysis of the crystal orientations obtained by the dictionary approach to EBSD indexing.
Ram, Farangis; Wright, Stuart; Singh, Saransh; De Graef, Marc
2017-10-01
The efficacy of the dictionary approach to Electron Back-Scatter Diffraction (EBSD) indexing was evaluated through the analysis of the error in the retrieved crystal orientations. EBSPs simulated by the Callahan-De Graef forward model were used for this purpose. Patterns were noised, distorted, and binned prior to dictionary indexing. Patterns with a high level of noise, with optical distortions, and with a 25 × 25 pixel size, when the error in projection center was 0.7% of the pattern width and the error in specimen tilt was 0.8°, were indexed with a 0.8° mean error in orientation. The same patterns, but 60 × 60 pixel in size, were indexed by the standard 2D Hough transform based approach with almost the same orientation accuracy. Optimal detection parameters in the Hough space were obtained by minimizing the orientation error. It was shown that if the error in detector geometry can be reduced to 0.1% in projection center and 0.1° in specimen tilt, the dictionary approach can retrieve a crystal orientation with a 0.2° accuracy. Copyright © 2017 Elsevier B.V. All rights reserved.
Presentation of geometries and transient results of TRAC-calculations
International Nuclear Information System (INIS)
Lutz, A.; Lang, U.; Ruehle, R.
1985-02-01
The computer code TRAC is used to analyze the transient behaviour of nuclear reactors. The input of a TRAC-Calculation, as well as the produced result files serve for the graphical presentation of the geometries and transient results. This supports the search for errors during input generation and the understanding of complex processes by dynamic presentation of calculational result in colour. (orig.) [de
[Medical errors: inevitable but preventable].
Giard, R W
2001-10-27
Medical errors are increasingly reported in the lay press. Studies have shown dramatic error rates of 10 percent or even higher. From a methodological point of view, studying the frequency and causes of medical errors is far from simple. Clinical decisions on diagnostic or therapeutic interventions are always taken within a clinical context. Reviewing outcomes of interventions without taking into account both the intentions and the arguments for a particular action will limit the conclusions from a study on the rate and preventability of errors. The interpretation of the preventability of medical errors is fraught with difficulties and probably highly subjective. Blaming the doctor personally does not do justice to the actual situation and especially the organisational framework. Attention for and improvement of the organisational aspects of error are far more important then litigating the person. To err is and will remain human and if we want to reduce the incidence of faults we must be able to learn from our mistakes. That requires an open attitude towards medical mistakes, a continuous effort in their detection, a sound analysis and, where feasible, the institution of preventive measures.
Bochnak, Jacek; Roy, Marie-Françoise
1998-01-01
This book is a systematic treatment of real algebraic geometry, a subject that has strong interrelation with other areas of mathematics: singularity theory, differential topology, quadratic forms, commutative algebra, model theory, complexity theory etc. The careful and clearly written account covers both basic concepts and up-to-date research topics. It may be used as text for a graduate course. The present edition is a substantially revised and expanded English version of the book "Géometrie algébrique réelle" originally published in French, in 1987, as Volume 12 of ERGEBNISSE. Since the publication of the French version the theory has made advances in several directions. Many of these are included in this English version. Thus the English book may be regarded as a completely new treatment of the subject.
Critique of information geometry
International Nuclear Information System (INIS)
Skilling, John
2014-01-01
As applied to probability, information geometry fails because probability distributions do not form a metric space. Probability theory rests on a compelling foundation of elementary symmetries, which also support information (aka minus entropy, Kullback-Leibler) H(p;q) as the unique measure of divergence from source probability distribution q to destination p. Because the only compatible connective H is from≠to asymmetric, H(p;q)≠H(q;p), there can be no compatible geometrical distance (which would necessarily be from=to symmetric). Hence there is no distance relationship compatible with the structure of probability theory. Metrics g and densities sqrt(det(g)) interpreted as prior probabilities follow from the definition of distance, and must fail likewise. Various metrics and corresponding priors have been proposed, Fisher's being the most popular, but all must behave unacceptably. This is illustrated with simple counter-examples
International Nuclear Information System (INIS)
Correa, Diego H.; Silva, Guillermo A.
2008-01-01
We discuss how geometrical and topological aspects of certain (1/2)-BPS type IIB geometries are captured by their dual operators in N = 4 Super Yang-Mills theory. The type IIB solutions are characterized by arbitrary droplet pictures in a plane and we consider, in particular, axially symmetric droplets. The 1-loop anomalous dimension of the dual gauge theory operators probed with single traces is described by some bosonic lattice Hamiltonians. These Hamiltonians are shown to encode the topology of the droplets. In appropriate BMN limits, the Hamiltonians spectrum reproduces the spectrum of near-BPS string excitations propagating along each of the individual edges of the droplet. We also study semiclassical regimes for the Hamiltonians. For droplets having disconnected constituents, the Hamiltonian admits different complimentary semiclassical descriptions, each one replicating the semiclassical description for closed strings extending in each of the constituents
Emergent geometry of membranes
Energy Technology Data Exchange (ETDEWEB)
Badyn, Mathias Hudoba de; Karczmarek, Joanna L.; Sabella-Garnier, Philippe; Yeh, Ken Huai-Che [Department of Physics and Astronomy, University of British Columbia,6224 Agricultural Road, Vancouver (Canada)
2015-11-13
In work http://dx.doi.org/10.1103/PhysRevD.86.086001, a surface embedded in flat ℝ{sup 3} is associated to any three hermitian matrices. We study this emergent surface when the matrices are large, by constructing coherent states corresponding to points in the emergent geometry. We find the original matrices determine not only shape of the emergent surface, but also a unique Poisson structure. We prove that commutators of matrix operators correspond to Poisson brackets. Through our construction, we can realize arbitrary noncommutative membranes: for example, we examine a round sphere with a non-spherically symmetric Poisson structure. We also give a natural construction for a noncommutative torus embedded in ℝ{sup 3}. Finally, we make remarks about area and find matrix equations for minimal area surfaces.
Energy Technology Data Exchange (ETDEWEB)
Elliott, C.J.; McVey, B. (Los Alamos National Lab., NM (USA)); Quimby, D.C. (Spectra Technology, Inc., Bellevue, WA (USA))
1990-01-01
The level of field errors in an FEL is an important determinant of its performance. We have computed 3D performance of a large laser subsystem subjected to field errors of various types. These calculations have been guided by simple models such as SWOOP. The technique of choice is utilization of the FELEX free electron laser code that now possesses extensive engineering capabilities. Modeling includes the ability to establish tolerances of various types: fast and slow scale field bowing, field error level, beam position monitor error level, gap errors, defocusing errors, energy slew, displacement and pointing errors. Many effects of these errors on relative gain and relative power extraction are displayed and are the essential elements of determining an error budget. The random errors also depend on the particular random number seed used in the calculation. The simultaneous display of the performance versus error level of cases with multiple seeds illustrates the variations attributable to stochasticity of this model. All these errors are evaluated numerically for comprehensive engineering of the system. In particular, gap errors are found to place requirements beyond mechanical tolerances of {plus minus}25{mu}m, and amelioration of these may occur by a procedure utilizing direct measurement of the magnetic fields at assembly time. 4 refs., 12 figs.
Geometry through history Euclidean, hyperbolic, and projective geometries
Dillon, Meighan I
2018-01-01
Presented as an engaging discourse, this textbook invites readers to delve into the historical origins and uses of geometry. The narrative traces the influence of Euclid’s system of geometry, as developed in his classic text The Elements, through the Arabic period, the modern era in the West, and up to twentieth century mathematics. Axioms and proof methods used by mathematicians from those periods are explored alongside the problems in Euclidean geometry that lead to their work. Students cultivate skills applicable to much of modern mathematics through sections that integrate concepts like projective and hyperbolic geometry with representative proof-based exercises. For its sophisticated account of ancient to modern geometries, this text assumes only a year of college mathematics as it builds towards its conclusion with algebraic curves and quaternions. Euclid’s work has affected geometry for thousands of years, so this text has something to offer to anyone who wants to broaden their appreciation for the...
Field error reduction experiment on the REPUTE-1 RFP device
International Nuclear Information System (INIS)
Toyama, H.; Shinohara, S.; Yamagishi, K.
1989-01-01
The vacuum chamber of the RFP device REPUTE-1 is a welded structure using 18 sets of 1 mm thick Inconel bellows (inner minor radius 22 cm) and 2.4 mm thick port segments arranged in toroidal geometry as shown in Fig. 1. The vacuum chamber is surrounded by 5 mm thick stainless steel shells. The time constant of the shell is 1 ms for vertical field penetration. The pulse length in REPUTE-1 is so far 3.2 ms (about 3 times longer than shell skin time). The port bypass plates have been attached as shown in Fig. 2 to reduce field errors so that the pulse length becomes longer and the loop voltage becomes lower. (author) 5 refs., 4 figs
Eliminating US hospital medical errors.
Kumar, Sameer; Steinebach, Marc
2008-01-01
Healthcare costs in the USA have continued to rise steadily since the 1980s. Medical errors are one of the major causes of deaths and injuries of thousands of patients every year, contributing to soaring healthcare costs. The purpose of this study is to examine what has been done to deal with the medical-error problem in the last two decades and present a closed-loop mistake-proof operation system for surgery processes that would likely eliminate preventable medical errors. The design method used is a combination of creating a service blueprint, implementing the six sigma DMAIC cycle, developing cause-and-effect diagrams as well as devising poka-yokes in order to develop a robust surgery operation process for a typical US hospital. In the improve phase of the six sigma DMAIC cycle, a number of poka-yoke techniques are introduced to prevent typical medical errors (identified through cause-and-effect diagrams) that may occur in surgery operation processes in US hospitals. It is the authors' assertion that implementing the new service blueprint along with the poka-yokes, will likely result in the current medical error rate to significantly improve to the six-sigma level. Additionally, designing as many redundancies as possible in the delivery of care will help reduce medical errors. Primary healthcare providers should strongly consider investing in adequate doctor and nurse staffing, and improving their education related to the quality of service delivery to minimize clinical errors. This will lead to an increase in higher fixed costs, especially in the shorter time frame. This paper focuses additional attention needed to make a sound technical and business case for implementing six sigma tools to eliminate medical errors that will enable hospital managers to increase their hospital's profitability in the long run and also ensure patient safety.
Analysis of errors in forensic science
Directory of Open Access Journals (Sweden)
Mingxiao Du
2017-01-01
Full Text Available Reliability of expert testimony is one of the foundations of judicial justice. Both expert bias and scientific errors affect the reliability of expert opinion, which in turn affects the trustworthiness of the findings of fact in legal proceedings. Expert bias can be eliminated by replacing experts; however, it may be more difficult to eliminate scientific errors. From the perspective of statistics, errors in operation of forensic science include systematic errors, random errors, and gross errors. In general, process repetition and abiding by the standard ISO/IEC:17025: 2005, general requirements for the competence of testing and calibration laboratories, during operation are common measures used to reduce errors that originate from experts and equipment, respectively. For example, to reduce gross errors, the laboratory can ensure that a test is repeated several times by different experts. In applying for forensic principles and methods, the Federal Rules of Evidence 702 mandate that judges consider factors such as peer review, to ensure the reliability of the expert testimony. As the scientific principles and methods may not undergo professional review by specialists in a certain field, peer review serves as an exclusive standard. This study also examines two types of statistical errors. As false-positive errors involve a higher possibility of an unfair decision-making, they should receive more attention than false-negative errors.
Prescription Errors in Psychiatry
African Journals Online (AJOL)
Arun Kumar Agnihotri
clinical pharmacists in detecting errors before they have a (sometimes serious) clinical impact should not be underestimated. Research on medication error in mental health care is limited. .... participation in ward rounds and adverse drug.
On organizing principles of discrete differential geometry. Geometry of spheres
International Nuclear Information System (INIS)
Bobenko, Alexander I; Suris, Yury B
2007-01-01
Discrete differential geometry aims to develop discrete equivalents of the geometric notions and methods of classical differential geometry. This survey contains a discussion of the following two fundamental discretization principles: the transformation group principle (smooth geometric objects and their discretizations are invariant with respect to the same transformation group) and the consistency principle (discretizations of smooth parametrized geometries can be extended to multidimensional consistent nets). The main concrete geometric problem treated here is discretization of curvature-line parametrized surfaces in Lie geometry. Systematic use of the discretization principles leads to a discretization of curvature-line parametrization which unifies circular and conical nets.
Higher geometry an introduction to advanced methods in analytic geometry
Woods, Frederick S
2005-01-01
For students of mathematics with a sound background in analytic geometry and some knowledge of determinants, this volume has long been among the best available expositions of advanced work on projective and algebraic geometry. Developed from Professor Woods' lectures at the Massachusetts Institute of Technology, it bridges the gap between intermediate studies in the field and highly specialized works.With exceptional thoroughness, it presents the most important general concepts and methods of advanced algebraic geometry (as distinguished from differential geometry). It offers a thorough study
An introduction to incidence geometry
De Bruyn, Bart
2016-01-01
This book gives an introduction to the field of Incidence Geometry by discussing the basic families of point-line geometries and introducing some of the mathematical techniques that are essential for their study. The families of geometries covered in this book include among others the generalized polygons, near polygons, polar spaces, dual polar spaces and designs. Also the various relationships between these geometries are investigated. Ovals and ovoids of projective spaces are studied and some applications to particular geometries will be given. A separate chapter introduces the necessary mathematical tools and techniques from graph theory. This chapter itself can be regarded as a self-contained introduction to strongly regular and distance-regular graphs. This book is essentially self-contained, only assuming the knowledge of basic notions from (linear) algebra and projective and affine geometry. Almost all theorems are accompanied with proofs and a list of exercises with full solutions is given at the end...
International Nuclear Information System (INIS)
Buescher, R.
2005-01-01
Casimir interactions are interactions induced by quantum vacuum fluctuations and thermal fluctuations of the electromagnetic field. Using a path integral quantization for the gauge field, an effective Gaussian action will be derived which is the starting point to compute Casimir forces between macroscopic objects analytically and numerically. No assumptions about the independence of the material and shape dependent contributions to the interaction are made. We study the limit of flat surfaces in further detail and obtain a concise derivation of Lifshitz' theory of molecular forces. For the case of ideally conducting boundaries, the Gaussian action will be calculated explicitly. Both limiting cases are also discussed within the framework of a scalar field quantization approach, which is applicable for translationally invariant geometries. We develop a non-perturbative approach to calculate the Casimir interaction from the Gaussian action for periodically deformed and ideally conducting objects numerically. The obtained results reveal two different scaling regimes for the Casimir force as a function of the distance between the objects, their deformation wavelength and -amplitude. The results confirm that the interaction is non-additive, especially in the presence of strong geometric deformations. Furthermore, the numerical approach is extended to calculate lateral Casimir forces. The results are consistent with the results of the proximity-force approximation for large deformation wavelengths. A qualitatively different behaviour between the normal and lateral force is revealed. We also establish a relation between the boundary induced change of the of the density of states for the scalar Helmholtz equation and the Casimir interaction using the path integral method. For statically deformed boundaries, this relation can be expressed as a novel trace formula, which is formally similar to the so-called Krein-Friedel-Lloyd formula. While the latter formula describes the
Initiation to global Finslerian geometry
Akbar-Zadeh, Hassan
2006-01-01
After a brief description of the evolution of thinking on Finslerian geometry starting from Riemann, Finsler, Berwald and Elie Cartan, the book gives a clear and precise treatment of this geometry. The first three chapters develop the basic notions and methods, introduced by the author, to reach the global problems in Finslerian Geometry. The next five chapters are independent of each other, and deal with among others the geometry of generalized Einstein manifolds, the classification of Finslerian manifolds of constant sectional curvatures. They also give a treatment of isometric, affine, p
Spacecraft and propulsion technician error
Schultz, Daniel Clyde
Commercial aviation and commercial space similarly launch, fly, and land passenger vehicles. Unlike aviation, the U.S. government has not established maintenance policies for commercial space. This study conducted a mixed methods review of 610 U.S. space launches from 1984 through 2011, which included 31 failures. An analysis of the failure causal factors showed that human error accounted for 76% of those failures, which included workmanship error accounting for 29% of the failures. With the imminent future of commercial space travel, the increased potential for the loss of human life demands that changes be made to the standardized procedures, training, and certification to reduce human error and failure rates. Several recommendations were made by this study to the FAA's Office of Commercial Space Transportation, space launch vehicle operators, and maintenance technician schools in an effort to increase the safety of the space transportation passengers.
Directory of Open Access Journals (Sweden)
Šárka Nedomová
2013-01-01
Full Text Available Precise quantification of the profile of egg can provide a powerful tool for the analysis of egg shape for various biological problems. A new approach to the geometry of a Ostrich’s egg profile is presented here using an analysing the egg’s digital photo by edge detection techniques. The obtained points on the eggshell counter are fitted by the Fourier series. The obtained equations describing an egg profile have been used to calculate radii of curvature. The radii of the curvature at the important point of the egg profile (sharp end, blunt end and maximum thickness are independent on the egg shape index. The exact values of the egg surface and the egg volume have been obtained. These quantities are also independent on the egg shape index. These quantities can be successively estimated on the basis of simplified equations which are expressed in terms of the egg length, L¸ and its width, B. The surface area of the eggshells also exhibits good correlation with the egg long circumference length. Some limitations of the most used procedures have been also shown.
Nonperturbative quantum geometries
International Nuclear Information System (INIS)
Jacobson, T.; California Univ., Santa Barbara; Smolin, L.; California Univ., Santa Barbara
1988-01-01
Using the self-dual representation of quantum general relativity, based on Ashtekar's new phase space variables, we present an infinite dimensional family of quantum states of the gravitational field which are exactly annihilated by the hamiltonian constraint. These states are constructed from Wilson loops for Ashtekar's connection (which is the spatial part of the left handed spin connection). We propose a new regularization procedure which allows us to evaluate the action of the hamiltonian constraint on these states. Infinite linear combinations of these states which are formally annihilated by the diffeomorphism constraints as well are also described. These are explicit examples of physical states of the gravitational field - and for the compact case are exact zero eigenstates of the hamiltonian of quantum general relativity. Several different approaches to constructing diffeomorphism invariant states in the self dual representation are also described. The physical interpretation of the states described here is discussed. However, as we do not yet know the physical inner product, any interpretation is at this stage speculative. Nevertheless, this work suggests that quantum geometry at Planck scales might be much simpler when explored in terms of the parallel transport of left-handed spinors than when explored in terms of the three metric. (orig.)
Bhatia, Rajendra
2013-01-01
This book is an outcome of the Indo-French Workshop on Matrix Information Geometries (MIG): Applications in Sensor and Cognitive Systems Engineering, which was held in Ecole Polytechnique and Thales Research and Technology Center, Palaiseau, France, in February 23-25, 2011. The workshop was generously funded by the Indo-French Centre for the Promotion of Advanced Research (IFCPAR). During the event, 22 renowned invited french or indian speakers gave lectures on their areas of expertise within the field of matrix analysis or processing. From these talks, a total of 17 original contribution or state-of-the-art chapters have been assembled in this volume. All articles were thoroughly peer-reviewed and improved, according to the suggestions of the international referees. The 17 contributions presented are organized in three parts: (1) State-of-the-art surveys & original matrix theory work, (2) Advanced matrix theory for radar processing, and (3) Matrix-based signal processing applications.
Random and Systematic Errors Share in Total Error of Probes for CNC Machine Tools
Directory of Open Access Journals (Sweden)
Adam Wozniak
2018-03-01
Full Text Available Probes for CNC machine tools, as every measurement device, have accuracy limited by random errors and by systematic errors. Random errors of these probes are described by a parameter called unidirectional repeatability. Manufacturers of probes for CNC machine tools usually specify only this parameter, while parameters describing systematic errors of the probes, such as pre-travel variation or triggering radius variation, are used rarely. Systematic errors of the probes, linked to the differences in pre-travel values for different measurement directions, can be corrected or compensated, but it is not a widely used procedure. In this paper, the share of systematic errors and random errors in total error of exemplary probes are determined. In the case of simple, kinematic probes, systematic errors are much greater than random errors, so compensation would significantly reduce the probing error. Moreover, it shows that in the case of kinematic probes commonly specified unidirectional repeatability is significantly better than 2D performance. However, in the case of more precise strain-gauge probe systematic errors are of the same order as random errors, which means that errors correction or compensation, in this case, would not yield any significant benefits.
Simultaneous calibration phantom commission and geometry calibration in cone beam CT
Xu, Yuan; Yang, Shuai; Ma, Jianhui; Li, Bin; Wu, Shuyu; Qi, Hongliang; Zhou, Linghong
2017-09-01
Geometry calibration is a vital step for describing the geometry of a cone beam computed tomography (CBCT) system and is a prerequisite for CBCT reconstruction. In current methods, calibration phantom commission and geometry calibration are divided into two independent tasks. Small errors in ball-bearing (BB) positioning in the phantom-making step will severely degrade the quality of phantom calibration. To solve this problem, we propose an integrated method to simultaneously realize geometry phantom commission and geometry calibration. Instead of assuming the accuracy of the geometry phantom, the integrated method considers BB centers in the phantom as an optimized parameter in the workflow. Specifically, an evaluation phantom and the corresponding evaluation contrast index are used to evaluate geometry artifacts for optimizing the BB coordinates in the geometry phantom. After utilizing particle swarm optimization, the CBCT geometry and BB coordinates in the geometry phantom are calibrated accurately and are then directly used for the next geometry calibration task in other CBCT systems. To evaluate the proposed method, both qualitative and quantitative studies were performed on simulated and realistic CBCT data. The spatial resolution of reconstructed images using dental CBCT can reach up to 15 line pair cm-1. The proposed method is also superior to the Wiesent method in experiments. This paper shows that the proposed method is attractive for simultaneous and accurate geometry phantom commission and geometry calibration.
Geometrical error calibration in reflective surface testing based on reverse Hartmann test
Gong, Zhidong; Wang, Daodang; Xu, Ping; Wang, Chao; Liang, Rongguang; Kong, Ming; Zhao, Jun; Mo, Linhai; Mo, Shuhui
2017-08-01
In the fringe-illumination deflectometry based on reverse-Hartmann-test configuration, ray tracing of the modeled testing system is performed to reconstruct the test surface error. Careful calibration of system geometry is required to achieve high testing accuracy. To realize the high-precision surface testing with reverse Hartmann test, a computer-aided geometrical error calibration method is proposed. The aberrations corresponding to various geometrical errors are studied. With the aberration weights for various geometrical errors, the computer-aided optimization of system geometry with iterative ray tracing is carried out to calibration the geometrical error, and the accuracy in the order of subnanometer is achieved.
Spur gears: Optimal geometry, methods for generation and Tooth Contact Analysis (TCA) program
Litvin, Faydor L.; Zhang, Jiao
1988-01-01
The contents of this report include the following: (1) development of optimal geometry for crowned spur gears; (2) methods for their generation; and (3) tooth contact analysis (TCA) computer programs for the analysis of meshing and bearing contact on the crowned spur gears. The method developed for synthesis is used for the determination of the optimal geometry for crowned pinion surface and is directed to reduce the sensitivity of the gears to misalignment, localize the bearing contact, and guarantee the favorable shape and low level of the transmission errors. A new method for the generation of the crowned pinion surface has been proposed. This method is based on application of the tool with a surface of revolution that slightly deviates from a regular cone surface. The tool can be used as a grinding wheel or as a shaver. The crowned pinion surface can also be generated by a generating plane whose motion is provided by an automatic grinding machine controlled by a computer. The TCA program simulates the meshing and bearing contact of the misaligned gears. The transmission errors are also determined.
Error-correcting pairs for a public-key cryptosystem
International Nuclear Information System (INIS)
Pellikaan, Ruud; Márquez-Corbella, Irene
2017-01-01
Code-based Cryptography (CBC) is a powerful and promising alternative for quantum resistant cryptography. Indeed, together with lattice-based cryptography, multivariate cryptography and hash-based cryptography are the principal available techniques for post-quantum cryptography. CBC was first introduced by McEliece where he designed one of the most efficient Public-Key encryption schemes with exceptionally strong security guarantees and other desirable properties that still resist to attacks based on Quantum Fourier Transform and Amplitude Amplification. The original proposal, which remains unbroken, was based on binary Goppa codes. Later, several families of codes have been proposed in order to reduce the key size. Some of these alternatives have already been broken. One of the main requirements of a code-based cryptosystem is having high performance t -bounded decoding algorithms which is achieved in the case the code has a t -error-correcting pair (ECP). Indeed, those McEliece schemes that use GRS codes, BCH, Goppa and algebraic geometry codes are in fact using an error-correcting pair as a secret key. That is, the security of these Public-Key Cryptosystems is not only based on the inherent intractability of bounded distance decoding but also on the assumption that it is difficult to retrieve efficiently an error-correcting pair. In this paper, the class of codes with a t -ECP is proposed for the McEliece cryptosystem. Moreover, we study the hardness of distinguishing arbitrary codes from those having a t -error correcting pair. (paper)
GPS: Geometry, Probability, and Statistics
Field, Mike
2012-01-01
It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…
Surrogate Modeling for Geometry Optimization
DEFF Research Database (Denmark)
Rojas Larrazabal, Marielba de la Caridad; Abraham, Yonas; Holzwarth, Natalie
2009-01-01
A new approach for optimizing the nuclear geometry of an atomic system is described. Instead of the original expensive objective function (energy functional), a small number of simpler surrogates is used.......A new approach for optimizing the nuclear geometry of an atomic system is described. Instead of the original expensive objective function (energy functional), a small number of simpler surrogates is used....
Kaehler geometry and SUSY mechanics
International Nuclear Information System (INIS)
Bellucci, Stefano; Nersessian, Armen
2001-01-01
We present two examples of SUSY mechanics related with Kaehler geometry. The first system is the N = 4 supersymmetric one-dimensional sigma-model proposed in hep-th/0101065. Another system is the N = 2 SUSY mechanics whose phase space is the external algebra of an arbitrary Kaehler manifold. The relation of these models with antisymplectic geometry is discussed
A prediction for bubbling geometries
Okuda, Takuya
2007-01-01
We study the supersymmetric circular Wilson loops in N=4 Yang-Mills theory. Their vacuum expectation values are computed in the parameter region that admits smooth bubbling geometry duals. The results are a prediction for the supergravity action evaluated on the bubbling geometries for Wilson loops.
Molecular motion in restricted geometries
Indian Academy of Sciences (India)
Molecular dynamics in restricted geometries is known to exhibit anomalous behaviour. Diffusion, translational or rotational, of molecules is altered significantly on confinement in restricted geometries. Quasielastic neutron scattering (QENS) offers a unique possibility of studying molecular motion in such systems. Both time ...
Mathematical model of geometry and fibrous structure of the heart.
Nielsen, P M; Le Grice, I J; Smaill, B H; Hunter, P J
1991-04-01
We developed a mathematical representation of ventricular geometry and muscle fiber organization using three-dimensional finite elements referred to a prolate spheroid coordinate system. Within elements, fields are approximated using basis functions with associated parameters defined at the element nodes. Four parameters per node are used to describe ventricular geometry. The radial coordinate is interpolated using cubic Hermite basis functions that preserve slope continuity, while the angular coordinates are interpolated linearly. Two further nodal parameters describe the orientation of myocardial fibers. The orientation of fibers within coordinate planes bounded by epicardial and endocardial surfaces is interpolated linearly, with transmural variation given by cubic Hermite basis functions. Left and right ventricular geometry and myocardial fiber orientations were characterized for a canine heart arrested in diastole and fixed at zero transmural pressure. The geometry was represented by a 24-element ensemble with 41 nodes. Nodal parameters fitted using least squares provided a realistic description of ventricular epicardial [root mean square (RMS) error less than 0.9 mm] and endocardial (RMS error less than 2.6 mm) surfaces. Measured fiber fields were also fitted (RMS error less than 17 degrees) with a 60-element, 99-node mesh obtained by subdividing the 24-element mesh. These methods provide a compact and accurate anatomic description of the ventricles suitable for use in finite element stress analysis, simulation of cardiac electrical activation, and other cardiac field modeling problems.
Cell homogenization methods for pin-by-pin core calculations tested in slab geometry
International Nuclear Information System (INIS)
Yamamoto, Akio; Kitamura, Yasunori; Yamane, Yoshihiro
2004-01-01
In this paper, performances of spatial homogenization methods for fuel or non-fuel cells are compared in slab geometry in order to facilitate pin-by-pin core calculations. Since the spatial homogenization methods were mainly developed for fuel assemblies, systematic study of their performance for the cell-level homogenization has not been carried out. Importance of cell-level homogenization is recently increasing since the pin-by-pin mesh core calculation in actual three-dimensional geometry, which is less approximate approach than current advanced nodal method, is getting feasible. Four homogenization methods were investigated in this paper; the flux-volume weighting, the generalized equivalence theory, the superhomogenization (SPH) method and the nonlinear iteration method. The last one, the nonlinear iteration method, was tested as the homogenization method for the first time. The calculations were carried out in simplified colorset assembly configurations of PWR, which are simulated by slab geometries, and homogenization performances were evaluated through comparison with the reference cell-heterogeneous calculations. The calculation results revealed that the generalized equivalence theory showed best performance. Though the nonlinear iteration method can significantly reduce homogenization error, its performance was not as good as that of the generalized equivalence theory. Through comparison of the results obtained by the generalized equivalence theory and the superhomogenization method, important byproduct was obtained; deficiency of the current superhomogenization method, which could be improved by incorporating the 'cell-level discontinuity factor between assemblies', was clarified
The error in total error reduction.
Witnauer, James E; Urcelay, Gonzalo P; Miller, Ralph R
2014-02-01
Most models of human and animal learning assume that learning is proportional to the discrepancy between a delivered outcome and the outcome predicted by all cues present during that trial (i.e., total error across a stimulus compound). This total error reduction (TER) view has been implemented in connectionist and artificial neural network models to describe the conditions under which weights between units change. Electrophysiological work has revealed that the activity of dopamine neurons is correlated with the total error signal in models of reward learning. Similar neural mechanisms presumably support fear conditioning, human contingency learning, and other types of learning. Using a computational modeling approach, we compared several TER models of associative learning to an alternative model that rejects the TER assumption in favor of local error reduction (LER), which assumes that learning about each cue is proportional to the discrepancy between the delivered outcome and the outcome predicted by that specific cue on that trial. The LER model provided a better fit to the reviewed data than the TER models. Given the superiority of the LER model with the present data sets, acceptance of TER should be tempered. Copyright © 2013 Elsevier Inc. All rights reserved.
Shafarevich, Igor Rostislavovich
1994-01-01
Shafarevich Basic Algebraic Geometry 2 The second edition of Shafarevich's introduction to algebraic geometry is in two volumes. The second volume covers schemes and complex manifolds, generalisations in two different directions of the affine and projective varieties that form the material of the first volume. Two notable additions in this second edition are the section on moduli spaces and representable functors, motivated by a discussion of the Hilbert scheme, and the section on Kähler geometry. The book ends with a historical sketch discussing the origins of algebraic geometry. From the Zentralblatt review of this volume: "... one can only respectfully repeat what has been said about the first part of the book (...): a great textbook, written by one of the leading algebraic geometers and teachers himself, has been reworked and updated. As a result the author's standard textbook on algebraic geometry has become even more important and valuable. Students, teachers, and active researchers using methods of al...
Optical geometry across the horizon
International Nuclear Information System (INIS)
Jonsson, Rickard
2006-01-01
In a recent paper (Jonsson and Westman 2006 Class. Quantum Grav. 23 61), a generalization of optical geometry, assuming a non-shearing reference congruence, is discussed. Here we illustrate that this formalism can be applied to (a finite four-volume) of any spherically symmetric spacetime. In particular we apply the formalism, using a non-static reference congruence, to do optical geometry across the horizon of a static black hole. While the resulting geometry in principle is time dependent, we can choose the reference congruence in such a manner that an embedding of the geometry always looks the same. Relative to the embedded geometry the reference points are then moving. We discuss the motion of photons, inertial forces and gyroscope precession in this framework
Antonio Boldrini; Rosa T. Scaramuzzo; Armando Cuttano
2013-01-01
Introduction: Danger and errors are inherent in human activities. In medical practice errors can lean to adverse events for patients. Mass media echo the whole scenario. Methods: We reviewed recent published papers in PubMed database to focus on the evidence and management of errors in medical practice in general and in Neonatology in particular. We compared the results of the literature with our specific experience in Nina Simulation Centre (Pisa, Italy). Results: In Neonatology the main err...
National Research Council Canada - National Science Library
Byrne, Michael D
2006-01-01
.... This problem has received surprisingly little attention from cognitive psychologists. The research summarized here examines such errors in some detail both empirically and through computational cognitive modeling...
International Nuclear Information System (INIS)
Wahlstroem, B.
1993-01-01
Human errors have a major contribution to the risks for industrial accidents. Accidents have provided important lesson making it possible to build safer systems. In avoiding human errors it is necessary to adapt the systems to their operators. The complexity of modern industrial systems is however increasing the danger of system accidents. Models of the human operator have been proposed, but the models are not able to give accurate predictions of human performance. Human errors can never be eliminated, but their frequency can be decreased by systematic efforts. The paper gives a brief summary of research in human error and it concludes with suggestions for further work. (orig.)
Three-dimensional ray-tracing model for the study of advanced refractive errors in keratoconus.
Schedin, Staffan; Hallberg, Per; Behndig, Anders
2016-01-20
We propose a numerical three-dimensional (3D) ray-tracing model for the analysis of advanced corneal refractive errors. The 3D modeling was based on measured corneal elevation data by means of Scheimpflug photography. A mathematical description of the measured corneal surfaces from a keratoconus (KC) patient was used for the 3D ray tracing, based on Snell's law of refraction. A model of a commercial intraocular lens (IOL) was included in the analysis. By modifying the posterior IOL surface, it was shown that the imaging quality could be significantly improved. The RMS values were reduced by approximately 50% close to the retina, both for on- and off-axis geometries. The 3D ray-tracing model can constitute a basis for simulation of customized IOLs that are able to correct the advanced, irregular refractive errors in KC.
Complex analysis and CR geometry
Zampieri, Giuseppe
2008-01-01
Cauchy-Riemann (CR) geometry is the study of manifolds equipped with a system of CR-type equations. Compared to the early days when the purpose of CR geometry was to supply tools for the analysis of the existence and regularity of solutions to the \\bar\\partial-Neumann problem, it has rapidly acquired a life of its own and has became an important topic in differential geometry and the study of non-linear partial differential equations. A full understanding of modern CR geometry requires knowledge of various topics such as real/complex differential and symplectic geometry, foliation theory, the geometric theory of PDE's, and microlocal analysis. Nowadays, the subject of CR geometry is very rich in results, and the amount of material required to reach competence is daunting to graduate students who wish to learn it. However, the present book does not aim at introducing all the topics of current interest in CR geometry. Instead, an attempt is made to be friendly to the novice by moving, in a fairly relaxed way, f...
The geometry description markup language
International Nuclear Information System (INIS)
Chytracek, R.
2001-01-01
Currently, a lot of effort is being put on designing complex detectors. A number of simulation and reconstruction frameworks and applications have been developed with the aim to make this job easier. A very important role in this activity is played by the geometry description of the detector apparatus layout and its working environment. However, no real common approach to represent geometry data is available and such data can be found in various forms starting from custom semi-structured text files, source code (C/C++/FORTRAN), to XML and database solutions. The XML (Extensible Markup Language) has proven to provide an interesting approach for describing detector geometries, with several different but incompatible XML-based solutions existing. Therefore, interoperability and geometry data exchange among different frameworks is not possible at present. The author introduces a markup language for geometry descriptions. Its aim is to define a common approach for sharing and exchanging of geometry description data. Its requirements and design have been driven by experience and user feedback from existing projects which have their geometry description in XML
Moreno, R.; Bazán, A. M.
2017-10-01
The main purpose of this work is to study improvements to the learning method of technical drawing and descriptive geometry through exercises with traditional techniques that are usually solved manually by applying automated processes assisted by high-level CAD templates (HLCts). Given that an exercise with traditional procedures can be solved, detailed step by step in technical drawing and descriptive geometry manuals, CAD applications allow us to do the same and generalize it later, incorporating references. Traditional teachings have become obsolete and current curricula have been relegated. However, they can be applied in certain automation processes. The use of geometric references (using variables in script languages) and their incorporation into HLCts allows the automation of drawing processes. Instead of repeatedly creating similar exercises or modifying data in the same exercises, users should be able to use HLCts to generate future modifications of these exercises. This paper introduces the automation process when generating exercises based on CAD script files, aided by parametric geometry calculation tools. The proposed method allows us to design new exercises without user intervention. The integration of CAD, mathematics, and descriptive geometry facilitates their joint learning. Automation in the generation of exercises not only saves time but also increases the quality of the statements and reduces the possibility of human error.
Hyperbolic geometry of Kuramoto oscillator networks
Chen, Bolun; Engelbrecht, Jan R.; Mirollo, Renato
2017-09-01
Kuramoto oscillator networks have the special property that their trajectories are constrained to lie on the (at most) 3D orbits of the Möbius group acting on the state space T N (the N-fold torus). This result has been used to explain the existence of the N-3 constants of motion discovered by Watanabe and Strogatz for Kuramoto oscillator networks. In this work we investigate geometric consequences of this Möbius group action. The dynamics of Kuramoto phase models can be further reduced to 2D reduced group orbits, which have a natural geometry equivalent to the unit disk \
Performance analysis of a decoding algorithm for algebraic-geometry codes
DEFF Research Database (Denmark)
Høholdt, Tom; Jensen, Helge Elbrønd; Nielsen, Rasmus Refslund
1999-01-01
The fast decoding algorithm for one point algebraic-geometry codes of Sakata, Elbrond Jensen, and Hoholdt corrects all error patterns of weight less than half the Feng-Rao minimum distance. In this correspondence we analyze the performance of the algorithm for heavier error patterns. It turns out...
Metcalfe, Janet
2017-01-01
Although error avoidance during learning appears to be the rule in American classrooms, laboratory studies suggest that it may be a counterproductive strategy, at least for neurologically typical students. Experimental investigations indicate that errorful learning followed by corrective feedback is beneficial to learning. Interestingly, the…
Fallow), Stray
2009-01-01
Having trouble with geometry? Do Pi, The Pythagorean Theorem, and angle calculations just make your head spin? Relax. With Head First 2D Geometry, you'll master everything from triangles, quads and polygons to the time-saving secrets of similar and congruent angles -- and it'll be quick, painless, and fun. Through entertaining stories and practical examples from the world around you, this book takes you beyond boring problems. You'll actually use what you learn to make real-life decisions, like using angles and parallel lines to crack a mysterious CSI case. Put geometry to work for you, and
Walsh, Edward T
2014-01-01
This introductory text is designed to help undergraduate students develop a solid foundation in geometry. Early chapters progress slowly, cultivating the necessary understanding and self-confidence for the more rapid development that follows. The extensive treatment can be easily adapted to accommodate shorter courses. Starting with the language of mathematics as expressed in the algebra of logic and sets, the text covers geometric sets of points, separation and angles, triangles, parallel lines, similarity, polygons and area, circles, space geometry, and coordinate geometry. Each chapter incl
Differential geometry curves, surfaces, manifolds
Kohnel, Wolfgang
2002-01-01
This carefully written book is an introduction to the beautiful ideas and results of differential geometry. The first half covers the geometry of curves and surfaces, which provide much of the motivation and intuition for the general theory. Special topics that are explored include Frenet frames, ruled surfaces, minimal surfaces and the Gauss-Bonnet theorem. The second part is an introduction to the geometry of general manifolds, with particular emphasis on connections and curvature. The final two chapters are insightful examinations of the special cases of spaces of constant curvature and Einstein manifolds. The text is illustrated with many figures and examples. The prerequisites are undergraduate analysis and linear algebra.
Vectorising the detector geometry to optimize particle transport
Apostolakis, John; Carminati, Federico; Gheata, Andrei; Wenzel, Sandro
2014-01-01
Among the components contributing to particle transport, geometry navigation is an important consumer of CPU cycles. The tasks performed to get answers to "basic" queries such as locating a point within a geometry hierarchy or computing accurately the distance to the next boundary can become very computing intensive for complex detector setups. So far, the existing geometry algorithms employ mainly scalar optimisation strategies (voxelization, caching) to reduce their CPU consumption. In this paper, we would like to take a different approach and investigate how geometry navigation can benefit from the vector instruction set extensions that are one of the primary source of performance enhancements on current and future hardware. While on paper, this form of microparallelism promises increasing performance opportunities, applying this technology to the highly hierarchical and multiply branched geometry code is a difficult challenge. We refer to the current work done to vectorise an important part of the critica...
Applying Intelligent Algorithms to Automate the Identification of Error Factors.
Jin, Haizhe; Qu, Qingxing; Munechika, Masahiko; Sano, Masataka; Kajihara, Chisato; Duffy, Vincent G; Chen, Han
2018-05-03
Medical errors are the manifestation of the defects occurring in medical processes. Extracting and identifying defects as medical error factors from these processes are an effective approach to prevent medical errors. However, it is a difficult and time-consuming task and requires an analyst with a professional medical background. The issues of identifying a method to extract medical error factors and reduce the extraction difficulty need to be resolved. In this research, a systematic methodology to extract and identify error factors in the medical administration process was proposed. The design of the error report, extraction of the error factors, and identification of the error factors were analyzed. Based on 624 medical error cases across four medical institutes in both Japan and China, 19 error-related items and their levels were extracted. After which, they were closely related to 12 error factors. The relational model between the error-related items and error factors was established based on a genetic algorithm (GA)-back-propagation neural network (BPNN) model. Additionally, compared to GA-BPNN, BPNN, partial least squares regression and support vector regression, GA-BPNN exhibited a higher overall prediction accuracy, being able to promptly identify the error factors from the error-related items. The combination of "error-related items, their different levels, and the GA-BPNN model" was proposed as an error-factor identification technology, which could automatically identify medical error factors.
Action errors, error management, and learning in organizations.
Frese, Michael; Keith, Nina
2015-01-03
Every organization is confronted with errors. Most errors are corrected easily, but some may lead to negative consequences. Organizations often focus on error prevention as a single strategy for dealing with errors. Our review suggests that error prevention needs to be supplemented by error management--an approach directed at effectively dealing with errors after they have occurred, with the goal of minimizing negative and maximizing positive error consequences (examples of the latter are learning and innovations). After defining errors and related concepts, we review research on error-related processes affected by error management (error detection, damage control). Empirical evidence on positive effects of error management in individuals and organizations is then discussed, along with emotional, motivational, cognitive, and behavioral pathways of these effects. Learning from errors is central, but like other positive consequences, learning occurs under certain circumstances--one being the development of a mind-set of acceptance of human error.
Hermeticity of three cryogenic calorimeter geometries
International Nuclear Information System (INIS)
Strovink, M.; Wormersley, W.J.; Forden, G.E.
1989-04-01
We calculate the effect of cracks and dead material on resolution in three simplified cryogenic calorimeter geometries, using a crude approximation that neglects transverse shower spreading and considers only a small set of incident angles. For each dead region, we estimate the average unseen energy using a shower parametrization, and relate it to resolution broadening using a simple approximation that agrees with experimental data. Making reasonable and consistent assumptions on cryostat wall thicknesses, we find that the effects of cracks and dead material dominate the expected resolution in the region where separate ''barrel'' and ''end'' cryostats meet. This is particularly true for one geometry in which the end calorimeter caps the barrel and also protrudes into the hole within it. We also find that carefully designed auxiliary ''crack filler'' detectors can substantially reduce the loss of resolution in these areas. 6 figs
Resolution, coverage, and geometry beyond traditional limits
Energy Technology Data Exchange (ETDEWEB)
Ronen, Shuki; Ferber, Ralf
1998-12-31
The presentation relates to the optimization of the image of seismic data and improved resolution and coverage of acquired data. Non traditional processing methods such as inversion to zero offset (IZO) are used. To realize the potential of saving acquisition cost by reducing in-fill and to plan resolution improvement by processing, geometry QC methods such as DMO Dip Coverage Spectrum (DDCS) and Bull`s Eyes Analysis are used. The DDCS is a 2-D spectrum whose entries consist of the DMO (Dip Move Out) coverage for a particular reflector specified by it`s true time dip and reflector normal strike. The Bull`s Eyes Analysis relies on real time processing of synthetic data generated with the real geometry. 4 refs., 6 figs.
Advances in discrete differential geometry
2016-01-01
This is one of the first books on a newly emerging field of discrete differential geometry and an excellent way to access this exciting area. It surveys the fascinating connections between discrete models in differential geometry and complex analysis, integrable systems and applications in computer graphics. The authors take a closer look at discrete models in differential geometry and dynamical systems. Their curves are polygonal, surfaces are made from triangles and quadrilaterals, and time is discrete. Nevertheless, the difference between the corresponding smooth curves, surfaces and classical dynamical systems with continuous time can hardly be seen. This is the paradigm of structure-preserving discretizations. Current advances in this field are stimulated to a large extent by its relevance for computer graphics and mathematical physics. This book is written by specialists working together on a common research project. It is about differential geometry and dynamical systems, smooth and discrete theories, ...
Hyperbolic Metamaterials with Complex Geometry
DEFF Research Database (Denmark)
Lavrinenko, Andrei; Andryieuski, Andrei; Zhukovsky, Sergei
2016-01-01
We investigate new geometries of hyperbolic metamaterialssuch as highly corrugated structures, nanoparticle monolayer assemblies, super-structured or vertically arranged multilayersand nanopillars. All structures retain basic propertiesof hyperbolic metamaterials, but have functionality improved...
An introduction to differential geometry
Willmore, T J
2012-01-01
This text employs vector methods to explore the classical theory of curves and surfaces. Topics include basic theory of tensor algebra, tensor calculus, calculus of differential forms, and elements of Riemannian geometry. 1959 edition.
Symplectic geometry and Fourier analysis
Wallach, Nolan R
2018-01-01
Suitable for graduate students in mathematics, this monograph covers differential and symplectic geometry, homogeneous symplectic manifolds, Fourier analysis, metaplectic representation, quantization, Kirillov theory. Includes Appendix on Quantum Mechanics by Robert Hermann. 1977 edition.
Gao, J.
2014-12-01
Reducing modeling error is often a major concern of empirical geophysical models. However, modeling errors can be defined in different ways: When the response variable is continuous, the most commonly used metrics are squared (SQ) and absolute (ABS) errors. For most applications, ABS error is the more natural, but SQ error is mathematically more tractable, so is often used as a substitute with little scientific justification. Existing literature has not thoroughly investigated the implications of using SQ error in place of ABS error, especially not geospatially. This study compares the two metrics through the lens of bias-variance decomposition (BVD). BVD breaks down the expected modeling error of each model evaluation point into bias (systematic error), variance (model sensitivity), and noise (observation instability). It offers a way to probe the composition of various error metrics. I analytically derived the BVD of ABS error and compared it with the well-known SQ error BVD, and found that not only the two metrics measure the characteristics of the probability distributions of modeling errors differently, but also the effects of these characteristics on the overall expected error are different. Most notably, under SQ error all bias, variance, and noise increase expected error, while under ABS error certain parts of the error components reduce expected error. Since manipulating these subtractive terms is a legitimate way to reduce expected modeling error, SQ error can never capture the complete story embedded in ABS error. I then empirically compared the two metrics with a supervised remote sensing model for mapping surface imperviousness. Pair-wise spatially-explicit comparison for each error component showed that SQ error overstates all error components in comparison to ABS error, especially variance-related terms. Hence, substituting ABS error with SQ error makes model performance appear worse than it actually is, and the analyst would more likely accept a
Attitude Determination Error Analysis System (ADEAS) mathematical specifications document
Nicholson, Mark; Markley, F.; Seidewitz, E.
1988-01-01
The mathematical specifications of Release 4.0 of the Attitude Determination Error Analysis System (ADEAS), which provides a general-purpose linear error analysis capability for various spacecraft attitude geometries and determination processes, are presented. The analytical basis of the system is presented. The analytical basis of the system is presented, and detailed equations are provided for both three-axis-stabilized and spin-stabilized attitude sensor models.
Energy Technology Data Exchange (ETDEWEB)
Poiate, Junior, Edgard; Costa, Alvaro M. da; Amaral, Claudio S; Rocha, Renato S [PETROBRAS, Rio de Janeiro, RJ (Brazil). Centro de Pesquisas (CENPES); Guimaraes, Giuseppe B; Souza, Pablo F [Pontificia Univ. Catolica do Rio de Janeiro, RJ (Brazil)
2003-07-01
In January 2000 PETROBRAS faced a leakage of heavy heated MF380 oil from a 16'' pipeline in Guanabara Bay. The thermal structural buckling of the pipeline, interacting with the soil, induced the rupture of the pipeline wall, causing the leakage of oil. In order to overcome this undesired phenomenon PETROBRAS studied several alternatives of a new pipeline. As a result of these studies a pipeline with 'ZIGZAG' geometry was adopted, named PE-3. Due to the very few applications of this kind of concept by the oil industry and in different soil conditions compared to the existing one in Guanabara Bay, a very sophisticated procedure including the simulation of the thermal mechanical interactions between the soil and the pipeline structure was developed. Computer modeling was carried out using the finite element method considering the soil and pipeline non-linear material behavior and finite displacements. In order to validate the numerical modelling was build an experimental test in a reduced model with physics similarity of a pipeline with ZIG-ZAG geometry (PE-3). The numerical and experimental results are comparing and have a good agreement. (author)
Topology and geometry for physicists
Nash, Charles
1983-01-01
Differential geometry and topology are essential tools for many theoretical physicists, particularly in the study of condensed matter physics, gravity, and particle physics. Written by physicists for physics students, this text introduces geometrical and topological methods in theoretical physics and applied mathematics. It assumes no detailed background in topology or geometry, and it emphasizes physical motivations, enabling students to apply the techniques to their physics formulas and research. ""Thoroughly recommended"" by The Physics Bulletin, this volume's physics applications range fr
Uncorrected refractive errors.
Naidoo, Kovin S; Jaggernath, Jyoti
2012-01-01
Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC), were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR) Development, Service Development and Social Entrepreneurship.
Directory of Open Access Journals (Sweden)
Kovin S Naidoo
2012-01-01
Full Text Available Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC, were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR Development, Service Development and Social Entrepreneurship.
Spectral dimension of quantum geometries
International Nuclear Information System (INIS)
Calcagni, Gianluca; Oriti, Daniele; Thürigen, Johannes
2014-01-01
The spectral dimension is an indicator of geometry and topology of spacetime and a tool to compare the description of quantum geometry in various approaches to quantum gravity. This is possible because it can be defined not only on smooth geometries but also on discrete (e.g., simplicial) ones. In this paper, we consider the spectral dimension of quantum states of spatial geometry defined on combinatorial complexes endowed with additional algebraic data: the kinematical quantum states of loop quantum gravity (LQG). Preliminarily, the effects of topology and discreteness of classical discrete geometries are studied in a systematic manner. We look for states reproducing the spectral dimension of a classical space in the appropriate regime. We also test the hypothesis that in LQG, as in other approaches, there is a scale dependence of the spectral dimension, which runs from the topological dimension at large scales to a smaller one at short distances. While our results do not give any strong support to this hypothesis, we can however pinpoint when the topological dimension is reproduced by LQG quantum states. Overall, by exploring the interplay of combinatorial, topological and geometrical effects, and by considering various kinds of quantum states such as coherent states and their superpositions, we find that the spectral dimension of discrete quantum geometries is more sensitive to the underlying combinatorial structures than to the details of the additional data associated with them. (paper)
Evaluation of a cone beam computed tomography geometry for image guided small animal irradiation
International Nuclear Information System (INIS)
Yang, Yidong; Armour, Michael; Wang, Ken Kang-Hsin; Gandhi, Nishant; Wong, John; Iordachita, Iulian; Siewerdsen, Jeffrey
2015-01-01
The conventional imaging geometry for small animal cone beam computed tomography (CBCT) is that a detector panel rotates around the head-to-tail axis of an imaged animal (‘tubular’ geometry). Another unusual but possible imaging geometry is that the detector panel rotates around the anterior-to-posterior axis of the animal (‘pancake’ geometry). The small animal radiation research platform developed at Johns Hopkins University employs the pancake geometry where a prone-positioned animal is rotated horizontally between an x-ray source and detector panel. This study is to assess the CBCT image quality in the pancake geometry and investigate potential methods for improvement. We compared CBCT images acquired in the pancake geometry with those acquired in the tubular geometry when the phantom/animal was placed upright simulating the conventional CBCT geometry. Results showed signal-to-noise and contrast-to-noise ratios in the pancake geometry were reduced in comparison to the tubular geometry at the same dose level. But the overall spatial resolution within the transverse plane of the imaged cylinder/animal was better in the pancake geometry. A modest exposure increase to two folds in the pancake geometry can improve image quality to a level close to the tubular geometry. Image quality can also be improved by inclining the animal, which reduces streak artifacts caused by bony structures. The major factor resulting in the inferior image quality in the pancake geometry is the elevated beam attenuation along the long axis of the phantom/animal and consequently increased scatter-to-primary ratio in that orientation. Not withstanding, the image quality in the pancake-geometry CBCT is adequate to support image guided animal positioning, while providing unique advantages of non-coplanar and multiple mice irradiation. This study also provides useful knowledge about the image quality in the two very different imaging geometries, i.e. pancake and tubular geometry
Evaluation of a cone beam computed tomography geometry for image guided small animal irradiation.
Yang, Yidong; Armour, Michael; Wang, Ken Kang-Hsin; Gandhi, Nishant; Iordachita, Iulian; Siewerdsen, Jeffrey; Wong, John
2015-07-07
The conventional imaging geometry for small animal cone beam computed tomography (CBCT) is that a detector panel rotates around the head-to-tail axis of an imaged animal ('tubular' geometry). Another unusual but possible imaging geometry is that the detector panel rotates around the anterior-to-posterior axis of the animal ('pancake' geometry). The small animal radiation research platform developed at Johns Hopkins University employs the pancake geometry where a prone-positioned animal is rotated horizontally between an x-ray source and detector panel. This study is to assess the CBCT image quality in the pancake geometry and investigate potential methods for improvement. We compared CBCT images acquired in the pancake geometry with those acquired in the tubular geometry when the phantom/animal was placed upright simulating the conventional CBCT geometry. Results showed signal-to-noise and contrast-to-noise ratios in the pancake geometry were reduced in comparison to the tubular geometry at the same dose level. But the overall spatial resolution within the transverse plane of the imaged cylinder/animal was better in the pancake geometry. A modest exposure increase to two folds in the pancake geometry can improve image quality to a level close to the tubular geometry. Image quality can also be improved by inclining the animal, which reduces streak artifacts caused by bony structures. The major factor resulting in the inferior image quality in the pancake geometry is the elevated beam attenuation along the long axis of the phantom/animal and consequently increased scatter-to-primary ratio in that orientation. Not withstanding, the image quality in the pancake-geometry CBCT is adequate to support image guided animal positioning, while providing unique advantages of non-coplanar and multiple mice irradiation. This study also provides useful knowledge about the image quality in the two very different imaging geometries, i.e. pancake and tubular geometry, respectively.
Preventing Errors in Laterality
Landau, Elliot; Hirschorn, David; Koutras, Iakovos; Malek, Alexander; Demissie, Seleshie
2014-01-01
An error in laterality is the reporting of a finding that is present on the right side as on the left or vice versa. While different medical and surgical specialties have implemented protocols to help prevent such errors, very few studies have been published that describe these errors in radiology reports and ways to prevent them. We devised a system that allows the radiologist to view reports in a separate window, displayed in a simple font and with all terms of laterality highlighted in sep...
International Nuclear Information System (INIS)
Reason, J.
1988-01-01
This paper is in three parts. The first part summarizes the human failures responsible for the Chernobyl disaster and argues that, in considering the human contribution to power plant emergencies, it is necessary to distinguish between: errors and violations; and active and latent failures. The second part presents empirical evidence, drawn from driver behavior, which suggest that errors and violations have different psychological origins. The concluding part outlines a resident pathogen view of accident causation, and seeks to identify the various system pathways along which errors and violations may be propagated
LENUS (Irish Health Repository)
Holmberg, Ola
2002-06-01
When preparing radiation treatment, the prescribed dose and irradiation geometry must be translated into physical machine parameters. An error in the calculations or machine settings can negatively affect the intended treatment outcome. Analysing incidents originating in the treatment preparation chain makes it possible to find weak links and prevent treatment errors. The aim of this work is to study the effectiveness of a multilayered error prevention system by analysing both near misses and actual treatment errors.
... this page: //medlineplus.gov/ency/patientinstructions/000618.htm Help prevent hospital errors To use the sharing features ... in the hospital. If You Are Having Surgery, Help Keep Yourself Safe Go to a hospital you ...
2012-03-01
This project examined the prevalence of pedal application errors and the driver, vehicle, roadway and/or environmental characteristics associated with pedal misapplication crashes based on a literature review, analysis of news media reports, a panel ...
International Nuclear Information System (INIS)
Jeach, J.L.
1976-01-01
When rounding error is large relative to weighing error, it cannot be ignored when estimating scale precision and bias from calibration data. Further, if the data grouping is coarse, rounding error is correlated with weighing error and may also have a mean quite different from zero. These facts are taken into account in a moment estimation method. A copy of the program listing for the MERDA program that provides moment estimates is available from the author. Experience suggests that if the data fall into four or more cells or groups, it is not necessary to apply the moment estimation method. Rather, the estimate given by equation (3) is valid in this instance. 5 tables
Spotting software errors sooner
International Nuclear Information System (INIS)
Munro, D.
1989-01-01
Static analysis is helping to identify software errors at an earlier stage and more cheaply than conventional methods of testing. RTP Software's MALPAS system also has the ability to check that a code conforms to its original specification. (author)
International Nuclear Information System (INIS)
Kop, L.
2001-01-01
On request, the Dutch Association for Energy, Environment and Water (VEMW) checks the energy bills for her customers. It appeared that in the year 2000 many small, but also big errors were discovered in the bills of 42 businesses
Medical Errors Reduction Initiative
National Research Council Canada - National Science Library
Mutter, Michael L
2005-01-01
The Valley Hospital of Ridgewood, New Jersey, is proposing to extend a limited but highly successful specimen management and medication administration medical errors reduction initiative on a hospital-wide basis...
Klonoff, David C; Lias, Courtney; Vigersky, Robert; Clarke, William; Parkes, Joan Lee; Sacks, David B; Kirkman, M Sue; Kovatchev, Boris
2014-07-01
Currently used error grids for assessing clinical accuracy of blood glucose monitors are based on out-of-date medical practices. Error grids have not been widely embraced by regulatory agencies for clearance of monitors, but this type of tool could be useful for surveillance of the performance of cleared products. Diabetes Technology Society together with representatives from the Food and Drug Administration, the American Diabetes Association, the Endocrine Society, and the Association for the Advancement of Medical Instrumentation, and representatives of academia, industry, and government, have developed a new error grid, called the surveillance error grid (SEG) as a tool to assess the degree of clinical risk from inaccurate blood glucose (BG) monitors. A total of 206 diabetes clinicians were surveyed about the clinical risk of errors of measured BG levels by a monitor. The impact of such errors on 4 patient scenarios was surveyed. Each monitor/reference data pair was scored and color-coded on a graph per its average risk rating. Using modeled data representative of the accuracy of contemporary meters, the relationships between clinical risk and monitor error were calculated for the Clarke error grid (CEG), Parkes error grid (PEG), and SEG. SEG action boundaries were consistent across scenarios, regardless of whether the patient was type 1 or type 2 or using insulin or not. No significant differences were noted between responses of adult/pediatric or 4 types of clinicians. Although small specific differences in risk boundaries between US and non-US clinicians were noted, the panel felt they did not justify separate grids for these 2 types of clinicians. The data points of the SEG were classified in 15 zones according to their assigned level of risk, which allowed for comparisons with the classic CEG and PEG. Modeled glucose monitor data with realistic self-monitoring of blood glucose errors derived from meter testing experiments plotted on the SEG when compared to
DEFF Research Database (Denmark)
Rasmussen, Jens
1983-01-01
An important aspect of the optimal design of computer-based operator support systems is the sensitivity of such systems to operator errors. The author discusses how a system might allow for human variability with the use of reversibility and observability.......An important aspect of the optimal design of computer-based operator support systems is the sensitivity of such systems to operator errors. The author discusses how a system might allow for human variability with the use of reversibility and observability....
Human errors related to maintenance and modifications
International Nuclear Information System (INIS)
Laakso, K.; Pyy, P.; Reiman, L.
1998-01-01
about weakness in audits made by the operating organisation and in tests relating to plant operation. The number of plant-specific maintenance records used as input material was high and the findings were discussed thoroughly with the plant maintenance personnel. The results indicated that instrumentation is more prone to human error than the rest of maintenance. Most errors stem from refuelling outage periods and about a half of them were identified during the same outage they were committed. Plant modifications are a significant source of common cause failures. The number of dependent errors could be reduced by improved co-ordination and auditing, post-installation checking, training and start-up testing programmes. (orig.)
2008-01-01
One way in which physicians can respond to a medical error is to apologize. Apologies—statements that acknowledge an error and its consequences, take responsibility, and communicate regret for having caused harm—can decrease blame, decrease anger, increase trust, and improve relationships. Importantly, apologies also have the potential to decrease the risk of a medical malpractice lawsuit and can help settle claims by patients. Patients indicate they want and expect explanations and apologies after medical errors and physicians indicate they want to apologize. However, in practice, physicians tend to provide minimal information to patients after medical errors and infrequently offer complete apologies. Although fears about potential litigation are the most commonly cited barrier to apologizing after medical error, the link between litigation risk and the practice of disclosure and apology is tenuous. Other barriers might include the culture of medicine and the inherent psychological difficulties in facing one’s mistakes and apologizing for them. Despite these barriers, incorporating apology into conversations between physicians and patients can address the needs of both parties and can play a role in the effective resolution of disputes related to medical error. PMID:18972177
Thermodynamics of Error Correction
Directory of Open Access Journals (Sweden)
Pablo Sartori
2015-12-01
Full Text Available Information processing at the molecular scale is limited by thermal fluctuations. This can cause undesired consequences in copying information since thermal noise can lead to errors that can compromise the functionality of the copy. For example, a high error rate during DNA duplication can lead to cell death. Given the importance of accurate copying at the molecular scale, it is fundamental to understand its thermodynamic features. In this paper, we derive a universal expression for the copy error as a function of entropy production and work dissipated by the system during wrong incorporations. Its derivation is based on the second law of thermodynamics; hence, its validity is independent of the details of the molecular machinery, be it any polymerase or artificial copying device. Using this expression, we find that information can be copied in three different regimes. In two of them, work is dissipated to either increase or decrease the error. In the third regime, the protocol extracts work while correcting errors, reminiscent of a Maxwell demon. As a case study, we apply our framework to study a copy protocol assisted by kinetic proofreading, and show that it can operate in any of these three regimes. We finally show that, for any effective proofreading scheme, error reduction is limited by the chemical driving of the proofreading reaction.
Medication Errors: New EU Good Practice Guide on Risk Minimisation and Error Prevention.
Goedecke, Thomas; Ord, Kathryn; Newbould, Victoria; Brosch, Sabine; Arlett, Peter
2016-06-01
A medication error is an unintended failure in the drug treatment process that leads to, or has the potential to lead to, harm to the patient. Reducing the risk of medication errors is a shared responsibility between patients, healthcare professionals, regulators and the pharmaceutical industry at all levels of healthcare delivery. In 2015, the EU regulatory network released a two-part good practice guide on medication errors to support both the pharmaceutical industry and regulators in the implementation of the changes introduced with the EU pharmacovigilance legislation. These changes included a modification of the 'adverse reaction' definition to include events associated with medication errors, and the requirement for national competent authorities responsible for pharmacovigilance in EU Member States to collaborate and exchange information on medication errors resulting in harm with national patient safety organisations. To facilitate reporting and learning from medication errors, a clear distinction has been made in the guidance between medication errors resulting in adverse reactions, medication errors without harm, intercepted medication errors and potential errors. This distinction is supported by an enhanced MedDRA(®) terminology that allows for coding all stages of the medication use process where the error occurred in addition to any clinical consequences. To better understand the causes and contributing factors, individual case safety reports involving an error should be followed-up with the primary reporter to gather information relevant for the conduct of root cause analysis where this may be appropriate. Such reports should also be summarised in periodic safety update reports and addressed in risk management plans. Any risk minimisation and prevention strategy for medication errors should consider all stages of a medicinal product's life-cycle, particularly the main sources and types of medication errors during product development. This article
Heuristics and Cognitive Error in Medical Imaging.
Itri, Jason N; Patel, Sohil H
2018-05-01
The field of cognitive science has provided important insights into mental processes underlying the interpretation of imaging examinations. Despite these insights, diagnostic error remains a major obstacle in the goal to improve quality in radiology. In this article, we describe several types of cognitive bias that lead to diagnostic errors in imaging and discuss approaches to mitigate cognitive biases and diagnostic error. Radiologists rely on heuristic principles to reduce complex tasks of assessing probabilities and predicting values into simpler judgmental operations. These mental shortcuts allow rapid problem solving based on assumptions and past experiences. Heuristics used in the interpretation of imaging studies are generally helpful but can sometimes result in cognitive biases that lead to significant errors. An understanding of the causes of cognitive biases can lead to the development of educational content and systematic improvements that mitigate errors and improve the quality of care provided by radiologists.
The District Nursing Clinical Error Reduction Programme.
McGraw, Caroline; Topping, Claire
2011-01-01
The District Nursing Clinical Error Reduction (DANCER) Programme was initiated in NHS Islington following an increase in the number of reported medication errors. The objectives were to reduce the actual degree of harm and the potential risk of harm associated with medication errors and to maintain the existing positive reporting culture, while robustly addressing performance issues. One hundred medication errors reported in 2007/08 were analysed using a framework that specifies the factors that predispose to adverse medication events in domiciliary care. Various contributory factors were identified and interventions were subsequently developed to address poor drug calculation and medication problem-solving skills and incorrectly transcribed medication administration record charts. Follow up data were obtained at 12 months and two years. The evaluation has shown that although medication errors do still occur, the programme has resulted in a marked shift towards a reduction in the associated actual degree of harm and the potential risk of harm.
Schofield, Jonathon S; Evans, Katherine R; Hebert, Jacqueline S; Marasco, Paul D; Carey, Jason P
2016-03-21
Force Sensitive Resistors (FSRs) are commercially available thin film polymer sensors commonly employed in a multitude of biomechanical measurement environments. Reasons for such wide spread usage lie in the versatility, small profile, and low cost of these sensors. Yet FSRs have limitations. It is commonly accepted that temperature, curvature and biological tissue compliance may impact sensor conductance and resulting force readings. The effect of these variables and degree to which they interact has yet to be comprehensively investigated and quantified. This work systematically assesses varying levels of temperature, sensor curvature and surface compliance using a full factorial design-of-experiments approach. Three models of Interlink FSRs were evaluated. Calibration equations under 12 unique combinations of temperature, curvature and compliance were determined for each sensor. Root mean squared error, mean absolute error, and maximum error were quantified as measures of the impact these thermo/mechanical factors have on sensor performance. It was found that all three variables have the potential to affect FSR calibration curves. The FSR model and corresponding sensor geometry are sensitive to these three mechanical factors at varying levels. Experimental results suggest that reducing sensor error requires calibration of each sensor in an environment as close to its intended use as possible and if multiple FSRs are used in a system, they must be calibrated independently. Copyright © 2016 Elsevier Ltd. All rights reserved.
Barnewitz, Holger; Fritz, Willy; Thiele, Frank
2013-01-01
This volume reports results from the German research initiative MUNA (Management and Minimization of Errors and Uncertainties in Numerical Aerodynamics), which combined development activities of the German Aerospace Center (DLR), German universities and German aircraft industry. The main objective of this five year project was the development of methods and procedures aiming at reducing various types of uncertainties that are typical of numerical flow simulations. The activities were focused on methods for grid manipulation, techniques for increasing the simulation accuracy, sensors for turbulence modelling, methods for handling uncertainties of the geometry and grid deformation as well as stochastic methods for quantifying aleatoric uncertainties.
Variable geometry Darrieus wind machine
Pytlinski, J. T.; Serrano, D.
1983-08-01
A variable geometry Darrieus wind machine is proposed. The lower attachment of the blades to the rotor can move freely up and down the axle allowing the blades of change shape during rotation. Experimental data for a 17 m. diameter Darrieus rotor and a theoretical model for multiple streamtube performance prediction were used to develop a computer simulation program for studying parameters that affect the machine's performance. This new variable geometry concept is described and interrelated with multiple streamtube theory through aerodynamic parameters. The computer simulation study shows that governor behavior of a Darrieus turbine can not be attained by a standard turbine operating within normally occurring rotational velocity limits. A second generation variable geometry Darrieus wind turbine which uses a telescopic blade is proposed as a potential improvement on the studied concept.
Flux compactifications and generalized geometries
International Nuclear Information System (INIS)
Grana, Mariana
2006-01-01
Following the lectures given at CERN Winter School 2006, we present a pedagogical overview of flux compactifications and generalized geometries, concentrating on closed string fluxes in type II theories. We start by reviewing the supersymmetric flux configurations with maximally symmetric four-dimensional spaces. We then discuss the no-go theorems (and their evasion) for compactifications with fluxes. We analyse the resulting four-dimensional effective theories for Calabi-Yau and Calabi-Yau orientifold compactifications, concentrating on the flux-induced superpotentials. We discuss the generic mechanism of moduli stabilization and illustrate with two examples: the conifold in IIB and a T 6 /(Z 3 x Z 3 ) torus in IIA. We finish by studying the effective action and flux vacua for generalized geometries in the context of generalized complex geometry
Flux compactifications and generalized geometries
Energy Technology Data Exchange (ETDEWEB)
Grana, Mariana [Service de Physique Theorique, CEA/Saclay, 91191 Gif-sur-Yvette Cedex (France)
2006-11-07
Following the lectures given at CERN Winter School 2006, we present a pedagogical overview of flux compactifications and generalized geometries, concentrating on closed string fluxes in type II theories. We start by reviewing the supersymmetric flux configurations with maximally symmetric four-dimensional spaces. We then discuss the no-go theorems (and their evasion) for compactifications with fluxes. We analyse the resulting four-dimensional effective theories for Calabi-Yau and Calabi-Yau orientifold compactifications, concentrating on the flux-induced superpotentials. We discuss the generic mechanism of moduli stabilization and illustrate with two examples: the conifold in IIB and a T{sup 6} /(Z{sub 3} x Z{sub 3}) torus in IIA. We finish by studying the effective action and flux vacua for generalized geometries in the context of generalized complex geometry.
Euclidean geometry and its subgeometries
Specht, Edward John; Calkins, Keith G; Rhoads, Donald H
2015-01-01
In this monograph, the authors present a modern development of Euclidean geometry from independent axioms, using up-to-date language and providing detailed proofs. The axioms for incidence, betweenness, and plane separation are close to those of Hilbert. This is the only axiomatic treatment of Euclidean geometry that uses axioms not involving metric notions and that explores congruence and isometries by means of reflection mappings. The authors present thirteen axioms in sequence, proving as many theorems as possible at each stage and, in the process, building up subgeometries, most notably the Pasch and neutral geometries. Standard topics such as the congruence theorems for triangles, embedding the real numbers in a line, and coordinatization of the plane are included, as well as theorems of Pythagoras, Desargues, Pappas, Menelaus, and Ceva. The final chapter covers consistency and independence of axioms, as well as independence of definition properties. There are over 300 exercises; solutions to many of the...
Guide to Computational Geometry Processing
DEFF Research Database (Denmark)
Bærentzen, Jakob Andreas; Gravesen, Jens; Anton, François
be processed before it is useful. This Guide to Computational Geometry Processing reviews the algorithms for processing geometric data, with a practical focus on important techniques not covered by traditional courses on computer vision and computer graphics. This is balanced with an introduction...... to the theoretical and mathematical underpinnings of each technique, enabling the reader to not only implement a given method, but also to understand the ideas behind it, its limitations and its advantages. Topics and features: Presents an overview of the underlying mathematical theory, covering vector spaces......, metric space, affine spaces, differential geometry, and finite difference methods for derivatives and differential equations Reviews geometry representations, including polygonal meshes, splines, and subdivision surfaces Examines techniques for computing curvature from polygonal meshes Describes...
Electrodynamics and Spacetime Geometry: Foundations
Cabral, Francisco; Lobo, Francisco S. N.
2017-02-01
We explore the intimate connection between spacetime geometry and electrodynamics. This link is already implicit in the constitutive relations between the field strengths and excitations, which are an essential part of the axiomatic structure of electromagnetism, clearly formulated via integration theory and differential forms. We review the foundations of classical electromagnetism based on charge and magnetic flux conservation, the Lorentz force and the constitutive relations. These relations introduce the conformal part of the metric and allow the study of electrodynamics for specific spacetime geometries. At the foundational level, we discuss the possibility of generalizing the vacuum constitutive relations, by relaxing the fixed conditions of homogeneity and isotropy, and by assuming that the symmetry properties of the electro-vacuum follow the spacetime isometries. The implications of this extension are briefly discussed in the context of the intimate connection between electromagnetism and the geometry (and causal structure) of spacetime.
Dayside merging and cusp geometry
International Nuclear Information System (INIS)
Crooker, N.U.
1979-01-01
Geometrical considerations are presented to show that dayside magnetic merging when constrained to act only where the fields are antiparallel results in lines of merging that converge at the polar cusps. An important consequence of this geometry is that no accelerated flows are predicted across the dayside magnetopause. Acceleration owing to merging acts in opposition to the magnetosheath flow at the merging point and produces the variably directed, slower-than-magnetosheath flows observed in the entry layer. Another consequence of the merging geometry is that much of the time closed field lines constitute the subsolar region of the magnetopause. The manner in which the polar cap convection patterns predicted by the proposed geometry change as the interplanetary field is rotated through 360 0 provides a unifying description of how the observed single circular vortex and the crescent-shaped double vortex patterns mutually evolve under the influence of a single operating principle
DOGBONE GEOMETRY FOR RECIRCULATING ACCELERATORS
International Nuclear Information System (INIS)
BERG, J.S.; JOHNSTONE, C.; SUMMERS, D.
2001-01-01
Most scenarios for accelerating muons require recirculating acceleration. A racetrack shape for the accelerator requires particles with lower energy in early passes to traverse almost the same length of arc as particles with the highest energy. This extra arc length may lead to excess decays and excess cost. Changing the geometry to a dogbone shape, where there is a single linac and the beam turns completely around at the end of the linac, returning to the same end of the linac from which it exited, addresses this problem. In this design, the arc lengths can be proportional to the particle's momentum. This paper proposes an approximate cost model for a recirculating accelerator, attempts to make cost-optimized designs for both racetrack and dogbone geometries, and demonstrates that the dogbone geometry does appear to be more cost effective
Study of Errors among Nursing Students
Directory of Open Access Journals (Sweden)
Ella Koren
2007-09-01
Full Text Available The study of errors in the health system today is a topic of considerable interest aimed at reducing errors through analysis of the phenomenon and the conclusions reached. Errors that occur frequently among health professionals have also been observed among nursing students. True, in most cases they are actually “near errors,” but these could be a future indicator of therapeutic reality and the effect of nurses' work environment on their personal performance. There are two different approaches to such errors: (a The EPP (error prone person approach lays full responsibility at the door of the individual involved in the error, whether a student, nurse, doctor, or pharmacist. According to this approach, handling consists purely in identifying and penalizing the guilty party. (b The EPE (error prone environment approach emphasizes the environment as a primary contributory factor to errors. The environment as an abstract concept includes components and processes of interpersonal communications, work relations, human engineering, workload, pressures, technical apparatus, and new technologies. The objective of the present study was to examine the role played by factors in and components of personal performance as compared to elements and features of the environment. The study was based on both of the aforementioned approaches, which, when combined, enable a comprehensive understanding of the phenomenon of errors among the student population as well as a comparison of factors contributing to human error and to error deriving from the environment. The theoretical basis of the study was a model that combined both approaches: one focusing on the individual and his or her personal performance and the other focusing on the work environment. The findings emphasize the work environment of health professionals as an EPE. However, errors could have been avoided by means of strict adherence to practical procedures. The authors examined error events in the
Information theory, spectral geometry, and quantum gravity.
Kempf, Achim; Martin, Robert
2008-01-18
We show that there exists a deep link between the two disciplines of information theory and spectral geometry. This allows us to obtain new results on a well-known quantum gravity motivated natural ultraviolet cutoff which describes an upper bound on the spatial density of information. Concretely, we show that, together with an infrared cutoff, this natural ultraviolet cutoff beautifully reduces the path integral of quantum field theory on curved space to a finite number of ordinary integrations. We then show, in particular, that the subsequent removal of the infrared cutoff is safe.
Non-holonomic dynamics and Poisson geometry
International Nuclear Information System (INIS)
Borisov, A V; Mamaev, I S; Tsiganov, A V
2014-01-01
This is a survey of basic facts presently known about non-linear Poisson structures in the analysis of integrable systems in non-holonomic mechanics. It is shown that by using the theory of Poisson deformations it is possible to reduce various non-holonomic systems to dynamical systems on well-understood phase spaces equipped with linear Lie-Poisson brackets. As a result, not only can different non-holonomic systems be compared, but also fairly advanced methods of Poisson geometry and topology can be used for investigating them. Bibliography: 95 titles
Geometric Transformations in Engineering Geometry
Directory of Open Access Journals (Sweden)
I. F. Borovikov
2015-01-01
Full Text Available Recently, for business purposes, in view of current trends and world experience in training engineers, research and faculty staff there has been a need to transform traditional courses of descriptive geometry into the course of engineering geometry in which the geometrical transformations have to become its main section. On the basis of critical analysis the paper gives suggestions to improve a presentation technique of this section both in the classroom and in academic literature, extend an application scope of geometrical transformations to solve the position and metric tasks and simulation of surfaces, as well as to design complex engineering configurations, which meet a number of pre-specified conditions.The article offers to make a number of considerable amendments to the terms and definitions used in the existing courses of descriptive geometry. It draws some conclusions and makes the appropriate proposals on feasibility of coordination in teaching the movement transformation in the courses of analytical and descriptive geometry. This will provide interdisciplinary team teaching and allow students to be convinced that a combination of analytical and graphic ways to solve geometric tasks is useful and reasonable.The traditional sections of learning courses need to be added with a theory of projective and bi-rational transformations. In terms of application simplicity and convenience it is enough to consider the central transformations when solving the applied tasks. These transformations contain a beam of sub-invariant (low-invariant straight lines on which the invariant curve induces non-involution and involution projectivities. The expediency of nonlinear transformations application is shown in the article by a specific example of geometric modeling of the interfacing surface "spar-blade".Implementation of these suggestions will contribute to a real transformation of a traditional course of descriptive geometry to the engineering geometry
Directory of Open Access Journals (Sweden)
MA. Lendita Kryeziu
2015-06-01
Full Text Available “Errare humanum est”, a well known and widespread Latin proverb which states that: to err is human, and that people make mistakes all the time. However, what counts is that people must learn from mistakes. On these grounds Steve Jobs stated: “Sometimes when you innovate, you make mistakes. It is best to admit them quickly, and get on with improving your other innovations.” Similarly, in learning new language, learners make mistakes, thus it is important to accept them, learn from them, discover the reason why they make them, improve and move on. The significance of studying errors is described by Corder as: “There have always been two justifications proposed for the study of learners' errors: the pedagogical justification, namely that a good understanding of the nature of error is necessary before a systematic means of eradicating them could be found, and the theoretical justification, which claims that a study of learners' errors is part of the systematic study of the learners' language which is itself necessary to an understanding of the process of second language acquisition” (Corder, 1982; 1. Thus the importance and the aim of this paper is analyzing errors in the process of second language acquisition and the way we teachers can benefit from mistakes to help students improve themselves while giving the proper feedback.
Compact disk error measurements
Howe, D.; Harriman, K.; Tehranchi, B.
1993-01-01
The objectives of this project are as follows: provide hardware and software that will perform simple, real-time, high resolution (single-byte) measurement of the error burst and good data gap statistics seen by a photoCD player read channel when recorded CD write-once discs of variable quality (i.e., condition) are being read; extend the above system to enable measurement of the hard decision (i.e., 1-bit error flags) and soft decision (i.e., 2-bit error flags) decoding information that is produced/used by the Cross Interleaved - Reed - Solomon - Code (CIRC) block decoder employed in the photoCD player read channel; construct a model that uses data obtained via the systems described above to produce meaningful estimates of output error rates (due to both uncorrected ECC words and misdecoded ECC words) when a CD disc having specific (measured) error statistics is read (completion date to be determined); and check the hypothesis that current adaptive CIRC block decoders are optimized for pressed (DAD/ROM) CD discs. If warranted, do a conceptual design of an adaptive CIRC decoder that is optimized for write-once CD discs.
KEMAJUAN BELAJAR SISWA PADA GEOMETRI TRANSFORMASI MENGGUNAKAN AKTIVITAS REFLEKSI GEOMETRI
Directory of Open Access Journals (Sweden)
Irkham Ulil Albab
2014-10-01
Full Text Available Abstrak: Penelitian ini bertujuan untuk mendeskripsikan kemajuan belajar siswa pada materi geometri transformasi yang didukung dengan serangkaian aktivitas belajar berdasarkan Pendidikan Matematika Realistik Indonesia. Penelitian didesain melalui tiga tahap, yaitu tahapan perancangan desain awal, pengujian desain melalui pembelajaran awal dan pembelajaran eksperimental, dan tahap analisis retrospektif. Dalam penelitian ini, Hypothetical Learning Trajectory, HLT (HLT berperan penting sebagai desain pembelajaran sekaligus instrumen penelitian. HLT diujikan terhadap 26 siswa kelas VII. Data dikumpulkan dengan teknik wawancara, pengamatan, dan catatan lapangan. Hasil penelitian menunjukkan bahwa desain pembelajaran ini mampu menstimulasi siswa untuk memberikan karakteristik refleksi dan transformasi geometri lainnya secara informal, mengklasifikasikannya dalam transformasi isometri pada level kedua, dan menemukan garis bantuan refleksi pada level yang lebih formal. Selain itu, garis bantuan refleksi digunakan oleh siswa untuk menggambar bayangan refleksi dan pola pencerminan serta memahami bentuk rotasi dan translasi sebagai kombinasi refleksi adalah level tertinggi. Keyword: transformasi geometri, kombinasi refleksi, rotasi, translasi, design research, HLT STUDENTS’ LEARNING PROGRESS ON TRANSFORMATION GEOMETRY USING THE GEOMETRY REFLECTION ACTIVITIES Abstract: This study was aimed at describing the students’ learning progress on transformation geometry supported by a set of learning activities based on Indonesian Realistic Mathematics Education. The study was designed into three stages, that is, the preliminary design stage, the design testing through initial instruction and experiment, and the restrospective analysis stage. In this study, Hypothetical Learning Trajectory (HLT played an important role as an instructional design and a research instrument. HLT was tested to 26 seventh grade students. The data were collected through interviews
Graphical debugging of combinational geometry
International Nuclear Information System (INIS)
Burns, T.J.; Smith, M.S.
1992-01-01
A graphical debugger for combinatorial geometry being developed at Oak Ridge National Laboratory is described. The prototype debugger consists of two parts: a FORTRAN-based ''view'' generator and a Microsoft Windows application for displaying the geometry. Options and features of both modules are discussed. Examples illustrating the various options available are presented. The potential for utilizing the images produced using the debugger as a visualization tool for the output of the radiation transport codes is discussed as is the future direction of the development
Lectures on Algebraic Geometry I
Harder, Gunter
2012-01-01
This book and the following second volume is an introduction into modern algebraic geometry. In the first volume the methods of homological algebra, theory of sheaves, and sheaf cohomology are developed. These methods are indispensable for modern algebraic geometry, but they are also fundamental for other branches of mathematics and of great interest in their own. In the last chapter of volume I these concepts are applied to the theory of compact Riemann surfaces. In this chapter the author makes clear how influential the ideas of Abel, Riemann and Jacobi were and that many of the modern metho
Combinatorial geometry in the plane
Hadwiger, Hugo; Klee, Victor
2014-01-01
Geared toward advanced undergraduates familiar with analysis and college geometry, this concise book discusses theorems on topics restricted to the plane such as convexity, coverings, and graphs. In addition to helping students cultivate rigorous thought, the text encourages the development of mathematical intuition and clarifies the nature of mathematical research.The two-part treatment begins with specific topics including integral distances, covering problems, point set geometry and convexity, simple paradoxes involving point sets, and pure combinatorics, among other subjects. The second pa
Modern differential geometry for physicists
Isham, C J
1989-01-01
These notes are the content of an introductory course on modern, coordinate-free differential geometry which is taken by the first-year theoretical physics PhD students, or by students attending the one-year MSc course "Fundamental Fields and Forces" at Imperial College. The book is concerned entirely with mathematics proper, although the emphasis and detailed topics have been chosen with an eye to the way in which differential geometry is applied these days to modern theoretical physics. This includes not only the traditional area of general relativity but also the theory of Yang-Mills fields
Comparison theorems in Riemannian geometry
Cheeger, Jeff
2008-01-01
The central theme of this book is the interaction between the curvature of a complete Riemannian manifold and its topology and global geometry. The first five chapters are preparatory in nature. They begin with a very concise introduction to Riemannian geometry, followed by an exposition of Toponogov's theorem-the first such treatment in a book in English. Next comes a detailed presentation of homogeneous spaces in which the main goal is to find formulas for their curvature. A quick chapter of Morse theory is followed by one on the injectivity radius. Chapters 6-9 deal with many of the most re
Geometry, topology, and string theory
Energy Technology Data Exchange (ETDEWEB)
Varadarajan, Uday [Univ. of California, Berkeley, CA (United States)
2003-01-01
A variety of scenarios are considered which shed light upon the uses and limitations of classical geometric and topological notions in string theory. The primary focus is on situations in which D-brane or string probes of a given classical space-time see the geometry quite differently than one might naively expect. In particular, situations in which extra dimensions, non-commutative geometries as well as other non-local structures emerge are explored in detail. Further, a preliminary exploration of such issues in Lorentzian space-times with non-trivial causal structures within string theory is initiated.
Spatial geometry and special relativity
DEFF Research Database (Denmark)
Kneubil, Fabiana Botelho
2016-01-01
In this work, it is shown the interplay of relative and absolute entities, which are present in both spatial geometry and special relativity. In order to strengthen the understanding of special relativity, we discuss firstly an instance of geometry and the existence of both frame......-dependent and frame-independent entities. We depart from a subject well known by students, which is the three-dimensional geometric space in order to compare, afterwards, with the treatment of four-dimensional space in the special relativity. The differences and similarities between these two subjects are also...
Stochastic geometry and its applications
Chiu, Sung Nok; Kendall, Wilfrid S; Mecke, Joseph
2013-01-01
An extensive update to a classic text Stochastic geometry and spatial statistics play a fundamental role in many modern branches of physics, materials sciences, engineering, biology and environmental sciences. They offer successful models for the description of random two- and three-dimensional micro and macro structures and statistical methods for their analysis. The previous edition of this book has served as the key reference in its field for over 18 years and is regarded as the best treatment of the subject of stochastic geometry, both as a subject with vital a
Introduction to topology and geometry
Stahl, Saul
2014-01-01
An easily accessible introduction to over three centuries of innovations in geometry Praise for the First Edition ". . . a welcome alternative to compartmentalized treatments bound to the old thinking. This clearly written, well-illustrated book supplies sufficient background to be self-contained." -CHOICE This fully revised new edition offers the most comprehensive coverage of modern geometry currently available at an introductory level. The book strikes a welcome balance between academic rigor and accessibility, providing a complete and cohesive picture of the science with an unparallele
Algebraic geometry and theta functions
Coble, Arthur B
1929-01-01
This book is the result of extending and deepening all questions from algebraic geometry that are connected to the central problem of this book: the determination of the tritangent planes of a space curve of order six and genus four, which the author treated in his Colloquium Lecture in 1928 at Amherst. The first two chapters recall fundamental ideas of algebraic geometry and theta functions in such fashion as will be most helpful in later applications. In order to clearly present the state of the central problem, the author first presents the better-known cases of genus two (Chapter III) and
Geometry, topology, and string theory
International Nuclear Information System (INIS)
Varadarajan, Uday
2003-01-01
A variety of scenarios are considered which shed light upon the uses and limitations of classical geometric and topological notions in string theory. The primary focus is on situations in which D-brane or string probes of a given classical space-time see the geometry quite differently than one might naively expect. In particular, situations in which extra dimensions, non-commutative geometries as well as other non-local structures emerge are explored in detail. Further, a preliminary exploration of such issues in Lorentzian space-times with non-trivial causal structures within string theory is initiated
Research trend on human error reduction
International Nuclear Information System (INIS)
Miyaoka, Sadaoki
1990-01-01
Human error has been the problem in all industries. In 1988, the Bureau of Mines, Department of the Interior, USA, carried out the worldwide survey on the human error in all industries in relation to the fatal accidents in mines. There was difference in the results according to the methods of collecting data, but the proportion that human error took in the total accidents distributed in the wide range of 20∼85%, and was 35% on the average. The rate of occurrence of accidents and troubles in Japanese nuclear power stations is shown, and the rate of occurrence of human error is 0∼0.5 cases/reactor-year, which did not much vary. Therefore, the proportion that human error took in the total tended to increase, and it has become important to reduce human error for lowering the rate of occurrence of accidents and troubles hereafter. After the TMI accident in 1979 in USA, the research on man-machine interface became active, and after the Chernobyl accident in 1986 in USSR, the problem of organization and management has been studied. In Japan, 'Safety 21' was drawn up by the Advisory Committee for Energy, and also the annual reports on nuclear safety pointed out the importance of human factors. The state of the research on human factors in Japan and abroad and three targets to reduce human error are reported. (K.I.)
ERROR VS REJECTION CURVE FOR THE PERCEPTRON
PARRONDO, JMR; VAN DEN BROECK, Christian
1993-01-01
We calculate the generalization error epsilon for a perceptron J, trained by a teacher perceptron T, on input patterns S that form a fixed angle arccos (J.S) with the student. We show that the error is reduced from a power law to an exponentially fast decay by rejecting input patterns that lie within a given neighbourhood of the decision boundary J.S = 0. On the other hand, the error vs. rejection curve epsilon(rho), where rho is the fraction of rejected patterns, is shown to be independent ...
KMRR thermal power measurement error estimation
International Nuclear Information System (INIS)
Rhee, B.W.; Sim, B.S.; Lim, I.C.; Oh, S.K.
1990-01-01
The thermal power measurement error of the Korea Multi-purpose Research Reactor has been estimated by a statistical Monte Carlo method, and compared with those obtained by the other methods including deterministic and statistical approaches. The results show that the specified thermal power measurement error of 5% cannot be achieved if the commercial RTDs are used to measure the coolant temperatures of the secondary cooling system and the error can be reduced below the requirement if the commercial RTDs are replaced by the precision RTDs. The possible range of the thermal power control operation has been identified to be from 100% to 20% of full power
Learning mechanisms to limit medication administration errors.
Drach-Zahavy, Anat; Pud, Dorit
2010-04-01
This paper is a report of a study conducted to identify and test the effectiveness of learning mechanisms applied by the nursing staff of hospital wards as a means of limiting medication administration errors. Since the influential report ;To Err Is Human', research has emphasized the role of team learning in reducing medication administration errors. Nevertheless, little is known about the mechanisms underlying team learning. Thirty-two hospital wards were randomly recruited. Data were collected during 2006 in Israel by a multi-method (observations, interviews and administrative data), multi-source (head nurses, bedside nurses) approach. Medication administration error was defined as any deviation from procedures, policies and/or best practices for medication administration, and was identified using semi-structured observations of nurses administering medication. Organizational learning was measured using semi-structured interviews with head nurses, and the previous year's reported medication administration errors were assessed using administrative data. The interview data revealed four learning mechanism patterns employed in an attempt to learn from medication administration errors: integrated, non-integrated, supervisory and patchy learning. Regression analysis results demonstrated that whereas the integrated pattern of learning mechanisms was associated with decreased errors, the non-integrated pattern was associated with increased errors. Supervisory and patchy learning mechanisms were not associated with errors. Superior learning mechanisms are those that represent the whole cycle of team learning, are enacted by nurses who administer medications to patients, and emphasize a system approach to data analysis instead of analysis of individual cases.
Directory of Open Access Journals (Sweden)
Antonio Boldrini
2013-06-01
Full Text Available Introduction: Danger and errors are inherent in human activities. In medical practice errors can lean to adverse events for patients. Mass media echo the whole scenario. Methods: We reviewed recent published papers in PubMed database to focus on the evidence and management of errors in medical practice in general and in Neonatology in particular. We compared the results of the literature with our specific experience in Nina Simulation Centre (Pisa, Italy. Results: In Neonatology the main error domains are: medication and total parenteral nutrition, resuscitation and respiratory care, invasive procedures, nosocomial infections, patient identification, diagnostics. Risk factors include patients’ size, prematurity, vulnerability and underlying disease conditions but also multidisciplinary teams, working conditions providing fatigue, a large variety of treatment and investigative modalities needed. Discussion and Conclusions: In our opinion, it is hardly possible to change the human beings but it is likely possible to change the conditions under they work. Voluntary errors report systems can help in preventing adverse events. Education and re-training by means of simulation can be an effective strategy too. In Pisa (Italy Nina (ceNtro di FormazIone e SimulazioNe NeonAtale is a simulation center that offers the possibility of a continuous retraining for technical and non-technical skills to optimize neonatological care strategies. Furthermore, we have been working on a novel skill trainer for mechanical ventilation (MEchatronic REspiratory System SImulator for Neonatal Applications, MERESSINA. Finally, in our opinion national health policy indirectly influences risk for errors. Proceedings of the 9th International Workshop on Neonatology · Cagliari (Italy · October 23rd-26th, 2013 · Learned lessons, changing practice and cutting-edge research
LIBERTARISMO & ERROR CATEGORIAL
Directory of Open Access Journals (Sweden)
Carlos G. Patarroyo G.
2009-01-01
Full Text Available En este artículo se ofrece una defensa del libertarismo frente a dos acusaciones según las cuales éste comete un error categorial. Para ello, se utiliza la filosofía de Gilbert Ryle como herramienta para explicar las razones que fundamentan estas acusaciones y para mostrar por qué, pese a que ciertas versiones del libertarismo que acuden a la causalidad de agentes o al dualismo cartesiano cometen estos errores, un libertarismo que busque en el indeterminismo fisicalista la base de la posibilidad de la libertad humana no necesariamente puede ser acusado de incurrir en ellos.
Libertarismo & Error Categorial
PATARROYO G, CARLOS G
2009-01-01
En este artículo se ofrece una defensa del libertarismo frente a dos acusaciones según las cuales éste comete un error categorial. Para ello, se utiliza la filosofía de Gilbert Ryle como herramienta para explicar las razones que fundamentan estas acusaciones y para mostrar por qué, pese a que ciertas versiones del libertarismo que acuden a la causalidad de agentes o al dualismo cartesiano cometen estos errores, un libertarismo que busque en el indeterminismo fisicalista la base de la posibili...
1985-01-01
A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.
The Idea of Order at Geometry Class.
Rishel, Thomas
The idea of order in geometry is explored using the experience of assignments given to undergraduates in a college geometry course "From Space to Geometry." Discussed are the definition of geometry, and earth measurement using architecture, art, and common experience. This discussion concludes with a consideration of the question of whether…
Teaching Spatial Geometry in a Virtual World
DEFF Research Database (Denmark)
Förster, Klaus-Tycho
2017-01-01
Spatial geometry is one of the fundamental mathematical building blocks of any engineering education. However, it is overshadowed by planar geometry in the curriculum between playful early primary education and later analytical geometry, leaving a multi-year gap where spatial geometry is absent...
The Influence of the Mounting Errors in RodToothed Transmissions
Directory of Open Access Journals (Sweden)
M. Yu. Sachkov
2015-01-01
Full Text Available In the paper we consider an approximate transmission. The work is aimed at development of gear-powered transmission on parallel axes, which is RF patent-protected. The paper justifies a relevance of the synthesis of new kinds of engagement with the simplified geometry of the contacting condition. A typical solution for powered mechanisms received by F. L. Livinin and his disciples is characterized.The paper describes the arrangement of the coordinate systems used to obtain the function of the position of the gear-powered transmission consisting of two wheels with fifteen leads. For them, also the coordinates of the contact points are obtained, and errors of function of the position in tooth changeover are calculated. To obtain the function position was used a method of matrix transformation and equality of radius and unit normal vectors at the contact point. This transmission can be used in mechanical and instrumentation engineering, and other sectors of the economy. Both reducers and multipliers can be made on its basis. It has high manufacturability (with no special equipment required for its production, and a displacement function is close to linear.This article describes the influence of the axle spacing error on the quality of the transmission characteristics. The paper presents the graphic based relationships and tabular estimates for nominal axle spacing and offsets within 0.2 mm. This error of axle spacing is significant for gearing. From the results of this work we can say that the transmission is almost insensitive to errors of axle spacing. Engagement occurs without an exit of contact point on the lead edge. To solve the obtained system of equations, the numerical methods of the software MathCAD package have been applied.In the future, the authors expect to consider other possible manufacturing and mounting errors of gear-powered transmission (such as the error of the step, misalignment, etc. to assess their impact on the quality
Analogical Reasoning in Geometry Education
Magdas, Ioana
2015-01-01
The analogical reasoning isn't used only in mathematics but also in everyday life. In this article we approach the analogical reasoning in Geometry Education. The novelty of this article is a classification of geometrical analogies by reasoning type and their exemplification. Our classification includes: analogies for understanding and setting a…
Normal forms in Poisson geometry
Marcut, I.T.
2013-01-01
The structure of Poisson manifolds is highly nontrivial even locally. The first important result in this direction is Conn's linearization theorem around fixed points. One of the main results of this thesis (Theorem 2) is a normal form theorem in Poisson geometry, which is the Poisson-geometric
Exploring Bundling Theory with Geometry
Eckalbar, John C.
2006-01-01
The author shows how instructors might successfully introduce students in principles and intermediate microeconomic theory classes to the topic of bundling (i.e., the selling of two or more goods as a package, rather than separately). It is surprising how much students can learn using only the tools of high school geometry. To be specific, one can…
Stochastic Modelling of River Geometry
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Schaarup-Jensen, K.
1996-01-01
Numerical hydrodynamic river models are used in a large number of applications to estimate critical events for rivers. These estimates are subject to a number of uncertainties. In this paper, the problem to evaluate these estimates using probabilistic methods is considered. Stochastic models for ...... for river geometries are formulated and a coupling between hydraulic computational methods and numerical reliability methods is presented....
Matter in toy dynamical geometries
Konopka, T.J.
2009-01-01
One of the objectives of theories describing quantum dynamical geometry is to compute expectation values of geometrical observables. The results of such computations can be affected by whether or not matter is taken into account. It is thus important to understand to what extent and to what effect
Ca??adas, Mar??a C.; Molina, Marta; Gallardo, Sandra; Mart??nez-Santaolalla, Manuel J.; Pe??as, Mar??a
2010-01-01
In this work we present an activity for High School students in which various mathematical concepts of plane and spatial geometry are involved. The final objective of the proposed tasks is constructing a particular polyhedron, the cube, by using a modality of origami called modular origami.
Granular flows in constrained geometries
Murthy, Tejas; Viswanathan, Koushik
Confined geometries are widespread in granular processing applications. The deformation and flow fields in such a geometry, with non-trivial boundary conditions, determine the resultant mechanical properties of the material (local porosity, density, residual stresses etc.). We present experimental studies of deformation and plastic flow of a prototypical granular medium in different nontrivial geometries- flat-punch compression, Couette-shear flow and a rigid body sliding past a granular half-space. These geometries represent simplified scaled-down versions of common industrial configurations such as compaction and dredging. The corresponding granular flows show a rich variety of flow features, representing the entire gamut of material types, from elastic solids (beam buckling) to fluids (vortex-formation, boundary layers) and even plastically deforming metals (dead material zone, pile-up). The effect of changing particle-level properties (e.g., shape, size, density) on the observed flows is also explicitly demonstrated. Non-smooth contact dynamics particle simulations are shown to reproduce some of the observed flow features quantitatively. These results showcase some central challenges facing continuum-scale constitutive theories for dynamic granular flows.
General Relativity: Geometry Meets Physics
Thomsen, Dietrick E.
1975-01-01
Observing the relationship of general relativity and the geometry of space-time, the author questions whether the rest of physics has geometrical explanations. As a partial answer he discusses current research on subatomic particles employing geometric transformations, and cites the existence of geometrical definitions of physical quantities such…
Learners engaging with transformation geometry
African Journals Online (AJOL)
participants engaged in investigative semi-structured interviews with the resear- chers. ... Keywords: analysis; conversions; transformation geometry; transformations; treatments .... semiotic systems of representation is not only to designate mathematical objects or to com- municate but also to ... Research design. We believe ...
Multivariable calculus and differential geometry
Walschap, Gerard
2015-01-01
This text is a modern in-depth study of the subject that includes all the material needed from linear algebra. It then goes on to investigate topics in differential geometry, such as manifolds in Euclidean space, curvature, and the generalization of the fundamental theorem of calculus known as Stokes' theorem.
College geometry a unified development
Kay, David C
2011-01-01
""The book is a comprehensive textbook on basic geometry. … Key features of the book include numerous figures and many problems, more than half of which come with hints or even complete solutions. Frequent historical comments add to making the reading a pleasant one.""-Michael Joswig, Zentralblatt MATH 1273
Visual correlation analytics of event-based error reports for advanced manufacturing
Nazir, Iqbal
2017-01-01
With the growing digitalization and automation in the manufacturing domain, an increasing amount of process data and error reports become available. To minimize the number of errors and maximize the efficiency of the production line, it is important to analyze the generated error reports and find solutions that can reduce future errors. However, not all errors have the equal importance, as some errors may be the result of previously occurred errors. Therefore, it is important for domain exper...
International Nuclear Information System (INIS)
Yang Xue; Satvat, Nader
2012-01-01
Highlight: ► A two-dimensional numerical code based on the method of characteristics is developed. ► The complex arbitrary geometries are represented by constructive solid geometry and decomposed by unstructured meshing. ► Excellent agreement between Monte Carlo and the developed code is observed. ► High efficiency is achieved by parallel computing. - Abstract: A transport theory code MOCUM based on the method of characteristics as the flux solver with an advanced general geometry processor has been developed for two-dimensional rectangular and hexagonal lattice and full core neutronics modeling. In the code, the core structure is represented by the constructive solid geometry that uses regularized Boolean operations to build complex geometries from simple polygons. Arbitrary-precision arithmetic is also used in the process of building geometry objects to eliminate the round-off error from the commonly used double precision numbers. Then, the constructed core frame will be decomposed and refined into a Conforming Delaunay Triangulation to ensure the quality of the meshes. The code is fully parallelized using OpenMP and is verified and validated by various benchmarks representing rectangular, hexagonal, plate type and CANDU reactor geometries. Compared with Monte Carlo and deterministic reference solution, MOCUM results are highly accurate. The mentioned characteristics of the MOCUM make it a perfect tool for high fidelity full core calculation for current and GenIV reactor core designs. The detailed representation of reactor physics parameters can enhance the safety margins with acceptable confidence levels, which lead to more economically optimized designs.
Mahaffey, Michael L.
One of a series of experimental units for children at the preschool level, this booklet deals with geometric concepts. A unit on volume and a unit on linear measurement are covered; for each unit a discussion of mathematical objectives, a list of materials needed, and a sequence of learning activities are provided. Directions are specified for the…
DEFF Research Database (Denmark)
Byg din egen boomerang, kast den, se den flyve, forstå hvorfor og hvordan den vender tilbage, og grib den. Det handler om opdriften på vingerne når du flyver, men det handler også og allermest om den mærkværdige gyroskop-effekt, du bruger til at holde balancen, når du kører på cykel. Vi vil bruge...
Discretisation errors in Landau gauge on the lattice
International Nuclear Information System (INIS)
Bonnet DR, Frederic; Bowman O, Patrick; Leinweber B, Derek; Williams G, Anthony; Richards G, David G.
1999-01-01
Lattice discretization errors in the Landau gauge condition are examined. An improved gauge fixing algorithm in which O(a 2 ) errors are removed is presented. O(a 2 ) improvement of the gauge fixing condition improves comparison with continuum Landau gauge in two ways: (1) through the elimination of O(a 2 ) errors and (2) through a secondary effect of reducing the size of higher-order errors. These results emphasize the importance of implementing an improved gauge fixing condition
A proposal of an open PET geometry
Energy Technology Data Exchange (ETDEWEB)
Yamaya, Taiga [Molecular Imaging Center, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba, 263-8555 (Japan); Inaniwa, Taku [Research Center for Charged Particle Therapy, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba 263-8555 (Japan); Minohara, Shinichi [Research Center for Charged Particle Therapy, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba 263-8555 (Japan); Yoshida, Eiji [Molecular Imaging Center, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba, 263-8555 (Japan); Inadama, Naoko [Molecular Imaging Center, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba, 263-8555 (Japan); Nishikido, Fumihiko [Molecular Imaging Center, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba, 263-8555 (Japan); Shibuya, Kengo [Molecular Imaging Center, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba, 263-8555 (Japan); Lam, Chih Fung [Molecular Imaging Center, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba, 263-8555 (Japan); Murayama, Hideo [Molecular Imaging Center, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba, 263-8555 (Japan)
2008-02-07
The long patient port of a PET scanner tends to put stress on patients, especially patients with claustrophobia. It also prevents doctors and technicians from taking care of patients during scanning. In this paper, we proposed an 'open PET' geometry, which consists of two axially separated detector rings. A long and continuous field-of-view (FOV) including a 360 deg. opened gap between two detector rings can be imaged enabling a fully 3D image reconstruction of all the possible lines-of-response. The open PET will become practical if iterative image reconstruction methods are applied even though image reconstruction of the open PET is analytically an incomplete problem. First we implemented a 'masked' 3D ordered subset expectation maximization (OS-EM) in which the system matrix was obtained from a long 'gapless' scanner by applying a mask to detectors corresponding to the open space. Next, in order to evaluate imaging performance of the proposed open PET geometry, we simulated a dual HR+ scanner (ring diameter of D = 827 mm, axial length of W = 154 mm x 2) separated by a variable gap. The gap W was the maximum limit to have axially continuous FOV of 3W though the maximum diameter of FOV at the central slice was limited to D/2. Artifacts, observed on both sides of the open space when the gap exceeded W, were effectively reduced by inserting detectors partially into unnecessary open spaces. We also tested the open PET geometry using experimental data obtained by the jPET-D4. The jPET-D4 is a prototype brain scanner, which has 5 rings of 24 detector blocks. We simulated the open jPET-D4 with a gap of 66 mm by eliminating 1 block-ring from experimental data. Although some artifacts were seen at both ends of the opened gap, very similar images were obtained with and without the gap. The proposed open PET geometry is expected to lead to realization of in-beam PET, which is a method for an in situ monitoring of charged particle therapy, by
Discrete differential geometry. Consistency as integrability
Bobenko, Alexander I.; Suris, Yuri B.
2005-01-01
A new field of discrete differential geometry is presently emerging on the border between differential and discrete geometry. Whereas classical differential geometry investigates smooth geometric shapes (such as surfaces), and discrete geometry studies geometric shapes with finite number of elements (such as polyhedra), the discrete differential geometry aims at the development of discrete equivalents of notions and methods of smooth surface theory. Current interest in this field derives not ...
Indian Academy of Sciences (India)
Science and Automation at ... the Reed-Solomon code contained 223 bytes of data, (a byte ... then you have a data storage system with error correction, that ..... practical codes, storing such a table is infeasible, as it is generally too large.
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...
Challenge and Error: Critical Events and Attention-Related Errors
Cheyne, James Allan; Carriere, Jonathan S. A.; Solman, Grayden J. F.; Smilek, Daniel
2011-01-01
Attention lapses resulting from reactivity to task challenges and their consequences constitute a pervasive factor affecting everyday performance errors and accidents. A bidirectional model of attention lapses (error [image omitted] attention-lapse: Cheyne, Solman, Carriere, & Smilek, 2009) argues that errors beget errors by generating attention…
NLO error propagation exercise: statistical results
International Nuclear Information System (INIS)
Pack, D.J.; Downing, D.J.
1985-09-01
Error propagation is the extrapolation and cumulation of uncertainty (variance) above total amounts of special nuclear material, for example, uranium or 235 U, that are present in a defined location at a given time. The uncertainty results from the inevitable inexactness of individual measurements of weight, uranium concentration, 235 U enrichment, etc. The extrapolated and cumulated uncertainty leads directly to quantified limits of error on inventory differences (LEIDs) for such material. The NLO error propagation exercise was planned as a field demonstration of the utilization of statistical error propagation methodology at the Feed Materials Production Center in Fernald, Ohio from April 1 to July 1, 1983 in a single material balance area formed specially for the exercise. Major elements of the error propagation methodology were: variance approximation by Taylor Series expansion; variance cumulation by uncorrelated primary error sources as suggested by Jaech; random effects ANOVA model estimation of variance effects (systematic error); provision for inclusion of process variance in addition to measurement variance; and exclusion of static material. The methodology was applied to material balance area transactions from the indicated time period through a FORTRAN computer code developed specifically for this purpose on the NLO HP-3000 computer. This paper contains a complete description of the error propagation methodology and a full summary of the numerical results of applying the methodlogy in the field demonstration. The error propagation LEIDs did encompass the actual uranium and 235 U inventory differences. Further, one can see that error propagation actually provides guidance for reducing inventory differences and LEIDs in future time periods
Energy Technology Data Exchange (ETDEWEB)
Nakagawa, Takahiro; Ochiai, Katsuharu [Plant and System Planning Department, Toshiba Corporation, Yokohama, Kanagawa (Japan); Uematsu, Mikio; Hayashida, Yoshihisa [Department of Nuclear Engineering, Toshiba Engineering Corporation, Yokohama, Kanagawa (Japan)
2000-03-01
A boiling water reactor (BWR) plant has a single loop coolant system, in which main steam generated in the reactor core proceeds directly into turbines. Consequently, radioactive {sup 16}N (6.2 MeV photon emitter) contained in the steam contributes to gamma-ray skyshine dose in the vicinity of the BWR plant. The skyshine dose analysis is generally performed with the line-beam method code SKYSHINE, in which calculational geometry consists of a rectangular turbine building and a set of isotropic point sources corresponding to an actual distribution of {sup 16}N sources. For the purpose of upgrading calculational accuracy, the SKYSHINE-CG code has been developed by incorporating the combinatorial geometry (CG) routine into the SKYSHINE code, so that shielding effect of in-building equipment can be properly considered using a three-dimensional model composed of boxes, cylinders, spheres, etc. Skyshine dose rate around a 500 MWe BWR plant was calculated with both SKYSHINE and SKYSHINE-CG codes, and the calculated results were compared with measured data obtained with a NaI(Tl) scintillation detector. The C/E values for SKYSHINE-CG calculation were scattered around 4.0, whereas the ones for SKYSHINE calculation were as large as 6.0. Calculational error was found to be reduced by adopting three-dimensional model based on the combinatorial geometry method. (author)
DEFF Research Database (Denmark)
Hallas, Peter; Ellingsen, Trond
2006-01-01
Evaluation of the circumstances related to errors in diagnosis of fractures at an Emergency Department may suggest ways to reduce the incidence of such errors.......Evaluation of the circumstances related to errors in diagnosis of fractures at an Emergency Department may suggest ways to reduce the incidence of such errors....
SU-E-T-558: Monte Carlo Photon Transport Simulations On GPU with Quadric Geometry
International Nuclear Information System (INIS)
Chi, Y; Tian, Z; Jiang, S; Jia, X
2015-01-01
Purpose: Monte Carlo simulation on GPU has experienced rapid advancements over the past a few years and tremendous accelerations have been achieved. Yet existing packages were developed only in voxelized geometry. In some applications, e.g. radioactive seed modeling, simulations in more complicated geometry are needed. This abstract reports our initial efforts towards developing a quadric geometry module aiming at expanding the application scope of GPU-based MC simulations. Methods: We defined the simulation geometry consisting of a number of homogeneous bodies, each specified by its material composition and limiting surfaces characterized by quadric functions. A tree data structure was utilized to define geometric relationship between different bodies. We modified our GPU-based photon MC transport package to incorporate this geometry. Specifically, geometry parameters were loaded into GPU’s shared memory for fast access. Geometry functions were rewritten to enable the identification of the body that contains the current particle location via a fast searching algorithm based on the tree data structure. Results: We tested our package in an example problem of HDR-brachytherapy dose calculation for shielded cylinder. The dose under the quadric geometry and that under the voxelized geometry agreed in 94.2% of total voxels within 20% isodose line based on a statistical t-test (95% confidence level), where the reference dose was defined to be the one at 0.5cm away from the cylinder surface. It took 243sec to transport 100million source photons under this quadric geometry on an NVidia Titan GPU card. Compared with simulation time of 99.6sec in the voxelized geometry, including quadric geometry reduced efficiency due to the complicated geometry-related computations. Conclusion: Our GPU-based MC package has been extended to support photon transport simulation in quadric geometry. Satisfactory accuracy was observed with a reduced efficiency. Developments for charged
SU-E-T-558: Monte Carlo Photon Transport Simulations On GPU with Quadric Geometry
Energy Technology Data Exchange (ETDEWEB)
Chi, Y; Tian, Z; Jiang, S; Jia, X [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)
2015-06-15
Purpose: Monte Carlo simulation on GPU has experienced rapid advancements over the past a few years and tremendous accelerations have been achieved. Yet existing packages were developed only in voxelized geometry. In some applications, e.g. radioactive seed modeling, simulations in more complicated geometry are needed. This abstract reports our initial efforts towards developing a quadric geometry module aiming at expanding the application scope of GPU-based MC simulations. Methods: We defined the simulation geometry consisting of a number of homogeneous bodies, each specified by its material composition and limiting surfaces characterized by quadric functions. A tree data structure was utilized to define geometric relationship between different bodies. We modified our GPU-based photon MC transport package to incorporate this geometry. Specifically, geometry parameters were loaded into GPU’s shared memory for fast access. Geometry functions were rewritten to enable the identification of the body that contains the current particle location via a fast searching algorithm based on the tree data structure. Results: We tested our package in an example problem of HDR-brachytherapy dose calculation for shielded cylinder. The dose under the quadric geometry and that under the voxelized geometry agreed in 94.2% of total voxels within 20% isodose line based on a statistical t-test (95% confidence level), where the reference dose was defined to be the one at 0.5cm away from the cylinder surface. It took 243sec to transport 100million source photons under this quadric geometry on an NVidia Titan GPU card. Compared with simulation time of 99.6sec in the voxelized geometry, including quadric geometry reduced efficiency due to the complicated geometry-related computations. Conclusion: Our GPU-based MC package has been extended to support photon transport simulation in quadric geometry. Satisfactory accuracy was observed with a reduced efficiency. Developments for charged
Evaluation of Data with Systematic Errors
International Nuclear Information System (INIS)
Froehner, F. H.
2003-01-01
Application-oriented evaluated nuclear data libraries such as ENDF and JEFF contain not only recommended values but also uncertainty information in the form of 'covariance' or 'error files'. These can neither be constructed nor utilized properly without a thorough understanding of uncertainties and correlations. It is shown how incomplete information about errors is described by multivariate probability distributions or, more summarily, by covariance matrices, and how correlations are caused by incompletely known common errors. Parameter estimation for the practically most important case of the Gaussian distribution with common errors is developed in close analogy to the more familiar case without. The formalism shows that, contrary to widespread belief, common ('systematic') and uncorrelated ('random' or 'statistical') errors are to be added in quadrature. It also shows explicitly that repetition of a measurement reduces mainly the statistical uncertainties but not the systematic ones. While statistical uncertainties are readily estimated from the scatter of repeatedly measured data, systematic uncertainties can only be inferred from prior information about common errors and their propagation. The optimal way to handle error-affected auxiliary quantities ('nuisance parameters') in data fitting and parameter estimation is to adjust them on the same footing as the parameters of interest and to integrate (marginalize) them out of the joint posterior distribution afterward
Rieger, Martina; Martinez, Fanny; Wenke, Dorit
2011-01-01
Using a typing task we investigated whether insufficient imagination of errors and error corrections is related to duration differences between execution and imagination. In Experiment 1 spontaneous error imagination was investigated, whereas in Experiment 2 participants were specifically instructed to imagine errors. Further, in Experiment 2 we…
Code subspaces for LLM geometries
Berenstein, David; Miller, Alexandra
2018-03-01
We consider effective field theory around classical background geometries with a gauge theory dual, specifically those in the class of LLM geometries. These are dual to half-BPS states of N= 4 SYM. We find that the language of code subspaces is natural for discussing the set of nearby states, which are built by acting with effective fields on these backgrounds. This work extends our previous work by going beyond the strict infinite N limit. We further discuss how one can extract the topology of the state beyond N→∞ and find that, as before, uncertainty and entanglement entropy calculations provide a useful tool to do so. Finally, we discuss obstructions to writing down a globally defined metric operator. We find that the answer depends on the choice of reference state that one starts with. Therefore, within this setup, there is ambiguity in trying to write an operator that describes the metric globally.
Euclidean distance geometry an introduction
Liberti, Leo
2017-01-01
This textbook, the first of its kind, presents the fundamentals of distance geometry: theory, useful methodologies for obtaining solutions, and real world applications. Concise proofs are given and step-by-step algorithms for solving fundamental problems efficiently and precisely are presented in Mathematica®, enabling the reader to experiment with concepts and methods as they are introduced. Descriptive graphics, examples, and problems, accompany the real gems of the text, namely the applications in visualization of graphs, localization of sensor networks, protein conformation from distance data, clock synchronization protocols, robotics, and control of unmanned underwater vehicles, to name several. Aimed at intermediate undergraduates, beginning graduate students, researchers, and practitioners, the reader with a basic knowledge of linear algebra will gain an understanding of the basic theories of distance geometry and why they work in real life.
Fractal geometry and computer graphics
Sakas, Georgios; Peitgen, Heinz-Otto; Englert, Gabriele
1992-01-01
Fractal geometry has become popular in the last 15 years, its applications can be found in technology, science, or even arts. Fractal methods and formalism are seen today as a general, abstract, but nevertheless practical instrument for the description of nature in a wide sense. But it was Computer Graphics which made possible the increasing popularity of fractals several years ago, and long after their mathematical formulation. The two disciplines are tightly linked. The book contains the scientificcontributions presented in an international workshop in the "Computer Graphics Center" in Darmstadt, Germany. The target of the workshop was to present the wide spectrum of interrelationships and interactions between Fractal Geometry and Computer Graphics. The topics vary from fundamentals and new theoretical results to various applications and systems development. All contributions are original, unpublished papers.The presentations have been discussed in two working groups; the discussion results, together with a...
The geometry of celestial mechanics
Geiges, Hansjörg
2016-01-01
Celestial mechanics is the branch of mathematical astronomy devoted to studying the motions of celestial bodies subject to the Newtonian law of gravitation. This mathematical introductory textbook reveals that even the most basic question in celestial mechanics, the Kepler problem, leads to a cornucopia of geometric concepts: conformal and projective transformations, spherical and hyperbolic geometry, notions of curvature, and the topology of geodesic flows. For advanced undergraduate and beginning graduate students, this book explores the geometric concepts underlying celestial mechanics and is an ideal companion for introductory courses. The focus on the history of geometric ideas makes it perfect supplementary reading for students in elementary geometry and topology. Numerous exercises, historical notes and an extensive bibliography provide all the contextual information required to gain a solid grounding in celestial mechanics.
Differential geometry and mathematical physics
Rudolph, Gerd
Starting from an undergraduate level, this book systematically develops the basics of • Calculus on manifolds, vector bundles, vector fields and differential forms, • Lie groups and Lie group actions, • Linear symplectic algebra and symplectic geometry, • Hamiltonian systems, symmetries and reduction, integrable systems and Hamilton-Jacobi theory. The topics listed under the first item are relevant for virtually all areas of mathematical physics. The second and third items constitute the link between abstract calculus and the theory of Hamiltonian systems. The last item provides an introduction to various aspects of this theory, including Morse families, the Maslov class and caustics. The book guides the reader from elementary differential geometry to advanced topics in the theory of Hamiltonian systems with the aim of making current research literature accessible. The style is that of a mathematical textbook,with full proofs given in the text or as exercises. The material is illustrated by numerous d...
Grassmannian geometry of scattering amplitudes
Arkani-Hamed, Nima; Cachazo, Freddy; Goncharov, Alexander; Postnikov, Alexander; Trnka, Jaroslav
2016-01-01
Outlining a revolutionary reformulation of the foundations of perturbative quantum field theory, this book is a self-contained and authoritative analysis of the application of this new formulation to the case of planar, maximally supersymmetric Yang–Mills theory. The book begins by deriving connections between scattering amplitudes and Grassmannian geometry from first principles before introducing novel physical and mathematical ideas in a systematic manner accessible to both physicists and mathematicians. The principle players in this process are on-shell functions which are closely related to certain sub-strata of Grassmannian manifolds called positroids - in terms of which the classification of on-shell functions and their relations becomes combinatorially manifest. This is an essential introduction to the geometry and combinatorics of the positroid stratification of the Grassmannian and an ideal text for advanced students and researchers working in the areas of field theory, high energy physics, and the...
Foliation theory in algebraic geometry
McKernan, James; Pereira, Jorge
2016-01-01
Featuring a blend of original research papers and comprehensive surveys from an international team of leading researchers in the thriving fields of foliation theory, holomorphic foliations, and birational geometry, this book presents the proceedings of the conference "Foliation Theory in Algebraic Geometry," hosted by the Simons Foundation in New York City in September 2013. Topics covered include: Fano and del Pezzo foliations; the cone theorem and rank one foliations; the structure of symmetric differentials on a smooth complex surface and a local structure theorem for closed symmetric differentials of rank two; an overview of lifting symmetric differentials from varieties with canonical singularities and the applications to the classification of AT bundles on singular varieties; an overview of the powerful theory of the variety of minimal rational tangents introduced by Hwang and Mok; recent examples of varieties which are hyperbolic and yet the Green-Griffiths locus is the whole of X; and a classificati...
Groups and Geometries : Siena Conference
Kantor, William; Lunardon, Guglielmo; Pasini, Antonio; Tamburini, Maria
1998-01-01
On September 1-7, 1996 a conference on Groups and Geometries took place in lovely Siena, Italy. It brought together experts and interested mathematicians from numerous countries. The scientific program centered around invited exposi tory lectures; there also were shorter research announcements, including talks by younger researchers. The conference concerned a broad range of topics in group theory and geometry, with emphasis on recent results and open problems. Special attention was drawn to the interplay between group-theoretic methods and geometric and combinatorial ones. Expanded versions of many of the talks appear in these Proceedings. This volume is intended to provide a stimulating collection of themes for a broad range of algebraists and geometers. Among those themes, represented within the conference or these Proceedings, are aspects of the following: 1. the classification of finite simple groups, 2. the structure and properties of groups of Lie type over finite and algebraically closed fields of f...
Kim, Min-A; Sim, Hye-Min; Lee, Hye-Seong
2016-11-01
As reformulations and processing changes are increasingly needed in the food industry to produce healthier, more sustainable, and cost effective products while maintaining superior quality, reliable measurements of consumers' sensory perception and discrimination are becoming more critical. Consumer discrimination methods using a preferred-reference duo-trio test design have been shown to be effective in improving the discrimination performance by customizing sample presentation sequences. However, this design can add complexity to the discrimination task for some consumers, resulting in more errors in sensory discrimination. The objective of the present study was to investigate the effects of different types of test instructions using the preference-reference duo-trio test design where a paired-preference test is followed by 6 repeated preferred-reference duo-trio tests, in comparison to the analytical method using the balanced-reference duo-trio. Analyses of d' estimates (product-related measure) and probabilistic sensory discriminators in momentary numbers of subjects showing statistical significance (subject-related measure) revealed that only preferred-reference duo-trio test using affective reference-framing, either by providing no information about the reference or information on a previously preferred sample, improved the sensory discrimination more than the analytical method. No decrease in discrimination performance was observed with any type of instruction, confirming that consumers could handle the test methods. These results suggest that when repeated tests are feasible, using the affective discrimination method would be operationally more efficient as well as ecologically more reliable for measuring consumers' sensory discrimination ability. Copyright © 2016 Elsevier Ltd. All rights reserved.
Correction of refractive errors
Directory of Open Access Journals (Sweden)
Vladimir Pfeifer
2005-10-01
Full Text Available Background: Spectacles and contact lenses are the most frequently used, the safest and the cheapest way to correct refractive errors. The development of keratorefractive surgery has brought new opportunities for correction of refractive errors in patients who have the need to be less dependent of spectacles or contact lenses. Until recently, RK was the most commonly performed refractive procedure for nearsighted patients.Conclusions: The introduction of excimer laser in refractive surgery has given the new opportunities of remodelling the cornea. The laser energy can be delivered on the stromal surface like in PRK or deeper on the corneal stroma by means of lamellar surgery. In LASIK flap is created with microkeratome in LASEK with ethanol and in epi-LASIK the ultra thin flap is created mechanically.
1989-01-01
001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.
A three-dimensional reconstruction algorithm for an inverse-geometry volumetric CT system
International Nuclear Information System (INIS)
Schmidt, Taly Gilat; Fahrig, Rebecca; Pelc, Norbert J.
2005-01-01
An inverse-geometry volumetric computed tomography (IGCT) system has been proposed capable of rapidly acquiring sufficient data to reconstruct a thick volume in one circular scan. The system uses a large-area scanned source opposite a smaller detector. The source and detector have the same extent in the axial, or slice, direction, thus providing sufficient volumetric sampling and avoiding cone-beam artifacts. This paper describes a reconstruction algorithm for the IGCT system. The algorithm first rebins the acquired data into two-dimensional (2D) parallel-ray projections at multiple tilt and azimuthal angles, followed by a 3D filtered backprojection. The rebinning step is performed by gridding the data onto a Cartesian grid in a 4D projection space. We present a new method for correcting the gridding error caused by the finite and asymmetric sampling in the neighborhood of each output grid point in the projection space. The reconstruction algorithm was implemented and tested on simulated IGCT data. Results show that the gridding correction reduces the gridding errors to below one Hounsfield unit. With this correction, the reconstruction algorithm does not introduce significant artifacts or blurring when compared to images reconstructed from simulated 2D parallel-ray projections. We also present an investigation of the noise behavior of the method which verifies that the proposed reconstruction algorithm utilizes cross-plane rays as efficiently as in-plane rays and can provide noise comparable to an in-plane parallel-ray geometry for the same number of photons. Simulations of a resolution test pattern and the modulation transfer function demonstrate that the IGCT system, using the proposed algorithm, is capable of 0.4 mm isotropic resolution. The successful implementation of the reconstruction algorithm is an important step in establishing feasibility of the IGCT system
Needle decompositions in Riemannian geometry
Klartag, Bo'az
2017-01-01
The localization technique from convex geometry is generalized to the setting of Riemannian manifolds whose Ricci curvature is bounded from below. In a nutshell, the author's method is based on the following observation: When the Ricci curvature is non-negative, log-concave measures are obtained when conditioning the Riemannian volume measure with respect to a geodesic foliation that is orthogonal to the level sets of a Lipschitz function. The Monge mass transfer problem plays an important role in the author's analysis.
Geometry Dependence of Stellarator Turbulence
International Nuclear Information System (INIS)
Mynick, H.E.; Xanthopoulos, P.; Boozer, A.H.
2009-01-01
Using the nonlinear gyrokinetic code package GENE/GIST, we study the turbulent transport in a broad family of stellarator designs, to understand the geometry-dependence of the microturbulence. By using a set of flux tubes on a given flux surface, we construct a picture of the 2D structure of the microturbulence over that surface, and relate this to relevant geometric quantities, such as the curvature, local shear, and effective potential in the Schrodinger-like equation governing linear drift modes
Superbanana orbits in stellarator geometries
International Nuclear Information System (INIS)
Derr, J.A.; Shohet, J.L.
1979-04-01
The presence of superbanana orbit types localized to either the interior or the exterior of stellarators and torsatrons is numerically investigated for 3.5 MeV alpha particles. The absence of the interior superbanana in both geometries is found to be due to non-conservation of the action. Exterior superbananas are found in the stellarator only, as a consequence of the existence of closed helical magnetic wells. No superbananas of either type are found in the torsatron
Turtle geometry the Python way
Battle, S.
2014-01-01
An introduction to coding using Python’s on-screen ‘turtle’ that can be commanded with a few simple instructions including forward, backward, left and right. The turtle leaves a trace that can be used to draw geometric figures. This workshop is aimed at beginners of all ages. The aim is to learn a smattering of programming and a little bit of geometry in a fun way.
Topics in modern differential geometry
Verstraelen, Leopold
2017-01-01
A variety of introductory articles is provided on a wide range of topics, including variational problems on curves and surfaces with anisotropic curvature. Experts in the fields of Riemannian, Lorentzian and contact geometry present state-of-the-art reviews of their topics. The contributions are written on a graduate level and contain extended bibliographies. The ten chapters are the result of various doctoral courses which were held in 2009 and 2010 at universities in Leuven, Serbia, Romania and Spain.
Directory of Open Access Journals (Sweden)
Pooyan Vahidi Pashsaki
2016-06-01
Full Text Available Accuracy of a five-axis CNC machine tool is affected by a vast number of error sources. This paper investigates volumetric error modeling and its compensation to the basis for creation of new tool path for improvement of work pieces accuracy. The volumetric error model of a five-axis machine tool with the configuration RTTTR (tilting head B-axis and rotary table in work piece side A΄ was set up taking into consideration rigid body kinematics and homogeneous transformation matrix, in which 43 error components are included. Volumetric error comprises 43 error components that can separately reduce geometrical and dimensional accuracy of work pieces. The machining accuracy of work piece is guaranteed due to the position of the cutting tool center point (TCP relative to the work piece. The cutting tool is deviated from its ideal position relative to the work piece and machining error is experienced. For compensation process detection of the present tool path and analysis of the RTTTR five-axis CNC machine tools geometrical error, translating current position of component to compensated positions using the Kinematics error model, converting newly created component to new tool paths using the compensation algorithms and finally editing old G-codes using G-code generator algorithm have been employed.
Soft error mechanisms, modeling and mitigation
Sayil, Selahattin
2016-01-01
This book introduces readers to various radiation soft-error mechanisms such as soft delays, radiation induced clock jitter and pulses, and single event (SE) coupling induced effects. In addition to discussing various radiation hardening techniques for combinational logic, the author also describes new mitigation strategies targeting commercial designs. Coverage includes novel soft error mitigation techniques such as the Dynamic Threshold Technique and Soft Error Filtering based on Transmission gate with varied gate and body bias. The discussion also includes modeling of SE crosstalk noise, delay and speed-up effects. Various mitigation strategies to eliminate SE coupling effects are also introduced. Coverage also includes the reliability of low power energy-efficient designs and the impact of leakage power consumption optimizations on soft error robustness. The author presents an analysis of various power optimization techniques, enabling readers to make design choices that reduce static power consumption an...
Error Sonification of a Complex Motor Task
Directory of Open Access Journals (Sweden)
Riener Robert
2011-12-01
Full Text Available Visual information is mainly used to master complex motor tasks. Thus, additional information providing augmented feedback should be displayed in other modalities than vision, e.g. hearing. The present work evaluated the potential of error sonification to enhance learning of a rowing-type motor task. In contrast to a control group receiving self-controlled terminal feedback, the experimental group could not significantly reduce spatial errors. Thus, motor learning was not enhanced by error sonification, although during the training the participant could benefit from it. It seems that the motor task was too slow, resulting in immediate corrections of the movement rather than in an internal representation of the general characteristics of the motor task. Therefore, further studies should elaborate the impact of error sonification when general characteristics of the motor tasks are already known.
Minimum Tracking Error Volatility
Luca RICCETTI
2010-01-01
Investors assign part of their funds to asset managers that are given the task of beating a benchmark. The risk management department usually imposes a maximum value of the tracking error volatility (TEV) in order to keep the risk of the portfolio near to that of the selected benchmark. However, risk management does not establish a rule on TEV which enables us to understand whether the asset manager is really active or not and, in practice, asset managers sometimes follow passively the corres...
Hinds, Erold W. (Principal Investigator)
1996-01-01
This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.