WorldWideScience

Sample records for errors prescribing faults

  1. Medication errors: prescribing faults and prescription errors.

    Science.gov (United States)

    Velo, Giampaolo P; Minuz, Pietro

    2009-06-01

    1. Medication errors are common in general practice and in hospitals. Both errors in the act of writing (prescription errors) and prescribing faults due to erroneous medical decisions can result in harm to patients. 2. Any step in the prescribing process can generate errors. Slips, lapses, or mistakes are sources of errors, as in unintended omissions in the transcription of drugs. Faults in dose selection, omitted transcription, and poor handwriting are common. 3. Inadequate knowledge or competence and incomplete information about clinical characteristics and previous treatment of individual patients can result in prescribing faults, including the use of potentially inappropriate medications. 4. An unsafe working environment, complex or undefined procedures, and inadequate communication among health-care personnel, particularly between doctors and nurses, have been identified as important underlying factors that contribute to prescription errors and prescribing faults. 5. Active interventions aimed at reducing prescription errors and prescribing faults are strongly recommended. These should be focused on the education and training of prescribers and the use of on-line aids. The complexity of the prescribing procedure should be reduced by introducing automated systems or uniform prescribing charts, in order to avoid transcription and omission errors. Feedback control systems and immediate review of prescriptions, which can be performed with the assistance of a hospital pharmacist, are also helpful. Audits should be performed periodically.

  2. Learning from prescribing errors

    OpenAIRE

    Dean, B

    2002-01-01

    

 The importance of learning from medical error has recently received increasing emphasis. This paper focuses on prescribing errors and argues that, while learning from prescribing errors is a laudable goal, there are currently barriers that can prevent this occurring. Learning from errors can take place on an individual level, at a team level, and across an organisation. Barriers to learning from prescribing errors include the non-discovery of many prescribing errors, lack of feedback to th...

  3. Electronic prescribing reduces prescribing error in public hospitals.

    Science.gov (United States)

    Shawahna, Ramzi; Rahman, Nisar-Ur; Ahmad, Mahmood; Debray, Marcel; Yliperttula, Marjo; Declèves, Xavier

    2011-11-01

    To examine the incidence of prescribing errors in a main public hospital in Pakistan and to assess the impact of introducing electronic prescribing system on the reduction of their incidence. Medication errors are persistent in today's healthcare system. The impact of electronic prescribing on reducing errors has not been tested in developing world. Prospective review of medication and discharge medication charts before and after the introduction of an electronic inpatient record and prescribing system. Inpatient records (n = 3300) and 1100 discharge medication sheets were reviewed for prescribing errors before and after the installation of electronic prescribing system in 11 wards. Medications (13,328 and 14,064) were prescribed for inpatients, among which 3008 and 1147 prescribing errors were identified, giving an overall error rate of 22·6% and 8·2% throughout paper-based and electronic prescribing, respectively. Medications (2480 and 2790) were prescribed for discharge patients, among which 418 and 123 errors were detected, giving an overall error rate of 16·9% and 4·4% during paper-based and electronic prescribing, respectively. Electronic prescribing has a significant effect on the reduction of prescribing errors. Prescribing errors are commonplace in Pakistan public hospitals. The study evaluated the impact of introducing electronic inpatient records and electronic prescribing in the reduction of prescribing errors in a public hospital in Pakistan. © 2011 Blackwell Publishing Ltd.

  4. Prescribing Errors in Cardiovascular Diseases in a Tertiary Health ...

    African Journals Online (AJOL)

    Prescription errors are now known to be contributing to a large number of deaths during the treatment of cardiovascular diseases. However, there is paucity of information about these errors occurring in health facilities in Nigeria. The objective of this study was to investigate the prevalence of prescribing errors in ...

  5. Prescribing errors in a Brazilian neonatal intensive care unit

    Directory of Open Access Journals (Sweden)

    Ana Paula Cezar Machado

    2015-12-01

    Full Text Available Abstract Pediatric patients, especially those admitted to the neonatal intensive care unit (ICU, are highly vulnerable to medication errors. This study aimed to measure the prescription error rate in a university hospital neonatal ICU and to identify susceptible patients, types of errors, and the medicines involved. The variables related to medicines prescribed were compared to the Neofax prescription protocol. The study enrolled 150 newborns and analyzed 489 prescription order forms, with 1,491 medication items, corresponding to 46 drugs. Prescription error rate was 43.5%. Errors were found in dosage, intervals, diluents, and infusion time, distributed across 7 therapeutic classes. Errors were more frequent in preterm newborns. Diluent and dosing were the most frequent sources of errors. The therapeutic classes most involved in errors were antimicrobial agents and drugs that act on the nervous and cardiovascular systems.

  6. Software error masking effect on hardware faults

    International Nuclear Information System (INIS)

    Choi, Jong Gyun; Seong, Poong Hyun

    1999-01-01

    Based on the Very High Speed Integrated Circuit (VHSIC) Hardware Description Language (VHDL), in this work, a simulation model for fault injection is developed to estimate the dependability of the digital system in operational phase. We investigated the software masking effect on hardware faults through the single bit-flip and stuck-at-x fault injection into the internal registers of the processor and memory cells. The fault location reaches all registers and memory cells. Fault distribution over locations is randomly chosen based on a uniform probability distribution. Using this model, we have predicted the reliability and masking effect of an application software in a digital system-Interposing Logic System (ILS) in a nuclear power plant. We have considered four the software operational profiles. From the results it was found that the software masking effect on hardware faults should be properly considered for predicting the system dependability accurately in operation phase. It is because the masking effect was formed to have different values according to the operational profile

  7. A Technological Innovation to Reduce Prescribing Errors Based on Implementation Intentions: The Acceptability and Feasibility of MyPrescribe.

    Science.gov (United States)

    Keyworth, Chris; Hart, Jo; Thoong, Hong; Ferguson, Jane; Tully, Mary

    2017-08-01

    Although prescribing of medication in hospitals is rarely an error-free process, prescribers receive little feedback on their mistakes and ways to change future practices. Audit and feedback interventions may be an effective approach to modifying the clinical practice of health professionals, but these may pose logistical challenges when used in hospitals. Moreover, such interventions are often labor intensive. Consequently, there is a need to develop effective and innovative interventions to overcome these challenges and to improve the delivery of feedback on prescribing. Implementation intentions, which have been shown to be effective in changing behavior, link critical situations with an appropriate response; however, these have rarely been used in the context of improving prescribing practices. Semistructured qualitative interviews were conducted to evaluate the acceptability and feasibility of providing feedback on prescribing errors via MyPrescribe, a mobile-compatible website informed by implementation intentions. Data relating to 200 prescribing errors made by 52 junior doctors were collected by 11 hospital pharmacists. These errors were populated into MyPrescribe, where prescribers were able to construct their own personalized action plans. Qualitative interviews with a subsample of 15 junior doctors were used to explore issues regarding feasibility and acceptability of MyPrescribe and their experiences of using implementation intentions to construct prescribing action plans. Framework analysis was used to identify prominent themes, with findings mapped to the behavioral components of the COM-B model (capability, opportunity, motivation, and behavior) to inform the development of future interventions. MyPrescribe was perceived to be effective in providing opportunities for critical reflection on prescribing errors and to complement existing training (such as junior doctors' e-portfolio). The participants were able to provide examples of how they would use

  8. Using Fault Trees to Advance Understanding of Diagnostic Errors.

    Science.gov (United States)

    Rogith, Deevakar; Iyengar, M Sriram; Singh, Hardeep

    2017-11-01

    Diagnostic errors annually affect at least 5% of adults in the outpatient setting in the United States. Formal analytic techniques are only infrequently used to understand them, in part because of the complexity of diagnostic processes and clinical work flows involved. In this article, diagnostic errors were modeled using fault tree analysis (FTA), a form of root cause analysis that has been successfully used in other high-complexity, high-risk contexts. How factors contributing to diagnostic errors can be systematically modeled by FTA to inform error understanding and error prevention is demonstrated. A team of three experts reviewed 10 published cases of diagnostic error and constructed fault trees. The fault trees were modeled according to currently available conceptual frameworks characterizing diagnostic error. The 10 trees were then synthesized into a single fault tree to identify common contributing factors and pathways leading to diagnostic error. FTA is a visual, structured, deductive approach that depicts the temporal sequence of events and their interactions in a formal logical hierarchy. The visual FTA enables easier understanding of causative processes and cognitive and system factors, as well as rapid identification of common pathways and interactions in a unified fashion. In addition, it enables calculation of empirical estimates for causative pathways. Thus, fault trees might provide a useful framework for both quantitative and qualitative analysis of diagnostic errors. Future directions include establishing validity and reliability by modeling a wider range of error cases, conducting quantitative evaluations, and undertaking deeper exploration of other FTA capabilities. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  9. The impact of a closed-loop electronic prescribing and administration system on prescribing errors, administration errors and staff time: a before-and-after study.

    Science.gov (United States)

    Franklin, Bryony Dean; O'Grady, Kara; Donyai, Parastou; Jacklin, Ann; Barber, Nick

    2007-08-01

    To assess the impact of a closed-loop electronic prescribing, automated dispensing, barcode patient identification and electronic medication administration record (EMAR) system on prescribing and administration errors, confirmation of patient identity before administration, and staff time. Before-and-after study in a surgical ward of a teaching hospital, involving patients and staff of that ward. Closed-loop electronic prescribing, automated dispensing, barcode patient identification and EMAR system. Percentage of new medication orders with a prescribing error, percentage of doses with medication administration errors (MAEs) and percentage given without checking patient identity. Time spent prescribing and providing a ward pharmacy service. Nursing time on medication tasks. Prescribing errors were identified in 3.8% of 2450 medication orders pre-intervention and 2.0% of 2353 orders afterwards (pMedical staff required 15 s to prescribe a regular inpatient drug pre-intervention and 39 s afterwards (p = 0.03; t test). Time spent providing a ward pharmacy service increased from 68 min to 98 min each weekday (p = 0.001; t test); 22% of drug charts were unavailable pre-intervention. Time per drug administration round decreased from 50 min to 40 min (p = 0.006; t test); nursing time on medication tasks outside of drug rounds increased from 21.1% to 28.7% (p = 0.006; chi(2) test). A closed-loop electronic prescribing, dispensing and barcode patient identification system reduced prescribing errors and MAEs, and increased confirmation of patient identity before administration. Time spent on medication-related tasks increased.

  10. The impact of a closed‐loop electronic prescribing and administration system on prescribing errors, administration errors and staff time: a before‐and‐after study

    Science.gov (United States)

    Franklin, Bryony Dean; O'Grady, Kara; Donyai, Parastou; Jacklin, Ann; Barber, Nick

    2007-01-01

    Objectives To assess the impact of a closed‐loop electronic prescribing, automated dispensing, barcode patient identification and electronic medication administration record (EMAR) system on prescribing and administration errors, confirmation of patient identity before administration, and staff time. Design, setting and participants Before‐and‐after study in a surgical ward of a teaching hospital, involving patients and staff of that ward. Intervention Closed‐loop electronic prescribing, automated dispensing, barcode patient identification and EMAR system. Main outcome measures Percentage of new medication orders with a prescribing error, percentage of doses with medication administration errors (MAEs) and percentage given without checking patient identity. Time spent prescribing and providing a ward pharmacy service. Nursing time on medication tasks. Results Prescribing errors were identified in 3.8% of 2450 medication orders pre‐intervention and 2.0% of 2353 orders afterwards (pMedical staff required 15 s to prescribe a regular inpatient drug pre‐intervention and 39 s afterwards (p = 0.03; t test). Time spent providing a ward pharmacy service increased from 68 min to 98 min each weekday (p = 0.001; t test); 22% of drug charts were unavailable pre‐intervention. Time per drug administration round decreased from 50 min to 40 min (p = 0.006; t test); nursing time on medication tasks outside of drug rounds increased from 21.1% to 28.7% (p = 0.006; χ2 test). Conclusions A closed‐loop electronic prescribing, dispensing and barcode patient identification system reduced prescribing errors and MAEs, and increased confirmation of patient identity before administration. Time spent on medication‐related tasks increased. PMID:17693676

  11. [Medication reconciliation errors according to patient risk and type of physician prescriber identified by prescribing tool used].

    Science.gov (United States)

    Bilbao Gómez-Martino, Cristina; Nieto Sánchez, Ángel; Fernández Pérez, Cristina; Borrego Hernando, Mª Isabel; Martín-Sánchez, Francisco Javier

    2017-01-01

    To study the frequency of medication reconciliation errors (MREs) in hospitalized patients and explore the profiles of patients at greater risk. To compare the rates of errors in prescriptions written by emergency physicians and ward physicians, who each used a different prescribing tool. Prospective cross-sectional study of a convenience sample of patients admitted to medical, geriatric, and oncology wards over a period of 6 months. A pharmacist undertook the medication reconciliation report, and data were analyzed for possible associations with risk factors or prescriber type (emergency vs ward physician). A total of 148 patients were studied. Emergency physicians had prescribed for 68 (45.9%) and ward physicians for 80 (54.1%). A total of 303 MREs were detected; 113 (76.4%) patients had at least 1 error. No statistically significant differences were found between prescriber types. Factors that conferred risk for a medication error were use polypharmacy (odds ratio [OR], 3.4; 95% CI, 1.2-9.0; P=.016) and multiple chronic conditions in patients under the age of 80 years (OR, 3.9; 95% CI, 1.1-14.7; P=.039). The incidence of MREs is high regardless of whether the prescriber is an emergency or ward physician. The patients who are most at risk are those taking several medications and those under the age of 80 years who have multiple chronic conditions.

  12. E-Prescribing Errors in Community Pharmacies: Exploring Consequences and Contributing Factors

    Science.gov (United States)

    Stone, Jamie A.; Chui, Michelle A.

    2014-01-01

    Objective To explore types of e-prescribing errors in community pharmacies and their potential consequences, as well as the factors that contribute to e-prescribing errors. Methods Data collection involved performing 45 total hours of direct observations in five pharmacies. Follow-up interviews were conducted with 20 study participants. Transcripts from observations and interviews were subjected to content analysis using NVivo 10. Results Pharmacy staff detected 75 e-prescription errors during the 45 hour observation in pharmacies. The most common e-prescribing errors were wrong drug quantity, wrong dosing directions, wrong duration of therapy, and wrong dosage formulation. Participants estimated that 5 in 100 e-prescriptions have errors. Drug classes that were implicated in e-prescribing errors were antiinfectives, inhalers, ophthalmic, and topical agents. The potential consequences of e-prescribing errors included increased likelihood of the patient receiving incorrect drug therapy, poor disease management for patients, additional work for pharmacy personnel, increased cost for pharmacies and patients, and frustrations for patients and pharmacy staff. Factors that contribute to errors included: technology incompatibility between pharmacy and clinic systems, technology design issues such as use of auto-populate features and dropdown menus, and inadvertently entering incorrect information. Conclusion Study findings suggest that a wide range of e-prescribing errors are encountered in community pharmacies. Pharmacists and technicians perceive that causes of e-prescribing errors are multidisciplinary and multifactorial, that is to say e-prescribing errors can originate from technology used in prescriber offices and pharmacies. PMID:24657055

  13. Quantum Error Correction and Fault Tolerant Quantum Computing

    CERN Document Server

    Gaitan, Frank

    2008-01-01

    It was once widely believed that quantum computation would never become a reality. However, the discovery of quantum error correction and the proof of the accuracy threshold theorem nearly ten years ago gave rise to extensive development and research aimed at creating a working, scalable quantum computer. Over a decade has passed since this monumental accomplishment yet no book-length pedagogical presentation of this important theory exists. Quantum Error Correction and Fault Tolerant Quantum Computing offers the first full-length exposition on the realization of a theory once thought impo

  14. Measuring the relationship between interruptions, multitasking and prescribing errors in an emergency department: a study protocol.

    Science.gov (United States)

    Raban, Magdalena Z; Walter, Scott R; Douglas, Heather E; Strumpman, Dana; Mackenzie, John; Westbrook, Johanna I

    2015-10-13

    Interruptions and multitasking are frequent in clinical settings, and have been shown in the cognitive psychology literature to affect performance, increasing the risk of error. However, comparatively less is known about their impact on errors in clinical work. This study will assess the relationship between prescribing errors, interruptions and multitasking in an emergency department (ED) using direct observations and chart review. The study will be conducted in an ED of a 440-bed teaching hospital in Sydney, Australia. Doctors will be shadowed at proximity by observers for 2 h time intervals while they are working on day shift (between 0800 and 1800). Time stamped data on tasks, interruptions and multitasking will be recorded on a handheld computer using the validated Work Observation Method by Activity Timing (WOMBAT) tool. The prompts leading to interruptions and multitasking will also be recorded. When doctors prescribe medication, type of chart and chart sections written on, along with the patient's medical record number (MRN) will be recorded. A clinical pharmacist will access patient records and assess the medication orders for prescribing errors. The prescribing error rate will be calculated per prescribing task and is defined as the number of errors divided by the number of medication orders written during the prescribing task. The association between prescribing error rates, and rates of prompts, interruptions and multitasking will be assessed using statistical modelling. Ethics approval has been obtained from the hospital research ethics committee. Eligible doctors will be provided with written information sheets and written consent will be obtained if they agree to participate. Doctor details and MRNs will be kept separate from the data on prescribing errors, and will not appear in the final data set for analysis. Study results will be disseminated in publications and feedback to the ED. Published by the BMJ Publishing Group Limited. For permission

  15. Medication prescribing errors in a public teaching hospital in India: A prospective study.

    Directory of Open Access Journals (Sweden)

    Pote S

    2007-03-01

    Full Text Available Background: To prevent medication errors in prescribing, one needs to know their types and relative occurrence. Such errors are a great cause of concern as they have the potential to cause patient harm. The aim of this study was to determine the nature and types of medication prescribing errors in an Indian setting.Methods: The medication errors were analyzed in a prospective observational study conducted in 3 medical wards of a public teaching hospital in India. The medication errors were analyzed by means of Micromedex Drug-Reax database.Results: Out of 312 patients, only 304 were included in the study. Of the 304 cases, 103 (34% cases had at least one error. The total number of errors found was 157. The drug-drug interactions were the most frequently (68.2% occurring type of error, which was followed by incorrect dosing interval (12% and dosing errors (9.5%. The medication classes involved most were antimicrobial agents (29.4%, cardiovascular agents (15.4%, GI agents (8.6% and CNS agents (8.2%. The moderate errors contributed maximum (61.8% to the total errors when compared to the major (25.5% and minor (12.7% errors. The results showed that the number of errors increases with age and number of medicines prescribed.Conclusion: The results point to the establishment of medication error reporting at each hospital and to share the data with other hospitals. The role of clinical pharmacist in this situation appears to be a strong intervention; and the clinical pharmacist, initially, could confine to identification of the medication errors.

  16. Barriers and facilitators to recovering from e-prescribing errors in community pharmacies.

    Science.gov (United States)

    Odukoya, Olufunmilola K; Stone, Jamie A; Chui, Michelle A

    2015-01-01

    To explore barriers and facilitators to recovery from e-prescribing errors in community pharmacies and to explore practical solutions for work system redesign to ensure successful recovery from errors. Cross-sectional qualitative design using direct observations, interviews, and focus groups. Five community pharmacies in Wisconsin. 13 pharmacists and 14 pharmacy technicians. Observational field notes and transcribed interviews and focus groups were subjected to thematic analysis guided by the Systems Engineering Initiative for Patient Safety (SEIPS) work system and patient safety model. Barriers and facilitators to recovering from e-prescription errors in community pharmacies. Organizational factors, such as communication, training, teamwork, and staffing levels, play an important role in recovering from e-prescription errors. Other factors that could positively or negatively affect recovery of e-prescription errors include level of experience, knowledge of the pharmacy personnel, availability or usability of tools and technology, interruptions and time pressure when performing tasks, and noise in the physical environment. The SEIPS model sheds light on key factors that may influence recovery from e-prescribing errors in pharmacies, including the environment, teamwork, communication, technology, tasks, and other organizational variables. To be successful in recovering from e-prescribing errors, pharmacies must provide the appropriate working conditions that support recovery from errors.

  17. The causes of prescribing errors in English general practices: a qualitative study.

    Science.gov (United States)

    Slight, Sarah P; Howard, Rachel; Ghaleb, Maisoon; Barber, Nick; Franklin, Bryony Dean; Avery, Anthony J

    2013-10-01

    Few detailed studies exist of the underlying causes of prescribing errors in the UK. To examine the causes of prescribing and monitoring errors in general practice and provide recommendations for how they may be overcome. Qualitative interview and focus group study with purposive sampling of English general practices. General practice staff from 15 general practices across three PCTs in England participated in a combination of semi-structured interviews (n = 34) and six focus groups (n = 46). Thematic analysis informed by Reason's Accident Causation Model was used. Seven categories of high-level error-producing conditions were identified: the prescriber, the patient, the team, the working environment, the task, the computer system, and the primary-secondary care interface. These were broken down to reveal various error-producing conditions: the prescriber's therapeutic training, drug knowledge and experience, knowledge of the patient, perception of risk, and their physical and emotional health; the patient's characteristics and the complexity of the individual clinical case; the importance of feeling comfortable within the practice team was highlighted, as well as the safety implications of GPs signing prescriptions generated by nurses when they had not seen the patient for themselves; the working environment with its extensive workload, time pressures, and interruptions; and computer-related issues associated with mis-selecting drugs from electronic pick-lists and overriding alerts were all highlighted as possible causes of prescribing errors and were often interconnected. Complex underlying causes of prescribing and monitoring errors in general practices were highlighted, several of which are amenable to intervention.

  18. Antiretroviral medication prescribing errors are common with hospitalization of HIV-infected patients.

    Science.gov (United States)

    Commers, Tessa; Swindells, Susan; Sayles, Harlan; Gross, Alan E; Devetten, Marcel; Sandkovsky, Uriel

    2014-01-01

    Errors in prescribing antiretroviral therapy (ART) often occur with the hospitalization of HIV-infected patients. The rapid identification and prevention of errors may reduce patient harm and healthcare-associated costs. A retrospective review of hospitalized HIV-infected patients was carried out between 1 January 2009 and 31 December 2011. Errors were documented as omission, underdose, overdose, duplicate therapy, incorrect scheduling and/or incorrect therapy. The time to error correction was recorded. Relative risks (RRs) were computed to evaluate patient characteristics and error rates. A total of 289 medication errors were identified in 146/416 admissions (35%). The most common was drug omission (69%). At an error rate of 31%, nucleoside reverse transcriptase inhibitors were associated with an increased risk of error when compared with protease inhibitors (RR 1.32; 95% CI 1.04-1.69) and co-formulated drugs (RR 1.59; 95% CI 1.19-2.09). Of the errors, 31% were corrected within the first 24 h, but over half (55%) were never remedied. Admissions with an omission error were 7.4 times more likely to have all errors corrected within 24 h than were admissions without an omission. Drug interactions with ART were detected on 51 occasions. For the study population (n = 177), an increased risk of admission error was observed for black (43%) compared with white (28%) individuals (RR 1.53; 95% CI 1.16-2.03) but no significant differences were observed between white patients and other minorities or between men and women. Errors in inpatient ART were common, and the majority were never detected. The most common errors involved omission of medication, and nucleoside reverse transcriptase inhibitors had the highest rate of prescribing error. Interventions to prevent and correct errors are urgently needed.

  19. Chronology of prescribing error during the hospital stay and prediction of pharmacist's alerts overriding: a prospective analysis

    Directory of Open Access Journals (Sweden)

    Bruni Vanida

    2010-01-01

    Full Text Available Abstract Background Drug prescribing errors are frequent in the hospital setting and pharmacists play an important role in detection of these errors. The objectives of this study are (1 to describe the drug prescribing errors rate during the patient's stay, (2 to find which characteristics for a prescribing error are the most predictive of their reproduction the next day despite pharmacist's alert (i.e. override the alert. Methods We prospectively collected all medication order lines and prescribing errors during 18 days in 7 medical wards' using computerized physician order entry. We described and modelled the errors rate according to the chronology of hospital stay. We performed a classification and regression tree analysis to find which characteristics of alerts were predictive of their overriding (i.e. prescribing error repeated. Results 12 533 order lines were reviewed, 117 errors (errors rate 0.9% were observed and 51% of these errors occurred on the first day of the hospital stay. The risk of a prescribing error decreased over time. 52% of the alerts were overridden (i.e error uncorrected by prescribers on the following day. Drug omissions were the most frequently taken into account by prescribers. The classification and regression tree analysis showed that overriding pharmacist's alerts is first related to the ward of the prescriber and then to either Anatomical Therapeutic Chemical class of the drug or the type of error. Conclusions Since 51% of prescribing errors occurred on the first day of stay, pharmacist should concentrate his analysis of drug prescriptions on this day. The difference of overriding behavior between wards and according drug Anatomical Therapeutic Chemical class or type of error could also guide the validation tasks and programming of electronic alerts.

  20. Medication errors : the impact of prescribing and transcribing errors on preventable harm in hospitalised patients

    NARCIS (Netherlands)

    van Doormaal, J.E.; van der Bemt, P.M.L.A.; Mol, P.G.M.; Egberts, A.C.G.; Haaijer-Ruskamp, F.M.; Kosterink, J.G.W.; Zaal, Rianne J.

    Background: Medication errors (MEs) affect patient safety to a significant extent. Because these errors can lead to preventable adverse drug events (pADEs), it is important to know what type of ME is the most prevalent cause of these pADEs. This study determined the impact of the various types of

  1. Prescribing errors during hospital inpatient care: factors influencing identification by pharmacists.

    Science.gov (United States)

    Tully, Mary P; Buchan, Iain E

    2009-12-01

    To investigate the prevalence of prescribing errors identified by pharmacists in hospital inpatients and the factors influencing error identification rates by pharmacists throughout hospital admission. 880-bed university teaching hospital in North-west England. Data about prescribing errors identified by pharmacists (median: 9 (range 4-17) collecting data per day) when conducting routine work were prospectively recorded on 38 randomly selected days over 18 months. Proportion of new medication orders in which an error was identified; predictors of error identification rate, adjusted for workload and seniority of pharmacist, day of week, type of ward or stage of patient admission. 33,012 new medication orders were reviewed for 5,199 patients; 3,455 errors (in 10.5% of orders) were identified for 2,040 patients (39.2%; median 1, range 1-12). Most were problem orders (1,456, 42.1%) or potentially significant errors (1,748, 50.6%); 197 (5.7%) were potentially serious; 1.6% (n = 54) were potentially severe or fatal. Errors were 41% (CI: 28-56%) more likely to be identified at patient's admission than at other times, independent of confounders. Workload was the strongest predictor of error identification rates, with 40% (33-46%) less errors identified on the busiest days than at other times. Errors identified fell by 1.9% (1.5-2.3%) for every additional chart checked, independent of confounders. Pharmacists routinely identify errors but increasing workload may reduce identification rates. Where resources are limited, they may be better spent on identifying and addressing errors immediately after admission to hospital.

  2. Medication errors with electronic prescribing (eP): Two views of the same picture

    Science.gov (United States)

    2010-01-01

    Background Quantitative prospective methods are widely used to evaluate the impact of new technologies such as electronic prescribing (eP) on medication errors. However, they are labour-intensive and it is not always feasible to obtain pre-intervention data. Our objective was to compare the eP medication error picture obtained with retrospective quantitative and qualitative methods. Methods The study was carried out at one English district general hospital approximately two years after implementation of an integrated electronic prescribing, administration and records system. Quantitative: A structured retrospective analysis was carried out of clinical records and medication orders for 75 randomly selected patients admitted to three wards (medicine, surgery and paediatrics) six months after eP implementation. Qualitative: Eight doctors, 6 nurses, 8 pharmacy staff and 4 other staff at senior, middle and junior grades, and 19 adult patients on acute surgical and medical wards were interviewed. Staff interviews explored experiences of developing and working with the system; patient interviews focused on experiences of medicine prescribing and administration on the ward. Interview transcripts were searched systematically for accounts of medication incidents. A classification scheme was developed and applied to the errors identified in the records review. Results The two approaches produced similar pictures of the drug use process. Interviews identified types of error identified in the retrospective notes review plus two eP-specific errors which were not detected by record review. Interview data took less time to collect than record review, and provided rich data on the prescribing process, and reasons for delays or non-administration of medicines, including "once only" orders and "as required" medicines. Conclusions The qualitative approach provided more understanding of processes, and some insights into why medication errors can happen. The method is cost-effective and

  3. Relating faults in diagnostic reasoning with diagnostic errors and patient harm.

    NARCIS (Netherlands)

    Zwaan, L.; Thijs, A.; Wagner, C.; Wal, G. van der; Timmermans, D.R.M.

    2012-01-01

    Purpose: The relationship between faults in diagnostic reasoning, diagnostic errors, and patient harm has hardly been studied. This study examined suboptimal cognitive acts (SCAs; i.e., faults in diagnostic reasoning), related them to the occurrence of diagnostic errors and patient harm, and studied

  4. Prescribing error at hospital discharge: a retrospective review of medication information in an Irish hospital.

    Science.gov (United States)

    Michaelson, M; Walsh, E; Bradley, C P; McCague, P; Owens, R; Sahm, L J

    2017-08-01

    Prescribing error may result in adverse clinical outcomes leading to increased patient morbidity, mortality and increased economic burden. Many errors occur during transitional care as patients move between different stages and settings of care. To conduct a review of medication information and identify prescribing error among an adult population in an urban hospital. Retrospective review of medication information was conducted. Part 1: an audit of discharge prescriptions which assessed: legibility, compliance with legal requirements, therapeutic errors (strength, dose and frequency) and drug interactions. Part 2: A review of all sources of medication information (namely pre-admission medication list, drug Kardex, discharge prescription, discharge letter) for 15 inpatients to identify unintentional prescription discrepancies, defined as: "undocumented and/or unjustified medication alteration" throughout the hospital stay. Part 1: of the 5910 prescribed items; 53 (0.9%) were deemed illegible. Of the controlled drug prescriptions 11.1% (n = 167) met all the legal requirements. Therapeutic errors occurred in 41% of prescriptions (n = 479) More than 1 in 5 patients (21.9%) received a prescription containing a drug interaction. Part 2: 175 discrepancies were identified across all sources of medication information; of which 78 were deemed unintentional. Of these: 10.2% (n = 8) occurred at the point of admission, whereby 76.9% (n = 60) occurred at the point of discharge. The study identified the time of discharge as a point at which prescribing errors are likely to occur. This has implications for patient safety and provider work load in both primary and secondary care.

  5. Ground Motion Synthetics For Spontaneous Versus Prescribed Rupture On A 45(o) Thrust Fault

    Science.gov (United States)

    Gottschämmer, E.; Olsen, K. B.

    We have compared prescribed (kinematic) and spontaneous dynamic rupture propaga- tion on a 45(o) dipping thrust fault buried up to 5 km in a half-space model, as well as ground motions on the free surface for frequencies less than 1 Hz. The computa- tions are carried out using a 3D finite-difference method with rate-and-state friction on a planar, 20 km by 20 km fault. We use a slip-weakening distance of 15 cm and a slip- velocity weakening distance of 9.2 cm/s, similar to those for the dynamic study for the 1994 M6.7 Northridge earthquake by Nielsen and Olsen (2000) which generated satis- factory fits to selected strong motion data in the San Fernando Valley. The prescribed rupture propagation was designed to mimic that of the dynamic simulation at depth in order to isolate the dynamic free-surface effects. In this way, the results reflect the dy- namic (normal-stress) interaction with the free surface for various depths of burial of the fault. We find that the moment, peak slip and peak sliprate for the rupture breaking the surface are increased by up to 60%, 80%, and 10%, respectively, compared to the values for the scenario buried 5 km. The inclusion of these effects increases the peak displacements and velocities above the fault by factors up 3.4 and 2.9 including the increase in moment due to normal-stress effects at the free surface, and up to 2.1 and 2.0 when scaled to a Northridge-size event with surface rupture. Similar differences were found by Aagaard et al. (2001). Significant dynamic effects on the ground mo- tions include earlier arrival times caused by super-shear rupture velocities (break-out phases), in agreement with the dynamic finite-element simulations by Oglesby et al. (1998, 2000). The presence of shallow low-velocity layers tend to increase the rup- ture time and the sliprate. In particular, they promote earlier transitions to super-shear velocities and decrease the rupture velocity within the layers. Our results suggest that dynamic

  6. Prevalence, Nature, Severity and Risk Factors for Prescribing Errors in Hospital Inpatients: Prospective Study in 20 UK Hospitals.

    Science.gov (United States)

    Ashcroft, Darren M; Lewis, Penny J; Tully, Mary P; Farragher, Tracey M; Taylor, David; Wass, Valerie; Williams, Steven D; Dornan, Tim

    2015-09-01

    It has been suggested that doctors in their first year of post-graduate training make a disproportionate number of prescribing errors. This study aimed to compare the prevalence of prescribing errors made by first-year post-graduate doctors with that of errors by senior doctors and non-medical prescribers and to investigate the predictors of potentially serious prescribing errors. Pharmacists in 20 hospitals over 7 prospectively selected days collected data on the number of medication orders checked, the grade of prescriber and details of any prescribing errors. Logistic regression models (adjusted for clustering by hospital) identified factors predicting the likelihood of prescribing erroneously and the severity of prescribing errors. Pharmacists reviewed 26,019 patients and 124,260 medication orders; 11,235 prescribing errors were detected in 10,986 orders. The mean error rate was 8.8 % (95 % confidence interval [CI] 8.6-9.1) errors per 100 medication orders. Rates of errors for all doctors in training were significantly higher than rates for medical consultants. Doctors who were 1 year (odds ratio [OR] 2.13; 95 % CI 1.80-2.52) or 2 years in training (OR 2.23; 95 % CI 1.89-2.65) were more than twice as likely to prescribe erroneously. Prescribing errors were 70 % (OR 1.70; 95 % CI 1.61-1.80) more likely to occur at the time of hospital admission than when medication orders were issued during the hospital stay. No significant differences in severity of error were observed between grades of prescriber. Potentially serious errors were more likely to be associated with prescriptions for parenteral administration, especially for cardiovascular or endocrine disorders. The problem of prescribing errors in hospitals is substantial and not solely a problem of the most junior medical prescribers, particularly for those errors most likely to cause significant patient harm. Interventions are needed to target these high-risk errors by all grades of staff and hence

  7. Impact of Internally Developed Electronic Prescription on Prescribing Errors at Discharge from the Emergency Department.

    Science.gov (United States)

    Hitti, Eveline; Tamim, Hani; Bakhti, Rinad; Zebian, Dina; Mufarrij, Afif

    2017-08-01

    Medication errors are common, with studies reporting at least one error per patient encounter. At hospital discharge, medication errors vary from 15%-38%. However, studies assessing the effect of an internally developed electronic (E)-prescription system at discharge from an emergency department (ED) are comparatively minimal. Additionally, commercially available electronic solutions are cost-prohibitive in many resource-limited settings. We assessed the impact of introducing an internally developed, low-cost E-prescription system, with a list of commonly prescribed medications, on prescription error rates at discharge from the ED, compared to handwritten prescriptions. We conducted a pre- and post-intervention study comparing error rates in a randomly selected sample of discharge prescriptions (handwritten versus electronic) five months pre and four months post the introduction of the E-prescription. The internally developed, E-prescription system included a list of 166 commonly prescribed medications with the generic name, strength, dose, frequency and duration. We included a total of 2,883 prescriptions in this study: 1,475 in the pre-intervention phase were handwritten (HW) and 1,408 in the post-intervention phase were electronic. We calculated rates of 14 different errors and compared them between the pre- and post-intervention period. Overall, E-prescriptions included fewer prescription errors as compared to HW-prescriptions. Specifically, E-prescriptions reduced missing dose (11.3% to 4.3%, p prescriptions, however, were associated with a significant increase in duplication errors, specifically with home medication (1.7% to 3%, p=0.02). A basic, internally developed E-prescription system, featuring commonly used medications, effectively reduced medication errors in a low-resource setting where the costs of sophisticated commercial electronic solutions are prohibitive.

  8. Synchronization of multiple 3-DOF helicopters under actuator faults and saturations with prescribed performance.

    Science.gov (United States)

    Yang, Huiliao; Jiang, Bin; Yang, Hao; Liu, Hugh H T

    2018-04-01

    The distributed cooperative control strategy is proposed to make the networked nonlinear 3-DOF helicopters achieve the attitude synchronization in the presence of actuator faults and saturations. Based on robust adaptive control, the proposed control method can both compensate the uncertain partial loss of control effectiveness and deal with the system uncertainties. To address actuator saturation problem, the control scheme is designed to ensure that the saturation constraint on the actuation will not be violated during the operation in spite of the actuator faults. It is shown that with the proposed control strategy, both the tracking errors of the leading helicopter and the attitude synchronization errors of each following helicopter are bounded in the existence of faulty actuators and actuator saturations. Moreover, the state responses of the entire group would not exceed the predesigned performance functions which are totally independent from the underlaying interaction topology. Simulation results illustrate the effectiveness of the proposed control scheme. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Tracking error constrained robust adaptive neural prescribed performance control for flexible hypersonic flight vehicle

    Directory of Open Access Journals (Sweden)

    Zhonghua Wu

    2017-02-01

    Full Text Available A robust adaptive neural control scheme based on a back-stepping technique is developed for the longitudinal dynamics of a flexible hypersonic flight vehicle, which is able to ensure the state tracking error being confined in the prescribed bounds, in spite of the existing model uncertainties and actuator constraints. Minimal learning parameter technique–based neural networks are used to estimate the model uncertainties; thus, the amount of online updated parameters is largely lessened, and the prior information of the aerodynamic parameters is dispensable. With the utilization of an assistant compensation system, the problem of actuator constraint is overcome. By combining the prescribed performance function and sliding mode differentiator into the neural back-stepping control design procedure, a composite state tracking error constrained adaptive neural control approach is presented, and a new type of adaptive law is constructed. As compared with other adaptive neural control designs for hypersonic flight vehicle, the proposed composite control scheme exhibits not only low-computation property but also strong robustness. Finally, two comparative simulations are performed to demonstrate the robustness of this neural prescribed performance controller.

  10. A survey of the criteria for prescribing in cases of borderline refractive errors

    Directory of Open Access Journals (Sweden)

    Einat Shneor

    2016-01-01

    Conclusions: The prescribing criteria found in this study are broadly comparable with those in previous studies and with published prescribing guidelines. Subtle indications suggest that optometrists may become more conservative in their prescribing criteria with experience.

  11. Novel prescribed performance neural control of a flexible air-breathing hypersonic vehicle with unknown initial errors.

    Science.gov (United States)

    Bu, Xiangwei; Wu, Xiaoyan; Zhu, Fujing; Huang, Jiaqi; Ma, Zhen; Zhang, Rui

    2015-11-01

    A novel prescribed performance neural controller with unknown initial errors is addressed for the longitudinal dynamic model of a flexible air-breathing hypersonic vehicle (FAHV) subject to parametric uncertainties. Different from traditional prescribed performance control (PPC) requiring that the initial errors have to be known accurately, this paper investigates the tracking control without accurate initial errors via exploiting a new performance function. A combined neural back-stepping and minimal learning parameter (MLP) technology is employed for exploring a prescribed performance controller that provides robust tracking of velocity and altitude reference trajectories. The highlight is that the transient performance of velocity and altitude tracking errors is satisfactory and the computational load of neural approximation is low. Finally, numerical simulation results from a nonlinear FAHV model demonstrate the efficacy of the proposed strategy. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Incidence and Severity of Prescribing Errors in Parenteral Nutrition for Pediatric Inpatients at a Neonatal and Pediatric Intensive Care Unit

    Directory of Open Access Journals (Sweden)

    Theresa Hermanspann

    2017-06-01

    Full Text Available ObjectivesPediatric inpatients are particularly vulnerable to medication errors (MEs, especially in highly individualized preparations like parenteral nutrition (PN. Aside from prescribing via a computerized physician order entry system (CPOE, we evaluated the effect of cross-checking by a clinical pharmacist to prevent harm from PN order errors in a neonatal and pediatric intensive care unit (NICU/PICU.MethodsThe incidence of prescribing errors in PN in a tertiary level NICU/PICU was surveyed prospectively between March 2012 and July 2013 (n = 3,012 orders. A pharmacist cross-checked all PN orders prior to preparation. Errors were assigned to seven different error-type categories. Three independent experts from different academic tertiary level NICUs judged the severity of each error according to the National Coordinating Council for Medication Error Reporting and Prevention (NCC MERP Index (categories A–I.ResultsThe error rate was 3.9% for all 3,012 orders (118 prescribing errors in 111 orders. 77 (6.0%, 1,277 orders errors occurred in the category concentration range, all concerning a relative overdose of calcium gluconate for peripheral infusion. The majority of all events (60% were assigned to categories C and D (without major harmful consequences while 28% could not be assigned due to missing majority decision. Potential harmful consequences requiring interventions (category E could have occurred in 12% of assessments.ConclusionNext to systematic application of clinical guidelines and prescribing via CPOE, order review by a clinical pharmacist is still required to effectively reduce MEs and thus to prevent minor and major adverse drug events with the aim to enhance medication safety.

  13. Medication prescribing errors in the medical intensive care unit of Tikur Anbessa Specialized Hospital, Addis Ababa, Ethiopia.

    Science.gov (United States)

    Sada, Oumer; Melkie, Addisu; Shibeshi, Workineh

    2015-09-16

    Medication errors (MEs) are important problems in all hospitalized populations, especially in intensive care unit (ICU). Little is known about the prevalence of medication prescribing errors in the ICU of hospitals in Ethiopia. The aim of this study was to assess medication prescribing errors in the ICU of Tikur Anbessa Specialized Hospital using retrospective cross-sectional analysis of patient cards and medication charts. About 220 patient charts were reviewed with a total of 1311 patient-days, and 882 prescription episodes. 359 MEs were detected; with prevalence of 40 per 100 orders. Common prescribing errors were omission errors 154 (42.89%), 101 (28.13%) wrong combination, 48 (13.37%) wrong abbreviation, 30 (8.36%) wrong dose, wrong frequency 18 (5.01%) and wrong indications 8 (2.23%). The present study shows that medication errors are common in medical ICU of Tikur Anbessa Specialized Hospital. These results suggest future targets of prevention strategies to reduce the rate of medication error.

  14. Physician Acceptance of Pharmacist Recommendations about Medication Prescribing Errors in Iraqi Hospitals

    Directory of Open Access Journals (Sweden)

    ALI AZEEZ ALI AL-JUMAILI

    2016-08-01

    Full Text Available The objectives of this study were to measure the incidence and types of medication prescribing errors (MPEs in Iraqi hospitals, to calculate for the first time the percentage of physician agreement with pharmacist medication regimen review (MRR recommendations regarding MPEs, and to identify the factors influencing the physician agreement rate with these recommendations. Methods: Fourteen pharmacists (10 females and 4 males reviewed each hand-written physician order for 1506 patients who were admitted to two public hospitals in Al-Najaf, Iraq during August 2015. The pharmacists identified medication prescribing errors using the Medscape WebMD, LCC phone application as a reference. The pharmacists contacted the physicians (2 females and 34 males in-person to address MPEs that were identified. Results: The pharmacists identified 78 physician orders containing 99 MPEs with an incidence of 6.57 percent of all the physician orders reviewed. The patients with MPEs were taking 4.8 medications on average. The MPEs included drug-drug interactions (65.7%, incorrect doses (16.2%, unnecessary medications (8.1%, contra-indications (7.1%, incorrect drug duration (2%, and untreated conditions (1%. The physicians implemented 37 (37.4% pharmacist recommendations. Three factors were significantly related to physician acceptance of pharmacist recommendations. These were physician specialty, pharmacist gender, and patient gender. Pediatricians were less likely (OR= 0.1 to accept pharmacist recommendations compared to internal medicine physicians. Male pharmacists received more positive responses from physicians (OR=7.11 than female pharmacists. Lastly, the recommendations were significantly more likely to be accepted (OR= 3.72 when the patients were females. Conclusions: The incidence of MPEs is higher in Iraqi hospitalized patients than in the U.S. and U.K, but lower than in Brazil, Ethiopia, India, and Croatia. Drug-drug interactions were the most common type of

  15. Fault-tolerant quantum computing in the Pauli or Clifford frame with slow error diagnostics

    Directory of Open Access Journals (Sweden)

    Christopher Chamberland

    2018-01-01

    Full Text Available We consider the problem of fault-tolerant quantum computation in the presence of slow error diagnostics, either caused by measurement latencies or slow decoding algorithms. Our scheme offers a few improvements over previously existing solutions, for instance it does not require active error correction and results in a reduced error-correction overhead when error diagnostics is much slower than the gate time. In addition, we adapt our protocol to cases where the underlying error correction strategy chooses the optimal correction amongst all Clifford gates instead of the usual Pauli gates. The resulting Clifford frame protocol is of independent interest as it can increase error thresholds and could find applications in other areas of quantum computation.

  16. Fault tree model of human error based on error-forcing contexts

    International Nuclear Information System (INIS)

    Kang, Hyun Gook; Jang, Seung Cheol; Ha, Jae Joo

    2004-01-01

    In the safety-critical systems such as nuclear power plants, the safety-feature actuation is fully automated. In emergency case, the human operator could also play the role of a backup for automated systems. That is, the failure of safety-feature-actuation signal generation implies the concurrent failure of automated systems and that of manual actuation. The human operator's manual actuation failure is largely affected by error-forcing contexts (EFC). The failures of sensors and automated systems are most important ones. The sensors, the automated actuation system and the human operators are correlated in a complex manner and hard to develop a proper model. In this paper, we will explain the condition-based human reliability assessment (CBHRA) method in order to treat these complicated conditions in a practical way. In this study, we apply the CBHRA method to the manual actuation of safety features such as reactor trip and safety injection in Korean Standard Nuclear Power Plants

  17. Non-intercepted dose errors in prescribing anti-neoplastic treatment

    DEFF Research Database (Denmark)

    Mattsson, T O; Holm, B; Michelsen, H

    2015-01-01

    BACKGROUND: The incidence of non-intercepted prescription errors and the risk factors involved, including the impact of computerised order entry (CPOE) systems on such errors, are unknown. Our objective was to determine the incidence, type, severity, and related risk factors of non-intercepted pr....... Strategies to prevent future prescription errors could usefully focus on integrated computerised systems that can aid dose calculations and reduce transcription errors between databases....

  18. Learning curves, taking instructions, and patient safety: using a theoretical domains framework in an interview study to investigate prescribing errors among trainee doctors

    Directory of Open Access Journals (Sweden)

    Duncan Eilidh M

    2012-09-01

    Full Text Available Abstract Background Prescribing errors are a major source of morbidity and mortality and represent a significant patient safety concern. Evidence suggests that trainee doctors are responsible for most prescribing errors. Understanding the factors that influence prescribing behavior may lead to effective interventions to reduce errors. Existing investigations of prescribing errors have been based on Human Error Theory but not on other relevant behavioral theories. The aim of this study was to apply a broad theory-based approach using the Theoretical Domains Framework (TDF to investigate prescribing in the hospital context among a sample of trainee doctors. Method Semistructured interviews, based on 12 theoretical domains, were conducted with 22 trainee doctors to explore views, opinions, and experiences of prescribing and prescribing errors. Content analysis was conducted, followed by applying relevance criteria and a novel stage of critical appraisal, to identify which theoretical domains could be targeted in interventions to improve prescribing. Results Seven theoretical domains met the criteria of relevance: “social professional role and identity,” “environmental context and resources,” “social influences,” “knowledge,” “skills,” “memory, attention, and decision making,” and “behavioral regulation.” From critical appraisal of the interview data, “beliefs about consequences” and “beliefs about capabilities” were also identified as potentially important domains. Interrelationships between domains were evident. Additionally, the data supported theoretical elaboration of the domain behavioral regulation. Conclusions In this investigation of hospital-based prescribing, participants’ attributions about causes of errors were used to identify domains that could be targeted in interventions to improve prescribing. In a departure from previous TDF practice, critical appraisal was used to identify additional domains

  19. Learning curves, taking instructions, and patient safety: using a theoretical domains framework in an interview study to investigate prescribing errors among trainee doctors.

    Science.gov (United States)

    Duncan, Eilidh M; Francis, Jill J; Johnston, Marie; Davey, Peter; Maxwell, Simon; McKay, Gerard A; McLay, James; Ross, Sarah; Ryan, Cristín; Webb, David J; Bond, Christine

    2012-09-11

    Prescribing errors are a major source of morbidity and mortality and represent a significant patient safety concern. Evidence suggests that trainee doctors are responsible for most prescribing errors. Understanding the factors that influence prescribing behavior may lead to effective interventions to reduce errors. Existing investigations of prescribing errors have been based on Human Error Theory but not on other relevant behavioral theories. The aim of this study was to apply a broad theory-based approach using the Theoretical Domains Framework (TDF) to investigate prescribing in the hospital context among a sample of trainee doctors. Semistructured interviews, based on 12 theoretical domains, were conducted with 22 trainee doctors to explore views, opinions, and experiences of prescribing and prescribing errors. Content analysis was conducted, followed by applying relevance criteria and a novel stage of critical appraisal, to identify which theoretical domains could be targeted in interventions to improve prescribing. Seven theoretical domains met the criteria of relevance: "social professional role and identity," "environmental context and resources," "social influences," "knowledge," "skills," "memory, attention, and decision making," and "behavioral regulation." From critical appraisal of the interview data, "beliefs about consequences" and "beliefs about capabilities" were also identified as potentially important domains. Interrelationships between domains were evident. Additionally, the data supported theoretical elaboration of the domain behavioral regulation. In this investigation of hospital-based prescribing, participants' attributions about causes of errors were used to identify domains that could be targeted in interventions to improve prescribing. In a departure from previous TDF practice, critical appraisal was used to identify additional domains that should also be targeted, despite participants' perceptions that they were not relevant to

  20. FPGAs and parallel architectures for aerospace applications soft errors and fault-tolerant design

    CERN Document Server

    Rech, Paolo

    2016-01-01

    This book introduces the concepts of soft errors in FPGAs, as well as the motivation for using commercial, off-the-shelf (COTS) FPGAs in mission-critical and remote applications, such as aerospace.  The authors describe the effects of radiation in FPGAs, present a large set of soft-error mitigation techniques that can be applied in these circuits, as well as methods for qualifying these circuits under radiation.  Coverage includes radiation effects in FPGAs, fault-tolerant techniques for FPGAs, use of COTS FPGAs in aerospace applications, experimental data of FPGAs under radiation, FPGA embedded processors under radiation, and fault injection in FPGAs. Since dedicated parallel processing architectures such as GPUs have become more desirable in aerospace applications due to high computational power, GPU analysis under radiation is also discussed. ·         Discusses features and drawbacks of reconfigurability methods for FPGAs, focused on aerospace applications; ·         Explains how radia...

  1. Detection and Localization of Tooth Breakage Fault on Wind Turbine Planetary Gear System considering Gear Manufacturing Errors

    Directory of Open Access Journals (Sweden)

    Y. Gui

    2014-01-01

    Full Text Available Sidebands of vibration spectrum are sensitive to the fault degree and have been proved to be useful for tooth fault detection and localization. However, the amplitude and frequency modulation due to manufacturing errors (which are inevitable in actual planetary gear system lead to much more complex sidebands. Thus, in the paper, a lumped parameter model for a typical planetary gear system with various types of errors is established. In the model, the influences of tooth faults on time-varying mesh stiffness and tooth impact force are derived analytically. Numerical methods are then utilized to obtain the response spectra of the system with tooth faults with and without errors. Three system components (including sun, planet, and ring gears with tooth faults are considered in the discussion, respectively. Through detailed comparisons of spectral sidebands, fault characteristic frequencies of the system are acquired. Dynamic experiments on a planetary gear-box test rig are carried out to verify the simulation results and these results are of great significances for the detection and localization of tooth faults in wind turbines.

  2. Fault diagnosis of generation IV nuclear HTGR components – Part II: The area error enthalpy–entropy graph approach

    International Nuclear Information System (INIS)

    Rand, C.P. du; Schoor, G. van

    2012-01-01

    Highlights: ► Different uncorrelated fault signatures are derived for HTGR component faults. ► A multiple classifier ensemble increases confidence in classification accuracy. ► Detailed simulation model of system is not required for fault diagnosis. - Abstract: The second paper in a two part series presents the area error method for generation of representative enthalpy–entropy (h–s) fault signatures to classify malfunctions in generation IV nuclear high temperature gas-cooled reactor (HTGR) components. The second classifier is devised to ultimately address the fault diagnosis (FD) problem via the proposed methods in a multiple classifier (MC) ensemble. FD is realized by way of different input feature sets to the classification algorithm based on the area and trajectory of the residual shift between the fault-free and the actual operating h–s graph models. The application of the proposed technique is specifically demonstrated for 24 single fault transients considered in the main power system (MPS) of the Pebble Bed Modular Reactor (PBMR). The results show that the area error technique produces different fault signatures with low correlation for all the examined component faults. A brief evaluation of the two fault signature generation techniques is presented and the performance of the area error method is documented using the fault classification index (FCI) presented in Part I of the series. The final part of this work reports the application of the proposed approach for classification of an emulated fault transient in data from the prototype Pebble Bed Micro Model (PBMM) plant. Reference data values are calculated for the plant via a thermo-hydraulic simulation model of the MPS. The results show that the correspondence between the fault signatures, generated via experimental plant data and simulated reference values, are generally good. The work presented in the two part series, related to the classification of component faults in the MPS of different

  3. Fault Analysis of Wind Turbines Based on Error Messages and Work Orders

    DEFF Research Database (Denmark)

    Borchersen, Anders Bech; Larsen, Jesper Abildgaard; Stoustrup, Jakob

    2012-01-01

    describing the service performed at the individual turbines. The auto generated alarms are analysed by applying a cleaning procedure to identify the alarms related to components. A severity, occurrence, and detection analysis is performed on the work orders. The outcome of the two analyses are then compared......In this paper data describing the operation and maintenance of an offshore wind farm is presented and analysed. Two different sets of data is presented; the first is auto generated error messages from the Supervisory Control and Data Acquisition (SCADA) system, the other is the work orders...... to identify common fault types and areas where further data analysis would be beneficial for improving the operation and maintenance of wind turbines in the future....

  4. Error Mitigation of Point-to-Point Communication for Fault-Tolerant Computing

    Science.gov (United States)

    Akamine, Robert L.; Hodson, Robert F.; LaMeres, Brock J.; Ray, Robert E.

    2011-01-01

    Fault tolerant systems require the ability to detect and recover from physical damage caused by the hardware s environment, faulty connectors, and system degradation over time. This ability applies to military, space, and industrial computing applications. The integrity of Point-to-Point (P2P) communication, between two microcontrollers for example, is an essential part of fault tolerant computing systems. In this paper, different methods of fault detection and recovery are presented and analyzed.

  5. On-ward participation of a hospital pharmacist in a Dutch intensive care unit reduces prescribing errors and related patient harm: an intervention study

    NARCIS (Netherlands)

    Klopotowska, J.E.; Kuiper, R.; van Kan, H.J.; de Pont, A.C.; Dijkgraaf, M.G.; Lie-A-Huen, L.; Vroom, M.B.; Smorenburg, S.M.

    2010-01-01

    Introduction: Patients admitted to an intensive care unit (ICU) are at high risk for prescribing errors and related adverse drug events (ADEs). An effective intervention to decrease this risk, based on studies conducted mainly in North America, is on-ward participation of a clinical pharmacist in an

  6. On-ward participation of a hospital pharmacist in a Dutch intensive care unit reduces prescribing errors and related patient harm: an intervention study

    NARCIS (Netherlands)

    Klopotowska, Joanna E.; Kuiper, Rob; van Kan, Hendrikus J.; de Pont, Anne-Cornelie; Dijkgraaf, Marcel G.; Lie-A-Huen, Loraine; Vroom, Margreeth B.; Smorenburg, Susanne M.

    2010-01-01

    Patients admitted to an intensive care unit (ICU) are at high risk for prescribing errors and related adverse drug events (ADEs). An effective intervention to decrease this risk, based on studies conducted mainly in North America, is on-ward participation of a clinical pharmacist in an ICU team. As

  7. Architecture Fault Modeling and Analysis with the Error Model Annex, Version 2

    Science.gov (United States)

    2016-06-01

    specification of fault propagation in EMV2 corresponds to the Fault Propagation and Transformation Calculus (FPTC) [Paige 2009]. The following concepts...definition of security includes acci- dental malicious indication of anomalous behavior either from outside a system or by unauthor- ized crossing of a

  8. Who Do Hospital Physicians and Nurses Go to for Advice About Medications? A Social Network Analysis and Examination of Prescribing Error Rates.

    Science.gov (United States)

    Creswick, Nerida; Westbrook, Johanna Irene

    2015-09-01

    To measure the weekly medication advice-seeking networks of hospital staff, to compare patterns across professional groups, and to examine these in the context of prescribing error rates. A social network analysis was conducted. All 101 staff in 2 wards in a large, academic teaching hospital in Sydney, Australia, were surveyed (response rate, 90%) using a detailed social network questionnaire. The extent of weekly medication advice seeking was measured by density of connections, proportion of reciprocal relationships by reciprocity, number of colleagues to whom each person provided advice by in-degree, and perceptions of amount and impact of advice seeking between physicians and nurses. Data on prescribing error rates from the 2 wards were compared. Weekly medication advice-seeking networks were sparse (density: 7% ward A and 12% ward B). Information sharing across professional groups was modest, and rates of reciprocation of advice were low (9% ward A, 14% ward B). Pharmacists provided advice to most people, and junior physicians also played central roles. Senior physicians provided medication advice to few people. Many staff perceived that physicians rarely sought advice from nurses when prescribing, but almost all believed that an increase in communication between physicians and nurses about medications would improve patient safety. The medication networks in ward B had higher measures for density, reciprocation, and fewer senior physicians who were isolates. Ward B had a significantly lower rate of both procedural and clinical prescribing errors than ward A (0.63 clinical prescribing errors per admission [95%CI, 0.47-0.79] versus 1.81/ admission [95%CI, 1.49-2.13]). Medication advice-seeking networks among staff on hospital wards are limited. Hubs of advice provision include pharmacists, junior physicians, and senior nurses. Senior physicians are poorly integrated into medication advice networks. Strategies to improve the advice-giving networks between senior

  9. Medication errors: definitions and classification

    Science.gov (United States)

    Aronson, Jeffrey K

    2009-01-01

    To understand medication errors and to identify preventive strategies, we need to classify them and define the terms that describe them. The four main approaches to defining technical terms consider etymology, usage, previous definitions, and the Ramsey–Lewis method (based on an understanding of theory and practice). A medication error is ‘a failure in the treatment process that leads to, or has the potential to lead to, harm to the patient’. Prescribing faults, a subset of medication errors, should be distinguished from prescription errors. A prescribing fault is ‘a failure in the prescribing [decision-making] process that leads to, or has the potential to lead to, harm to the patient’. The converse of this, ‘balanced prescribing’ is ‘the use of a medicine that is appropriate to the patient's condition and, within the limits created by the uncertainty that attends therapeutic decisions, in a dosage regimen that optimizes the balance of benefit to harm’. This excludes all forms of prescribing faults, such as irrational, inappropriate, and ineffective prescribing, underprescribing and overprescribing. A prescription error is ‘a failure in the prescription writing process that results in a wrong instruction about one or more of the normal features of a prescription’. The ‘normal features’ include the identity of the recipient, the identity of the drug, the formulation, dose, route, timing, frequency, and duration of administration. Medication errors can be classified, invoking psychological theory, as knowledge-based mistakes, rule-based mistakes, action-based slips, and memory-based lapses. This classification informs preventive strategies. PMID:19594526

  10. Fault tolerant computing systems

    International Nuclear Information System (INIS)

    Randell, B.

    1981-01-01

    Fault tolerance involves the provision of strategies for error detection damage assessment, fault treatment and error recovery. A survey is given of the different sorts of strategies used in highly reliable computing systems, together with an outline of recent research on the problems of providing fault tolerance in parallel and distributed computing systems. (orig.)

  11. A method for the estimation of the residual error in the SALP approach for fault tree analysis

    International Nuclear Information System (INIS)

    Astolfi, M.; Contini, S.

    1980-01-01

    The aim of this report is the illustration of the algorithms implemented in the SALP-MP code for the estimation of the residual error. These algorithms are of more general use, and it would be possible to implement them on all codes of the series SALP previously developed, as well as, with minor modifications, to analysis procedures based on 'top-down' approaches. At the time, combined 'top-down' - 'bottom up' procedures are being studied in order to take advantage from both approaches for further reduction of computer time and better estimation of the residual error, for which the developed algorithms are still applicable

  12. Rationalising prescribing

    DEFF Research Database (Denmark)

    Wadmann, Sarah; Bang, Lia Evi

    2015-01-01

    Initiatives in the name of 'rational pharmacotherapy' have been launched to alter what is seen as 'inappropriate' prescribing practices of physicians. Based on observations and interviews with 20 general practitioners (GPs) in 2009-2011, we explored how attempts to rationalise prescribing interac...

  13. Prescribing procrastination

    Science.gov (United States)

    Thomson, George H.

    1979-01-01

    In his everyday work the family physician sees many patients whose problems have been diagnosed but for whom postponement of an active treatment plan is indicated. The physician must therefore prescribe procrastination in a carefully planned way. I describe some ideas and practical methods for doing this. PMID:529244

  14. Use of FMEA analysis to reduce risk of errors in prescribing and administering drugs in paediatric wards: a quality improvement report.

    Science.gov (United States)

    Lago, Paola; Bizzarri, Giancarlo; Scalzotto, Francesca; Parpaiola, Antonella; Amigoni, Angela; Putoto, Giovanni; Perilongo, Giorgio

    2012-01-01

    Administering medication to hospitalised infants and children is a complex process at high risk of error. Failure mode and effect analysis (FMEA) is a proactive tool used to analyse risks, identify failures before they happen and prioritise remedial measures. To examine the hazards associated with the process of drug delivery to children, we performed a proactive risk-assessment analysis. Five multidisciplinary teams, representing different divisions of the paediatric department at Padua University Hospital, were trained to analyse the drug-delivery process, to identify possible causes of failures and their potential effects, to calculate a risk priority number (RPN) for each failure and plan changes in practices. To identify higher-priority potential failure modes as defined by RPNs and planning changes in clinical practice to reduce the risk of patients harm and improve safety in the process of medication use in children. In all, 37 higher-priority potential failure modes and 71 associated causes and effects were identified. The highest RPNs related (>48) mainly to errors in calculating drug doses and concentrations. Many of these failure modes were found in all the five units, suggesting the presence of common targets for improvement, particularly in enhancing the safety of prescription and preparation of endovenous drugs. The introductions of new activities in the revised process of administering drugs allowed reducing the high-risk failure modes of 60%. FMEA is an effective proactive risk-assessment tool useful to aid multidisciplinary groups in understanding a process care and identifying errors that may occur, prioritising remedial interventions and possibly enhancing the safety of drug delivery in children.

  15. Prescribing Antibiotics

    DEFF Research Database (Denmark)

    Pedersen, Inge Kryger; Jepsen, Kim Sune

    2018-01-01

    The medical professions will lose an indispensable tool in clinical practice if even simple infections cannot be cured because antibiotics have lost effectiveness. This article presents results from an exploratory enquiry into “good doctoring” in the case of antibiotic prescribing at a time when...... the knowledge base in the healthcare field is shifting. Drawing on in-depth interviews about diagnosing and prescribing, the article demonstrates how the problem of antimicrobial resistance is understood and engaged with by Danish general practitioners. When general practitioners speak of managing “non......-medical issues,” they refer to routines, clinical expertise, experiences with their patients, and decision-making based more on contextual circumstances than molecular conditions—and on the fact that such conditions can be hard to assess. This article’s contribution to knowledge about how new and global health...

  16. Safe prescribing: a titanic challenge.

    Science.gov (United States)

    Routledge, Philip A

    2012-10-01

    The challenge to achieve safe prescribing merits the adjective 'titanic'. The organisational and human errors leading to poor prescribing (e.g. underprescribing, overprescribing, misprescribing or medication errors) have parallels in the organisational and human errors that led to the loss of the Titanic 100 years ago this year. Prescribing can be adversely affected by communication failures, critical conditions, complacency, corner cutting, callowness and a lack of courage of conviction, all of which were also factors leading to the Titanic tragedy. These issues need to be addressed by a commitment to excellence, the final component of the 'Seven C's'. Optimal prescribing is dependent upon close communication and collaborative working between highly trained health professionals, whose role is to ensure maximum clinical effectiveness, whilst also protecting their patients from avoidable harm. Since humans are prone to error, and the environments in which they work are imperfect, it is not surprising that medication errors are common, occurring more often during the prescribing stage than during dispensing or administration. A commitment to excellence in prescribing includes a continued focus on lifelong learning (including interprofessional learning) in pharmacology and therapeutics. This should be accompanied by improvements in the clinical working environment of prescribers, and the encouragement of a strong safety culture (including reporting of adverse incidents as well as suspected adverse drug reactions whenever appropriate). Finally, members of the clinical team must be prepared to challenge each other, when necessary, to ensure that prescribing combines the highest likelihood of benefit with the lowest potential for harm. © 2012 The Author. British Journal of Clinical Pharmacology © 2012 The British Pharmacological Society.

  17. Safe prescribing: a titanic challenge

    Science.gov (United States)

    Routledge, Philip A

    2012-01-01

    The challenge to achieve safe prescribing merits the adjective ‘titanic’. The organisational and human errors leading to poor prescribing (e.g. underprescribing, overprescribing, misprescribing or medication errors) have parallels in the organisational and human errors that led to the loss of the Titanic 100 years ago this year. Prescribing can be adversely affected by communication failures, critical conditions, complacency, corner cutting, callowness and a lack of courage of conviction, all of which were also factors leading to the Titanic tragedy. These issues need to be addressed by a commitment to excellence, the final component of the ‘Seven C's’. Optimal prescribing is dependent upon close communication and collaborative working between highly trained health professionals, whose role is to ensure maximum clinical effectiveness, whilst also protecting their patients from avoidable harm. Since humans are prone to error, and the environments in which they work are imperfect, it is not surprising that medication errors are common, occurring more often during the prescribing stage than during dispensing or administration. A commitment to excellence in prescribing includes a continued focus on lifelong learning (including interprofessional learning) in pharmacology and therapeutics. This should be accompanied by improvements in the clinical working environment of prescribers, and the encouragement of a strong safety culture (including reporting of adverse incidents as well as suspected adverse drug reactions whenever appropriate). Finally, members of the clinical team must be prepared to challenge each other, when necessary, to ensure that prescribing combines the highest likelihood of benefit with the lowest potential for harm. PMID:22738396

  18. Automation bias in electronic prescribing.

    Science.gov (United States)

    Lyell, David; Magrabi, Farah; Raban, Magdalena Z; Pont, L G; Baysari, Melissa T; Day, Richard O; Coiera, Enrico

    2017-03-16

    Clinical decision support (CDS) in e-prescribing can improve safety by alerting potential errors, but introduces new sources of risk. Automation bias (AB) occurs when users over-rely on CDS, reducing vigilance in information seeking and processing. Evidence of AB has been found in other clinical tasks, but has not yet been tested with e-prescribing. This study tests for the presence of AB in e-prescribing and the impact of task complexity and interruptions on AB. One hundred and twenty students in the final two years of a medical degree prescribed medicines for nine clinical scenarios using a simulated e-prescribing system. Quality of CDS (correct, incorrect and no CDS) and task complexity (low, low + interruption and high) were varied between conditions. Omission errors (failure to detect prescribing errors) and commission errors (acceptance of false positive alerts) were measured. Compared to scenarios with no CDS, correct CDS reduced omission errors by 38.3% (p < .0001, n = 120), 46.6% (p < .0001, n = 70), and 39.2% (p < .0001, n = 120) for low, low + interrupt and high complexity scenarios respectively. Incorrect CDS increased omission errors by 33.3% (p < .0001, n = 120), 24.5% (p < .009, n = 82), and 26.7% (p < .0001, n = 120). Participants made commission errors, 65.8% (p < .0001, n = 120), 53.5% (p < .0001, n = 82), and 51.7% (p < .0001, n = 120). Task complexity and interruptions had no impact on AB. This study found evidence of AB omission and commission errors in e-prescribing. Verification of CDS alerts is key to avoiding AB errors. However, interventions focused on this have had limited success to date. Clinicians should remain vigilant to the risks of CDS failures and verify CDS.

  19. Optimization of electronic prescribing in pediatric patients

    NARCIS (Netherlands)

    Maat, B.

    2014-01-01

    Improving pediatric patient safety by preventing medication errors that may result in adverse drug events and consequent healthcare expenditure,is a worldwide challenge to healthcare. In pediatrics, reported medication error rates in general, and prescribing error rates in particular, vary between

  20. H infinity Integrated Fault Estimation and Fault Tolerant Control of Discrete-time Piecewise Linear Systems

    DEFF Research Database (Denmark)

    Tabatabaeipour, Seyed Mojtaba; Bak, Thomas

    2012-01-01

    In this paper we consider the problem of fault estimation and accommodation for discrete time piecewise linear systems. A robust fault estimator is designed to estimate the fault such that the estimation error converges to zero and H∞ performance of the fault estimation is minimized. Then, the es...

  1. Neural network-based robust actuator fault diagnosis for a non-linear multi-tank system.

    Science.gov (United States)

    Mrugalski, Marcin; Luzar, Marcel; Pazera, Marcin; Witczak, Marcin; Aubrun, Christophe

    2016-03-01

    The paper is devoted to the problem of the robust actuator fault diagnosis of the dynamic non-linear systems. In the proposed method, it is assumed that the diagnosed system can be modelled by the recurrent neural network, which can be transformed into the linear parameter varying form. Such a system description allows developing the designing scheme of the robust unknown input observer within H∞ framework for a class of non-linear systems. The proposed approach is designed in such a way that a prescribed disturbance attenuation level is achieved with respect to the actuator fault estimation error, while guaranteeing the convergence of the observer. The application of the robust unknown input observer enables actuator fault estimation, which allows applying the developed approach to the fault tolerant control tasks. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  2. An efficient diagnostic technique for distribution systems based on under fault voltages and currents

    Energy Technology Data Exchange (ETDEWEB)

    Campoccia, A.; Di Silvestre, M.L.; Incontrera, I.; Riva Sanseverino, E. [Dipartimento di Ingegneria Elettrica elettronica e delle Telecomunicazioni, Universita degli Studi di Palermo, viale delle Scienze, 90128 Palermo (Italy); Spoto, G. [Centro per la Ricerca Elettronica in Sicilia, Monreale, Via Regione Siciliana 49, 90046 Palermo (Italy)

    2010-10-15

    Service continuity is one of the major aspects in the definition of the quality of the electrical energy, for this reason the research in the field of faults diagnostic for distribution systems is spreading ever more. Moreover the increasing interest around modern distribution systems automation for management purposes gives faults diagnostics more tools to detect outages precisely and in short times. In this paper, the applicability of an efficient fault location and characterization methodology within a centralized monitoring system is discussed. The methodology, appropriate for any kind of fault, is based on the use of the analytical model of the network lines and uses the fundamental components rms values taken from the transient measures of line currents and voltages at the MV/LV substations. The fault location and identification algorithm, proposed by the authors and suitably restated, has been implemented on a microprocessor-based device that can be installed at each MV/LV substation. The speed and precision of the algorithm have been tested against the errors deriving from the fundamental extraction within the prescribed fault clearing times and against the inherent precision of the electronic device used for computation. The tests have been carried out using Matlab Simulink for simulating the faulted system. (author)

  3. Faults Images

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Through the study of faults and their effects, much can be learned about the size and recurrence intervals of earthquakes. Faults also teach us about crustal...

  4. Fault finder

    Science.gov (United States)

    Bunch, Richard H.

    1986-01-01

    A fault finder for locating faults along a high voltage electrical transmission line. Real time monitoring of background noise and improved filtering of input signals is used to identify the occurrence of a fault. A fault is detected at both a master and remote unit spaced along the line. A master clock synchronizes operation of a similar clock at the remote unit. Both units include modulator and demodulator circuits for transmission of clock signals and data. All data is received at the master unit for processing to determine an accurate fault distance calculation.

  5. Approximation errors during variance propagation

    International Nuclear Information System (INIS)

    Dinsmore, Stephen

    1986-01-01

    Risk and reliability analyses are often performed by constructing and quantifying large fault trees. The inputs to these models are component failure events whose probability of occuring are best represented as random variables. This paper examines the errors inherent in two approximation techniques used to calculate the top event's variance from the inputs' variance. Two sample fault trees are evaluated and several three dimensional plots illustrating the magnitude of the error over a wide range of input means and variances are given

  6. Fault diagnosis

    Science.gov (United States)

    Abbott, Kathy

    1990-01-01

    The objective of the research in this area of fault management is to develop and implement a decision aiding concept for diagnosing faults, especially faults which are difficult for pilots to identify, and to develop methods for presenting the diagnosis information to the flight crew in a timely and comprehensible manner. The requirements for the diagnosis concept were identified by interviewing pilots, analyzing actual incident and accident cases, and examining psychology literature on how humans perform diagnosis. The diagnosis decision aiding concept developed based on those requirements takes abnormal sensor readings as input, as identified by a fault monitor. Based on these abnormal sensor readings, the diagnosis concept identifies the cause or source of the fault and all components affected by the fault. This concept was implemented for diagnosis of aircraft propulsion and hydraulic subsystems in a computer program called Draphys (Diagnostic Reasoning About Physical Systems). Draphys is unique in two important ways. First, it uses models of both functional and physical relationships in the subsystems. Using both models enables the diagnostic reasoning to identify the fault propagation as the faulted system continues to operate, and to diagnose physical damage. Draphys also reasons about behavior of the faulted system over time, to eliminate possibilities as more information becomes available, and to update the system status as more components are affected by the fault. The crew interface research is examining display issues associated with presenting diagnosis information to the flight crew. One study examined issues for presenting system status information. One lesson learned from that study was that pilots found fault situations to be more complex if they involved multiple subsystems. Another was pilots could identify the faulted systems more quickly if the system status was presented in pictorial or text format. Another study is currently under way to

  7. Concatenated codes for fault tolerant quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Knill, E.; Laflamme, R.; Zurek, W.

    1995-05-01

    The application of concatenated codes to fault tolerant quantum computing is discussed. We have previously shown that for quantum memories and quantum communication, a state can be transmitted with error {epsilon} provided each gate has error at most c{epsilon}. We show how this can be used with Shor`s fault tolerant operations to reduce the accuracy requirements when maintaining states not currently participating in the computation. Viewing Shor`s fault tolerant operations as a method for reducing the error of operations, we give a concatenated implementation which promises to propagate the reduction hierarchically. This has the potential of reducing the accuracy requirements in long computations.

  8. Opioid Prescribing PSA (:60)

    Centers for Disease Control (CDC) Podcasts

    This 60 second public service announcement is based on the July 2017 CDC Vital Signs report. Higher opioid prescribing puts patients at risk for addiction and overdose. Learn what can be done about this serious problem.

  9. About faults, errors, and other dangerous things

    NARCIS (Netherlands)

    Rauterberg, G.W.M.

    1997-01-01

    In this paper the traditional paradigm for learning and training of operators in complex systems is discussed and criticised. There is a strong influence coming from research carried out in artificial intelligence (AI). The most well known arguments against the AI-approach are presented and

  10. Analyzing Software Errors in Safety-Critical Embedded Systems

    Science.gov (United States)

    Lutz, Robyn R.

    1994-01-01

    This paper analyzes the root causes of safty-related software faults identified as potentially hazardous to the system are distributed somewhat differently over the set of possible error causes than non-safety-related software faults.

  11. Beyond the basics: refills by electronic prescribing.

    Science.gov (United States)

    Goldman, Roberta E; Dubé, Catherine; Lapane, Kate L

    2010-07-01

    E-prescribing is part of a new generation of electronic solutions for the medical industry that may have great potential for improving work flow and communication between medical practices and pharmacies. In the US, it has been introduced with minimal monitoring of errors and general usability. This paper examines refill functionality in e-prescribing software. A mixed method study including focus groups and surveys was conducted. Qualitative data were collected in on-site focus groups or individual interviews with clinicians and medical office staff at 64 physician office practices. Focus group participants described their experiences with the refill functionality of e-prescribing software, provided suggestions for improving it, and suggested improvements in office procedures and software functionality. Overall, approximately 50% reduction in time spent each day on refills was reported. Overall reports of refill functionality were positive; but clinicians and staff identified numerous difficulties and glitches associated managing prescription refills. These glitches diminished over time. Benefits included time saved as well as patient convenience. Potential for refilling without thought because of the ease of use was noted. Clinicians and staff appreciated the ability to track whether patients are filling and refilling prescriptions. E-prescribing software for managing medication refills has not yet reached its full potential. To reduce work flow barriers and medication errors, software companies need to develop error reporting systems and response teams to deal effectively with problems experienced by users. Examining usability issues on both the medical office and pharmacy ends is required to identify the behavioral and cultural changes that accompany technological innovation and ease the transition to full use of e-prescribing software. 2010 Elsevier Ireland Ltd. All rights reserved.

  12. Quantum error correction for beginners

    International Nuclear Information System (INIS)

    Devitt, Simon J; Nemoto, Kae; Munro, William J

    2013-01-01

    Quantum error correction (QEC) and fault-tolerant quantum computation represent one of the most vital theoretical aspects of quantum information processing. It was well known from the early developments of this exciting field that the fragility of coherent quantum systems would be a catastrophic obstacle to the development of large-scale quantum computers. The introduction of quantum error correction in 1995 showed that active techniques could be employed to mitigate this fatal problem. However, quantum error correction and fault-tolerant computation is now a much larger field and many new codes, techniques, and methodologies have been developed to implement error correction for large-scale quantum algorithms. In response, we have attempted to summarize the basic aspects of quantum error correction and fault-tolerance, not as a detailed guide, but rather as a basic introduction. The development in this area has been so pronounced that many in the field of quantum information, specifically researchers who are new to quantum information or people focused on the many other important issues in quantum computation, have found it difficult to keep up with the general formalisms and methodologies employed in this area. Rather than introducing these concepts from a rigorous mathematical and computer science framework, we instead examine error correction and fault-tolerance largely through detailed examples, which are more relevant to experimentalists today and in the near future. (review article)

  13. Opioid Prescribing PSA (:60)

    Centers for Disease Control (CDC) Podcasts

    2017-07-06

    This 60 second public service announcement is based on the July 2017 CDC Vital Signs report. Higher opioid prescribing puts patients at risk for addiction and overdose. Learn what can be done about this serious problem.  Created: 7/6/2017 by Centers for Disease Control and Prevention (CDC).   Date Released: 7/6/2017.

  14. Matrix with Prescribed Eigenvectors

    Science.gov (United States)

    Ahmad, Faiz

    2011-01-01

    It is a routine matter for undergraduates to find eigenvalues and eigenvectors of a given matrix. But the converse problem of finding a matrix with prescribed eigenvalues and eigenvectors is rarely discussed in elementary texts on linear algebra. This problem is related to the "spectral" decomposition of a matrix and has important technical…

  15. Negligence, genuine error, and litigation

    Directory of Open Access Journals (Sweden)

    Sohn DH

    2013-02-01

    Full Text Available David H SohnDepartment of Orthopedic Surgery, University of Toledo Medical Center, Toledo, OH, USAAbstract: Not all medical injuries are the result of negligence. In fact, most medical injuries are the result either of the inherent risk in the practice of medicine, or due to system errors, which cannot be prevented simply through fear of disciplinary action. This paper will discuss the differences between adverse events, negligence, and system errors; the current medical malpractice tort system in the United States; and review current and future solutions, including medical malpractice reform, alternative dispute resolution, health courts, and no-fault compensation systems. The current political environment favors investigation of non-cap tort reform remedies; investment into more rational oversight systems, such as health courts or no-fault systems may reap both quantitative and qualitative benefits for a less costly and safer health system.Keywords: medical malpractice, tort reform, no fault compensation, alternative dispute resolution, system errors

  16. Neuroadaptive Fault-Tolerant Control of Nonlinear Systems Under Output Constraints and Actuation Faults.

    Science.gov (United States)

    Zhao, Kai; Song, Yongduan; Shen, Zhixi

    2018-02-01

    In this paper, a neuroadaptive fault-tolerant tracking control method is proposed for a class of time-delay pure-feedback systems in the presence of external disturbances and actuation faults. The proposed controller can achieve prescribed transient and steady-state performance, despite uncertain time delays and output constraints as well as actuation faults. By combining a tangent barrier Lyapunov-Krasovskii function with the dynamic surface control technique, the neural network unit in the developed control scheme is able to take its action from the very beginning and play its learning/approximating role safely during the entire system operational envelope, leading to enhanced control performance without the danger of violating compact set precondition. Furthermore, prescribed transient performance and output constraints are strictly ensured in the presence of nonaffine uncertainties, external disturbances, and undetectable actuation faults. The control strategy is also validated by numerical simulation.

  17. A model of methods for influencing prescribing: Part I. A review of prescribing models, persuasion theories, and administrative and educational methods.

    Science.gov (United States)

    Raisch, D W

    1990-04-01

    The purpose of this literature review is to develop a model of methods to be used to influence prescribing. Four bodies of literature were identified as being important for developing the model: (1) Theoretical prescribing models furnish information concerning factors that affect prescribing and how prescribing decisions are made. (2) Theories of persuasion provide insight into important components of educational communications. (3) Research articles of programs to improve prescribing identify types of programs that have been found to be successful. (4) Theories of human inference describe how judgments are formulated and identify errors in judgment that can play a role in prescribing. This review is presented in two parts. This article reviews prescribing models, theories of persuasion, studies of administrative programs to control prescribing, and sub-optimally designed studies of educational efforts to influence drug prescribing.

  18. Prescribing practices for pediatric out-patients: A case study of two ...

    African Journals Online (AJOL)

    Purpose: The objective of this study was to evaluate drug utilization pattern in the pediatric ... Medication error can affect ... medication error may be caused by many factors ... pharmacokinetic .... prescriber's performance, patients experience at.

  19. Development of methods for evaluating active faults

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    The report for long-term evaluation of active faults was published by the Headquarters for Earthquake Research Promotion on Nov. 2010. After occurrence of the 2011 Tohoku-oki earthquake, the safety review guide with regard to geology and ground of site was revised by the Nuclear Safety Commission on Mar. 2012 with scientific knowledges of the earthquake. The Nuclear Regulation Authority established on Sep. 2012 is newly planning the New Safety Design Standard related to Earthquakes and Tsunamis of Light Water Nuclear Power Reactor Facilities. With respect to those guides and standards, our investigations for developing the methods of evaluating active faults are as follows; (1) For better evaluation on activities of offshore fault, we proposed a work flow to date marine terrace (indicator for offshore fault activity) during the last 400,000 years. We also developed the analysis of fault-related fold for evaluating of blind fault. (2) To clarify the activities of active faults without superstratum, we carried out the color analysis of fault gouge and divided the activities into thousand of years and tens of thousands. (3) To reduce uncertainties of fault activities and frequency of earthquakes, we compiled the survey data and possible errors. (4) For improving seismic hazard analysis, we compiled the fault activities of the Yunotake and Itozawa faults, induced by the 2011 Tohoku-oki earthquake. (author)

  20. Changing doctor prescribing behaviour

    DEFF Research Database (Denmark)

    Gill, P.S.; Mäkelä, M.; Vermeulen, K.M.

    1999-01-01

    Collaboration on Effective Professional Practice. This register is kept up to date by searching the following databases for reports of relevant research: DHSS-DATA; EMBASE; MEDLINE; SIGLE; Resource Database in Continuing Medical Education (1975-1994), along with bibliographies of related topics, hand searching......The aim of this overview was to identify interventions that change doctor prescribing behaviour and to derive conclusions for practice and further research. Relevant studies (indicating prescribing as a behaviour change) were located from a database of studies maintained by the Cochrane...... of key journals and personal contact with content area experts. Randomised controlled trials and non-equivalent group designs with pre- and post-intervention measures were included. Outcome measures were those used by the study authors. For each study we determined whether these were positive, negative...

  1. Prescribed fire research in Pennsylvania

    Science.gov (United States)

    Patrick Brose

    2009-01-01

    Prescribed fire in Pennsylvania is a relatively new forestry practice because of the State's adverse experience with highly destructive wildfires in the early 1900s. The recent introduction of prescribed fire raises a myriad of questions regarding its correct and safe use. This poster briefly describes the prescribed fire research projects of the Forestry Sciences...

  2. Using total quality management approach to improve patient safety by preventing medication error incidences*.

    Science.gov (United States)

    Yousef, Nadin; Yousef, Farah

    2017-09-04

    Whereas one of the predominant causes of medication errors is a drug administration error, a previous study related to our investigations and reviews estimated that the incidences of medication errors constituted 6.7 out of 100 administrated medication doses. Therefore, we aimed by using six sigma approach to propose a way that reduces these errors to become less than 1 out of 100 administrated medication doses by improving healthcare professional education and clearer handwritten prescriptions. The study was held in a General Government Hospital. First, we systematically studied the current medication use process. Second, we used six sigma approach by utilizing the five-step DMAIC process (Define, Measure, Analyze, Implement, Control) to find out the real reasons behind such errors. This was to figure out a useful solution to avoid medication error incidences in daily healthcare professional practice. Data sheet was used in Data tool and Pareto diagrams were used in Analyzing tool. In our investigation, we reached out the real cause behind administrated medication errors. As Pareto diagrams used in our study showed that the fault percentage in administrated phase was 24.8%, while the percentage of errors related to prescribing phase was 42.8%, 1.7 folds. This means that the mistakes in prescribing phase, especially because of the poor handwritten prescriptions whose percentage in this phase was 17.6%, are responsible for the consequent) mistakes in this treatment process later on. Therefore, we proposed in this study an effective low cost strategy based on the behavior of healthcare workers as Guideline Recommendations to be followed by the physicians. This method can be a prior caution to decrease errors in prescribing phase which may lead to decrease the administrated medication error incidences to less than 1%. This improvement way of behavior can be efficient to improve hand written prescriptions and decrease the consequent errors related to administrated

  3. Optimal fault signal estimation

    NARCIS (Netherlands)

    Stoorvogel, Antonie Arij; Niemann, H.H.; Saberi, A.; Sannuti, P.

    2002-01-01

    We consider here both fault identification and fault signal estimation. Regarding fault identification, we seek either exact or almost fault identification. On the other hand, regarding fault signal estimation, we seek either $H_2$ optimal, $H_2$ suboptimal or Hinfinity suboptimal estimation. By

  4. Detecting and classifying faults on transmission systems using a backpropagation neural network; Deteccion y clasificacion de fallas en sistemas de transmision empleando una red neuronal con retropropagacion del error

    Energy Technology Data Exchange (ETDEWEB)

    Rosas Ortiz, German

    2000-01-01

    Fault detection and diagnosis on transmission systems is an interesting area of investigation to Artificial Intelligence (AI) based systems. Neurocomputing is one of fastest growing areas of research in the fields of AI and pattern recognition. This work explores the possible suitability of pattern recognition approach of neural networks for fault detection and classification on power systems. The conventional detection techniques in modern relays are based in digital processing of signals and it need some time (around 1 cycle) to send a tripping signal, also they are likely to make incorrect decisions if the signals are noisy. It's desirable to develop a fast, accurate and robust approach that perform accurately for changing system conditions (like load variations and fault resistance). The aim of this work is to develop a novel technique based on Artificial Neural Networks (ANN), which explores the suitability of a pattern classification approach for fault detection and diagnosis. The suggested approach is based in the fact that when a fault occurs, a change in the system impedance take place and, as a consequence changes in amplitude and phase of line voltage and current signals take place. The ANN-based fault discriminator is trained to detect this changes as indicators of the instant of fault inception. This detector uses instantaneous values of these signals to make decisions. Suitability of using neural network as pattern classifiers for transmission systems fault diagnosis is described in detail a neural network design and simulation environment for real-time is presented. Results showing the performance of this approach are presented and indicate that it is fast, secure and exact enough, and it can be used in high speed fault detection and classification schemes. [Spanish] El diagnostico y la deteccion de fallas en sistemas de transmision es una area de interes en investigacion para sistemas basados en Inteligencia Artificial (IA). El calculo neuronal

  5. Dependability validation by means of fault injection: method, implementation, application

    International Nuclear Information System (INIS)

    Arlat, Jean

    1990-01-01

    This dissertation presents theoretical and practical results concerning the use of fault injection as a means for testing fault tolerance in the framework of the experimental dependability validation of computer systems. The dissertation first presents the state-of-the-art of published work on fault injection, encompassing both hardware (fault simulation, physical fault Injection) and software (mutation testing) issues. Next, the major attributes of fault injection (faults and their activation, experimental readouts and measures, are characterized taking into account: i) the abstraction levels used to represent the system during the various phases of its development (analytical, empirical and physical models), and Il) the validation objectives (verification and evaluation). An evaluation method is subsequently proposed that combines the analytical modeling approaches (Monte Carlo Simulations, closed-form expressions. Markov chains) used for the representation of the fault occurrence process and the experimental fault Injection approaches (fault Simulation and physical injection); characterizing the error processing and fault treatment provided by the fault tolerance mechanisms. An experimental tool - MESSALINE - is then defined and presented. This tool enables physical faults to be Injected In an hardware and software prototype of the system to be validated. Finally, the application of MESSALINE for testing two fault-tolerant systems possessing very dissimilar features and the utilization of the experimental results obtained - both as design feedbacks and for dependability measures evaluation - are used to illustrate the relevance of the method. (author) [fr

  6. A fault-tolerant software strategy for digital systems

    Science.gov (United States)

    Hitt, E. F.; Webb, J. J.

    1984-01-01

    Techniques developed for producing fault-tolerant software are described. Tolerance is required because of the impossibility of defining fault-free software. Faults are caused by humans and can appear anywhere in the software life cycle. Tolerance is effected through error detection, damage assessment, recovery, and fault treatment, followed by return of the system to service. Multiversion software comprises two or more versions of the software yielding solutions which are examined by a decision algorithm. Errors can also be detected by extrapolation from previous results or by the acceptability of results. Violations of timing specifications can reveal errors, or the system can roll back to an error-free state when a defect is detected. The software, when used in flight control systems, must not impinge on time-critical responses. Efforts are still needed to reduce the costs of developing the fault-tolerant systems.

  7. Error Patterns

    NARCIS (Netherlands)

    Hoede, C.; Li, Z.

    2001-01-01

    In coding theory the problem of decoding focuses on error vectors. In the simplest situation code words are $(0,1)$-vectors, as are the received messages and the error vectors. Comparison of a received word with the code words yields a set of error vectors. In deciding on the original code word,

  8. Social determinants of prescribed and non-prescribed medicine use

    Directory of Open Access Journals (Sweden)

    García-Altés Anna

    2010-05-01

    Full Text Available Abstract Background The aim of the present study was to describe the use of prescribed and non prescribed medicines in a non-institutionalised population older than 15 years of an urban area during the year 2000, in terms of age and gender, social class, employment status and type of Primary Health Care. Methods Cross-sectional study. Information came from the 2000 Barcelona Health Interview Survey. The indicators used were the prevalence of use of prescribed and non-prescribed medicines in the two weeks prior to the interview. Descriptive analyses, bivariate and multivariate logistic regression analyses were carried out. Results More women than men took medicines (75.8% vs. 60% respectively. The prevalence of use of prescribed medicines increased with age while the prevalence of non-prescribed use decreased. These age differences are smaller among those with poor perceived health. In terms of social class, a higher percentage of men with good health in the more advantaged classes took non-prescribed medicines compared with disadvantaged classes (38.7% vs 31.8%. In contrast, among the group with poor health, more people from the more advantaged classes took prescribed medicines, compared with disadvantaged classes (51.4% vs 33.3%. A higher proportion of people who were either retired, unemployed or students, with good health, used prescribed medicines. Conclusion This study shows that beside health needs, there are social determinants affecting medicine consumption in the city of Barcelona.

  9. Fault latency in the memory - An experimental study on VAX 11/780

    Science.gov (United States)

    Chillarege, Ram; Iyer, Ravishankar K.

    1986-01-01

    Fault latency is the time between the physical occurrence of a fault and its corruption of data, causing an error. The measure of this time is difficult to obtain because the time of occurrence of a fault and the exact moment of generation of an error are not known. This paper describes an experiment to accurately study the fault latency in the memory subsystem. The experiment employs real memory data from a VAX 11/780 at the University of Illinois. Fault latency distributions are generated for s-a-0 and s-a-1 permanent fault models. Results show that the mean fault latency of a s-a-0 fault is nearly 5 times that of the s-a-1 fault. Large variations in fault latency are found for different regions in memory. An analysis of a variance model to quantify the relative influence of various workload measures on the evaluated latency is also given.

  10. The mechanics of fault-bend folding and tear-fault systems in the Niger Delta

    Science.gov (United States)

    Benesh, Nathan Philip

    This dissertation investigates the mechanics of fault-bend folding using the discrete element method (DEM) and explores the nature of tear-fault systems in the deep-water Niger Delta fold-and-thrust belt. In Chapter 1, we employ the DEM to investigate the development of growth structures in anticlinal fault-bend folds. This work was inspired by observations that growth strata in active folds show a pronounced upward decrease in bed dip, in contrast to traditional kinematic fault-bend fold models. Our analysis shows that the modeled folds grow largely by parallel folding as specified by the kinematic theory; however, the process of folding over a broad axial surface zone yields a component of fold growth by limb rotation that is consistent with the patterns observed in natural folds. This result has important implications for how growth structures can he used to constrain slip and paleo-earthquake ages on active blind-thrust faults. In Chapter 2, we expand our DEM study to investigate the development of a wider range of fault-bend folds. We examine the influence of mechanical stratigraphy and quantitatively compare our models with the relationships between fold and fault shape prescribed by the kinematic theory. While the synclinal fault-bend models closely match the kinematic theory, the modeled anticlinal fault-bend folds show robust behavior that is distinct from the kinematic theory. Specifically, we observe that modeled structures maintain a linear relationship between fold shape (gamma) and fault-horizon cutoff angle (theta), rather than expressing the non-linear relationship with two distinct modes of anticlinal folding that is prescribed by the kinematic theory. These observations lead to a revised quantitative relationship for fault-bend folds that can serve as a useful interpretation tool. Finally, in Chapter 3, we examine the 3D relationships of tear- and thrust-fault systems in the western, deep-water Niger Delta. Using 3D seismic reflection data and new

  11. Probabilistic assessment of faults

    International Nuclear Information System (INIS)

    Foden, R.W.

    1987-01-01

    Probabilistic safety analysis (PSA) is the process by which the probability (or frequency of occurrence) of reactor fault conditions which could lead to unacceptable consequences is assessed. The basic objective of a PSA is to allow a judgement to be made as to whether or not the principal probabilistic requirement is satisfied. It also gives insights into the reliability of the plant which can be used to identify possible improvements. This is explained in the article. The scope of a PSA and the PSA performed by the National Nuclear Corporation (NNC) for the Heysham II and Torness AGRs and Sizewell-B PWR are discussed. The NNC methods for hazards, common cause failure and operator error are mentioned. (UK)

  12. Refractive Errors

    Science.gov (United States)

    ... power glasses focus the light rays on the retina and improve vision. Myopia usually progresses yearly and stabilizes by the ... several points (in front and/or behind the retina). Near and distant vision is affected. Cylindrical power glasses are prescribed for ...

  13. Operator errors

    International Nuclear Information System (INIS)

    Knuefer; Lindauer

    1980-01-01

    Besides that at spectacular events a combination of component failure and human error is often found. Especially the Rasmussen-Report and the German Risk Assessment Study show for pressurised water reactors that human error must not be underestimated. Although operator errors as a form of human error can never be eliminated entirely, they can be minimized and their effects kept within acceptable limits if a thorough training of personnel is combined with an adequate design of the plant against accidents. Contrary to the investigation of engineering errors, the investigation of human errors has so far been carried out with relatively small budgets. Intensified investigations in this field appear to be a worthwhile effort. (orig.)

  14. Information Based Fault Diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2008-01-01

    Fault detection and isolation, (FDI) of parametric faults in dynamic systems will be considered in this paper. An active fault diagnosis (AFD) approach is applied. The fault diagnosis will be investigated with respect to different information levels from the external inputs to the systems. These ...

  15. A fault-tolerant one-way quantum computer

    International Nuclear Information System (INIS)

    Raussendorf, R.; Harrington, J.; Goyal, K.

    2006-01-01

    We describe a fault-tolerant one-way quantum computer on cluster states in three dimensions. The presented scheme uses methods of topological error correction resulting from a link between cluster states and surface codes. The error threshold is 1.4% for local depolarizing error and 0.11% for each source in an error model with preparation-, gate-, storage-, and measurement errors

  16. Prescribing antibiotics in general practice:

    DEFF Research Database (Denmark)

    Sydenham, Rikke Vognbjerg; Pedersen, Line Bjørnskov; Plejdrup Hansen, Malene

    Objectives The majority of antibiotics are prescribed from general practice. The use of broad-spectrum antibiotics increases the risk of development of bacteria resistant to antibiotic treatment. In spite of guidelines aiming to minimize the use of broad-spectrum antibiotics we see an increase...... in the use of these agents. The overall aim of the project is to explore factors influencing the decision process and the prescribing behaviour of the GPs when prescribing antibiotics. We will study the impact of microbiological testing on the choice of antibiotic. Furthermore the project will explore how...... the GPs’ prescribing behaviour is influenced by selected factors. Method The study consists of a register-based study and a questionnaire study. The register-based study is based on data from the Register of Medicinal Product Statistics (prescribed antibiotics), Statistics Denmark (socio-demographic data...

  17. Soft errors in modern electronic systems

    CERN Document Server

    Nicolaidis, Michael

    2010-01-01

    This book provides a comprehensive presentation of the most advanced research results and technological developments enabling understanding, qualifying and mitigating the soft errors effect in advanced electronics, including the fundamental physical mechanisms of radiation induced soft errors, the various steps that lead to a system failure, the modelling and simulation of soft error at various levels (including physical, electrical, netlist, event driven, RTL, and system level modelling and simulation), hardware fault injection, accelerated radiation testing and natural environment testing, s

  18. Active Fault-Tolerant Control for Wind Turbine with Simultaneous Actuator and Sensor Faults

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2017-01-01

    Full Text Available The purpose of this paper is to show a novel fault-tolerant tracking control (FTC strategy with robust fault estimation and compensating for simultaneous actuator sensor faults. Based on the framework of fault-tolerant control, developing an FTC design method for wind turbines is a challenge and, thus, they can tolerate simultaneous pitch actuator and pitch sensor faults having bounded first time derivatives. The paper’s key contribution is proposing a descriptor sliding mode method, in which for establishing a novel augmented descriptor system, with which we can estimate the state of system and reconstruct fault by designing descriptor sliding mode observer, the paper introduces an auxiliary descriptor state vector composed by a system state vector, actuator fault vector, and sensor fault vector. By the optimized method of LMI, the conditions for stability that estimated error dynamics are set up to promote the determination of the parameters designed. With this estimation, and designing a fault-tolerant controller, the system’s stability can be maintained. The effectiveness of the design strategy is verified by implementing the controller in the National Renewable Energy Laboratory’s 5-MW nonlinear, high-fidelity wind turbine model (FAST and simulating it in MATLAB/Simulink.

  19. Summary: beyond fault trees to fault graphs

    International Nuclear Information System (INIS)

    Alesso, H.P.; Prassinos, P.; Smith, C.F.

    1984-09-01

    Fault Graphs are the natural evolutionary step over a traditional fault-tree model. A Fault Graph is a failure-oriented directed graph with logic connectives that allows cycles. We intentionally construct the Fault Graph to trace the piping and instrumentation drawing (P and ID) of the system, but with logical AND and OR conditions added. Then we evaluate the Fault Graph with computer codes based on graph-theoretic methods. Fault Graph computer codes are based on graph concepts, such as path set (a set of nodes traveled on a path from one node to another) and reachability (the complete set of all possible paths between any two nodes). These codes are used to find the cut-sets (any minimal set of component failures that will fail the system) and to evaluate the system reliability

  20. Fault tree handbook

    International Nuclear Information System (INIS)

    Haasl, D.F.; Roberts, N.H.; Vesely, W.E.; Goldberg, F.F.

    1981-01-01

    This handbook describes a methodology for reliability analysis of complex systems such as those which comprise the engineered safety features of nuclear power generating stations. After an initial overview of the available system analysis approaches, the handbook focuses on a description of the deductive method known as fault tree analysis. The following aspects of fault tree analysis are covered: basic concepts for fault tree analysis; basic elements of a fault tree; fault tree construction; probability, statistics, and Boolean algebra for the fault tree analyst; qualitative and quantitative fault tree evaluation techniques; and computer codes for fault tree evaluation. Also discussed are several example problems illustrating the basic concepts of fault tree construction and evaluation

  1. A fault tolerant system by using distributed RTOS

    International Nuclear Information System (INIS)

    Ge Yingan; Liu Songqiang; Wang Yanfang

    1999-01-01

    The author describes the design and implementation of a prototypal distributed fault tolerant system, which is developed under QNX RTOS by networking two standard PCs. By using a watchdog timer for error detection, the system can be tolerant for fail silent and transient fault of a single node

  2. A Novel Design for Drug-Drug Interaction Alerts Improves Prescribing Efficiency.

    Science.gov (United States)

    Russ, Alissa L; Chen, Siying; Melton, Brittany L; Johnson, Elizabette G; Spina, Jeffrey R; Weiner, Michael; Zillich, Alan J

    2015-09-01

    Drug-drug interactions (DDIs) are common in clinical care and pose serious risks for patients. Electronic health records display DDI alerts that can influence prescribers, but the interface design of DDI alerts has largely been unstudied. In this study, the objective was to apply human factors engineering principles to alert design. It was hypothesized that redesigned DDI alerts would significantly improve prescribers' efficiency and reduce prescribing errors. In a counterbalanced, crossover study with prescribers, two DDI alert designs were evaluated. Department of Veterans Affairs (VA) prescribers were video recorded as they completed fictitious patient scenarios, which included DDI alerts of varying severity. Efficiency was measured from time-stamped recordings. Prescribing errors were evaluated against predefined criteria. Efficiency and prescribing errors were analyzed with the Wilcoxon signed-rank test. Other usability data were collected on the adequacy of alert content, prescribers' use of the DDI monograph, and alert navigation. Twenty prescribers completed patient scenarios for both designs. Prescribers resolved redesigned alerts in about half the time (redesign: 52 seconds versus original design: 97 seconds; p<.001). Prescribing errors were not significantly different between the two designs. Usability results indicate that DDI alerts might be enhanced by facilitating easier access to laboratory data and dosing information and by allowing prescribers to cancel either interacting medication directly from the alert. Results also suggest that neither design provided adequate information for decision making via the primary interface. Applying human factors principles to DDI alerts improved overall efficiency. Aspects of DDI alert design that could be further enhanced prior to implementation were also identified.

  3. Software fault tolerance in computer operating systems

    Science.gov (United States)

    Iyer, Ravishankar K.; Lee, Inhwan

    1994-01-01

    This chapter provides data and analysis of the dependability and fault tolerance for three operating systems: the Tandem/GUARDIAN fault-tolerant system, the VAX/VMS distributed system, and the IBM/MVS system. Based on measurements from these systems, basic software error characteristics are investigated. Fault tolerance in operating systems resulting from the use of process pairs and recovery routines is evaluated. Two levels of models are developed to analyze error and recovery processes inside an operating system and interactions among multiple instances of an operating system running in a distributed environment. The measurements show that the use of process pairs in Tandem systems, which was originally intended for tolerating hardware faults, allows the system to tolerate about 70% of defects in system software that result in processor failures. The loose coupling between processors which results in the backup execution (the processor state and the sequence of events occurring) being different from the original execution is a major reason for the measured software fault tolerance. The IBM/MVS system fault tolerance almost doubles when recovery routines are provided, in comparison to the case in which no recovery routines are available. However, even when recovery routines are provided, there is almost a 50% chance of system failure when critical system jobs are involved.

  4. Negligence, genuine error, and litigation

    Science.gov (United States)

    Sohn, David H

    2013-01-01

    Not all medical injuries are the result of negligence. In fact, most medical injuries are the result either of the inherent risk in the practice of medicine, or due to system errors, which cannot be prevented simply through fear of disciplinary action. This paper will discuss the differences between adverse events, negligence, and system errors; the current medical malpractice tort system in the United States; and review current and future solutions, including medical malpractice reform, alternative dispute resolution, health courts, and no-fault compensation systems. The current political environment favors investigation of non-cap tort reform remedies; investment into more rational oversight systems, such as health courts or no-fault systems may reap both quantitative and qualitative benefits for a less costly and safer health system. PMID:23426783

  5. A study on quantification of unavailability of DPPS with fault tolerant techniques considering fault tolerant techniques' characteristics

    International Nuclear Information System (INIS)

    Kim, B. G.; Kang, H. G.; Kim, H. E.; Seung, P. H.; Kang, H. G.; Lee, S. J.

    2012-01-01

    With the improvement of digital technologies, digital I and C systems have included more various fault tolerant techniques than conventional analog I and C systems have, in order to increase fault detection and to help the system safely perform the required functions in spite of the presence of faults. So, in the reliability evaluation of digital systems, the fault tolerant techniques (FTTs) and their fault coverage must be considered. To consider the effects of FTTs in a digital system, there have been several studies on the reliability of digital model. Therefore, this research based on literature survey attempts to develop a model to evaluate the plant reliability of the digital plant protection system (DPPS) with fault tolerant techniques considering detection and process characteristics and human errors. Sensitivity analysis is performed to ascertain important variables from the fault management coverage and unavailability based on the proposed model

  6. Fault Tolerant Feedback Control

    DEFF Research Database (Denmark)

    Stoustrup, Jakob; Niemann, H.

    2001-01-01

    An architecture for fault tolerant feedback controllers based on the Youla parameterization is suggested. It is shown that the Youla parameterization will give a residual vector directly in connection with the fault diagnosis part of the fault tolerant feedback controller. It turns out...... that there is a separation be-tween the feedback controller and the fault tolerant part. The closed loop feedback properties are handled by the nominal feedback controller and the fault tolerant part is handled by the design of the Youla parameter. The design of the fault tolerant part will not affect the design...... of the nominal feedback con-troller....

  7. Einstein's error

    International Nuclear Information System (INIS)

    Winterflood, A.H.

    1980-01-01

    In discussing Einstein's Special Relativity theory it is claimed that it violates the principle of relativity itself and that an anomalous sign in the mathematics is found in the factor which transforms one inertial observer's measurements into those of another inertial observer. The apparent source of this error is discussed. Having corrected the error a new theory, called Observational Kinematics, is introduced to replace Einstein's Special Relativity. (U.K.)

  8. Unrealized potential and residual consequences of electronic prescribing on pharmacy workflow in the outpatient pharmacy.

    Science.gov (United States)

    Nanji, Karen C; Rothschild, Jeffrey M; Boehne, Jennifer J; Keohane, Carol A; Ash, Joan S; Poon, Eric G

    2014-01-01

    Electronic prescribing systems have often been promoted as a tool for reducing medication errors and adverse drug events. Recent evidence has revealed that adoption of electronic prescribing systems can lead to unintended consequences such as the introduction of new errors. The purpose of this study is to identify and characterize the unrealized potential and residual consequences of electronic prescribing on pharmacy workflow in an outpatient pharmacy. A multidisciplinary team conducted direct observations of workflow in an independent pharmacy and semi-structured interviews with pharmacy staff members about their perceptions of the unrealized potential and residual consequences of electronic prescribing systems. We used qualitative methods to iteratively analyze text data using a grounded theory approach, and derive a list of major themes and subthemes related to the unrealized potential and residual consequences of electronic prescribing. We identified the following five themes: Communication, workflow disruption, cost, technology, and opportunity for new errors. These contained 26 unique subthemes representing different facets of our observations and the pharmacy staff's perceptions of the unrealized potential and residual consequences of electronic prescribing. We offer targeted solutions to improve electronic prescribing systems by addressing the unrealized potential and residual consequences that we identified. These recommendations may be applied not only to improve staff perceptions of electronic prescribing systems but also to improve the design and/or selection of these systems in order to optimize communication and workflow within pharmacies while minimizing both cost and the potential for the introduction of new errors.

  9. Prescribed burning: a topical issue

    Directory of Open Access Journals (Sweden)

    Bovio G

    2013-11-01

    Full Text Available Prescribed burning is a promising technique for the prevention of forest fires in Italy. The research deepened several ecological and operative aspects. However, legal issues need to be thoroughly investigated.

  10. Introduction to prescribed fires in Southern ecosystems

    Science.gov (United States)

    Thomas A. Waldrop; Scott L. Goodrick

    2012-01-01

    This publication is a guide for resource managers on planning and executing prescribed burns in Southern forests and grasslands. It includes explanations of reasons for prescribed burning, environmental effects, weather, and techniques as well as general information on prescribed burning.

  11. A SAFE approach towards early design space exploration of Fault-tolerant multimedia MPSoCs

    NARCIS (Netherlands)

    van Stralen, P.; Pimentel, A.

    2012-01-01

    With the reduction in feature size, transient errors start to play an important role in modern embedded systems. It is therefore important to make fault-tolerance a first-class citizen in embedded system design. Fault-tolerance patterns are techniques to make an application fault-tolerant. Not only

  12. Design of fault simulator

    Energy Technology Data Exchange (ETDEWEB)

    Gabbar, Hossam A. [Faculty of Energy Systems and Nuclear Science, University of Ontario Institute of Technology (UOIT), Ontario, L1H 7K4 (Canada)], E-mail: hossam.gabbar@uoit.ca; Sayed, Hanaa E.; Osunleke, Ajiboye S. [Okayama University, Graduate School of Natural Science and Technology, Division of Industrial Innovation Sciences Department of Intelligent Systems Engineering, Okayama 700-8530 (Japan); Masanobu, Hara [AspenTech Japan Co., Ltd., Kojimachi Crystal City 10F, Kojimachi, Chiyoda-ku, Tokyo 102-0083 (Japan)

    2009-08-15

    Fault simulator is proposed to understand and evaluate all possible fault propagation scenarios, which is an essential part of safety design and operation design and support of chemical/production processes. Process models are constructed and integrated with fault models, which are formulated in qualitative manner using fault semantic networks (FSN). Trend analysis techniques are used to map real time and simulation quantitative data into qualitative fault models for better decision support and tuning of FSN. The design of the proposed fault simulator is described and applied on experimental plant (G-Plant) to diagnose several fault scenarios. The proposed fault simulator will enable industrial plants to specify and validate safety requirements as part of safety system design as well as to support recovery and shutdown operation and disaster management.

  13. Iowa Bedrock Faults

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — This fault coverage locates and identifies all currently known/interpreted fault zones in Iowa, that demonstrate offset of geologic units in exposure or subsurface...

  14. Layered Fault Management Architecture

    National Research Council Canada - National Science Library

    Sztipanovits, Janos

    2004-01-01

    ... UAVs or Organic Air Vehicles. The approach of this effort was to analyze fault management requirements of formation flight for fleets of UAVs, and develop a layered fault management architecture which demonstrates significant...

  15. Fault detection and isolation in systems with parametric faults

    DEFF Research Database (Denmark)

    Stoustrup, Jakob; Niemann, Hans Henrik

    1999-01-01

    The problem of fault detection and isolation of parametric faults is considered in this paper. A fault detection problem based on parametric faults are associated with internal parameter variations in the dynamical system. A fault detection and isolation method for parametric faults is formulated...

  16. Which non-technical skills do junior doctors require to prescribe safely? A systematic review.

    Science.gov (United States)

    Dearden, Effie; Mellanby, Edward; Cameron, Helen; Harden, Jeni

    2015-12-01

    Prescribing errors are a major source of avoidable morbidity and mortality. Junior doctors write most in-hospital prescriptions and are the least experienced members of the healthcare team. This puts them at high risk of error and makes them attractive targets for interventions to improve prescription safety. Error analysis has shown a background of complex environments with multiple contributory conditions. Similar conditions in other high risk industries, such as aviation, have led to an increased understanding of so-called human factors and the use of non-technical skills (NTS) training to try to reduce error. To date no research has examined the NTS required for safe prescribing. The aim of this review was to develop a prototype NTS taxonomy for safe prescribing, by junior doctors, in hospital settings. A systematic search identified 14 studies analyzing prescribing behaviours and errors by junior doctors. Framework analysis was used to extract data from the studies and identify behaviours related to categories of NTS that might be relevant to safe and effective prescribing performance by junior doctors. Categories were derived from existing literature and inductively from the data. A prototype taxonomy of relevant categories (situational awareness, decision making, communication and team working, and task management) and elements was constructed. This prototype will form the basis of future work to create a tool that can be used for training and assessment of medical students and junior doctors to reduce prescribing error in the future. © 2015 The British Pharmacological Society.

  17. Fault zone hydrogeology

    Science.gov (United States)

    Bense, V. F.; Gleeson, T.; Loveless, S. E.; Bour, O.; Scibek, J.

    2013-12-01

    Deformation along faults in the shallow crust (research effort of structural geologists and hydrogeologists. However, we find that these disciplines often use different methods with little interaction between them. In this review, we document the current multi-disciplinary understanding of fault zone hydrogeology. We discuss surface- and subsurface observations from diverse rock types from unlithified and lithified clastic sediments through to carbonate, crystalline, and volcanic rocks. For each rock type, we evaluate geological deformation mechanisms, hydrogeologic observations and conceptual models of fault zone hydrogeology. Outcrop observations indicate that fault zones commonly have a permeability structure suggesting they should act as complex conduit-barrier systems in which along-fault flow is encouraged and across-fault flow is impeded. Hydrogeological observations of fault zones reported in the literature show a broad qualitative agreement with outcrop-based conceptual models of fault zone hydrogeology. Nevertheless, the specific impact of a particular fault permeability structure on fault zone hydrogeology can only be assessed when the hydrogeological context of the fault zone is considered and not from outcrop observations alone. To gain a more integrated, comprehensive understanding of fault zone hydrogeology, we foresee numerous synergistic opportunities and challenges for the discipline of structural geology and hydrogeology to co-evolve and address remaining challenges by co-locating study areas, sharing approaches and fusing data, developing conceptual models from hydrogeologic data, numerical modeling, and training interdisciplinary scientists.

  18. Performance based fault diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik

    2002-01-01

    Different aspects of fault detection and fault isolation in closed-loop systems are considered. It is shown that using the standard setup known from feedback control, it is possible to formulate fault diagnosis problems based on a performance index in this general standard setup. It is also shown...

  19. Algorithmic fault tree construction by component-based system modeling

    International Nuclear Information System (INIS)

    Majdara, Aref; Wakabayashi, Toshio

    2008-01-01

    Computer-aided fault tree generation can be easier, faster and less vulnerable to errors than the conventional manual fault tree construction. In this paper, a new approach for algorithmic fault tree generation is presented. The method mainly consists of a component-based system modeling procedure an a trace-back algorithm for fault tree synthesis. Components, as the building blocks of systems, are modeled using function tables and state transition tables. The proposed method can be used for a wide range of systems with various kinds of components, if an inclusive component database is developed. (author)

  20. Inappropriate prescribing in the elderly.

    LENUS (Irish Health Repository)

    Gallagher, P

    2012-02-03

    BACKGROUND AND OBJECTIVE: Drug therapy is necessary to treat acute illness, maintain current health and prevent further decline. However, optimizing drug therapy for older patients is challenging and sometimes, drug therapy can do more harm than good. Drug utilization review tools can highlight instances of potentially inappropriate prescribing to those involved in elderly pharmacotherapy, i.e. doctors, nurses and pharmacists. We aim to provide a review of the literature on potentially inappropriate prescribing in the elderly and also to review the explicit criteria that have been designed to detect potentially inappropriate prescribing in the elderly. METHODS: We performed an electronic search of the PUBMED database for articles published between 1991 and 2006 and a manual search through major journals for articles referenced in those located through PUBMED. Search terms were elderly, inappropriate prescribing, prescriptions, prevalence, Beers criteria, health outcomes and Europe. RESULTS AND DISCUSSION: Prescription of potentially inappropriate medications to older people is highly prevalent in the United States and Europe, ranging from 12% in community-dwelling elderly to 40% in nursing home residents. Inappropriate prescribing is associated with adverse drug events. Limited data exists on health outcomes from use of inappropriate medications. There are no prospective randomized controlled studies that test the tangible clinical benefit to patients of using drug utilization review tools. Existing drug utilization review tools have been designed on the basis of North American and Canadian drug formularies and may not be appropriate for use in European countries because of the differences in national drug formularies and prescribing attitudes. CONCLUSION: Given the high prevalence of inappropriate prescribing despite the widespread use of drug-utilization review tools, prospective randomized controlled trials are necessary to identify useful interventions. Drug

  1. Measurement and analysis of operating system fault tolerance

    Science.gov (United States)

    Lee, I.; Tang, D.; Iyer, R. K.

    1992-01-01

    This paper demonstrates a methodology to model and evaluate the fault tolerance characteristics of operational software. The methodology is illustrated through case studies on three different operating systems: the Tandem GUARDIAN fault-tolerant system, the VAX/VMS distributed system, and the IBM/MVS system. Measurements are made on these systems for substantial periods to collect software error and recovery data. In addition to investigating basic dependability characteristics such as major software problems and error distributions, we develop two levels of models to describe error and recovery processes inside an operating system and on multiple instances of an operating system running in a distributed environment. Based on the models, reward analysis is conducted to evaluate the loss of service due to software errors and the effect of the fault-tolerance techniques implemented in the systems. Software error correlation in multicomputer systems is also investigated.

  2. Managing systems faults on the commercial flight deck: Analysis of pilots' organization and prioritization of fault management information

    Science.gov (United States)

    Rogers, William H.

    1993-01-01

    In rare instances, flight crews of commercial aircraft must manage complex systems faults in addition to all their normal flight tasks. Pilot errors in fault management have been attributed, at least in part, to an incomplete or inaccurate awareness of the fault situation. The current study is part of a program aimed at assuring that the types of information potentially available from an intelligent fault management aiding concept developed at NASA Langley called 'Faultfinde' (see Abbott, Schutte, Palmer, and Ricks, 1987) are an asset rather than a liability: additional information should improve pilot performance and aircraft safety, but it should not confuse, distract, overload, mislead, or generally exacerbate already difficult circumstances.

  3. An evaluation of the appropriateness and safety of nurse and midwife prescribing in Ireland.

    LENUS (Irish Health Repository)

    Naughton, Corina

    2012-09-19

    AIM: To evaluate the clinical appropriateness and safety of nurse and midwife prescribing practice. BACKGROUND: The number of countries introducing nurse and midwife prescribing is increasing; however, concerns over patient safety remain. DESIGN: A multi-site documentation evaluation was conducted using purposeful and random sampling. The sample included 142 patients\\' records and 208 medications prescribed by 25 Registered Nurse Prescribers. METHODS: Data were extracted from patient and prescription records between March-May 2009. Two expert reviewers applied the modified Medication Appropriate Index tool (8 criteria) to each drug. The percentage of appropriate or inappropriate responses for each criterion was reported. Reviewer concordance was measured using the Cohen\\'s kappa statistic (inter-rater reliability). RESULTS: Nurse or midwife prescribers from eight hospitals working in seventeen different areas of practice were included. The reviewers judged that 95-96% of medicines prescribed were indicated and effective for the diagnosed condition. Criteria relating to dosage, directions, drug-drugs or disease-condition interaction, and duplication of therapy were judged appropriate in 87-92% of prescriptions. Duration of therapy received the lowest value at 76%. Overall, reviewers indicated that between 69 (reviewer 2)-80% (reviewer 1) of prescribing decisions met all eight criteria. CONCLUSION: The majority of nurse and midwife prescribing decisions were deemed safe and clinically appropriate. However, risk of inappropriate prescribing with the potential for drug errors was detected. Continuing education and evaluation of prescribing practice, especially related to drug and condition interactions, is required to maximize appropriate and safe prescribing.

  4. Spent fuel bundle counter sequence error manual - BRUCE NGS

    International Nuclear Information System (INIS)

    Nicholson, L.E.

    1992-01-01

    The Spent Fuel Bundle Counter (SFBC) is used to count the number and type of spent fuel transfers that occur into or out of controlled areas at CANDU reactor sites. However if the transfers are executed in a non-standard manner or the SFBC is malfunctioning, the transfers are recorded as sequence errors. Each sequence error message typically contains adequate information to determine the cause of the message. This manual provides a guide to interpret the various sequence error messages that can occur and suggests probable cause or causes of the sequence errors. Each likely sequence error is presented on a 'card' in Appendix A. Note that it would be impractical to generate a sequence error card file with entries for all possible combinations of faults. Therefore the card file contains sequences with only one fault at a time. Some exceptions have been included however where experience has indicated that several faults can occur simultaneously

  5. Spent fuel bundle counter sequence error manual - DARLINGTON NGS

    International Nuclear Information System (INIS)

    Nicholson, L.E.

    1992-01-01

    The Spent Fuel Bundle Counter (SFBC) is used to count the number and type of spent fuel transfers that occur into or out of controlled areas at CANDU reactor sites. However if the transfers are executed in a non-standard manner or the SFBC is malfunctioning, the transfers are recorded as sequence errors. Each sequence error message typically contains adequate information to determine the cause of the message. This manual provides a guide to interpret the various sequence error messages that can occur and suggests probable cause or causes of the sequence errors. Each likely sequence error is presented on a 'card' in Appendix A. Note that it would be impractical to generate a sequence error card file with entries for all possible combinations of faults. Therefore the card file contains sequences with only one fault at a time. Some exceptions have been included however where experience has indicated that several faults can occur simultaneously

  6. Fault Estimation for Fuzzy Delay Systems: A Minimum Norm Least Squares Solution Approach.

    Science.gov (United States)

    Huang, Sheng-Juan; Yang, Guang-Hong

    2017-09-01

    This paper mainly focuses on the problem of fault estimation for a class of Takagi-Sugeno fuzzy systems with state delays. A minimum norm least squares solution (MNLSS) approach is first introduced to establish a fault estimation compensator, which is able to optimize the fault estimator. Compared with most of the existing fault estimation methods, the MNLSS-based fault estimation method can effectively decrease the effect of state errors on the accuracy of fault estimation. Finally, three examples are given to illustrate the effectiveness and merits of the proposed method.

  7. A study of antibiotic prescribing

    DEFF Research Database (Denmark)

    Jaruseviciene, L.; Radzeviciene-Jurgute, R.; Jurgutis, A.

    2012-01-01

    Background. Globally, general practitioners (GPs) write more than 90% of all antibiotic prescriptions. This study examines the experiences of Lithuanian and Russian GPs in antibiotic prescription for upper respiratory tract infections, including their perceptions of when it is not indicated...... clinically or pharmacologically. Methods. 22 Lithuanian and 29 Russian GPs participated in five focus group discussions. Thematic analysis was used to analyse the data. Results. We identified four main thematic categories: patients' faith in antibiotics as medication for upper respiratory tract infections......; patient potential to influence a GP's decision to prescribe antibiotics for upper respiratory tract infections; impediments perceived by GPs in advocating clinically grounded antibiotic prescribing with their patients, and strategies applied in physician-patient negotiation about antibiotic prescribing...

  8. ELECTORAL PRESCRIBERS. WHO ARE THEY?

    Directory of Open Access Journals (Sweden)

    Constantin SASU

    2016-12-01

    Full Text Available The decision to vote and choosing among the candidates is an extremely important one with repercussions on everyday life by determining, in global mode, its quality for the whole society. Therefore the whole process by which the voter decides becomes a central concern. Prescribers, supposed to have a big influence on the electoral market, are a component of the microenvironment political organizations. These are people who occupy important positions that can influence the behavior of others. In the political environment, prescribers are known under the name of "opinion formers", "opinion leaders", "mediators" (Beciu, 2009 or "influencers" (Keller and Berry, 2003 Weimann, 1994. This paper aims to review the central opinions on what is the influence prescribers, opinion makers on voting behavior, voting and decisions on whether and how they act?

  9. Microcomputer applications of, and modifications to, the modular fault trees

    International Nuclear Information System (INIS)

    Zimmerman, T.L.; Graves, N.L.; Payne, A.C. Jr.; Whitehead, D.W.

    1994-10-01

    The LaSalle Probabilistic Risk Assessment was the first major application of the modular logic fault trees after the IREP program. In the process of performing the analysis, many errors were discovered in the fault tree modules that led to difficulties in combining the modules to form the final system fault trees. These errors are corrected in the revised modules listed in this report. In addition, the application of the modules in terms of editing them and forming them into the system fault trees was inefficient. Originally, the editing had to be done line by line and no error checking was performed by the computer. This led to many typos and other logic errors in the construction of the modular fault tree files. Two programs were written to help alleviate this problem: (1) MODEDIT - This program allows an operator to retrieve a file for editing, edit the file for the plant specific application, perform some general error checking while the file is being modified, and store the file for later use, and (2) INDEX - This program checks that the modules that are supposed to form one fault tree all link up appropriately before the files are,loaded onto the mainframe computer. Lastly, the modules were not designed for relay type logic common in BWR designs but for solid state type logic. Some additional modules were defined for modeling relay logic, and an explanation and example of their use are included in this report

  10. To prescribe codeine or not to prescribe codeine?

    Science.gov (United States)

    Fleming, Marc L; Wanat, Matthew A

    2014-09-01

    A recently published study in Pediatrics by Kaiser et al. (2014; Epub April 21, DOI: 10.1542/peds.2013-3171) reported that on average, over the past decade, children aged 3 to 17 were prescribed approximately 700,000 prescriptions for codeine-containing products each year in association with emergency department (ED) visits. Although, guidelines from the American Academy of Pediatrics issued warnings in 1997 and reaffirmed their concerns regarding the safety and effectiveness of codeine in 2006, it is still often prescribed for pain and cough associated with upper respiratory infection. With the impending rescheduling of hydrocodone combination products to Schedule II, physicians and mid-level prescribers may be compelled to prescribe codeine-containing products (e.g., with acetaminophen) due to reduced administrative burden and limits on Schedule II prescriptive authority for nurse practitioners and physician assistants in some states. This commentary expounds on the safety and effectiveness concerns of codeine, with a primary focus on patients in the ED setting.

  11. Psychologists' right to prescribe – should prescribing privileges be ...

    African Journals Online (AJOL)

    Current changes in legislation regarding prescription rights increase the possibility of non-medical practitioners being authorised to presctibe medication. There has been ongoing debate about granting psychologists in South Africa a limited right to prescribe (RTP) psychotropic medication. The main reasons advanced for ...

  12. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  13. Recurrent fuzzy neural network backstepping control for the prescribed output tracking performance of nonlinear dynamic systems.

    Science.gov (United States)

    Han, Seong-Ik; Lee, Jang-Myung

    2014-01-01

    This paper proposes a backstepping control system that uses a tracking error constraint and recurrent fuzzy neural networks (RFNNs) to achieve a prescribed tracking performance for a strict-feedback nonlinear dynamic system. A new constraint variable was defined to generate the virtual control that forces the tracking error to fall within prescribed boundaries. An adaptive RFNN was also used to obtain the required improvement on the approximation performances in order to avoid calculating the explosive number of terms generated by the recursive steps of traditional backstepping control. The boundedness and convergence of the closed-loop system was confirmed based on the Lyapunov stability theory. The prescribed performance of the proposed control scheme was validated by using it to control the prescribed error of a nonlinear system and a robot manipulator. © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  14. Clinical errors and medical negligence.

    Science.gov (United States)

    Oyebode, Femi

    2013-01-01

    This paper discusses the definition, nature and origins of clinical errors including their prevention. The relationship between clinical errors and medical negligence is examined as are the characteristics of litigants and events that are the source of litigation. The pattern of malpractice claims in different specialties and settings is examined. Among hospitalized patients worldwide, 3-16% suffer injury as a result of medical intervention, the most common being the adverse effects of drugs. The frequency of adverse drug effects appears superficially to be higher in intensive care units and emergency departments but once rates have been corrected for volume of patients, comorbidity of conditions and number of drugs prescribed, the difference is not significant. It is concluded that probably no more than 1 in 7 adverse events in medicine result in a malpractice claim and the factors that predict that a patient will resort to litigation include a prior poor relationship with the clinician and the feeling that the patient is not being kept informed. Methods for preventing clinical errors are still in their infancy. The most promising include new technologies such as electronic prescribing systems, diagnostic and clinical decision-making aids and error-resistant systems. Copyright © 2013 S. Karger AG, Basel.

  15. Electronic prescribing in pediatrics: toward safer and more effective medication management.

    Science.gov (United States)

    Johnson, Kevin B; Lehmann, Christoph U

    2013-04-01

    This technical report discusses recent advances in electronic prescribing (e-prescribing) systems, including the evidence base supporting their limitations and potential benefits. Specifically, this report acknowledges that there are limited but positive pediatric data supporting the role of e-prescribing in mitigating medication errors, improving communication with dispensing pharmacists, and improving medication adherence. On the basis of these data and on the basis of federal statutes that provide incentives for the use of e-prescribing systems, the American Academy of Pediatrics recommends the adoption of e-prescribing systems with pediatric functionality. This report supports the accompanying policy statement from the American Academy of Pediatrics recommending the adoption of e-prescribing by pediatric health care providers.

  16. TREDRA, Minimal Cut Sets Fault Tree Plot Program

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1983-01-01

    1 - Description of problem or function: TREDRA is a computer program for drafting report-quality fault trees. The input to TREDRA is similar to input for standard computer programs that find minimal cut sets from fault trees. Output includes fault tree plots containing all standard fault tree logic and event symbols, gate and event labels, and an output description for each event in the fault tree. TREDRA contains the following features: a variety of program options that allow flexibility in the program output; capability for automatic pagination of the output fault tree, when necessary; input groups which allow labeling of gates, events, and their output descriptions; a symbol library which includes standard fault tree symbols plus several less frequently used symbols; user control of character size and overall plot size; and extensive input error checking and diagnostic oriented output. 2 - Method of solution: Fault trees are generated by user-supplied control parameters and a coded description of the fault tree structure consisting of the name of each gate, the gate type, the number of inputs to the gate, and the names of these inputs. 3 - Restrictions on the complexity of the problem: TREDRA can produce fault trees with a minimum of 3 and a maximum of 56 levels. The width of each level may range from 3 to 37. A total of 50 transfers is allowed during pagination

  17. Prescribed burning for understory restoration

    Science.gov (United States)

    Kenneth W. Outcalt

    2006-01-01

    Because the longleaf ecosystem evolved with and is adapted to frequent fire, every 2 to 8 years, prescribed burning is often useful for restoring understory communities to a diverse ground layer of grasses, herbs, and small shrubs. This restoration provides habitat for a number of plant and animal species that are restricted to or found mostly in longleaf pine...

  18. Fuzzy Inference System Approach for Locating Series, Shunt, and Simultaneous Series-Shunt Faults in Double Circuit Transmission Lines.

    Science.gov (United States)

    Swetapadma, Aleena; Yadav, Anamika

    2015-01-01

    Many schemes are reported for shunt fault location estimation, but fault location estimation of series or open conductor faults has not been dealt with so far. The existing numerical relays only detect the open conductor (series) fault and give the indication of the faulty phase(s), but they are unable to locate the series fault. The repair crew needs to patrol the complete line to find the location of series fault. In this paper fuzzy based fault detection/classification and location schemes in time domain are proposed for both series faults, shunt faults, and simultaneous series and shunt faults. The fault simulation studies and fault location algorithm have been developed using Matlab/Simulink. Synchronized phasors of voltage and current signals of both the ends of the line have been used as input to the proposed fuzzy based fault location scheme. Percentage of error in location of series fault is within 1% and shunt fault is 5% for all the tested fault cases. Validation of percentage of error in location estimation is done using Chi square test with both 1% and 5% level of significance.

  19. Fault tolerant control for uncertain systems with parametric faults

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2006-01-01

    A fault tolerant control (FTC) architecture based on active fault diagnosis (AFD) and the YJBK (Youla, Jarb, Bongiorno and Kucera)parameterization is applied in this paper. Based on the FTC architecture, fault tolerant control of uncertain systems with slowly varying parametric faults...... is investigated. Conditions are given for closed-loop stability in case of false alarms or missing fault detection/isolation....

  20. Chaos Synchronization Based Novel Real-Time Intelligent Fault Diagnosis for Photovoltaic Systems

    Directory of Open Access Journals (Sweden)

    Chin-Tsung Hsieh

    2014-01-01

    Full Text Available The traditional solar photovoltaic fault diagnosis system needs two to three sets of sensing elements to capture fault signals as fault features and many fault diagnosis methods cannot be applied with real time. The fault diagnosis method proposed in this study needs only one set of sensing elements to intercept the fault features of the system, which can be real-time-diagnosed by creating the fault data of only one set of sensors. The aforesaid two points reduce the cost and fault diagnosis time. It can improve the construction of the huge database. This study used Matlab to simulate the faults in the solar photovoltaic system. The maximum power point tracker (MPPT is used to keep a stable power supply to the system when the system has faults. The characteristic signal of system fault voltage is captured and recorded, and the dynamic error of the fault voltage signal is extracted by chaos synchronization. Then, the extension engineering is used to implement the fault diagnosis. Finally, the overall fault diagnosis system only needs to capture the voltage signal of the solar photovoltaic system, and the fault type can be diagnosed instantly.

  1. Homogeneity of small-scale earthquake faulting, stress, and fault strength

    Science.gov (United States)

    Hardebeck, J.L.

    2006-01-01

    Small-scale faulting at seismogenic depths in the crust appears to be more homogeneous than previously thought. I study three new high-quality focal-mechanism datasets of small (M angular difference between their focal mechanisms. Closely spaced earthquakes (interhypocentral distance faults of many orientations may or may not be present, only similarly oriented fault planes produce earthquakes contemporaneously. On these short length scales, the crustal stress orientation and fault strength (coefficient of friction) are inferred to be homogeneous as well, to produce such similar earthquakes. Over larger length scales (???2-50 km), focal mechanisms become more diverse with increasing interhypocentral distance (differing on average by 40-70??). Mechanism variability on ???2- to 50 km length scales can be explained by ralatively small variations (???30%) in stress or fault strength. It is possible that most of this small apparent heterogeneity in stress of strength comes from measurement error in the focal mechanisms, as negligibble variation in stress or fault strength (<10%) is needed if each earthquake is assigned the optimally oriented focal mechanism within the 1-sigma confidence region. This local homogeneity in stress orientation and fault strength is encouraging, implying it may be possible to measure these parameters with enough precision to be useful in studying and modeling large earthquakes.

  2. Multi-link faults localization and restoration based on fuzzy fault set for dynamic optical networks.

    Science.gov (United States)

    Zhao, Yongli; Li, Xin; Li, Huadong; Wang, Xinbo; Zhang, Jie; Huang, Shanguo

    2013-01-28

    Based on a distributed method of bit-error-rate (BER) monitoring, a novel multi-link faults restoration algorithm is proposed for dynamic optical networks. The concept of fuzzy fault set (FFS) is first introduced for multi-link faults localization, which includes all possible optical equipment or fiber links with a membership describing the possibility of faults. Such a set is characterized by a membership function which assigns each object a grade of membership ranging from zero to one. OSPF protocol extension is designed for the BER information flooding in the network. The BER information can be correlated to link faults through FFS. Based on the BER information and FFS, multi-link faults localization mechanism and restoration algorithm are implemented and experimentally demonstrated on a GMPLS enabled optical network testbed with 40 wavelengths in each fiber link. Experimental results show that the novel localization mechanism has better performance compared with the extended limited perimeter vector matching (LVM) protocol and the restoration algorithm can improve the restoration success rate under multi-link faults scenario.

  3. The pattern of the discovery of medication errors in a tertiary hospital in Hong Kong.

    Science.gov (United States)

    Samaranayake, N R; Cheung, S T D; Chui, W C M; Cheung, B M Y

    2013-06-01

    The primary goal of reducing medication errors is to eliminate those that reach the patient. We aimed to study the pattern of interceptions to tackle medication errors along the medication use processes. Tertiary care hospital in Hong Kong. The 'Swiss Cheese Model' was used to explain the interceptions targeting medication error reporting over 5 years (2006-2010). Proportions of prescribing, dispensing and drug administration errors intercepted by pharmacists and nurses; proportions of prescribing, dispensing and drug administration errors that reached the patient. Our analysis included 1,268 in-patient medication errors, of which 53.4% were related to prescribing, 29.0% to administration and 17.6% to dispensing. 34.1% of all medication errors (4.9% prescribing, 26.8% drug administration and 2.4% dispensing) were not intercepted. Pharmacy staff intercepted 85.4% of the prescribing errors. Nurses detected 83.0% of dispensing and 5.0% of prescribing errors. However, 92.4% of all drug administration errors reached the patient. Having a preventive measure at each stage of the medication use process helps to prevent most errors. Most drug administration errors reach the patient as there is no defense against these. Therefore, more interventions to prevent drug administration errors are warranted.

  4. [Medical errors: inevitable but preventable].

    Science.gov (United States)

    Giard, R W

    2001-10-27

    Medical errors are increasingly reported in the lay press. Studies have shown dramatic error rates of 10 percent or even higher. From a methodological point of view, studying the frequency and causes of medical errors is far from simple. Clinical decisions on diagnostic or therapeutic interventions are always taken within a clinical context. Reviewing outcomes of interventions without taking into account both the intentions and the arguments for a particular action will limit the conclusions from a study on the rate and preventability of errors. The interpretation of the preventability of medical errors is fraught with difficulties and probably highly subjective. Blaming the doctor personally does not do justice to the actual situation and especially the organisational framework. Attention for and improvement of the organisational aspects of error are far more important then litigating the person. To err is and will remain human and if we want to reduce the incidence of faults we must be able to learn from our mistakes. That requires an open attitude towards medical mistakes, a continuous effort in their detection, a sound analysis and, where feasible, the institution of preventive measures.

  5. Internal Leakage Fault Detection and Tolerant Control of Single-Rod Hydraulic Actuators

    Directory of Open Access Journals (Sweden)

    Jianyong Yao

    2014-01-01

    Full Text Available The integration of internal leakage fault detection and tolerant control for single-rod hydraulic actuators is present in this paper. Fault detection is a potential technique to provide efficient condition monitoring and/or preventive maintenance, and fault tolerant control is a critical method to improve the safety and reliability of hydraulic servo systems. Based on quadratic Lyapunov functions, a performance-oriented fault detection method is proposed, which has a simple structure and is prone to implement in practice. The main feature is that, when a prescribed performance index is satisfied (even a slight fault has occurred, there is no fault alarmed; otherwise (i.e., a severe fault has occurred, the fault is detected and then a fault tolerant controller is activated. The proposed tolerant controller, which is based on the parameter adaptive methodology, is also prone to realize, and the learning mechanism is simple since only the internal leakage is considered in parameter adaptation and thus the persistent exciting (PE condition is easily satisfied. After the activation of the fault tolerant controller, the control performance is gradually recovered. Simulation results on a hydraulic servo system with both abrupt and incipient internal leakage fault demonstrate the effectiveness of the proposed fault detection and tolerant control method.

  6. Medication error detection in two major teaching hospitals: What are the types of errors?

    Directory of Open Access Journals (Sweden)

    Fatemeh Saghafi

    2014-01-01

    Full Text Available Background: Increasing number of reports on medication errors and relevant subsequent damages, especially in medical centers has become a growing concern for patient safety in recent decades. Patient safety and in particular, medication safety is a major concern and challenge for health care professionals around the world. Our prospective study was designed to detect prescribing, transcribing, dispensing, and administering medication errors in two major university hospitals. Materials and Methods: After choosing 20 similar hospital wards in two large teaching hospitals in the city of Isfahan, Iran, the sequence was randomly selected. Diagrams for drug distribution were drawn by the help of pharmacy directors. Direct observation technique was chosen as the method for detecting the errors. A total of 50 doses were studied in each ward to detect prescribing, transcribing and administering errors in each ward. The dispensing error was studied on 1000 doses dispensed in each hospital pharmacy. Results: A total of 8162 number of doses of medications were studied during the four stages, of which 8000 were complete data to be analyzed. 73% of prescribing orders were incomplete and did not have all six parameters (name, dosage form, dose and measuring unit, administration route, and intervals of administration. We found 15% transcribing errors. One-third of administration of medications on average was erroneous in both hospitals. Dispensing errors ranged between 1.4% and 2.2%. Conclusion: Although prescribing and administrating compromise most of the medication errors, improvements are needed in all four stages with regard to medication errors. Clear guidelines must be written and executed in both hospitals to reduce the incidence of medication errors.

  7. Minimizing Experimental Error in Thinning Research

    Science.gov (United States)

    C. B. Briscoe

    1964-01-01

    Many diverse approaches have been made prescribing and evaluating thinnings on an objective basis. None of the techniques proposed hasbeen widely accepted. Indeed. none has been proven superior to the others nor even widely applicable. There are at least two possible reasons for this: none of the techniques suggested is of any general utility and/or experimental error...

  8. Can we influence prescribing patterns?

    Science.gov (United States)

    Sbarbaro, J A

    2001-09-15

    A variety of programming techniques and methods of training have been employed to change physician behavior. Didactic continuing medical education lectures and clinical guidelines have had minimal impact, although endorsement of national professional guidelines by local opinion leaders appears to have a positive influence on the impact of professional guidelines. Interactive, hands-on workshops, performance reporting, and peer/patient feedback are also effective. Changing prescribing habits has been equally difficult. Drug utilization letters involving both pharmacist and physician have more impact than do letters sent only to the physician. Academic detailing, when properly executed, has been consistently effective. When combined with these strategies, closed formularies become a powerful tool in changing prescribing behavior.

  9. Development of direct dating methods of fault gouges: Deep drilling into Nojima Fault, Japan

    Science.gov (United States)

    Miyawaki, M.; Uchida, J. I.; Satsukawa, T.

    2017-12-01

    It is crucial to develop a direct dating method of fault gouges for the assessment of recent fault activity in terms of site evaluation for nuclear power plants. This method would be useful in regions without Late Pleistocene overlying sediments. In order to estimate the age of the latest fault slip event, it is necessary to use fault gouges which have experienced high frictional heating sufficient for age resetting. It is said that frictional heating is higher in deeper depths, because frictional heating generated by fault movement is determined depending on the shear stress. Therefore, we should determine the reliable depth of age resetting, as it is likely that fault gouges from the ground surface have been dated to be older than the actual age of the latest fault movement due to incomplete resetting. In this project, we target the Nojima fault which triggered the 1995 Kobe earthquake in Japan. Samples are collected from various depths (300-1,500m) by trenching and drilling to investigate age resetting conditions and depth using several methods including electron spin resonance (ESR) and optical stimulated luminescence (OSL), which are applicable to ages later than the Late Pleistocene. The preliminary results by the ESR method show approx. 1.1 Ma1) at the ground surface and 0.15-0.28 Ma2) at 388 m depth, respectively. These results indicate that samples from deeper depths preserve a younger age. In contrast, the OSL method dated approx. 2,200 yr1) at the ground surface. Although further consideration is still needed as there is a large margin of error, this result indicates that the age resetting depth of OSL is relatively shallow due to the high thermosensitivity of OSL compare to ESR. In the future, we plan to carry out further investigation for dating fault gouges from various depths up to approx. 1,500 m to verify the use of these direct dating methods.1) Kyoto University, 2017. FY27 Commissioned for the disaster presentation on nuclear facilities (Drilling

  10. Which prosthetic foot to prescribe?

    OpenAIRE

    De Asha, AR; Barnett, CT; Struchkov, V; Buckley, JG

    2017-01-01

    Introduction: \\ud Clinicians typically use findings from cohort studies to objectively inform judgements regarding the potential (dis)advantages of prescribing a new prosthetic device. However, before finalising prescription a clinician will typically ask a patient to 'try out' a change of prosthetic device while the patient is at the clinic. Observed differences in gait when using the new device should be the result of the device’s mechanical function, but could also conceivably be due to pa...

  11. Prescribing patterns in premenstrual syndrome

    Directory of Open Access Journals (Sweden)

    Jones Paul W

    2002-06-01

    Full Text Available Abstract Background Over 300 therapies have been proposed for premenstrual syndrome. To date there has been only one survey conducted in the UK of PMS treatments prescribed by GPs, a questionnaire-based study by the National Association of Premenstrual Syndrome in 1989. Since then, selective serotonin re-uptake inhibitors have been licensed for severe PMS/PMDD, and governmental recommendations to reduce the dosage of vitamin B6 (the first choice over-the-counter treatment for many women with PMS have been made. This study investigates the annual rates of diagnoses and prescribing patterns for premenstrual syndrome (1993–1998 within a computerised general practitioner database. Methods Retrospective survey of prescribing data for premenstrual syndrome between 1993–1998 using the General Practice Research Database for the West Midlands Region which contains information on 282,600 female patients Results Overall the proportion of women with a prescription-linked diagnosis of premenstrual syndrome has halved over the five years. Progestogens including progesterone were the most commonly recorded treatment for premenstrual syndrome during the whole study period accounting for over 40% of all prescriptions. Selective serotonin-reuptake inhibitors accounted for only 2% of the prescriptions in 1993 but rose to over 16% by 1998, becoming the second most commonly recorded treatment. Vitamin B6 accounted for 22% of the prescriptions in 1993 but dropped markedly between 1997 and 1998 to 11%. Conclusions This study shows a yearly decrease in the number of prescriptions linked to diagnoses for premenstrual syndrome. Progestogens including progesterone, is the most widely prescribed treatment for premenstrual syndrome despite the lack of evidence demonstrating their efficacy.

  12. Medication Errors - A Review

    OpenAIRE

    Vinay BC; Nikhitha MK; Patel Sunil B

    2015-01-01

    In this present review article, regarding medication errors its definition, medication error problem, types of medication errors, common causes of medication errors, monitoring medication errors, consequences of medication errors, prevention of medication error and managing medication errors have been explained neatly and legibly with proper tables which is easy to understand.

  13. Fault tree graphics

    International Nuclear Information System (INIS)

    Bass, L.; Wynholds, H.W.; Porterfield, W.R.

    1975-01-01

    Described is an operational system that enables the user, through an intelligent graphics terminal, to construct, modify, analyze, and store fault trees. With this system, complex engineering designs can be analyzed. This paper discusses the system and its capabilities. Included is a brief discussion of fault tree analysis, which represents an aspect of reliability and safety modeling

  14. How do normal faults grow?

    OpenAIRE

    Blækkan, Ingvild; Bell, Rebecca; Rotevatn, Atle; Jackson, Christopher; Tvedt, Anette

    2018-01-01

    Faults grow via a sympathetic increase in their displacement and length (isolated fault model), or by rapid length establishment and subsequent displacement accrual (constant-length fault model). To test the significance and applicability of these two models, we use time-series displacement (D) and length (L) data extracted for faults from nature and experiments. We document a range of fault behaviours, from sympathetic D-L fault growth (isolated growth) to sub-vertical D-L growth trajectorie...

  15. Characterization of leaky faults

    International Nuclear Information System (INIS)

    Shan, Chao.

    1990-05-01

    Leaky faults provide a flow path for fluids to move underground. It is very important to characterize such faults in various engineering projects. The purpose of this work is to develop mathematical solutions for this characterization. The flow of water in an aquifer system and the flow of air in the unsaturated fault-rock system were studied. If the leaky fault cuts through two aquifers, characterization of the fault can be achieved by pumping water from one of the aquifers, which are assumed to be horizontal and of uniform thickness. Analytical solutions have been developed for two cases of either a negligibly small or a significantly large drawdown in the unpumped aquifer. Some practical methods for using these solutions are presented. 45 refs., 72 figs., 11 tabs

  16. Solar system fault detection

    Science.gov (United States)

    Farrington, R.B.; Pruett, J.C. Jr.

    1984-05-14

    A fault detecting apparatus and method are provided for use with an active solar system. The apparatus provides an indication as to whether one or more predetermined faults have occurred in the solar system. The apparatus includes a plurality of sensors, each sensor being used in determining whether a predetermined condition is present. The outputs of the sensors are combined in a pre-established manner in accordance with the kind of predetermined faults to be detected. Indicators communicate with the outputs generated by combining the sensor outputs to give the user of the solar system and the apparatus an indication as to whether a predetermined fault has occurred. Upon detection and indication of any predetermined fault, the user can take appropriate corrective action so that the overall reliability and efficiency of the active solar system are increased.

  17. Common errors of drug administration in infants: causes and avoidance.

    Science.gov (United States)

    Anderson, B J; Ellis, J F

    1999-01-01

    Drug administration errors are common in infants. Although the infant population has a high exposure to drugs, there are few data concerning pharmacokinetics or pharmacodynamics, or the influence of paediatric diseases on these processes. Children remain therapeutic orphans. Formulations are often suitable only for adults; in addition, the lack of maturation of drug elimination processes, alteration of body composition and influence of size render the calculation of drug doses complex in infants. The commonest drug administration error in infants is one of dose, and the commonest hospital site for this error is the intensive care unit. Drug errors are a consequence of system error, and preventive strategies are possible through system analysis. The goal of a zero drug error rate should be aggressively sought, with systems in place that aim to eliminate the effects of inevitable human error. This involves review of the entire system from drug manufacture to drug administration. The nuclear industry, telecommunications and air traffic control services all practise error reduction policies with zero error as a clear goal, not by finding fault in the individual, but by identifying faults in the system and building into that system mechanisms for picking up faults before they occur. Such policies could be adapted to medicine using interventions both specific (the production of formulations which are for children only and clearly labelled, regular audit by pharmacists, legible prescriptions, standardised dose tables) and general (paediatric drug trials, education programmes, nonpunitive error reporting) to reduce the number of errors made in giving medication to infants.

  18. Unit of measurement used and parent medication dosing errors.

    Science.gov (United States)

    Yin, H Shonna; Dreyer, Benard P; Ugboaja, Donna C; Sanchez, Dayana C; Paul, Ian M; Moreira, Hannah A; Rodriguez, Luis; Mendelsohn, Alan L

    2014-08-01

    Adopting the milliliter as the preferred unit of measurement has been suggested as a strategy to improve the clarity of medication instructions; teaspoon and tablespoon units may inadvertently endorse nonstandard kitchen spoon use. We examined the association between unit used and parent medication errors and whether nonstandard instruments mediate this relationship. Cross-sectional analysis of baseline data from a larger study of provider communication and medication errors. English- or Spanish-speaking parents (n = 287) whose children were prescribed liquid medications in 2 emergency departments were enrolled. Medication error defined as: error in knowledge of prescribed dose, error in observed dose measurement (compared to intended or prescribed dose); >20% deviation threshold for error. Multiple logistic regression performed adjusting for parent age, language, country, race/ethnicity, socioeconomic status, education, health literacy (Short Test of Functional Health Literacy in Adults); child age, chronic disease; site. Medication errors were common: 39.4% of parents made an error in measurement of the intended dose, 41.1% made an error in the prescribed dose. Furthermore, 16.7% used a nonstandard instrument. Compared with parents who used milliliter-only, parents who used teaspoon or tablespoon units had twice the odds of making an error with the intended (42.5% vs 27.6%, P = .02; adjusted odds ratio=2.3; 95% confidence interval, 1.2-4.4) and prescribed (45.1% vs 31.4%, P = .04; adjusted odds ratio=1.9; 95% confidence interval, 1.03-3.5) dose; associations greater for parents with low health literacy and non-English speakers. Nonstandard instrument use partially mediated teaspoon and tablespoon-associated measurement errors. Findings support a milliliter-only standard to reduce medication errors. Copyright © 2014 by the American Academy of Pediatrics.

  19. Medicare Provider Data - Part D Prescriber

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Part D Prescriber Public Use File (PUF) provides information on prescription drugs prescribed by individual physicians and other health care providers and paid...

  20. Dutch Travel Health Nurses: Prepared to Prescribe?

    NARCIS (Netherlands)

    Overbosch, Femke W.; Koeman, Susan C.; van den Hoek, Anneke; Sonder, Gerard J. B.

    2012-01-01

    Background. In travel medicine, as in other specialties, independent prescribing of medication has traditionally been the domain of practitioners like physicians, dentists, and midwives. However, a 2011 ruling in the Netherlands expands independent prescribing and introduces supplementary

  1. The quality of outpatient antimicrobial prescribing

    DEFF Research Database (Denmark)

    Malo, Sara; Bjerrum, Lars; Feja, Cristina

    2013-01-01

    The aim of the study was to analyse and compare the quality of outpatient antimicrobial prescribing in Denmark and Aragón (in northeastern Spain), with the objective of assessing inappropriate prescribing....

  2. Quality Improvement Initiative to Decrease Variability of Emergency Physician Opioid Analgesic Prescribing

    Directory of Open Access Journals (Sweden)

    John H. Burton

    2016-05-01

    Full Text Available Introduction: Addressing pain is a crucial aspect of emergency medicine. Prescription opioids are commonly prescribed for moderate to severe pain in the emergency department (ED; unfortunately, prescribing practices are variable. High variability of opioid prescribing decisions suggests a lack of consensus and an opportunity to improve care. This quality improvement (QI initiative aimed to reduce variability in ED opioid analgesic prescribing. Methods: We evaluated the impact of a three-part QI initiative on ED opioid prescribing by physicians at seven sites. Stage 1: Retrospective baseline period (nine months. Stage 2: Physicians were informed that opioid prescribing information would be prospectively collected and feedback on their prescribing and that of the group would be shared at the end of the stage (three months. Stage 3: After physicians received their individual opioid prescribing data with blinded comparison to the group means (from Stage 2 they were informed that individual prescribing data would be unblinded and shared with the group after three months. The primary outcome was variability of the standard error of the mean and standard deviation of the opioid prescribing rate (defined as number of patients discharged with an opioid divided by total number of discharges for each provider. Secondary observations included mean quantity of pills per opioid prescription, and overall frequency of opioid prescribing. Results: The study group included 47 physicians with 149,884 ED patient encounters. The variability in prescribing decreased through each stage of the initiative as represented by the distributions for the opioid prescribing rate: Stage 1 mean 20%; Stage 2 mean 13% (46% reduction, p<0.01, and Stage 3 mean 8% (60% reduction, p<0.01. The mean quantity of pills prescribed per prescription was 16 pills in Stage 1, 14 pills in Stage 2 (18% reduction, p<0.01, and 13 pills in Stage 3 (18% reduction, p<0.01. The group mean

  3. Working Conditions-Aware Fault Injection Technique

    OpenAIRE

    Alouani , Ihsen; Niar , Smail; Jemai , Mohamed; Kuradi , Fadi; Abid , Mohamed

    2012-01-01

    International audience; With new integration rates, the circuits sensitivity to environmental and working conditions has increased dramatically. Thus, presenting reliable, less consuming energy and error resilient architectures is being one of the major problems to deal with. Besides, evaluating robustness and effectiveness of the proposed architectures is also an urgent need. In this paper, we present an extension of SimpleScalar simulation tool having the ability to inject faults in a given...

  4. Error Budgeting

    Energy Technology Data Exchange (ETDEWEB)

    Vinyard, Natalia Sergeevna [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Perry, Theodore Sonne [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Usov, Igor Olegovich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-04

    We calculate opacity from k (hn)=-ln[T(hv)]/pL, where T(hv) is the transmission for photon energy hv, p is sample density, and L is path length through the sample. The density and path length are measured together by Rutherford backscatter. Δk = $\\partial k$\\ $\\partial T$ ΔT + $\\partial k$\\ $\\partial (pL)$. We can re-write this in terms of fractional error as Δk/k = Δ1n(T)/T + Δ(pL)/(pL). Transmission itself is calculated from T=(U-E)/(V-E)=B/B0, where B is transmitted backlighter (BL) signal and B0 is unattenuated backlighter signal. Then ΔT/T=Δln(T)=ΔB/B+ΔB0/B0, and consequently Δk/k = 1/T (ΔB/B + ΔB$_0$/B$_0$ + Δ(pL)/(pL). Transmission is measured in the range of 0.2

  5. Measurement and analysis of workload effects on fault latency in real-time systems

    Science.gov (United States)

    Woodbury, Michael H.; Shin, Kang G.

    1990-01-01

    The authors demonstrate the need to address fault latency in highly reliable real-time control computer systems. It is noted that the effectiveness of all known recovery mechanisms is greatly reduced in the presence of multiple latent faults. The presence of multiple latent faults increases the possibility of multiple errors, which could result in coverage failure. The authors present experimental evidence indicating that the duration of fault latency is dependent on workload. A synthetic workload generator is used to vary the workload, and a hardware fault injector is applied to inject transient faults of varying durations. This method makes it possible to derive the distribution of fault latency duration. Experimental results obtained from the fault-tolerant multiprocessor at the NASA Airlab are presented and discussed.

  6. Feature-based handling of surface faults in compact disc players

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob; Andersen, Palle

    2006-01-01

    In this paper a novel method called feature-based control is presented. The method is designed to improve compact disc players’ handling of surface faults on the discs. The method is based on a fault-tolerant control scheme, which uses extracted features of the surface faults to remove those from...... the detector signals used for control during the occurrence of surface faults. The extracted features are coefficients of Karhunen–Loève approximations of the surface faults. The performance of the feature-based control scheme controlling compact disc players playing discs with surface faults has been...... validated experimentally. The proposed scheme reduces the control errors due to the surface faults, and in some cases where the standard fault handling scheme fails, our scheme keeps the CD-player playing....

  7. Risk Management and the Concept of Human Error

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1995-01-01

    by a stochastic coincidence of faults and human errors, but by a systemic erosion of the defenses due to decision making under competitive pressure in a dynamic environment. The presentation will discuss the nature of human error and the risk management problems found in a dynamic, competitive society facing...

  8. Inappropriate prescribing: criteria, detection and prevention.

    LENUS (Irish Health Repository)

    O'Connor, Marie N

    2012-06-01

    Inappropriate prescribing is highly prevalent in older people and is a major healthcare concern because of its association with negative healthcare outcomes including adverse drug events, related morbidity and hospitalization. With changing population demographics resulting in increasing proportions of older people worldwide, improving the quality and safety of prescribing in older people poses a global challenge. To date a number of different strategies have been used to identify potentially inappropriate prescribing in older people. Over the last two decades, a number of criteria have been published to assist prescribers in detecting inappropriate prescribing, the majority of which have been explicit sets of criteria, though some are implicit. The majority of these prescribing indicators pertain to overprescribing and misprescribing, with only a minority focussing on the underprescribing of indicated medicines. Additional interventions to optimize prescribing in older people include comprehensive geriatric assessment, clinical pharmacist review, and education of prescribers as well as computerized prescribing with clinical decision support systems. In this review, we describe the inappropriate prescribing detection tools or criteria most frequently cited in the literature and examine their role in preventing inappropriate prescribing and other related healthcare outcomes. We also discuss other measures commonly used in the detection and prevention of inappropriate prescribing in older people and the evidence supporting their use and their application in everyday clinical practice.

  9. Sensor fault detection and recovery in satellite attitude control

    Science.gov (United States)

    Nasrolahi, Seiied Saeed; Abdollahi, Farzaneh

    2018-04-01

    This paper proposes an integrated sensor fault detection and recovery for the satellite attitude control system. By introducing a nonlinear observer, the healthy sensor measurements are provided. Considering attitude dynamics and kinematic, a novel observer is developed to detect the fault in angular rate as well as attitude sensors individually or simultaneously. There is no limit on type and configuration of attitude sensors. By designing a state feedback based control signal and Lyapunov stability criterion, the uniformly ultimately boundedness of tracking errors in the presence of sensor faults is guaranteed. Finally, simulation results are presented to illustrate the performance of the integrated scheme.

  10. Fault Management Metrics

    Science.gov (United States)

    Johnson, Stephen B.; Ghoshal, Sudipto; Haste, Deepak; Moore, Craig

    2017-01-01

    This paper describes the theory and considerations in the application of metrics to measure the effectiveness of fault management. Fault management refers here to the operational aspect of system health management, and as such is considered as a meta-control loop that operates to preserve or maximize the system's ability to achieve its goals in the face of current or prospective failure. As a suite of control loops, the metrics to estimate and measure the effectiveness of fault management are similar to those of classical control loops in being divided into two major classes: state estimation, and state control. State estimation metrics can be classified into lower-level subdivisions for detection coverage, detection effectiveness, fault isolation and fault identification (diagnostics), and failure prognosis. State control metrics can be classified into response determination effectiveness and response effectiveness. These metrics are applied to each and every fault management control loop in the system, for each failure to which they apply, and probabilistically summed to determine the effectiveness of these fault management control loops to preserve the relevant system goals that they are intended to protect.

  11. Fault isolability conditions for linear systems with additive faults

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, Jakob

    2006-01-01

    In this paper, we shall show that an unlimited number of additive single faults can be isolated under mild conditions if a general isolation scheme is applied. Multiple faults are also covered. The approach is algebraic and is based on a set representation of faults, where all faults within a set...

  12. Line-to-Line Fault Analysis and Location in a VSC-Based Low-Voltage DC Distribution Network

    Directory of Open Access Journals (Sweden)

    Shi-Min Xue

    2018-03-01

    Full Text Available A DC cable short-circuit fault is the most severe fault type that occurs in DC distribution networks, having a negative impact on transmission equipment and the stability of system operation. When a short-circuit fault occurs in a DC distribution network based on a voltage source converter (VSC, an in-depth analysis and characterization of the fault is of great significance to establish relay protection, devise fault current limiters and realize fault location. However, research on short-circuit faults in VSC-based low-voltage DC (LVDC systems, which are greatly different from high-voltage DC (HVDC systems, is currently stagnant. The existing research in this area is not conclusive, with further study required to explain findings in HVDC systems that do not fit with simulated results or lack thorough theoretical analyses. In this paper, faults are divided into transient- and steady-state faults, and detailed formulas are provided. A more thorough and practical theoretical analysis with fewer errors can be used to develop protection schemes and short-circuit fault locations based on transient- and steady-state analytic formulas. Compared to the classical methods, the fault analyses in this paper provide more accurate computed results of fault current. Thus, the fault location method can rapidly evaluate the distance between the fault and converter. The analyses of error increase and an improved handshaking method coordinating with the proposed location method are presented.

  13. Medication Administration Errors Involving Paediatric In-Patients in a ...

    African Journals Online (AJOL)

    Erah

    In-Patients in a Hospital in Ethiopia. Yemisirach Feleke ... Purpose: To assess the type and frequency of medication administration errors (MAEs) in the paediatric ward of .... prescribers, does not go beyond obeying ... specialists, 43 general practitioners, 2 health officers ..... Medication Errors, International Council of Nurses.

  14. Fault Analysis in Cryptography

    CERN Document Server

    Joye, Marc

    2012-01-01

    In the 1970s researchers noticed that radioactive particles produced by elements naturally present in packaging material could cause bits to flip in sensitive areas of electronic chips. Research into the effect of cosmic rays on semiconductors, an area of particular interest in the aerospace industry, led to methods of hardening electronic devices designed for harsh environments. Ultimately various mechanisms for fault creation and propagation were discovered, and in particular it was noted that many cryptographic algorithms succumb to so-called fault attacks. Preventing fault attacks without

  15. Fault tolerant control based on active fault diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik

    2005-01-01

    An active fault diagnosis (AFD) method will be considered in this paper in connection with a Fault Tolerant Control (FTC) architecture based on the YJBK parameterization of all stabilizing controllers. The architecture consists of a fault diagnosis (FD) part and a controller reconfiguration (CR......) part. The FTC architecture can be applied for additive faults, parametric faults, and for system structural changes. Only parametric faults will be considered in this paper. The main focus in this paper is on the use of the new approach of active fault diagnosis in connection with FTC. The active fault...... diagnosis approach is based on including an auxiliary input in the system. A fault signature matrix is introduced in connection with AFD, given as the transfer function from the auxiliary input to the residual output. This can be considered as a generalization of the passive fault diagnosis case, where...

  16. Fault-tolerant measurement-based quantum computing with continuous-variable cluster states.

    Science.gov (United States)

    Menicucci, Nicolas C

    2014-03-28

    A long-standing open question about Gaussian continuous-variable cluster states is whether they enable fault-tolerant measurement-based quantum computation. The answer is yes. Initial squeezing in the cluster above a threshold value of 20.5 dB ensures that errors from finite squeezing acting on encoded qubits are below the fault-tolerance threshold of known qubit-based error-correcting codes. By concatenating with one of these codes and using ancilla-based error correction, fault-tolerant measurement-based quantum computation of theoretically indefinite length is possible with finitely squeezed cluster states.

  17. Quaternary Fault Lines

    Data.gov (United States)

    Department of Homeland Security — This data set contains locations and information on faults and associated folds in the United States that are believed to be sources of M>6 earthquakes during the...

  18. Fault lubrication during earthquakes.

    Science.gov (United States)

    Di Toro, G; Han, R; Hirose, T; De Paola, N; Nielsen, S; Mizoguchi, K; Ferri, F; Cocco, M; Shimamoto, T

    2011-03-24

    The determination of rock friction at seismic slip rates (about 1 m s(-1)) is of paramount importance in earthquake mechanics, as fault friction controls the stress drop, the mechanical work and the frictional heat generated during slip. Given the difficulty in determining friction by seismological methods, elucidating constraints are derived from experimental studies. Here we review a large set of published and unpublished experiments (∼300) performed in rotary shear apparatus at slip rates of 0.1-2.6 m s(-1). The experiments indicate a significant decrease in friction (of up to one order of magnitude), which we term fault lubrication, both for cohesive (silicate-built, quartz-built and carbonate-built) rocks and non-cohesive rocks (clay-rich, anhydrite, gypsum and dolomite gouges) typical of crustal seismogenic sources. The available mechanical work and the associated temperature rise in the slipping zone trigger a number of physicochemical processes (gelification, decarbonation and dehydration reactions, melting and so on) whose products are responsible for fault lubrication. The similarity between (1) experimental and natural fault products and (2) mechanical work measures resulting from these laboratory experiments and seismological estimates suggests that it is reasonable to extrapolate experimental data to conditions typical of earthquake nucleation depths (7-15 km). It seems that faults are lubricated during earthquakes, irrespective of the fault rock composition and of the specific weakening mechanism involved.

  19. Vipava fault (Slovenia

    Directory of Open Access Journals (Sweden)

    Ladislav Placer

    2008-06-01

    Full Text Available During mapping of the already accomplished Razdrto – Senožeče section of motorway and geologic surveying of construction operations of the trunk road between Razdrto and Vipava in northwestern part of External Dinarides on the southwestern slope of Mt. Nanos, called Rebrnice, a steep NW-SE striking fault was recognized, situated between the Predjama and the Ra{a faults. The fault was named Vipava fault after the Vipava town. An analysis of subrecent gravitational slips at Rebrnice indicates that they were probably associated with the activity of this fault. Unpublished results of a repeated levelling line along the regional road passing across the Vipava fault zone suggest its possible present activity. It would be meaningful to verify this by appropriate geodetic measurements, and to study the actual gravitational slips at Rebrnice. The association between tectonics and gravitational slips in this and in similar extreme cases in the areas of Alps and Dinarides points at the need of complex studying of geologic proceses.

  20. Medication errors reported to the National Medication Error Reporting System in Malaysia: a 4-year retrospective review (2009 to 2012).

    Science.gov (United States)

    Samsiah, A; Othman, Noordin; Jamshed, Shazia; Hassali, Mohamed Azmi; Wan-Mohaina, W M

    2016-12-01

    Reporting and analysing the data on medication errors (MEs) is important and contributes to a better understanding of the error-prone environment. This study aims to examine the characteristics of errors submitted to the National Medication Error Reporting System (MERS) in Malaysia. A retrospective review of reports received from 1 January 2009 to 31 December 2012 was undertaken. Descriptive statistics method was applied. A total of 17,357 MEs reported were reviewed. The majority of errors were from public-funded hospitals. Near misses were classified in 86.3 % of the errors. The majority of errors (98.1 %) had no harmful effects on the patients. Prescribing contributed to more than three-quarters of the overall errors (76.1 %). Pharmacists detected and reported the majority of errors (92.1 %). Cases of erroneous dosage or strength of medicine (30.75 %) were the leading type of error, whilst cardiovascular (25.4 %) was the most common category of drug found. MERS provides rich information on the characteristics of reported MEs. Low contribution to reporting from healthcare facilities other than government hospitals and non-pharmacists requires further investigation. Thus, a feasible approach to promote MERS among healthcare providers in both public and private sectors needs to be formulated and strengthened. Preventive measures to minimise MEs should be directed to improve prescribing competency among the fallible prescribers identified.

  1. Gearbox fault diagnosis based on time-frequency domain synchronous averaging and feature extraction technique

    Science.gov (United States)

    Zhang, Shengli; Tang, Jiong

    2016-04-01

    Gearbox is one of the most vulnerable subsystems in wind turbines. Its healthy status significantly affects the efficiency and function of the entire system. Vibration based fault diagnosis methods are prevalently applied nowadays. However, vibration signals are always contaminated by noise that comes from data acquisition errors, structure geometric errors, operation errors, etc. As a result, it is difficult to identify potential gear failures directly from vibration signals, especially for the early stage faults. This paper utilizes synchronous averaging technique in time-frequency domain to remove the non-synchronous noise and enhance the fault related time-frequency features. The enhanced time-frequency information is further employed in gear fault classification and identification through feature extraction algorithms including Kernel Principal Component Analysis (KPCA), Multilinear Principal Component Analysis (MPCA), and Locally Linear Embedding (LLE). Results show that the LLE approach is the most effective to classify and identify different gear faults.

  2. Inappropriate prescribing in geriatric patients.

    LENUS (Irish Health Repository)

    Barry, Patrick J

    2012-02-03

    Inappropriate prescribing in older people is a common condition associated with significant morbidity, mortality, and financial costs. Medication use increases with age, and this, in conjunction with an increasing disease burden, is associated with adverse drug reactions. This review outlines why older people are more likely to develop adverse drug reactions and how common the problem is. The use of different tools to identify and measure the problem is reviewed. Common syndromes seen in older adults (eg, falling, cognitive impairment, sleep disturbance) are considered, and recent evidence in relation to medication use for these conditions is reviewed. Finally, we present a brief summary of significant developments in the recent literature for those caring for older people.

  3. Fault morphology of the lyo Fault, the Median Tectonic Line Active Fault System

    OpenAIRE

    後藤, 秀昭

    1996-01-01

    In this paper, we investigated the various fault features of the lyo fault and depicted fault lines or detailed topographic map. The results of this paper are summarized as follows; 1) Distinct evidence of the right-lateral movement is continuously discernible along the lyo fault. 2) Active fault traces are remarkably linear suggesting that the angle of fault plane is high. 3) The lyo fault can be divided into four segments by jogs between left-stepping traces. 4) The mean slip rate is 1.3 ~ ...

  4. Residents' numeric inputting error in computerized physician order entry prescription.

    Science.gov (United States)

    Wu, Xue; Wu, Changxu; Zhang, Kan; Wei, Dong

    2016-04-01

    Computerized physician order entry (CPOE) system with embedded clinical decision support (CDS) can significantly reduce certain types of prescription error. However, prescription errors still occur. Various factors such as the numeric inputting methods in human computer interaction (HCI) produce different error rates and types, but has received relatively little attention. This study aimed to examine the effects of numeric inputting methods and urgency levels on numeric inputting errors of prescription, as well as categorize the types of errors. Thirty residents participated in four prescribing tasks in which two factors were manipulated: numeric inputting methods (numeric row in the main keyboard vs. numeric keypad) and urgency levels (urgent situation vs. non-urgent situation). Multiple aspects of participants' prescribing behavior were measured in sober prescribing situations. The results revealed that in urgent situations, participants were prone to make mistakes when using the numeric row in the main keyboard. With control of performance in the sober prescribing situation, the effects of the input methods disappeared, and urgency was found to play a significant role in the generalized linear model. Most errors were either omission or substitution types, but the proportion of transposition and intrusion error types were significantly higher than that of the previous research. Among numbers 3, 8, and 9, which were the less common digits used in prescription, the error rate was higher, which was a great risk to patient safety. Urgency played a more important role in CPOE numeric typing error-making than typing skills and typing habits. It was recommended that inputting with the numeric keypad had lower error rates in urgent situation. An alternative design could consider increasing the sensitivity of the keys with lower frequency of occurrence and decimals. To improve the usability of CPOE, numeric keyboard design and error detection could benefit from spatial

  5. Do final-year medical students have sufficient prescribing competencies? A systematic literature review.

    Science.gov (United States)

    Brinkman, David J; Tichelaar, Jelle; Graaf, Sanne; Otten, René H J; Richir, Milan C; van Agtmael, Michiel A

    2018-04-01

    Prescribing errors are an important cause of patient safety incidents and are frequently caused by junior doctors. This might be because the prescribing competence of final-year medical students is poor as a result of inadequate clinical pharmacology and therapeutic (CPT) education. We reviewed the literature to investigate which prescribing competencies medical students should have acquired in order to prescribe safely and effectively, and whether these have been attained by the time they graduate. PubMed, EMBASE and ERIC databases were searched from the earliest dates up to and including January 2017, using the terms 'prescribing', 'competence' and 'medical students' in combination. Articles describing or evaluating essential prescribing competencies of final-year medical students were included. Twenty-five articles describing, and 47 articles evaluating, the prescribing competencies of final-year students were included. Although there seems to be some agreement, we found no clear consensus among CPT teachers on which prescribing competencies medical students should have when they graduate. Studies showed that students had a general lack of preparedness, self-confidence, knowledge and skills, specifically regarding general and antimicrobial prescribing and pharmacovigilance. However, the results should be interpreted with caution, given the heterogeneity and methodological weaknesses of the included studies. There is considerable evidence that final-year students have insufficient competencies to prescribe safely and effectively, although there is a need for a greater consensus among CPT teachers on the required competencies. Changes in undergraduate CPT education are urgently required in order to improve the prescribing of future doctors. © 2018 VU University Medical Centre. British Journal of Clinical Pharmacology published by John Wiley & Sons Ltd on behalf of British Pharmacological Society.

  6. Nurse prescribing in dermatology: doctors' and non-prescribing nurses' views.

    Science.gov (United States)

    Stenner, Karen; Carey, Nicola; Courtenay, Molly

    2009-04-01

    This paper is a report of a study conducted to explore doctor and non-prescribing nurse views about nurse prescribing in the light of their experience in dermatology. The cooperation of healthcare professionals and peers is of key importance in enabling and supporting nurse prescribing. Lack of understanding of and opposition to nurse prescribing are known barriers to its implementation. Given the important role they play, it is necessary to consider how the recent expansion of nurse prescribing rights in England impacts on the views of healthcare professionals. Interviews with 12 doctors and six non-prescribing nurses were conducted in 10 case study sites across England between 2006 and 2007. Participants all worked with nurses who prescribed for patients with dermatological conditions in secondary or primary care. Thematic analysis was conducted on the interview data. Participants were positive about their experiences of nurse prescribing having witnessed benefits from it, but had reservations about nurse prescribing in general. Acceptance was conditional upon the nurses' level of experience, awareness of their own limitations and the context in which they prescribed. Fears that nurses would prescribe beyond their level of competence were expected to reduce as understanding and experience of nurse prescribing increased. Indications are that nurse prescribing can be acceptable to doctors and nurses so long as it operates within recommended parameters. Greater promotion and assessment of standards and criteria are recommended to improve understanding and acceptance of nurse prescribing.

  7. Guideliness for system modeling: fault tree [analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yoon Hwan; Yang, Joon Eon; Kang, Dae Il; Hwang, Mee Jeong

    2004-07-01

    This document, the guidelines for system modeling related to Fault Tree Analysis(FTA), is intended to provide the guidelines with the analyzer to construct the fault trees in the level of the capability category II of ASME PRA standard. Especially, they are to provide the essential and basic guidelines and the related contents to be used in support of revising the Ulchin 3 and 4 PSA model for risk monitor within the capability category II of ASME PRA standard. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis (ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. This document identifies and describes the definitions and the general procedures of FTA and the essential and basic guidelines for reving the fault trees. Accordingly, the guidelines for FTA will be capable to guide the FTA to the level of the capability category II of ASME PRA standard.

  8. Guideliness for system modeling: fault tree [analysis

    International Nuclear Information System (INIS)

    Lee, Yoon Hwan; Yang, Joon Eon; Kang, Dae Il; Hwang, Mee Jeong

    2004-07-01

    This document, the guidelines for system modeling related to Fault Tree Analysis(FTA), is intended to provide the guidelines with the analyzer to construct the fault trees in the level of the capability category II of ASME PRA standard. Especially, they are to provide the essential and basic guidelines and the related contents to be used in support of revising the Ulchin 3 and 4 PSA model for risk monitor within the capability category II of ASME PRA standard. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis (ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. This document identifies and describes the definitions and the general procedures of FTA and the essential and basic guidelines for reving the fault trees. Accordingly, the guidelines for FTA will be capable to guide the FTA to the level of the capability category II of ASME PRA standard

  9. Medication errors detected in non-traditional databases

    DEFF Research Database (Denmark)

    Perregaard, Helene; Aronson, Jeffrey K; Dalhoff, Kim

    2015-01-01

    AIMS: We have looked for medication errors involving the use of low-dose methotrexate, by extracting information from Danish sources other than traditional pharmacovigilance databases. We used the data to establish the relative frequencies of different types of errors. METHODS: We searched four...... errors, whereas knowledge-based errors more often resulted in near misses. CONCLUSIONS: The medication errors in this survey were most often action-based (50%) and knowledge-based (34%), suggesting that greater attention should be paid to education and surveillance of medical personnel who prescribe...

  10. Fault-tolerant architectures for superconducting qubits

    International Nuclear Information System (INIS)

    DiVincenzo, David P

    2009-01-01

    In this short review, I draw attention to new developments in the theory of fault tolerance in quantum computation that may give concrete direction to future work in the development of superconducting qubit systems. The basics of quantum error-correction codes, which I will briefly review, have not significantly changed since their introduction 15 years ago. But an interesting picture has emerged of an efficient use of these codes that may put fault-tolerant operation within reach. It is now understood that two-dimensional surface codes, close relatives of the original toric code of Kitaev, can be adapted as shown by Raussendorf and Harrington to effectively perform logical gate operations in a very simple planar architecture, with error thresholds for fault-tolerant operation simulated to be 0.75%. This architecture uses topological ideas in its functioning, but it is not 'topological quantum computation'-there are no non-abelian anyons in sight. I offer some speculations on the crucial pieces of superconducting hardware that could be demonstrated in the next couple of years that would be clear stepping stones towards this surface-code architecture.

  11. Modeling coherent errors in quantum error correction

    Science.gov (United States)

    Greenbaum, Daniel; Dutton, Zachary

    2018-01-01

    Analysis of quantum error correcting codes is typically done using a stochastic, Pauli channel error model for describing the noise on physical qubits. However, it was recently found that coherent errors (systematic rotations) on physical data qubits result in both physical and logical error rates that differ significantly from those predicted by a Pauli model. Here we examine the accuracy of the Pauli approximation for noise containing coherent errors (characterized by a rotation angle ɛ) under the repetition code. We derive an analytic expression for the logical error channel as a function of arbitrary code distance d and concatenation level n, in the small error limit. We find that coherent physical errors result in logical errors that are partially coherent and therefore non-Pauli. However, the coherent part of the logical error is negligible at fewer than {ε }-({dn-1)} error correction cycles when the decoder is optimized for independent Pauli errors, thus providing a regime of validity for the Pauli approximation. Above this number of correction cycles, the persistent coherent logical error will cause logical failure more quickly than the Pauli model would predict, and this may need to be combated with coherent suppression methods at the physical level or larger codes.

  12. Dose error analysis for a scanned proton beam delivery system

    International Nuclear Information System (INIS)

    Coutrakon, G; Wang, N; Miller, D W; Yang, Y

    2010-01-01

    All particle beam scanning systems are subject to dose delivery errors due to errors in position, energy and intensity of the delivered beam. In addition, finite scan speeds, beam spill non-uniformities, and delays in detector, detector electronics and magnet responses will all contribute errors in delivery. In this paper, we present dose errors for an 8 x 10 x 8 cm 3 target of uniform water equivalent density with 8 cm spread out Bragg peak and a prescribed dose of 2 Gy. Lower doses are also analyzed and presented later in the paper. Beam energy errors and errors due to limitations of scanning system hardware have been included in the analysis. By using Gaussian shaped pencil beams derived from measurements in the research room of the James M Slater Proton Treatment and Research Center at Loma Linda, CA and executing treatment simulations multiple times, statistical dose errors have been calculated in each 2.5 mm cubic voxel in the target. These errors were calculated by delivering multiple treatments to the same volume and calculating the rms variation in delivered dose at each voxel in the target. The variations in dose were the result of random beam delivery errors such as proton energy, spot position and intensity fluctuations. The results show that with reasonable assumptions of random beam delivery errors, the spot scanning technique yielded an rms dose error in each voxel less than 2% or 3% of the 2 Gy prescribed dose. These calculated errors are within acceptable clinical limits for radiation therapy.

  13. Fault tolerance with noisy and slow measurements and preparation.

    Science.gov (United States)

    Paz-Silva, Gerardo A; Brennen, Gavin K; Twamley, Jason

    2010-09-03

    It is not so well known that measurement-free quantum error correction protocols can be designed to achieve fault-tolerant quantum computing. Despite their potential advantages in terms of the relaxation of accuracy, speed, and addressing requirements, they have usually been overlooked since they are expected to yield a very bad threshold. We show that this is not the case. We design fault-tolerant circuits for the 9-qubit Bacon-Shor code and find an error threshold for unitary gates and preparation of p((p,g)thresh)=3.76×10(-5) (30% of the best known result for the same code using measurement) while admitting up to 1/3 error rates for measurements and allocating no constraints on measurement speed. We further show that demanding gate error rates sufficiently below the threshold pushes the preparation threshold up to p((p)thresh)=1/3.

  14. Prescribing Safety in Ambulatory Care: Physician Perspectives

    National Research Council Canada - National Science Library

    Rundall, Thomas G; Hsu, John; Lafata, Jennifer E; Fung, Vicki; Paez, Kathryn A; Simpkins, Jan; Simon, Steven R; Robinson, Scott B; Uratsu, Connie; Gunter, Margaret J; Soumerai, Stephen B; Selby, Joseph V

    2005-01-01

    .... We asked about current safety practices, perceptions of ambulatory prescribing safety. Using a content analysis approach, three investigators independently coded responses into thematic categories...

  15. Active Fault Isolation in MIMO Systems

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2014-01-01

    isolation is based directly on the input/output s ignals applied for the fault detection. It is guaranteed that the fault group includes the fault that had occurred in the system. The second step is individual fault isolation in the fault group . Both types of isolation are obtained by applying dedicated......Active fault isolation of parametric faults in closed-loop MIMO system s are considered in this paper. The fault isolation consists of two steps. T he first step is group- wise fault isolation. Here, a group of faults is isolated from other pos sible faults in the system. The group-wise fault...

  16. Analysis of error patterns in clinical radiotherapy

    International Nuclear Information System (INIS)

    Macklis, Roger; Meier, Tim; Barrett, Patricia; Weinhous, Martin

    1996-01-01

    Purpose: Until very recently, prescription errors and adverse treatment events have rarely been studied or reported systematically in oncology. We wished to understand the spectrum and severity of radiotherapy errors that take place on a day-to-day basis in a high-volume academic practice and to understand the resource needs and quality assurance challenges placed on a department by rapid upswings in contract-based clinical volumes requiring additional operating hours, procedures, and personnel. The goal was to define clinical benchmarks for operating safety and to detect error-prone treatment processes that might function as 'early warning' signs. Methods: A multi-tiered prospective and retrospective system for clinical error detection and classification was developed, with formal analysis of the antecedents and consequences of all deviations from prescribed treatment delivery, no matter how trivial. A department-wide record-and-verify system was operational during this period and was used as one method of treatment verification and error detection. Brachytherapy discrepancies were analyzed separately. Results: During the analysis year, over 2000 patients were treated with over 93,000 individual fields. A total of 59 errors affecting a total of 170 individual treated fields were reported or detected during this period. After review, all of these errors were classified as Level 1 (minor discrepancy with essentially no potential for negative clinical implications). This total treatment delivery error rate (170/93, 332 or 0.18%) is significantly better than corresponding error rates reported for other hospital and oncology treatment services, perhaps reflecting the relatively sophisticated error avoidance and detection procedures used in modern clinical radiation oncology. Error rates were independent of linac model and manufacturer, time of day (normal operating hours versus late evening or early morning) or clinical machine volumes. There was some relationship to

  17. Overview of error-tolerant cockpit research

    Science.gov (United States)

    Abbott, Kathy

    1990-01-01

    The objectives of research in intelligent cockpit aids and intelligent error-tolerant systems are stated. In intelligent cockpit aids research, the objective is to provide increased aid and support to the flight crew of civil transport aircraft through the use of artificial intelligence techniques combined with traditional automation. In intelligent error-tolerant systems, the objective is to develop and evaluate cockpit systems that provide flight crews with safe and effective ways and means to manage aircraft systems, plan and replan flights, and respond to contingencies. A subsystems fault management functional diagram is given. All information is in viewgraph form.

  18. Fault Detection for Industrial Processes

    Directory of Open Access Journals (Sweden)

    Yingwei Zhang

    2012-01-01

    Full Text Available A new fault-relevant KPCA algorithm is proposed. Then the fault detection approach is proposed based on the fault-relevant KPCA algorithm. The proposed method further decomposes both the KPCA principal space and residual space into two subspaces. Compared with traditional statistical techniques, the fault subspace is separated based on the fault-relevant influence. This method can find fault-relevant principal directions and principal components of systematic subspace and residual subspace for process monitoring. The proposed monitoring approach is applied to Tennessee Eastman process and penicillin fermentation process. The simulation results show the effectiveness of the proposed method.

  19. Fault tree analysis

    International Nuclear Information System (INIS)

    1981-09-01

    Suggestion are made concerning the method of the fault tree analysis, the use of certain symbols in the examination of system failures. This purpose of the fault free analysis is to find logical connections of component or subsystem failures leading to undesirable occurrances. The results of these examinations are part of the system assessment concerning operation and safety. The objectives of the analysis are: systematical identification of all possible failure combinations (causes) leading to a specific undesirable occurrance, finding of reliability parameters such as frequency of failure combinations, frequency of the undesirable occurrance or non-availability of the system when required. The fault tree analysis provides a near and reconstructable documentation of the examination. (orig./HP) [de

  20. Analog fault diagnosis by inverse problem technique

    KAUST Repository

    Ahmed, Rania F.

    2011-12-01

    A novel algorithm for detecting soft faults in linear analog circuits based on the inverse problem concept is proposed. The proposed approach utilizes optimization techniques with the aid of sensitivity analysis. The main contribution of this work is to apply the inverse problem technique to estimate the actual parameter values of the tested circuit and so, to detect and diagnose single fault in analog circuits. The validation of the algorithm is illustrated through applying it to Sallen-Key second order band pass filter and the results show that the detecting percentage efficiency was 100% and also, the maximum error percentage of estimating the parameter values is 0.7%. This technique can be applied to any other linear circuit and it also can be extended to be applied to non-linear circuits. © 2011 IEEE.

  1. Error Correcting Codes -34 ...

    Indian Academy of Sciences (India)

    information and coding theory. A large scale relay computer had failed to deliver the expected results due to a hardware fault. Hamming, one of the active proponents of computer usage, was determined to find an efficient means by which computers could detect and correct their own faults. A mathematician by train-.

  2. An Intelligent Actuator Fault Reconstruction Scheme for Robotic Manipulators.

    Science.gov (United States)

    Xiao, Bing; Yin, Shen

    2018-02-01

    This paper investigates a difficult problem of reconstructing actuator faults for robotic manipulators. An intelligent approach with fast reconstruction property is developed. This is achieved by using observer technique. This scheme is capable of precisely reconstructing the actual actuator fault. It is shown by Lyapunov stability analysis that the reconstruction error can converge to zero after finite time. A perfect reconstruction performance including precise and fast properties can be provided for actuator fault. The most important feature of the scheme is that, it does not depend on control law, dynamic model of actuator, faults' type, and also their time-profile. This super reconstruction performance and capability of the proposed approach are further validated by simulation and experimental results.

  3. Multiple Embedded Processors for Fault-Tolerant Computing

    Science.gov (United States)

    Bolotin, Gary; Watson, Robert; Katanyoutanant, Sunant; Burke, Gary; Wang, Mandy

    2005-01-01

    A fault-tolerant computer architecture has been conceived in an effort to reduce vulnerability to single-event upsets (spurious bit flips caused by impingement of energetic ionizing particles or photons). As in some prior fault-tolerant architectures, the redundancy needed for fault tolerance is obtained by use of multiple processors in one computer. Unlike prior architectures, the multiple processors are embedded in a single field-programmable gate array (FPGA). What makes this new approach practical is the recent commercial availability of FPGAs that are capable of having multiple embedded processors. A working prototype (see figure) consists of two embedded IBM PowerPC 405 processor cores and a comparator built on a Xilinx Virtex-II Pro FPGA. This relatively simple instantiation of the architecture implements an error-detection scheme. A planned future version, incorporating four processors and two comparators, would correct some errors in addition to detecting them.

  4. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  5. Fault Tolerant Computer Architecture

    CERN Document Server

    Sorin, Daniel

    2009-01-01

    For many years, most computer architects have pursued one primary goal: performance. Architects have translated the ever-increasing abundance of ever-faster transistors provided by Moore's law into remarkable increases in performance. Recently, however, the bounty provided by Moore's law has been accompanied by several challenges that have arisen as devices have become smaller, including a decrease in dependability due to physical faults. In this book, we focus on the dependability challenge and the fault tolerance solutions that architects are developing to overcome it. The two main purposes

  6. Fault tolerant linear actuator

    Science.gov (United States)

    Tesar, Delbert

    2004-09-14

    In varying embodiments, the fault tolerant linear actuator of the present invention is a new and improved linear actuator with fault tolerance and positional control that may incorporate velocity summing, force summing, or a combination of the two. In one embodiment, the invention offers a velocity summing arrangement with a differential gear between two prime movers driving a cage, which then drives a linear spindle screw transmission. Other embodiments feature two prime movers driving separate linear spindle screw transmissions, one internal and one external, in a totally concentric and compact integrated module.

  7. Design of passive fault-tolerant flight controller against actuator failures

    Directory of Open Access Journals (Sweden)

    Xiang Yu

    2015-02-01

    Full Text Available The problem of designing passive fault-tolerant flight controller is addressed when the normal and faulty cases are prescribed. First of all, the considered fault and fault-free cases are formed by polytopes. As considering that the safety of a post-fault system is directly related to the maximum values of physical variables in the system, peak-to-peak gain is selected to represent the relationships among the amplitudes of actuator outputs, system outputs, and reference commands. Based on the parameter dependent Lyapunov and slack methods, the passive fault-tolerant flight controllers in the absence/presence of system uncertainty for actuator failure cases are designed, respectively. Case studies of an airplane under actuator failures are carried out to validate the effectiveness of the proposed approach.

  8. Do final‐year medical students have sufficient prescribing competencies? A systematic literature review

    Science.gov (United States)

    Tichelaar, Jelle; Graaf, Sanne; Otten, René H. J.; Richir, Milan C.; van Agtmael, Michiel A.

    2018-01-01

    Aims Prescribing errors are an important cause of patient safety incidents and are frequently caused by junior doctors. This might be because the prescribing competence of final‐year medical students is poor as a result of inadequate clinical pharmacology and therapeutic (CPT) education. We reviewed the literature to investigate which prescribing competencies medical students should have acquired in order to prescribe safely and effectively, and whether these have been attained by the time they graduate. Methods PubMed, EMBASE and ERIC databases were searched from the earliest dates up to and including January 2017, using the terms ‘prescribing’, ‘competence’ and ‘medical students’ in combination. Articles describing or evaluating essential prescribing competencies of final‐year medical students were included. Results Twenty‐five articles describing, and 47 articles evaluating, the prescribing competencies of final‐year students were included. Although there seems to be some agreement, we found no clear consensus among CPT teachers on which prescribing competencies medical students should have when they graduate. Studies showed that students had a general lack of preparedness, self‐confidence, knowledge and skills, specifically regarding general and antimicrobial prescribing and pharmacovigilance. However, the results should be interpreted with caution, given the heterogeneity and methodological weaknesses of the included studies. Conclusions There is considerable evidence that final‐year students have insufficient competencies to prescribe safely and effectively, although there is a need for a greater consensus among CPT teachers on the required competencies. Changes in undergraduate CPT education are urgently required in order to improve the prescribing of future doctors. PMID:29315721

  9. Wind turbine fault detection and fault tolerant control

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Johnson, Kathryn

    2013-01-01

    In this updated edition of a previous wind turbine fault detection and fault tolerant control challenge, we present a more sophisticated wind turbine model and updated fault scenarios to enhance the realism of the challenge and therefore the value of the solutions. This paper describes...

  10. Analysis of Medication Errors in Simulated Pediatric Resuscitation by Residents

    Directory of Open Access Journals (Sweden)

    Evelyn Porter

    2014-07-01

    Full Text Available Introduction: The objective of our study was to estimate the incidence of prescribing medication errors specifically made by a trainee and identify factors associated with these errors during the simulated resuscitation of a critically ill child. Methods: The results of the simulated resuscitation are described. We analyzed data from the simulated resuscitation for the occurrence of a prescribing medication error. We compared univariate analysis of each variable to medication error rate and performed a separate multiple logistic regression analysis on the significant univariate variables to assess the association between the selected variables. Results: We reviewed 49 simulated resuscitations . The final medication error rate for the simulation was 26.5% (95% CI 13.7% - 39.3%. On univariate analysis, statistically significant findings for decreased prescribing medication error rates included senior residents in charge, presence of a pharmacist, sleeping greater than 8 hours prior to the simulation, and a visual analog scale score showing more confidence in caring for critically ill children. Multiple logistic regression analysis using the above significant variables showed only the presence of a pharmacist to remain significantly associated with decreased medication error, odds ratio of 0.09 (95% CI 0.01 - 0.64. Conclusion: Our results indicate that the presence of a clinical pharmacist during the resuscitation of a critically ill child reduces the medication errors made by resident physician trainees.

  11. Prevalence and Predictors of Inappropriate Medications Prescribing ...

    African Journals Online (AJOL)

    Data analysis involved use of World Health Organization (WHO) prescribing indicators, Updated 2002 Beer's criteria and DRUG-REAX® system software package of MICROMEDEX (R) Healthcare Series to assess the prescribing pattern, identify potentially inappropriate medications and potential drug-drug interactions, ...

  12. Psychiatric Prescribers' Experiences With Doctor Shoppers.

    Science.gov (United States)

    Worley, Julie; Johnson, Mary; Karnik, Niranjan

    2015-01-01

    Doctor shopping is a primary method of prescription medication diversion. After opioids, benzodiazepines and stimulants are the next most common prescription medications used nonmedically. Studies have shown that patients who engage in doctor shopping find it fun, exciting, and easy to do. There is a lack of research on the prescriber's perspective on the phenomenon of doctor shopping. This study investigates the experiences of prescribers in psychiatry with patients who engage in doctor shopping. Fifteen prescribers including psychiatrists and psychiatric nurse practitioners working in outpatient psychiatry were interviewed to elicit detailed information about their experiences with patients who engage in doctor shopping. Themes found throughout the interview were that psychiatric prescribers' experience with patients who engage in doctor shopping includes (a) detecting red flags, (b) negative emotional responding, (c) addressing the patient and the problem, and (d) inconsistently implementing precautions. When red flags were detected when prescribing controlled drugs, prescribers in psychiatry experienced both their own negative emotional responses such as disappointment and resentment as well as the negative emotions of the patients such as anger and other extreme emotional responses. Psychiatric prescribers responded to patient's doctor shopping in a variety of ways such as changing their practice, discharging the patients or taking steps to not accept certain patients identified as being at risk for doctor shopping, as well as by talking to the patient and trying to offer them help. Despite experiencing doctor shopping, the prescribers inconsistently implemented precautionary measures such as checking prescription drug monitoring programs. © The Author(s) 2015.

  13. SIFT - Design and analysis of a fault-tolerant computer for aircraft control. [Software Implemented Fault Tolerant systems

    Science.gov (United States)

    Wensley, J. H.; Lamport, L.; Goldberg, J.; Green, M. W.; Levitt, K. N.; Melliar-Smith, P. M.; Shostak, R. E.; Weinstock, C. B.

    1978-01-01

    SIFT (Software Implemented Fault Tolerance) is an ultrareliable computer for critical aircraft control applications that achieves fault tolerance by the replication of tasks among processing units. The main processing units are off-the-shelf minicomputers, with standard microcomputers serving as the interface to the I/O system. Fault isolation is achieved by using a specially designed redundant bus system to interconnect the processing units. Error detection and analysis and system reconfiguration are performed by software. Iterative tasks are redundantly executed, and the results of each iteration are voted upon before being used. Thus, any single failure in a processing unit or bus can be tolerated with triplication of tasks, and subsequent failures can be tolerated after reconfiguration. Independent execution by separate processors means that the processors need only be loosely synchronized, and a novel fault-tolerant synchronization method is described.

  14. Standardized Competencies for Parenteral Nutrition Prescribing: The American Society for Parenteral and Enteral Nutrition Model.

    Science.gov (United States)

    Guenter, Peggi; Boullata, Joseph I; Ayers, Phil; Gervasio, Jane; Malone, Ainsley; Raymond, Erica; Holcombe, Beverly; Kraft, Michael; Sacks, Gordon; Seres, David

    2015-08-01

    Parenteral nutrition (PN) provision is complex, as it is a high-alert medication and prone to a variety of potential errors. With changes in clinical practice models and recent federal rulings, the number of PN prescribers may be increasing. Safe prescribing of this therapy requires that competency for prescribers from all disciplines be demonstrated using a standardized process. A standardized model for PN prescribing competency is proposed based on a competency framework, the American Society for Parenteral and Enteral Nutrition (A.S.P.E.N.)-published interdisciplinary core competencies, safe practice recommendations, and clinical guidelines. This framework will guide institutions and agencies in developing and maintaining competency for safe PN prescription by their staff. © 2015 American Society for Parenteral and Enteral Nutrition.

  15. Fault management and systems knowledge

    Science.gov (United States)

    2016-12-01

    Pilots are asked to manage faults during flight operations. This leads to the training question of the type and depth of system knowledge required to respond to these faults. Based on discussions with multiple airline operators, there is agreement th...

  16. ESR dating of fault rocks

    International Nuclear Information System (INIS)

    Lee, Hee Kwon

    2002-03-01

    Past movement on faults can be dated by measurement of the intensity of ESR signals in quartz. These signals are reset by local lattice deformation and local frictional heating on grain contacts at the time of fault movement. The ESR signals then trow back as a result of bombardment by ionizing radiation from surrounding rocks. The age is obtained from the ratio of the equivalent dose, needed to produce the observed signal, to the dose rate. Fine grains are more completely reset during faulting, and a plot of age vs grain size shows a plateau for grains below critical size : these grains are presumed to have been completely zeroed by the last fault activity. We carried out ESR dating of fault rocks collected from the Yangsan fault system. ESR dates from the this fault system range from 870 to 240 ka. Results of this research suggest that long-term cyclic fault activity continued into the pleistocene

  17. Fault diagnosis of induction motors

    CERN Document Server

    Faiz, Jawad; Joksimović, Gojko

    2017-01-01

    This book is a comprehensive, structural approach to fault diagnosis strategy. The different fault types, signal processing techniques, and loss characterisation are addressed in the book. This is essential reading for work with induction motors for transportation and energy.

  18. ESR dating of fault rocks

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hee Kwon [Kangwon National Univ., Chuncheon (Korea, Republic of)

    2002-03-15

    Past movement on faults can be dated by measurement of the intensity of ESR signals in quartz. These signals are reset by local lattice deformation and local frictional heating on grain contacts at the time of fault movement. The ESR signals then trow back as a result of bombardment by ionizing radiation from surrounding rocks. The age is obtained from the ratio of the equivalent dose, needed to produce the observed signal, to the dose rate. Fine grains are more completely reset during faulting, and a plot of age vs grain size shows a plateau for grains below critical size : these grains are presumed to have been completely zeroed by the last fault activity. We carried out ESR dating of fault rocks collected from the Yangsan fault system. ESR dates from the this fault system range from 870 to 240 ka. Results of this research suggest that long-term cyclic fault activity continued into the pleistocene.

  19. Scalable error correction in distributed ion trap computers

    International Nuclear Information System (INIS)

    Oi, Daniel K. L.; Devitt, Simon J.; Hollenberg, Lloyd C. L.

    2006-01-01

    A major challenge for quantum computation in ion trap systems is scalable integration of error correction and fault tolerance. We analyze a distributed architecture with rapid high-fidelity local control within nodes and entangled links between nodes alleviating long-distance transport. We demonstrate fault-tolerant operator measurements which are used for error correction and nonlocal gates. This scheme is readily applied to linear ion traps which cannot be scaled up beyond a few ions per individual trap but which have access to a probabilistic entanglement mechanism. A proof-of-concept system is presented which is within the reach of current experiment

  20. Influences on the prescribing of new drugs.

    Science.gov (United States)

    Tobin, Luke; de Almedia Neto, Abelio C; Wutzke, Sonia; Patterson, Craig; Mackson, Judith; Weekes, Lynn; Williamson, Margaret

    2008-01-01

    The aim of this study was to identify the factors that influence prescribing of new drugs among general practitioners, endocrinologists and psychiatrists. Four focus groups were conducted with GPs, endocrinologists and psychiatrists on sources of awareness and influences on prescribing of new drugs. Pharmaceutical companies were the most important source for becoming aware of new drugs. There were many influences on the decision to prescribe a new drug, the most important being efficacy, safety, cost and advantage over existing therapies. Endocrinologists placed greater emphasis on evidence from clinical trials and scientific conferences, and psychiatrists and GPs placed more weight on pharmaceutical representatives, colleagues and specialists. New drug prescribing occurs in a complex environment with many influences. Effective interventions to promote rational, safe and effective prescribing of new drugs will need to be cognisant of these factors.

  1. Antipsychotic prescribing in older people.

    Science.gov (United States)

    Neil, Wendy; Curran, Stephen; Wattis, John

    2003-09-01

    older people. There is a need to redress this balance to ensure that the prescribing of antipsychotics in older people is evidence based.

  2. Introduction to fault tree analysis

    International Nuclear Information System (INIS)

    Barlow, R.E.; Lambert, H.E.

    1975-01-01

    An elementary, engineering oriented introduction to fault tree analysis is presented. The basic concepts, techniques and applications of fault tree analysis, FTA, are described. The two major steps of FTA are identified as (1) the construction of the fault tree and (2) its evaluation. The evaluation of the fault tree can be qualitative or quantitative depending upon the scope, extensiveness and use of the analysis. The advantages, limitations and usefulness of FTA are discussed

  3. Fault Tolerant Wind Farm Control

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob

    2013-01-01

    In the recent years the wind turbine industry has focused on optimizing the cost of energy. One of the important factors in this is to increase reliability of the wind turbines. Advanced fault detection, isolation and accommodation are important tools in this process. Clearly most faults are deal...... scenarios. This benchmark model is used in an international competition dealing with Wind Farm fault detection and isolation and fault tolerant control....

  4. Medication errors with the use of allopurinol and colchicine: a retrospective study of a national, anonymous Internet-accessible error reporting system.

    Science.gov (United States)

    Mikuls, Ted R; Curtis, Jeffrey R; Allison, Jeroan J; Hicks, Rodney W; Saag, Kenneth G

    2006-03-01

    To more closely assess medication errors in gout care, we examined data from a national, Internet-accessible error reporting program over a 5-year reporting period. We examined data from the MEDMARX database, covering the period from January 1, 1999 through December 31, 2003. For allopurinol and colchicine, we examined error severity, source, type, contributing factors, and healthcare personnel involved in errors, and we detailed errors resulting in patient harm. Causes of error and the frequency of other error characteristics were compared for gout medications versus other musculoskeletal treatments using the chi-square statistic. Gout medication errors occurred in 39% (n = 273) of facilities participating in the MEDMARX program. Reported errors were predominantly from the inpatient hospital setting and related to the use of allopurinol (n = 524), followed by colchicine (n = 315), probenecid (n = 50), and sulfinpyrazone (n = 2). Compared to errors involving other musculoskeletal treatments, allopurinol and colchicine errors were more often ascribed to problems with physician prescribing (7% for other therapies versus 23-39% for allopurinol and colchicine, p < 0.0001) and less often due to problems with drug administration or nursing error (50% vs 23-27%, p < 0.0001). Our results suggest that inappropriate prescribing practices are characteristic of errors occurring with the use of allopurinol and colchicine. Physician prescribing practices are a potential target for quality improvement interventions in gout care.

  5. Row fault detection system

    Science.gov (United States)

    Archer, Charles Jens [Rochester, MN; Pinnow, Kurt Walter [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian Edward [Rochester, MN

    2008-10-14

    An apparatus, program product and method checks for nodal faults in a row of nodes by causing each node in the row to concurrently communicate with its adjacent neighbor nodes in the row. The communications are analyzed to determine a presence of a faulty node or connection.

  6. Fault isolation techniques

    Science.gov (United States)

    Dumas, A.

    1981-01-01

    Three major areas that are considered in the development of an overall maintenance scheme of computer equipment are described. The areas of concern related to fault isolation techniques are: the programmer (or user), company and its policies, and the manufacturer of the equipment.

  7. Fault Tolerant Control Systems

    DEFF Research Database (Denmark)

    Bøgh, S. A.

    This thesis considered the development of fault tolerant control systems. The focus was on the category of automated processes that do not necessarily comprise a high number of identical sensors and actuators to maintain safe operation, but still have a potential for improving immunity to component...

  8. Fault-Related Sanctuaries

    Science.gov (United States)

    Piccardi, L.

    2001-12-01

    Beyond the study of historical surface faulting events, this work investigates the possibility, in specific cases, of identifying pre-historical events whose memory survives in myths and legends. The myths of many famous sacred places of the ancient world contain relevant telluric references: "sacred" earthquakes, openings to the Underworld and/or chthonic dragons. Given the strong correspondence with local geological evidence, these myths may be considered as describing natural phenomena. It has been possible in this way to shed light on the geologic origin of famous myths (Piccardi, 1999, 2000 and 2001). Interdisciplinary researches reveal that the origin of several ancient sanctuaries may be linked in particular to peculiar geological phenomena observed on local active faults (like ground shaking and coseismic surface ruptures, gas and flames emissions, strong underground rumours). In many of these sanctuaries the sacred area is laid directly above the active fault. In a few cases, faulting has affected also the archaeological relics, right through the main temple (e.g. Delphi, Cnidus, Hierapolis of Phrygia). As such, the arrangement of the cult site and content of relative myths suggest that specific points along the trace of active faults have been noticed in the past and worshiped as special `sacred' places, most likely interpreted as Hades' Doors. The mythological stratification of most of these sanctuaries dates back to prehistory, and points to a common derivation from the cult of the Mother Goddess (the Lady of the Doors), which was largely widespread since at least 25000 BC. The cult itself was later reconverted into various different divinities, while the `sacred doors' of the Great Goddess and/or the dragons (offspring of Mother Earth and generally regarded as Keepers of the Doors) persisted in more recent mythologies. Piccardi L., 1999: The "Footprints" of the Archangel: Evidence of Early-Medieval Surface Faulting at Monte Sant'Angelo (Gargano, Italy

  9. Simultaneous Event-Triggered Fault Detection and Estimation for Stochastic Systems Subject to Deception Attacks.

    Science.gov (United States)

    Li, Yunji; Wu, QingE; Peng, Li

    2018-01-23

    In this paper, a synthesized design of fault-detection filter and fault estimator is considered for a class of discrete-time stochastic systems in the framework of event-triggered transmission scheme subject to unknown disturbances and deception attacks. A random variable obeying the Bernoulli distribution is employed to characterize the phenomena of the randomly occurring deception attacks. To achieve a fault-detection residual is only sensitive to faults while robust to disturbances, a coordinate transformation approach is exploited. This approach can transform the considered system into two subsystems and the unknown disturbances are removed from one of the subsystems. The gain of fault-detection filter is derived by minimizing an upper bound of filter error covariance. Meanwhile, system faults can be reconstructed by the remote fault estimator. An recursive approach is developed to obtain fault estimator gains as well as guarantee the fault estimator performance. Furthermore, the corresponding event-triggered sensor data transmission scheme is also presented for improving working-life of the wireless sensor node when measurement information are aperiodically transmitted. Finally, a scaled version of an industrial system consisting of local PC, remote estimator and wireless sensor node is used to experimentally evaluate the proposed theoretical results. In particular, a novel fault-alarming strategy is proposed so that the real-time capacity of fault-detection is guaranteed when the event condition is triggered.

  10. Fractional-order adaptive fault estimation for a class of nonlinear fractional-order systems

    KAUST Repository

    N'Doye, Ibrahima; Laleg-Kirati, Taous-Meriem

    2015-01-01

    This paper studies the problem of fractional-order adaptive fault estimation for a class of fractional-order Lipschitz nonlinear systems using fractional-order adaptive fault observer. Sufficient conditions for the asymptotical convergence of the fractional-order state estimation error, the conventional integer-order and the fractional-order faults estimation error are derived in terms of linear matrix inequalities (LMIs) formulation by introducing a continuous frequency distributed equivalent model and using an indirect Lyapunov approach where the fractional-order α belongs to 0 < α < 1. A numerical example is given to demonstrate the validity of the proposed approach.

  11. Fractional-order adaptive fault estimation for a class of nonlinear fractional-order systems

    KAUST Repository

    N'Doye, Ibrahima

    2015-07-01

    This paper studies the problem of fractional-order adaptive fault estimation for a class of fractional-order Lipschitz nonlinear systems using fractional-order adaptive fault observer. Sufficient conditions for the asymptotical convergence of the fractional-order state estimation error, the conventional integer-order and the fractional-order faults estimation error are derived in terms of linear matrix inequalities (LMIs) formulation by introducing a continuous frequency distributed equivalent model and using an indirect Lyapunov approach where the fractional-order α belongs to 0 < α < 1. A numerical example is given to demonstrate the validity of the proposed approach.

  12. Spent fuel bundle counter sequence error manual - RAPPS (200 MW) NGS

    International Nuclear Information System (INIS)

    Nicholson, L.E.

    1992-01-01

    The Spent Fuel Bundle Counter (SFBC) is used to count the number and type of spent fuel transfers that occur into or out of controlled areas at CANDU reactor sites. However if the transfers are executed in a non-standard manner or the SFBC is malfunctioning, the transfers are recorded as sequence errors. Each sequence error message typically contains adequate information to determine the cause of the message. This manual provides a guide to interpret the various sequence error messages that can occur and suggests probable cause or causes of the sequence errors. Each likely sequence error is presented on a 'card' in Appendix A. Note that it would be impractical to generate a sequence error card file with entries for all possible combinations of faults. Therefore the card file contains sequences with only one fault at a time. Some exceptions have been included however where experience has indicated that several faults can occur simultaneously

  13. Spent fuel bundle counter sequence error manual - KANUPP (125 MW) NGS

    International Nuclear Information System (INIS)

    Nicholson, L.E.

    1992-01-01

    The Spent Fuel Bundle Counter (SFBC) is used to count the number and type of spent fuel transfers that occur into or out of controlled areas at CANDU reactor sites. However if the transfers are executed in a non-standard manner or the SFBC is malfunctioning, the transfers are recorded as sequence errors. Each sequence error message may contain adequate information to determine the cause of the message. This manual provides a guide to interpret the various sequence error messages that can occur and suggests probable cause or causes of the sequence errors. Each likely sequence error is presented on a 'card' in Appendix A. Note that it would be impractical to generate a sequence error card file with entries for all possible combinations of faults. Therefore the card file contains sequences with only one fault at a time. Some exceptions have been included however where experience has indicated that several faults can occur simultaneously

  14. LAMPF first-fault identifier for fast transient faults

    International Nuclear Information System (INIS)

    Swanson, A.R.; Hill, R.E.

    1979-01-01

    The LAMPF accelerator is presently producing 800-MeV proton beams at 0.5 mA average current. Machine protection for such a high-intensity accelerator requires a fast shutdown mechanism, which can turn off the beam within a few microseconds of the occurrence of a machine fault. The resulting beam unloading transients cause the rf systems to exceed control loop tolerances and consequently generate multiple fault indications for identification by the control computer. The problem is to isolate the primary fault or cause of beam shutdown while disregarding as many as 50 secondary fault indications that occur as a result of beam shutdown. The LAMPF First-Fault Identifier (FFI) for fast transient faults is operational and has proven capable of first-fault identification. The FFI design utilized features of the Fast Protection System that were previously implemented for beam chopping and rf power conservation. No software changes were required

  15. Analysis of large fault trees based on functional decomposition

    International Nuclear Information System (INIS)

    Contini, Sergio; Matuzas, Vaidas

    2011-01-01

    With the advent of the Binary Decision Diagrams (BDD) approach in fault tree analysis, a significant enhancement has been achieved with respect to previous approaches, both in terms of efficiency and accuracy of the overall outcome of the analysis. However, the exponential increase of the number of nodes with the complexity of the fault tree may prevent the construction of the BDD. In these cases, the only way to complete the analysis is to reduce the complexity of the BDD by applying the truncation technique, which nevertheless implies the problem of estimating the truncation error or upper and lower bounds of the top-event unavailability. This paper describes a new method to analyze large coherent fault trees which can be advantageously applied when the working memory is not sufficient to construct the BDD. It is based on the decomposition of the fault tree into simpler disjoint fault trees containing a lower number of variables. The analysis of each simple fault tree is performed by using all the computational resources. The results from the analysis of all simpler fault trees are re-combined to obtain the results for the original fault tree. Two decomposition methods are herewith described: the first aims at determining the minimal cut sets (MCS) and the upper and lower bounds of the top-event unavailability; the second can be applied to determine the exact value of the top-event unavailability. Potentialities, limitations and possible variations of these methods will be discussed with reference to the results of their application to some complex fault trees.

  16. Analysis of large fault trees based on functional decomposition

    Energy Technology Data Exchange (ETDEWEB)

    Contini, Sergio, E-mail: sergio.contini@jrc.i [European Commission, Joint Research Centre, Institute for the Protection and Security of the Citizen, 21020 Ispra (Italy); Matuzas, Vaidas [European Commission, Joint Research Centre, Institute for the Protection and Security of the Citizen, 21020 Ispra (Italy)

    2011-03-15

    With the advent of the Binary Decision Diagrams (BDD) approach in fault tree analysis, a significant enhancement has been achieved with respect to previous approaches, both in terms of efficiency and accuracy of the overall outcome of the analysis. However, the exponential increase of the number of nodes with the complexity of the fault tree may prevent the construction of the BDD. In these cases, the only way to complete the analysis is to reduce the complexity of the BDD by applying the truncation technique, which nevertheless implies the problem of estimating the truncation error or upper and lower bounds of the top-event unavailability. This paper describes a new method to analyze large coherent fault trees which can be advantageously applied when the working memory is not sufficient to construct the BDD. It is based on the decomposition of the fault tree into simpler disjoint fault trees containing a lower number of variables. The analysis of each simple fault tree is performed by using all the computational resources. The results from the analysis of all simpler fault trees are re-combined to obtain the results for the original fault tree. Two decomposition methods are herewith described: the first aims at determining the minimal cut sets (MCS) and the upper and lower bounds of the top-event unavailability; the second can be applied to determine the exact value of the top-event unavailability. Potentialities, limitations and possible variations of these methods will be discussed with reference to the results of their application to some complex fault trees.

  17. Wind Power and Fault Clearance. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Vikesjoe, Johnny; Messing, Lars (Gothia Power (Sweden))

    2011-04-15

    in case of a fault occurring elsewhere in the network. This can occur in a feeder bay connecting generation. Directional short circuit protection is proposed in these cases. - Prevention of feeder protection function. Fault current infeed from connected generation along a feeder will influence the fault current through the feeding bay. This problem occurs probably only for very long feeders with large infeed close to the feeding substation. - Clearance of busbar faults in the feeding substation. Normally used blocked overcurrent protection of busbars must be modified in case of fault current infeed from any feeder. Two solutions are possible: use of arc detection protection and/or directional short circuit protection in bays connecting feeders with generation. The impact of fault current infeed from wind generator systems at grid faults is discussed. Fault currents from the new types of generator systems; DFIG (Doubly Fed Induction Generator) and full power converter connected generator, differ from conventional synchronous generators. It is therefore concluded that conventional fault calculations will not give correct fault current levels. In many applications this error is negligible, but not always. Three different types of wind power applications are studied: - Protection with a limited number of wind power units connected to a distribution feeder - Protection with a small wind farm connected one feeder in a distribution system - Protection of a wind farm connected to the sub-transmission or transmission system Short circuits and earth faults are studied for different fault locations: in the wind power plant, on a feeder in the distribution/collection grid and in the connecting subtransmission/transmission grid. For these faults different kind of protections are discussed. Also protection for deviating voltage and frequency are discussed. As conclusion, guidelines are given for the choice of protection of different objects: - Protection in a substation bay

  18. Neuropharmacology and mental health nurse prescribers.

    Science.gov (United States)

    Skingsley, David; Bradley, Eleanor J; Nolan, Peter

    2006-08-01

    To outline the development and content of a 'top-up' neuropharmacology module for mental health nurse prescribers and consider how much pharmacology training is required to ensure effective mental health prescribing practice. Debate about the content of prescribing training courses has persisted within the United Kingdom since the mid-1980s. In early 2003 supplementary prescribing was introduced and gave mental health nurses the opportunity to become prescribers. The challenge of the nurse prescribing curriculum for universities is that they have only a short time to provide nurses from a range of backgrounds with enough knowledge to ensure that they meet agreed levels of competency for safe prescribing. There is growing concern within mental health care that the prescribing of medication in mental health services falls short of what would be deemed good practice. Over the past two decades, nurse training has increasingly adopted a psychosocial approach to nursing care raising concerns that, although nurses attending prescribing training may be able to communicate effectively with service users, they may lack the basic knowledge of biology and pharmacology to make effective decisions about medication. Following the completion of a general nurse prescribing course, mental health nurses who attended were asked to identify their specific needs during the evaluation phase. Although they had covered basic pharmacological principles in their training, they stated that they needed more specific information about drugs used in mental health; particularly how to select appropriate drug treatments for mental health conditions. This paper describes how the nurses were involved in the design of a specific module which would enable them to transfer their theoretical leaning to practice and in so doing increase their confidence in their new roles. The findings of this study suggest that the understanding and confidence of mental health nurse prescribers about the drugs they

  19. From experiment to design -- Fault characterization and detection in parallel computer systems using computational accelerators

    Science.gov (United States)

    Yim, Keun Soo

    This dissertation summarizes experimental validation and co-design studies conducted to optimize the fault detection capabilities and overheads in hybrid computer systems (e.g., using CPUs and Graphics Processing Units, or GPUs), and consequently to improve the scalability of parallel computer systems using computational accelerators. The experimental validation studies were conducted to help us understand the failure characteristics of CPU-GPU hybrid computer systems under various types of hardware faults. The main characterization targets were faults that are difficult to detect and/or recover from, e.g., faults that cause long latency failures (Ch. 3), faults in dynamically allocated resources (Ch. 4), faults in GPUs (Ch. 5), faults in MPI programs (Ch. 6), and microarchitecture-level faults with specific timing features (Ch. 7). The co-design studies were based on the characterization results. One of the co-designed systems has a set of source-to-source translators that customize and strategically place error detectors in the source code of target GPU programs (Ch. 5). Another co-designed system uses an extension card to learn the normal behavioral and semantic execution patterns of message-passing processes executing on CPUs, and to detect abnormal behaviors of those parallel processes (Ch. 6). The third co-designed system is a co-processor that has a set of new instructions in order to support software-implemented fault detection techniques (Ch. 7). The work described in this dissertation gains more importance because heterogeneous processors have become an essential component of state-of-the-art supercomputers. GPUs were used in three of the five fastest supercomputers that were operating in 2011. Our work included comprehensive fault characterization studies in CPU-GPU hybrid computers. In CPUs, we monitored the target systems for a long period of time after injecting faults (a temporally comprehensive experiment), and injected faults into various types of

  20. Antibiotic prescribing in dental practice in Belgium.

    Science.gov (United States)

    Mainjot, A; D'Hoore, W; Vanheusden, A; Van Nieuwenhuysen, J-P

    2009-12-01

    To assess the types and frequency of antibiotic prescriptions by Belgian dentists, the indications for antibiotic prescription, and dentists' knowledge about recommended practice in antibiotic use. In this cross-sectional survey, dental practitioners were asked to record information about all antibiotics prescribed to their patients during a 2-week period. The dental practitioners were also asked to complete a self-administered questionnaire regarding demographic data, prescribing practices, and knowledge about antibiotic use. A random sample of 268 Belgian dentists participated in the survey. During the 2-week period, 24 421 patient encounters were recorded; 1033 patients were prescribed an antibiotic (4.2%). The median number of prescriptions per dentist for the 2 weeks was 3. Broad spectrum antibiotics were most commonly prescribed: 82% of all prescriptions were for amoxycillin, amoxycillin-clavulanic acid and clindamycin. Antibiotics were often prescribed in the absence of fever (92.2%) and without any local treatment (54.2%). The most frequent diagnosis for which antibiotics were prescribed was periapical abscess (51.9%). Antibiotics were prescribed to 63.3% of patients with periapical abscess and 4.3% of patients with pulpitis. Patterns of prescriptions were confirmed by the data from the self-reported practice. Discrepancies between observed and recommended practice support the need for educational initiatives to promote rational use of antibiotics in dentistry in Belgium.

  1. Fault-tolerant computing systems

    International Nuclear Information System (INIS)

    Dal Cin, M.; Hohl, W.

    1991-01-01

    Tests, Diagnosis and Fault Treatment were chosen as the guiding themes of the conference. However, the scope of the conference included reliability, availability, safety and security issues in software and hardware systems as well. The sessions were organized for the conference which was completed by an industrial presentation: Keynote Address, Reconfiguration and Recover, System Level Diagnosis, Voting and Agreement, Testing, Fault-Tolerant Circuits, Array Testing, Modelling, Applied Fault Tolerance, Fault-Tolerant Arrays and Systems, Interconnection Networks, Fault-Tolerant Software. One paper has been indexed separately in the database. (orig./HP)

  2. Fault rocks and uranium mineralization

    International Nuclear Information System (INIS)

    Tong Hangshou.

    1991-01-01

    The types of fault rocks, microstructural characteristics of fault tectonite and their relationship with uranium mineralization in the uranium-productive granite area are discussed. According to the synthetic analysis on nature of stress, extent of crack and microstructural characteristics of fault rocks, they can be classified into five groups and sixteen subgroups. The author especially emphasizes the control of cataclasite group and fault breccia group over uranium mineralization in the uranium-productive granite area. It is considered that more effective study should be made on the macrostructure and microstructure of fault rocks. It is of an important practical significance in uranium exploration

  3. Network Fault Diagnosis Using DSM

    Institute of Scientific and Technical Information of China (English)

    Jiang Hao; Yan Pu-liu; Chen Xiao; Wu Jing

    2004-01-01

    Difference similitude matrix (DSM) is effective in reducing information system with its higher reduction rate and higher validity. We use DSM method to analyze the fault data of computer networks and obtain the fault diagnosis rules. Through discretizing the relative value of fault data, we get the information system of the fault data. DSM method reduces the information system and gets the diagnosis rules. The simulation with the actual scenario shows that the fault diagnosis based on DSM can obtain few and effective rules.

  4. Nurse practitioner prescribing: an international perspective

    Directory of Open Access Journals (Sweden)

    Fong J

    2015-10-01

    Full Text Available Jacqueline Fong,1,2 Thomas Buckley,2 Andrew Cashin3 1St George Hospital, Kogarah, 2Sydney Nursing School, University of Sydney, Camperdown, NSW, Australia; 3School of Health and Human Sciences, Southern Cross University, Lismore, NSW, Australia Background: Internationally, the delivery of care provided by nurses and midwives has undergone a significant change due to a variety of interrelated factors, including economic circumstances, a diminishing number of medical providers, the unavailability of adequate health care services in underserved and rural areas, and growing specialization among the professions. One solution to the challenges of care delivery has been the introduction of nurse practitioners (NPs and the authorization of NPs to prescribe medicines. Aim: The aim of this paper was to review the current international literature related to NP prescribing and compare the findings to the Australian context. The review focuses on literature from the United States, Canada, Europe, Australia, and New Zealand. Methods: Databases were searched from January 2000 to January 2015. The following keywords: “nurse practitioner”, “advanced nurse”, “advanced practice nurse”, “prescri*”, “Australia”, “United States America”, “UK”, “New Zealand”, “Canada”, “Europe”, “drug prescri*”, “prescri* authority”, and “prescri* legislation” were used. Findings: NPs tend to prescribe in differing contexts of practice to provide care in underserved populations and require good systems literacy to practice across complex systems. The key themes identified internationally related to NP prescribing relate to barriers to prescribing, confidence in prescribing, and the unique role of NPs in prescribing medicines, eg, the high prevalence of prescribing pain medicines in several countries, including Australia. Conclusion: Across all countries reviewed, there appears a need for further research into the organizational and

  5. Two-dimensional errors

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    This chapter addresses the extension of previous work in one-dimensional (linear) error theory to two-dimensional error analysis. The topics of the chapter include the definition of two-dimensional error, the probability ellipse, the probability circle, elliptical (circular) error evaluation, the application to position accuracy, and the use of control systems (points) in measurements

  6. Part two: Error propagation

    International Nuclear Information System (INIS)

    Picard, R.R.

    1989-01-01

    Topics covered in this chapter include a discussion of exact results as related to nuclear materials management and accounting in nuclear facilities; propagation of error for a single measured value; propagation of error for several measured values; error propagation for materials balances; and an application of error propagation to an example of uranium hexafluoride conversion process

  7. Learning from Errors

    OpenAIRE

    Martínez-Legaz, Juan Enrique; Soubeyran, Antoine

    2003-01-01

    We present a model of learning in which agents learn from errors. If an action turns out to be an error, the agent rejects not only that action but also neighboring actions. We find that, keeping memory of his errors, under mild assumptions an acceptable solution is asymptotically reached. Moreover, one can take advantage of big errors for a faster learning.

  8. Design of fault tolerant control system for steam generator using

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myung Ki; Seo, Mi Ro [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A controller and sensor fault tolerant system for a steam generator is designed with fuzzy logic. A structure of the proposed fault tolerant redundant system is composed of a supervisor and two fuzzy weighting modulators. A supervisor alternatively checks a controller and a sensor induced performances to identify which part, a controller or a sensor, is faulty. In order to analyze controller induced performance both an error and a change in error of the system output are chosen as fuzzy variables. The fuzzy logic for a sensor induced performance uses two variables : a deviation between two sensor outputs and its frequency. Fuzzy weighting modulator generates an output signal compensated for faulty input signal. Simulations show that the proposed fault tolerant control scheme for a steam generator regulates well water level by suppressing fault effect of either controllers or sensors. Therefore through duplicating sensors and controllers with the proposed fault tolerant scheme, both a reliability of a steam generator control and sensor system and that of a power plant increase even more. 2 refs., 9 figs., 1 tab. (Author)

  9. A Review Of Fault Tolerant Scheduling In Multicore Systems

    Directory of Open Access Journals (Sweden)

    Shefali Malhotra

    2015-05-01

    Full Text Available Abstract In this paper we have discussed about various fault tolerant task scheduling algorithm for multi core system based on hardware and software. Hardware based algorithm which is blend of Triple Modulo Redundancy and Double Modulo Redundancy in which Agricultural Vulnerability Factor is considered while deciding the scheduling other than EDF and LLF scheduling algorithms. In most of the real time system the dominant part is shared memory.Low overhead software based fault tolerance approach can be implemented at user-space level so that it does not require any changes at application level. Here redundant multi-threaded processes are used. Using those processes we can detect soft errors and recover from them. This method gives low overhead fast error detection and recovery mechanism. The overhead incurred by this method ranges from 0 to 18 for selected benchmarks. Hybrid Scheduling Method is another scheduling approach for real time systems. Dynamic fault tolerant scheduling gives high feasibility rate whereas task criticality is used to select the type of fault recovery method in order to tolerate the maximum number of faults.

  10. Generalized Gaussian Error Calculus

    CERN Document Server

    Grabe, Michael

    2010-01-01

    For the first time in 200 years Generalized Gaussian Error Calculus addresses a rigorous, complete and self-consistent revision of the Gaussian error calculus. Since experimentalists realized that measurements in general are burdened by unknown systematic errors, the classical, widespread used evaluation procedures scrutinizing the consequences of random errors alone turned out to be obsolete. As a matter of course, the error calculus to-be, treating random and unknown systematic errors side by side, should ensure the consistency and traceability of physical units, physical constants and physical quantities at large. The generalized Gaussian error calculus considers unknown systematic errors to spawn biased estimators. Beyond, random errors are asked to conform to the idea of what the author calls well-defined measuring conditions. The approach features the properties of a building kit: any overall uncertainty turns out to be the sum of a contribution due to random errors, to be taken from a confidence inter...

  11. Real-time fault tolerant full adder design for critical applications

    Directory of Open Access Journals (Sweden)

    Pankaj Kumar

    2016-09-01

    Full Text Available In the complex computing system, processing units are dealing with devices of smaller size, which are sensitive to the transient faults. A transient fault occurs in a circuit caused by the electromagnetic noises, cosmic rays, crosstalk and power supply noise. It is very difficult to detect these faults during offline testing. Hence an area efficient fault tolerant full adder for testing and repairing of transient and permanent faults occurred in single and multi-net is proposed. Additionally, the proposed architecture can also detect and repair permanent faults. This design incurs much lower hardware overheads relative to the traditional hardware architecture. In addition to this, proposed design also provides higher error detection and correction efficiency when compared to the existing designs.

  12. Modeling of outpatient prescribing process in iran: a gateway toward electronic prescribing system.

    Science.gov (United States)

    Ahmadi, Maryam; Samadbeik, Mahnaz; Sadoughi, Farahnaz

    2014-01-01

    Implementation of electronic prescribing system can overcome many problems of the paper prescribing system, and provide numerous opportunities of more effective and advantageous prescribing. Successful implementation of such a system requires complete and deep understanding of work content, human force, and workflow of paper prescribing. The current study was designed in order to model the current business process of outpatient prescribing in Iran and clarify different actions during this process. In order to describe the prescribing process and the system features in Iran, the methodology of business process modeling and analysis was used in the present study. The results of the process documentation were analyzed using a conceptual model of workflow elements and the technique of modeling "As-Is" business processes. Analysis of the current (as-is) prescribing process demonstrated that Iran stood at the first levels of sophistication in graduated levels of electronic prescribing, namely electronic prescription reference, and that there were problematic areas including bottlenecks, redundant and duplicated work, concentration of decision nodes, and communicative weaknesses among stakeholders of the process. Using information technology in some activities of medication prescription in Iran has not eliminated the dependence of the stakeholders on paper-based documents and prescriptions. Therefore, it is necessary to implement proper system programming in order to support change management and solve the problems in the existing prescribing process. To this end, a suitable basis should be provided for reorganization and improvement of the prescribing process for the future electronic systems.

  13. Understanding the determinants of antimicrobial prescribing within hospitals: the role of "prescribing etiquette".

    Science.gov (United States)

    Charani, E; Castro-Sanchez, E; Sevdalis, N; Kyratsis, Y; Drumright, L; Shah, N; Holmes, A

    2013-07-01

    There is limited knowledge of the key determinants of antimicrobial prescribing behavior (APB) in hospitals. An understanding of these determinants is required for the successful design, adoption, and implementation of quality improvement interventions in antimicrobial stewardship programs. Qualitative semistructured interviews were conducted with doctors (n = 10), pharmacists (n = 10), and nurses and midwives (n = 19) in 4 hospitals in London. Interviews were conducted until thematic saturation was reached. Thematic analysis was applied to the data to identify the key determinants of antimicrobial prescribing behaviors. The APB of healthcare professionals is governed by a set of cultural rules. Antimicrobial prescribing is performed in an environment where the behavior of clinical leaders or seniors influences practice of junior doctors. Senior doctors consider themselves exempt from following policy and practice within a culture of perceived autonomous decision making that relies more on personal knowledge and experience than formal policy. Prescribers identify with the clinical groups in which they work and adjust their APB according to the prevailing practice within these groups. A culture of "noninterference" in the antimicrobial prescribing practice of peers prevents intervention into prescribing of colleagues. These sets of cultural rules demonstrate the existence of a "prescribing etiquette," which dominates the APB of healthcare professionals. Prescribing etiquette creates an environment in which professional hierarchy and clinical groups act as key determinants of APB. To influence the antimicrobial prescribing of individual healthcare professionals, interventions need to address prescribing etiquette and use clinical leadership within existing clinical groups to influence practice.

  14. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.

  15. Prescribing Practices and Polypharmacy in Kitovu Hospital

    African Journals Online (AJOL)

    admin

    1Division of Medicine and Therapeutics, Centre for Medical Education, The Queen's ... This audit of prescribing practices explores recent trends at Kitovu Hospital, Uganda ..... This creates a cycle of poor ... interventions to remedy these is vital.

  16. The social act of electronic medication prescribing

    NARCIS (Netherlands)

    J.E.C.M. Aarts (Jos)

    2013-01-01

    markdownabstract__Abstract__ Prescribing medication is embedded in social norms and cultures. In modern Western health care professionals and policy makers have attempted to rationalize medicine by addressing cost-effectiveness of diagnostic and therapeutic treatments and the development of

  17. Customization in prescribing for bipolar disorder.

    Science.gov (United States)

    Hodgkin, Dominic; Volpe-Vartanian, Joanna; Merrick, Elizabeth L; Horgan, Constance M; Nierenberg, Andrew A; Frank, Richard G; Lee, Sue

    2012-06-01

    For many disorders, patient heterogeneity requires physicians to customize their treatment to each patient's needs. We test for the existence of customization in physicians' prescribing for bipolar disorder, using data from a naturalistic clinical effectiveness trial of bipolar disorder treatment (STEP-BD), which did not constrain physician prescribing. Multinomial logit is used to model the physician's choice among five combinations of drug classes. We find that our observed measure of the patient's clinical status played only a limited role in the choice among drug class combinations, even for conditions such as mania that are expected to affect class choice. However, treatment of a patient with given characteristics differed widely depending on which physician was seen. The explanatory power of the model was low. There was variation within each physician's prescribing, but the results do not suggest a high degree of customization in physicians' prescribing, based on our measure of clinical status. Copyright © 2011 John Wiley & Sons, Ltd.

  18. Progressive retry for software error recovery in distributed systems

    Science.gov (United States)

    Wang, Yi-Min; Huang, Yennun; Fuchs, W. K.

    1993-01-01

    In this paper, we describe a method of execution retry for bypassing software errors based on checkpointing, rollback, message reordering and replaying. We demonstrate how rollback techniques, previously developed for transient hardware failure recovery, can also be used to recover from software faults by exploiting message reordering to bypass software errors. Our approach intentionally increases the degree of nondeterminism and the scope of rollback when a previous retry fails. Examples from our experience with telecommunications software systems illustrate the benefits of the scheme.

  19. Fault diagnosis system of electromagnetic valve using neural network filter

    International Nuclear Information System (INIS)

    Hayashi, Shoji; Odaka, Tomohiro; Kuroiwa, Jousuke; Ogura, Hisakazu

    2008-01-01

    This paper is concerned with the gas leakage fault detection of electromagnetic valve using a neural network filter. In modern plants, the ability to detect and identify gas leakage faults is becoming increasingly important. The main difficulty in detecting gas leakage faults by sound signals lies in the fact that the practical plants are usually very noisy. To solve this difficulty, a neural network filter is used to eliminate background noise and raise the signal noise ratio of the sound signal. The background noise is assumed as a dynamic system, and an accurate mathematical model of the dynamic system can be established using a neural network filter. The predicted error between predicted values and practical ones constitutes the output of the filter. If the predicted error is zero, then there is no leakage. If the predicted error is greater than a certain value, then there is a leakage fault. Through application to practical pneumatic systems, it is verified that the neural network filter was effective in gas leakage detection. (author)

  20. Nurse prescribing ethics and medical marketing.

    Science.gov (United States)

    Adams, J

    This article suggests that nurse prescribers require an awareness of key concepts in ethics, such as deontology and utilitarianism to reflect on current debates and contribute to them. The principles of biomedical ethics have also been influential in the development of professional codes of conduct. Attention is drawn to the importance of the Association of the British Pharmaceutical Industry's code of practice for the pharmaceutical industry in regulating marketing aimed at prescribers.

  1. Prevalence of inappropriate prescribing in primary care

    DEFF Research Database (Denmark)

    Bregnhøj, Lisbeth; Thirstrup, Steffen; Kristensen, Mogens Brandt

    2007-01-01

    to the patients. Topical, dermatological medications and medications not used regularly were excluded. RESULTS: 212 patients were prescribed 1621 medications by their GPs at baseline. In all, 640 (39.5%) of the medications had one or more inappropriate ratings in the 10 criteria making up the MAI. The main part...... is good. However, the majority of patients used one or more medications with inappropriate ratings. The inappropriate prescribing relates to specific therapeutic groups and criteria, which should be targeted in future interventions....

  2. Faults in Linux

    DEFF Research Database (Denmark)

    Palix, Nicolas Jean-Michel; Thomas, Gaël; Saha, Suman

    2011-01-01

    In 2001, Chou et al. published a study of faults found by applying a static analyzer to Linux versions 1.0 through 2.4.1. A major result of their work was that the drivers directory contained up to 7 times more of certain kinds of faults than other directories. This result inspired a number...... of development and research efforts on improving the reliability of driver code. Today Linux is used in a much wider range of environments, provides a much wider range of services, and has adopted a new development and release model. What has been the impact of these changes on code quality? Are drivers still...... a major problem? To answer these questions, we have transported the experiments of Chou et al. to Linux versions 2.6.0 to 2.6.33, released between late 2003 and early 2010. We find that Linux has more than doubled in size during this period, but that the number of faults per line of code has been...

  3. Analyzing temozolomide medication errors: potentially fatal.

    Science.gov (United States)

    Letarte, Nathalie; Gabay, Michael P; Bressler, Linda R; Long, Katie E; Stachnik, Joan M; Villano, J Lee

    2014-10-01

    The EORTC-NCIC regimen for glioblastoma requires different dosing of temozolomide (TMZ) during radiation and maintenance therapy. This complexity is exacerbated by the availability of multiple TMZ capsule strengths. TMZ is an alkylating agent and the major toxicity of this class is dose-related myelosuppression. Inadvertent overdose can be fatal. The websites of the Institute for Safe Medication Practices (ISMP), and the Food and Drug Administration (FDA) MedWatch database were reviewed. We searched the MedWatch database for adverse events associated with TMZ and obtained all reports including hematologic toxicity submitted from 1st November 1997 to 30th May 2012. The ISMP describes errors with TMZ resulting from the positioning of information on the label of the commercial product. The strength and quantity of capsules on the label were in close proximity to each other, and this has been changed by the manufacturer. MedWatch identified 45 medication errors. Patient errors were the most common, accounting for 21 or 47% of errors, followed by dispensing errors, which accounted for 13 or 29%. Seven reports or 16% were errors in the prescribing of TMZ. Reported outcomes ranged from reversible hematological adverse events (13%), to hospitalization for other adverse events (13%) or death (18%). Four error reports lacked detail and could not be categorized. Although the FDA issued a warning in 2003 regarding fatal medication errors and the product label warns of overdosing, errors in TMZ dosing occur for various reasons and involve both healthcare professionals and patients. Overdosing errors can be fatal.

  4. Automatic fault tracing of active faults in the Sutlej valley (NW-Himalayas, India)

    Science.gov (United States)

    Janda, C.; Faber, R.; Hager, C.; Grasemann, B.

    2003-04-01

    In the Sutlej Valley the Lesser Himalayan Crystalline Sequence (LHCS) is actively extruding between the Munsiari Thrust (MT) at the base, and the Karcham Normal Fault (KNF) at the top. The clear evidences for ongoing deformation are brittle faults in Holocene lake deposits, hot springs activity near the faults and dramatically younger cooling ages within the LHCS (Vannay and Grasemann, 2001). Because these brittle fault zones obviously influence the morphology in the field we developed a new method for automatically tracing the intersections of planar fault geometries with digital elevation models (Faber, 2002). Traditional mapping techniques use structure contours (i.e. lines or curves connecting points of equal elevation on a geological structure) in order to construct intersections of geological structures with topographic maps. However, even if the geological structure is approximated by a plane and therefore structure contours are equally spaced lines, this technique is rather time consuming and inaccurate, because errors are cumulative. Drawing structure contours by hand makes it also impossible to slightly change the azimuth and dip direction of the favoured plane without redrawing everything from the beginning on. However, small variations of the fault position which are easily possible by either inaccuracies of measurement in the field or small local variations in the trend and/or dip of the fault planes can have big effects on the intersection with topography. The developed method allows to interactively view intersections in a 2D and 3D mode. Unlimited numbers of planes can be moved separately in 3 dimensions (translation and rotation) and intersections with the topography probably following morphological features can be mapped. Besides the increase of efficiency this method underlines the shortcoming of classical lineament extraction ignoring the dip of planar structures. Using this method, areas of active faulting influencing the morphology, can be

  5. The Sorong Fault Zone, Indonesia: Mapping a Fault Zone Offshore

    Science.gov (United States)

    Melia, S.; Hall, R.

    2017-12-01

    The Sorong Fault Zone is a left-lateral strike-slip fault zone in eastern Indonesia, extending westwards from the Bird's Head peninsula of West Papua towards Sulawesi. It is the result of interactions between the Pacific, Caroline, Philippine Sea, and Australian Plates and much of it is offshore. Previous research on the fault zone has been limited by the low resolution of available data offshore, leading to debates over the extent, location, and timing of movements, and the tectonic evolution of eastern Indonesia. Different studies have shown it north of the Sula Islands, truncated south of Halmahera, continuing to Sulawesi, or splaying into a horsetail fan of smaller faults. Recently acquired high resolution multibeam bathymetry of the seafloor (with a resolution of 15-25 meters), and 2D seismic lines, provide the opportunity to trace the fault offshore. The position of different strands can be identified. On land, SRTM topography shows that in the northern Bird's Head the fault zone is characterised by closely spaced E-W trending faults. NW of the Bird's Head offshore there is a fold and thrust belt which terminates some strands. To the west of the Bird's Head offshore the fault zone diverges into multiple strands trending ENE-WSW. Regions of Riedel shearing are evident west of the Bird's Head, indicating sinistral strike-slip motion. Further west, the ENE-WSW trending faults turn to an E-W trend and there are at least three fault zones situated immediately south of Halmahera, north of the Sula Islands, and between the islands of Sanana and Mangole where the fault system terminates in horsetail strands. South of the Sula islands some former normal faults at the continent-ocean boundary with the North Banda Sea are being reactivated as strike-slip faults. The fault zone does not currently reach Sulawesi. The new fault map differs from previous interpretations concerning the location, age and significance of different parts of the Sorong Fault Zone. Kinematic

  6. A framework for software fault tolerance in real-time systems

    Science.gov (United States)

    Anderson, T.; Knight, J. C.

    1983-01-01

    A classification scheme for errors and a technique for the provision of software fault tolerance in cyclic real-time systems is presented. The technique requires that the process structure of a system be represented by a synchronization graph which is used by an executive as a specification of the relative times at which they will communicate during execution. Communication between concurrent processes is severely limited and may only take place between processes engaged in an exchange. A history of error occurrences is maintained by an error handler. When an error is detected, the error handler classifies it using the error history information and then initiates appropriate recovery action.

  7. The social act of electronic medication prescribing.

    Science.gov (United States)

    Aarts, Jos

    2013-01-01

    Prescribing medication is embedded in social norms and cultures. In modern Western health care professionals and policy makers have attempted to rationalize medicine by addressing cost-effectiveness of diagnostic and therapeutic treatments and the development of guidelines and protocols based on the outcomes of clinical studies. These notions of cost-effectiveness and evidence-based medicine have also been embedded in technology such as electronic prescribing systems. Such constraining systems may clash with the reality of clinical practice, where formal boundaries of responsibility and authorization are often blurred. Such systems may therefore even impede patient care. Medication is seen as the essence of medical practice. Prescribing is a social act. In a hospital medications may be aimed at treating a patient for a specific condition, in primary care the professional often meets the patient with her or his social and cultural notions of a health problem. The author argues that the design and implementation of electronic prescribing systems should address the social and cultural context of prescribing. Especially in primary care, where health problems are often ill defined and evidence-based medicine guidelines do not always work as intended, studies need to take into account the sociotechnical character of electronic prescribing systems.

  8. Cooperative Fault Tolerant Tracking Control for Multiagent Systems: An Intermediate Estimator-Based Approach.

    Science.gov (United States)

    Zhu, Jun-Wei; Yang, Guang-Hong; Zhang, Wen-An; Yu, Li

    2017-10-17

    This paper studies the observer based fault tolerant tracking control problem for linear multiagent systems with multiple faults and mismatched disturbances. A novel distributed intermediate estimator based fault tolerant tracking protocol is presented. The leader's input is nonzero and unavailable to the followers. By applying a projection technique, the mismatched disturbances are separated into matched and unmatched components. For each node, a tracking error system is established, for which an intermediate estimator driven by the relative output measurements is constructed to estimate the sensor faults and a combined signal of the leader's input, process faults, and matched disturbance component. Based on the estimation, a fault tolerant tracking protocol is designed to eliminate the effects of the combined signal. Besides, the effect of unmatched disturbance component can be attenuated by directly adjusting some specified parameters. Finally, a simulation example of aircraft demonstrates the effectiveness of the designed tracking protocol.This paper studies the observer based fault tolerant tracking control problem for linear multiagent systems with multiple faults and mismatched disturbances. A novel distributed intermediate estimator based fault tolerant tracking protocol is presented. The leader's input is nonzero and unavailable to the followers. By applying a projection technique, the mismatched disturbances are separated into matched and unmatched components. For each node, a tracking error system is established, for which an intermediate estimator driven by the relative output measurements is constructed to estimate the sensor faults and a combined signal of the leader's input, process faults, and matched disturbance component. Based on the estimation, a fault tolerant tracking protocol is designed to eliminate the effects of the combined signal. Besides, the effect of unmatched disturbance component can be attenuated by directly adjusting some

  9. Fault-tolerant search algorithms reliable computation with unreliable information

    CERN Document Server

    Cicalese, Ferdinando

    2013-01-01

    Why a book on fault-tolerant search algorithms? Searching is one of the fundamental problems in computer science. Time and again algorithmic and combinatorial issues originally studied in the context of search find application in the most diverse areas of computer science and discrete mathematics. On the other hand, fault-tolerance is a necessary ingredient of computing. Due to their inherent complexity, information systems are naturally prone to errors, which may appear at any level - as imprecisions in the data, bugs in the software, or transient or permanent hardware failures. This book pr

  10. ESR dating of fault rocks

    International Nuclear Information System (INIS)

    Lee, Hee Kwon

    2003-02-01

    Past movement on faults can be dated by measurement of the intensity of ESR signals in quartz. These signals are reset by local lattice deformation and local frictional heating on grain contacts at the time of fault movement. The ESR signals then grow back as a result of bombardment by ionizing radiation from surrounding rocks. The age is obtained from the ratio of the equivalent dose, needed to produce the observed signal, to the dose rate. Fine grains are more completely reset during faulting, and a plot of age vs. grain size shows a plateau for grains below critical size; these grains are presumed to have been completely zeroed by the last fault activity. We carried out ESR dating of fault rocks collected near the Gori nuclear reactor. Most of the ESR signals of fault rocks collected from the basement are saturated. This indicates that the last movement of the faults had occurred before the Quaternary period. However, ESR dates from the Oyong fault zone range from 370 to 310 ka. Results of this research suggest that long-term cyclic fault activity of the Oyong fault zone continued into the Pleistocene

  11. Large earthquakes and creeping faults

    Science.gov (United States)

    Harris, Ruth A.

    2017-01-01

    Faults are ubiquitous throughout the Earth's crust. The majority are silent for decades to centuries, until they suddenly rupture and produce earthquakes. With a focus on shallow continental active-tectonic regions, this paper reviews a subset of faults that have a different behavior. These unusual faults slowly creep for long periods of time and produce many small earthquakes. The presence of fault creep and the related microseismicity helps illuminate faults that might not otherwise be located in fine detail, but there is also the question of how creeping faults contribute to seismic hazard. It appears that well-recorded creeping fault earthquakes of up to magnitude 6.6 that have occurred in shallow continental regions produce similar fault-surface rupture areas and similar peak ground shaking as their locked fault counterparts of the same earthquake magnitude. The behavior of much larger earthquakes on shallow creeping continental faults is less well known, because there is a dearth of comprehensive observations. Computational simulations provide an opportunity to fill the gaps in our understanding, particularly of the dynamic processes that occur during large earthquake rupture and arrest.

  12. ESR dating of fault rocks

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hee Kwon [Kangwon National Univ., Chuncheon (Korea, Republic of)

    2003-02-15

    Past movement on faults can be dated by measurement of the intensity of ESR signals in quartz. These signals are reset by local lattice deformation and local frictional heating on grain contacts at the time of fault movement. The ESR signals then grow back as a result of bombardment by ionizing radiation from surrounding rocks. The age is obtained from the ratio of the equivalent dose, needed to produce the observed signal, to the dose rate. Fine grains are more completely reset during faulting, and a plot of age vs. grain size shows a plateau for grains below critical size; these grains are presumed to have been completely zeroed by the last fault activity. We carried out ESR dating of fault rocks collected near the Gori nuclear reactor. Most of the ESR signals of fault rocks collected from the basement are saturated. This indicates that the last movement of the faults had occurred before the Quaternary period. However, ESR dates from the Oyong fault zone range from 370 to 310 ka. Results of this research suggest that long-term cyclic fault activity of the Oyong fault zone continued into the Pleistocene.

  13. Evaluation of drug administration errors in a teaching hospital

    OpenAIRE

    Berdot, Sarah; Sabatier, Brigitte; Gillaizeau, Florence; Caruba, Thibaut; Prognon, Patrice; Durieux, Pierre

    2012-01-01

    Abstract Background Medication errors can occur at any of the three steps of the medication use process: prescribing, dispensing and administration. We aimed to determine the incidence, type and clinical importance of drug administration errors and to identify risk factors. Methods Prospective study based on disguised observation technique in four wards in a teaching hospital in Paris, France (800 beds). A pharmacist accompanied nurses and witnessed the preparation and administration of drugs...

  14. Real-time fault diagnosis and fault-tolerant control

    OpenAIRE

    Gao, Zhiwei; Ding, Steven X.; Cecati, Carlo

    2015-01-01

    This "Special Section on Real-Time Fault Diagnosis and Fault-Tolerant Control" of the IEEE Transactions on Industrial Electronics is motivated to provide a forum for academic and industrial communities to report recent theoretic/application results in real-time monitoring, diagnosis, and fault-tolerant design, and exchange the ideas about the emerging research direction in this field. Twenty-three papers were eventually selected through a strict peer-reviewed procedure, which represent the mo...

  15. Imaging of Subsurface Faults using Refraction Migration with Fault Flooding

    KAUST Repository

    Metwally, Ahmed Mohsen Hassan

    2017-05-31

    We propose a novel method for imaging shallow faults by migration of transmitted refraction arrivals. The assumption is that there is a significant velocity contrast across the fault boundary that is underlain by a refracting interface. This procedure, denoted as refraction migration with fault flooding, largely overcomes the difficulty in imaging shallow faults with seismic surveys. Numerical results successfully validate this method on three synthetic examples and two field-data sets. The first field-data set is next to the Gulf of Aqaba and the second example is from a seismic profile recorded in Arizona. The faults detected by refraction migration in the Gulf of Aqaba data were in agreement with those indicated in a P-velocity tomogram. However, a new fault is detected at the end of the migration image that is not clearly seen in the traveltime tomogram. This result is similar to that for the Arizona data where the refraction image showed faults consistent with those seen in the P-velocity tomogram, except it also detected an antithetic fault at the end of the line. This fault cannot be clearly seen in the traveltime tomogram due to the limited ray coverage.

  16. Imaging of Subsurface Faults using Refraction Migration with Fault Flooding

    KAUST Repository

    Metwally, Ahmed Mohsen Hassan; Hanafy, Sherif; Guo, Bowen; Kosmicki, Maximillian Sunflower

    2017-01-01

    We propose a novel method for imaging shallow faults by migration of transmitted refraction arrivals. The assumption is that there is a significant velocity contrast across the fault boundary that is underlain by a refracting interface. This procedure, denoted as refraction migration with fault flooding, largely overcomes the difficulty in imaging shallow faults with seismic surveys. Numerical results successfully validate this method on three synthetic examples and two field-data sets. The first field-data set is next to the Gulf of Aqaba and the second example is from a seismic profile recorded in Arizona. The faults detected by refraction migration in the Gulf of Aqaba data were in agreement with those indicated in a P-velocity tomogram. However, a new fault is detected at the end of the migration image that is not clearly seen in the traveltime tomogram. This result is similar to that for the Arizona data where the refraction image showed faults consistent with those seen in the P-velocity tomogram, except it also detected an antithetic fault at the end of the line. This fault cannot be clearly seen in the traveltime tomogram due to the limited ray coverage.

  17. Field error lottery

    Energy Technology Data Exchange (ETDEWEB)

    Elliott, C.J.; McVey, B. (Los Alamos National Lab., NM (USA)); Quimby, D.C. (Spectra Technology, Inc., Bellevue, WA (USA))

    1990-01-01

    The level of field errors in an FEL is an important determinant of its performance. We have computed 3D performance of a large laser subsystem subjected to field errors of various types. These calculations have been guided by simple models such as SWOOP. The technique of choice is utilization of the FELEX free electron laser code that now possesses extensive engineering capabilities. Modeling includes the ability to establish tolerances of various types: fast and slow scale field bowing, field error level, beam position monitor error level, gap errors, defocusing errors, energy slew, displacement and pointing errors. Many effects of these errors on relative gain and relative power extraction are displayed and are the essential elements of determining an error budget. The random errors also depend on the particular random number seed used in the calculation. The simultaneous display of the performance versus error level of cases with multiple seeds illustrates the variations attributable to stochasticity of this model. All these errors are evaluated numerically for comprehensive engineering of the system. In particular, gap errors are found to place requirements beyond mechanical tolerances of {plus minus}25{mu}m, and amelioration of these may occur by a procedure utilizing direct measurement of the magnetic fields at assembly time. 4 refs., 12 figs.

  18. Wilshire fault: Earthquakes in Hollywood?

    Science.gov (United States)

    Hummon, Cheryl; Schneider, Craig L.; Yeats, Robert S.; Dolan, James F.; Sieh, Kerry E.; Huftile, Gary J.

    1994-04-01

    The Wilshire fault is a potentially seismogenic, blind thrust fault inferred to underlie and cause the Wilshire arch, a Quaternary fold in the Hollywood area, just west of downtown Los Angeles, California. Two inverse models, based on the Wilshire arch, allow us to estimate the location and slip rate of the Wilshire fault, which may be illuminated by a zone of microearthquakes. A fault-bend fold model indicates a reverse-slip rate of 1.5-1.9 mm/yr, whereas a three-dimensional elastic-dislocation model indicates a right-reverse slip rate of 2.6-3.2 mm/yr. The Wilshire fault is a previously unrecognized seismic hazard directly beneath Hollywood and Beverly Hills, distinct from the faults under the nearby Santa Monica Mountains.

  19. What is Fault Tolerant Control

    DEFF Research Database (Denmark)

    Blanke, Mogens; Frei, C. W.; Kraus, K.

    2000-01-01

    Faults in automated processes will often cause undesired reactions and shut-down of a controlled plant, and the consequences could be damage to the plant, to personnel or the environment. Fault-tolerant control is the synonym for a set of recent techniques that were developed to increase plant...... availability and reduce the risk of safety hazards. Its aim is to prevent that simple faults develop into serious failure. Fault-tolerant control merges several disciplines to achieve this goal, including on-line fault diagnosis, automatic condition assessment and calculation of remedial actions when a fault...... is detected. The envelope of the possible remedial actions is wide. This paper introduces tools to analyze and explore structure and other fundamental properties of an automated system such that any redundancy in the process can be fully utilized to enhance safety and a availability....

  20. Preventing treatment errors in radiotherapy by identifying and evaluating near misses and actual incidents

    LENUS (Irish Health Repository)

    Holmberg, Ola

    2002-06-01

    When preparing radiation treatment, the prescribed dose and irradiation geometry must be translated into physical machine parameters. An error in the calculations or machine settings can negatively affect the intended treatment outcome. Analysing incidents originating in the treatment preparation chain makes it possible to find weak links and prevent treatment errors. The aim of this work is to study the effectiveness of a multilayered error prevention system by analysing both near misses and actual treatment errors.

  1. Medication errors in the Middle East countries: a systematic review of the literature.

    Science.gov (United States)

    Alsulami, Zayed; Conroy, Sharon; Choonara, Imti

    2013-04-01

    Medication errors are a significant global concern and can cause serious medical consequences for patients. Little is known about medication errors in Middle Eastern countries. The objectives of this systematic review were to review studies of the incidence and types of medication errors in Middle Eastern countries and to identify the main contributory factors involved. A systematic review of the literature related to medication errors in Middle Eastern countries was conducted in October 2011 using the following databases: Embase, Medline, Pubmed, the British Nursing Index and the Cumulative Index to Nursing & Allied Health Literature. The search strategy included all ages and languages. Inclusion criteria were that the studies assessed or discussed the incidence of medication errors and contributory factors to medication errors during the medication treatment process in adults or in children. Forty-five studies from 10 of the 15 Middle Eastern countries met the inclusion criteria. Nine (20 %) studies focused on medication errors in paediatric patients. Twenty-one focused on prescribing errors, 11 measured administration errors, 12 were interventional studies and one assessed transcribing errors. Dispensing and documentation errors were inadequately evaluated. Error rates varied from 7.1 % to 90.5 % for prescribing and from 9.4 % to 80 % for administration. The most common types of prescribing errors reported were incorrect dose (with an incidence rate from 0.15 % to 34.8 % of prescriptions), wrong frequency and wrong strength. Computerised physician rder entry and clinical pharmacist input were the main interventions evaluated. Poor knowledge of medicines was identified as a contributory factor for errors by both doctors (prescribers) and nurses (when administering drugs). Most studies did not assess the clinical severity of the medication errors. Studies related to medication errors in the Middle Eastern countries were relatively few in number and of poor quality

  2. Prescription Errors in Psychiatry

    African Journals Online (AJOL)

    Arun Kumar Agnihotri

    clinical pharmacists in detecting errors before they have a (sometimes serious) clinical impact should not be underestimated. Research on medication error in mental health care is limited. .... participation in ward rounds and adverse drug.

  3. Evaluation of drug administration errors in a teaching hospital

    Directory of Open Access Journals (Sweden)

    Berdot Sarah

    2012-03-01

    Full Text Available Abstract Background Medication errors can occur at any of the three steps of the medication use process: prescribing, dispensing and administration. We aimed to determine the incidence, type and clinical importance of drug administration errors and to identify risk factors. Methods Prospective study based on disguised observation technique in four wards in a teaching hospital in Paris, France (800 beds. A pharmacist accompanied nurses and witnessed the preparation and administration of drugs to all patients during the three drug rounds on each of six days per ward. Main outcomes were number, type and clinical importance of errors and associated risk factors. Drug administration error rate was calculated with and without wrong time errors. Relationship between the occurrence of errors and potential risk factors were investigated using logistic regression models with random effects. Results Twenty-eight nurses caring for 108 patients were observed. Among 1501 opportunities for error, 415 administrations (430 errors with one or more errors were detected (27.6%. There were 312 wrong time errors, ten simultaneously with another type of error, resulting in an error rate without wrong time error of 7.5% (113/1501. The most frequently administered drugs were the cardiovascular drugs (425/1501, 28.3%. The highest risks of error in a drug administration were for dermatological drugs. No potentially life-threatening errors were witnessed and 6% of errors were classified as having a serious or significant impact on patients (mainly omission. In multivariate analysis, the occurrence of errors was associated with drug administration route, drug classification (ATC and the number of patient under the nurse's care. Conclusion Medication administration errors are frequent. The identification of its determinants helps to undertake designed interventions.

  4. A Method to Simultaneously Detect the Current Sensor Fault and Estimate the State of Energy for Batteries in Electric Vehicles.

    Science.gov (United States)

    Xu, Jun; Wang, Jing; Li, Shiying; Cao, Binggang

    2016-08-19

    Recently, State of energy (SOE) has become one of the most fundamental parameters for battery management systems in electric vehicles. However, current information is critical in SOE estimation and current sensor is usually utilized to obtain the latest current information. However, if the current sensor fails, the SOE estimation may be confronted with large error. Therefore, this paper attempts to make the following contributions: Current sensor fault detection and SOE estimation method is realized simultaneously. Through using the proportional integral observer (PIO) based method, the current sensor fault could be accurately estimated. By taking advantage of the accurate estimated current sensor fault, the influence caused by the current sensor fault can be eliminated and compensated. As a result, the results of the SOE estimation will be influenced little by the fault. In addition, the simulation and experimental workbench is established to verify the proposed method. The results indicate that the current sensor fault can be estimated accurately. Simultaneously, the SOE can also be estimated accurately and the estimation error is influenced little by the fault. The maximum SOE estimation error is less than 2%, even though the large current error caused by the current sensor fault still exists.

  5. Advanced cloud fault tolerance system

    Science.gov (United States)

    Sumangali, K.; Benny, Niketa

    2017-11-01

    Cloud computing has become a prevalent on-demand service on the internet to store, manage and process data. A pitfall that accompanies cloud computing is the failures that can be encountered in the cloud. To overcome these failures, we require a fault tolerance mechanism to abstract faults from users. We have proposed a fault tolerant architecture, which is a combination of proactive and reactive fault tolerance. This architecture essentially increases the reliability and the availability of the cloud. In the future, we would like to compare evaluations of our proposed architecture with existing architectures and further improve it.

  6. Final Technical Report: PV Fault Detection Tool.

    Energy Technology Data Exchange (ETDEWEB)

    King, Bruce Hardison [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Christian Birk [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    The PV Fault Detection Tool project plans to demonstrate that the FDT can (a) detect catastrophic and degradation faults and (b) identify the type of fault. This will be accomplished by collecting fault signatures using different instruments and integrating this information to establish a logical controller for detecting, diagnosing and classifying each fault.

  7. Air Pollution Episodes Associated with Prescribed Burns

    Science.gov (United States)

    Hart, M.; Di Virgilio, G.; Jiang, N.

    2017-12-01

    Air pollution events associated with wildfires have been associated with extreme health impacts. Prescribed burns are an important tool to reduce the severity of wildfires. However, if undertaken during unfavourable meteorological conditions, they too have the capacity to trigger extreme air pollution events. The Australian state of New South Wales has increased the annual average area treated by prescribed burn activities by 45%, in order to limit wildfire activity. Prescribed burns need to be undertaken during meteorological conditions that allow the fuel load to burn, while still allowing the burn to remain under control. These conditions are similar to those that inhibit atmospheric dispersion, resulting in a fine balance between managing fire risk and managing ambient air pollution. During prescribed burns, the Sydney air shed can experience elevated particulate matter concentrations, especially fine particulates (PM2.5) that occasionally exceed national air quality standards. Using pollutant and meteorological data from sixteen monitoring stations in Sydney we used generalized additive model and CART analyses to profile the meteorological conditions influencing air quality during planned burns. The insights gained from this study will help improve prescribed burn scheduling in order to reduce the pollution risk to the community, while allowing fire agencies to conduct this important work.

  8. Control of invasive weeds with prescribed burning

    Science.gov (United States)

    DiTomaso, Joseph M.; Brooks, Matthew L.; Allen, Edith B.; Minnich, Ralph; Rice, Peter M.; Kyser, Guy B.

    2006-01-01

    Prescribed burning has primarily been used as a tool for the control of invasive late-season annual broadleaf and grass species, particularly yellow starthistle, medusahead, barb goatgrass, and several bromes. However, timely burning of a few invasive biennial broadleaves (e.g., sweetclover and garlic mustard), perennial grasses (e.g., bluegrasses and smooth brome), and woody species (e.g., brooms and Chinese tallow tree) also has been successful. In many cases, the effectiveness of prescribed burning can be enhanced when incorporated into an integrated vegetation management program. Although there are some excellent examples of successful use of prescribed burning for the control of invasive species, a limited number of species have been evaluated. In addition, few studies have measured the impact of prescribed burning on the long-term changes in plant communities, impacts to endangered plant species, effects on wildlife and insect populations, and alterations in soil biology, including nutrition, mycorrhizae, and hydrology. In this review, we evaluate the current state of knowledge on prescribed burning as a tool for invasive weed management.

  9. Accounting for uncertain fault geometry in earthquake source inversions - I: theory and simplified application

    Science.gov (United States)

    Ragon, Théa; Sladen, Anthony; Simons, Mark

    2018-05-01

    The ill-posed nature of earthquake source estimation derives from several factors including the quality and quantity of available observations and the fidelity of our forward theory. Observational errors are usually accounted for in the inversion process. Epistemic errors, which stem from our simplified description of the forward problem, are rarely dealt with despite their potential to bias the estimate of a source model. In this study, we explore the impact of uncertainties related to the choice of a fault geometry in source inversion problems. The geometry of a fault structure is generally reduced to a set of parameters, such as position, strike and dip, for one or a few planar fault segments. While some of these parameters can be solved for, more often they are fixed to an uncertain value. We propose a practical framework to address this limitation by following a previously implemented method exploring the impact of uncertainties on the elastic properties of our models. We develop a sensitivity analysis to small perturbations of fault dip and position. The uncertainties in fault geometry are included in the inverse problem under the formulation of the misfit covariance matrix that combines both prediction and observation uncertainties. We validate this approach with the simplified case of a fault that extends infinitely along strike, using both Bayesian and optimization formulations of a static inversion. If epistemic errors are ignored, predictions are overconfident in the data and source parameters are not reliably estimated. In contrast, inclusion of uncertainties in fault geometry allows us to infer a robust posterior source model. Epistemic uncertainties can be many orders of magnitude larger than observational errors for great earthquakes (Mw > 8). Not accounting for uncertainties in fault geometry may partly explain observed shallow slip deficits for continental earthquakes. Similarly, ignoring the impact of epistemic errors can also bias estimates of

  10. Fault current limiter

    Science.gov (United States)

    Darmann, Francis Anthony

    2013-10-08

    A fault current limiter (FCL) includes a series of high permeability posts for collectively define a core for the FCL. A DC coil, for the purposes of saturating a portion of the high permeability posts, surrounds the complete structure outside of an enclosure in the form of a vessel. The vessel contains a dielectric insulation medium. AC coils, for transporting AC current, are wound on insulating formers and electrically interconnected to each other in a manner such that the senses of the magnetic field produced by each AC coil in the corresponding high permeability core are opposing. There are insulation barriers between phases to improve dielectric withstand properties of the dielectric medium.

  11. Fault-tolerant sub-lithographic design with rollback recovery

    International Nuclear Information System (INIS)

    Naeimi, Helia; DeHon, Andre

    2008-01-01

    Shrinking feature sizes and energy levels coupled with high clock rates and decreasing node capacitance lead us into a regime where transient errors in logic cannot be ignored. Consequently, several recent studies have focused on feed-forward spatial redundancy techniques to combat these high transient fault rates. To complement these studies, we analyze fine-grained rollback techniques and show that they can offer lower spatial redundancy factors with no significant impact on system performance for fault rates up to one fault per device per ten million cycles of operation (P f = 10 -7 ) in systems with 10 12 susceptible devices. Further, we concretely demonstrate these claims on nanowire-based programmable logic arrays. Despite expensive rollback buffers and general-purpose, conservative analysis, we show the area overhead factor of our technique is roughly an order of magnitude lower than a gate level feed-forward redundancy scheme

  12. The use of prescribed and non-prescribed medication by Dutch children.

    NARCIS (Netherlands)

    Dijk, L. van; Lindert, H. van

    2002-01-01

    Background: Most research on the use of medication focuses on adults. Children, however, use medication too, most of which is prescribed by GP's. Children also use non-prescribed medication (f.e. bought in the drugstore), but the extent to which is not known. Moreover, it is not known to what extent

  13. Auditing GPs' prescribing habits : Cardiovascular prescribing frequently continues medication initiated by specialists

    NARCIS (Netherlands)

    de Vries, C.S; van Diepen, N.M; de Jong-van den Berg, L T W

    Objective: To determine to what extent general practitioners' (GPs) prescribing behaviour is a result of repeat prescribing of medication which has been initiated by specialists. Method: During a 4-week period, pharmacists identified GPs' prescriptions for a large group of cardiovascular drugs.

  14. On Round-off Error for Adaptive Finite Element Methods

    KAUST Repository

    Alvarez-Aramberri, J.

    2012-06-02

    Round-off error analysis has been historically studied by analyzing the condition number of the associated matrix. By controlling the size of the condition number, it is possible to guarantee a prescribed round-off error tolerance. However, the opposite is not true, since it is possible to have a system of linear equations with an arbitrarily large condition number that still delivers a small round-off error. In this paper, we perform a round-off error analysis in context of 1D and 2D hp-adaptive Finite Element simulations for the case of Poisson equation. We conclude that boundary conditions play a fundamental role on the round-off error analysis, specially for the so-called ‘radical meshes’. Moreover, we illustrate the importance of the right-hand side when analyzing the round-off error, which is independent of the condition number of the matrix.

  15. On Round-off Error for Adaptive Finite Element Methods

    KAUST Repository

    Alvarez-Aramberri, J.; Pardo, David; Paszynski, Maciej; Collier, Nathan; Dalcin, Lisandro; Calo, Victor M.

    2012-01-01

    Round-off error analysis has been historically studied by analyzing the condition number of the associated matrix. By controlling the size of the condition number, it is possible to guarantee a prescribed round-off error tolerance. However, the opposite is not true, since it is possible to have a system of linear equations with an arbitrarily large condition number that still delivers a small round-off error. In this paper, we perform a round-off error analysis in context of 1D and 2D hp-adaptive Finite Element simulations for the case of Poisson equation. We conclude that boundary conditions play a fundamental role on the round-off error analysis, specially for the so-called ‘radical meshes’. Moreover, we illustrate the importance of the right-hand side when analyzing the round-off error, which is independent of the condition number of the matrix.

  16. Pharmaceutical marketing research and the prescribing physician.

    Science.gov (United States)

    Greene, Jeremy A

    2007-05-15

    Surveillance of physicians' prescribing patterns and the accumulation and sale of these data for pharmaceutical marketing are currently the subjects of legislation in several states and action by state and national medical associations. Contrary to common perception, the growth of the health care information organization industry has not been limited to the past decade but has been building slowly over the past 50 years, beginning in the 1940s when growth in the prescription drug market fueled industry interest in understanding and influencing prescribing patterns. The development of this surveillance system was not simply imposed on the medical profession by the pharmaceutical industry but was developed through the interactions of pharmaceutical salesmen, pharmaceutical marketers, academic researchers, individual physicians, and physician organizations. Examination of the role of physicians and physician organizations in the development of prescriber profiling is directly relevant to the contemporary policy debate surrounding this issue.

  17. The Quality of Prescribing for Psychiatric Patients

    DEFF Research Database (Denmark)

    Soerensen, A L; Nielsen, L P; Poulsen, B K

    2014-01-01

    The Quality of Prescribing for Psychiatric PatientsSoerensen AL1,2, Nielsen LP3,4, Poulsen BK3, Lisby M3,5, Mainz J6,7 1Danish Center for Healthcare Improvements, Faculty of Social Sciences and Faculty of Health Sciences, Aalborg University, Denmark; 2University College of Northern Denmark; 3......, Aalborg; Denmark OBJECTIVES: Prescribing for adult psychiatric patients is often highly complex due to the nature of psychiatric conditions, but also due to somatic comorbidity. Therefore, the aim of this study was to identify prevalence and types of potential inappropriate prescribing (PIP), asses...... the severity of potential clinical consequences and identify possible predictive factors of PIP.METHODS: The study was designed as a prospective study of PIP using medication reviews. Patients who were admitted during a 4 month period (August 2013 - November 2013) to a psychiatric university hospital were...

  18. The Quality of Prescribing for Psychiatric Patients

    DEFF Research Database (Denmark)

    Sørensen, Ann Lykkegaard; Nielsen, Lars Peter; Poulsen, Birgitte Klindt

    2014-01-01

    The Quality of Prescribing for Psychiatric Patients Soerensen AL1,2, Nielsen LP3,4, Poulsen BK3, Lisby M3,5, Mainz J6,7 1Danish Center for Healthcare Improvements, Faculty of Social Sciences and Faculty of Health Sciences, Aalborg University, Denmark; 2University College of Northern Denmark; 3......, Aalborg; Denmark OBJECTIVES: Prescribing for adult psychiatric patients is often highly complex due to the nature of psychiatric conditions, but also due to somatic comorbidity. Therefore, the aim of this study was to identify prevalence and types of potential inappropriate prescribing (PIP), asses...... the severity of potential clinical consequences and identify possible predictive factors of PIP. METHODS: The study was designed as a prospective study of PIP using medication reviews. Patients who were admitted during a 4 month period (August 2013 - November 2013) to a psychiatric university hospital were...

  19. Errors in otology.

    Science.gov (United States)

    Kartush, J M

    1996-11-01

    Practicing medicine successfully requires that errors in diagnosis and treatment be minimized. Malpractice laws encourage litigators to ascribe all medical errors to incompetence and negligence. There are, however, many other causes of unintended outcomes. This article describes common causes of errors and suggests ways to minimize mistakes in otologic practice. Widespread dissemination of knowledge about common errors and their precursors can reduce the incidence of their occurrence. Consequently, laws should be passed to allow for a system of non-punitive, confidential reporting of errors and "near misses" that can be shared by physicians nationwide.

  20. Near field communications technology and the potential to reduce medication errors through multidisciplinary application.

    Science.gov (United States)

    O'Connell, Emer; Pegler, Joe; Lehane, Elaine; Livingstone, Vicki; McCarthy, Nora; Sahm, Laura J; Tabirca, Sabin; O'Driscoll, Aoife; Corrigan, Mark

    2016-01-01

    Patient safety requires optimal management of medications. Electronic systems are encouraged to reduce medication errors. Near field communications (NFC) is an emerging technology that may be used to develop novel medication management systems. An NFC-based system was designed to facilitate prescribing, administration and review of medications commonly used on surgical wards. Final year medical, nursing, and pharmacy students were recruited to test the electronic system in a cross-over observational setting on a simulated ward. Medication errors were compared against errors recorded using a paper-based system. A significant difference in the commission of medication errors was seen when NFC and paper-based medication systems were compared. Paper use resulted in a mean of 4.09 errors per prescribing round while NFC prescribing resulted in a mean of 0.22 errors per simulated prescribing round (P=0.000). Likewise, medication administration errors were reduced from a mean of 2.30 per drug round with a Paper system to a mean of 0.80 errors per round using NFC (PNFC based medication system may be used to effectively reduce medication errors in a simulated ward environment.

  1. System tuning and measurement error detection testing

    International Nuclear Information System (INIS)

    Krejci, Petr; Machek, Jindrich

    2008-09-01

    The project includes the use of the PEANO (Process Evaluation and Analysis by Neural Operators) system to verify the monitoring of the status of dependent measurements with a view to early measurement fault detection and estimation of selected signal levels. At the present stage, the system's capabilities of detecting measurement errors was assessed and the quality of the estimates was evaluated for various system configurations and the formation of empiric models, and rules were sought for system training at chosen process data recording parameters and operating modes. The aim was to find a suitable system configuration and to document the quality of the tuned system on artificial failures

  2. Non-binary unitary error bases and quantum codes

    Energy Technology Data Exchange (ETDEWEB)

    Knill, E.

    1996-06-01

    Error operator bases for systems of any dimension are defined and natural generalizations of the bit-flip/ sign-change error basis for qubits are given. These bases allow generalizing the construction of quantum codes based on eigenspaces of Abelian groups. As a consequence, quantum codes can be constructed form linear codes over {ital Z}{sub {ital n}} for any {ital n}. The generalization of the punctured code construction leads to many codes which permit transversal (i.e. fault tolerant) implementations of certain operations compatible with the error basis.

  3. Automated Classification of Phonological Errors in Aphasic Language

    Science.gov (United States)

    Ahuja, Sanjeev B.; Reggia, James A.; Berndt, Rita S.

    1984-01-01

    Using heuristically-guided state space search, a prototype program has been developed to simulate and classify phonemic errors occurring in the speech of neurologically-impaired patients. Simulations are based on an interchangeable rule/operator set of elementary errors which represent a theory of phonemic processing faults. This work introduces and evaluates a novel approach to error simulation and classification, it provides a prototype simulation tool for neurolinguistic research, and it forms the initial phase of a larger research effort involving computer modelling of neurolinguistic processes.

  4. e-Learning initiatives to support prescribing.

    Science.gov (United States)

    Maxwell, Simon; Mucklow, John

    2012-10-01

    Preparing medical students to prescribe is a major challenge of undergraduate education. They must develop an understanding of clinical pharmacology and acquire knowledge about drugs and therapeutics, as well as the skills to prescribe for individual patients in the face of multiple variables. The task of delivering the learning required to achieve these attributes relies upon limited numbers of teachers, who have increasingly busy clinical commitments. There is evidence that training is currently insufficient to meet the demands of the workplace. e-Learning provides an opportunity to improve the learning experience. The advantages for teachers are improved distribution of learning content, ease of update, standardization and tracking of learner activities. The advantages for learners are ease of access, greater interactivity and individual choice concerning the pace and mix of learning. Important disadvantages are the considerable resource required to develop e-Learning projects and difficulties in simulating some aspects of the real world prescribing experience. Pre-requisites for developing an e-Learning programme to support prescribing include academic expertise, institutional support, learning technology services and an effective virtual learning environment. e-Learning content might range from complex interactive learning sessions through to static web pages with links. It is now possible to simulate and provide feedback on prescribing decisions and this will improve with advances in virtual reality. Other content might include a student formulary, self-assessment exercises (e.g. calculations), a glossary and an on-line library. There is some evidence for the effectiveness of e-Learning but better research is required into its potential impact on prescribing. © 2012 The Authors. British Journal of Clinical Pharmacology © 2012 The British Pharmacological Society.

  5. e-Learning initiatives to support prescribing

    Science.gov (United States)

    Maxwell, Simon; Mucklow, John

    2012-01-01

    Preparing medical students to prescribe is a major challenge of undergraduate education. They must develop an understanding of clinical pharmacology and acquire knowledge about drugs and therapeutics, as well as the skills to prescribe for individual patients in the face of multiple variables. The task of delivering the learning required to achieve these attributes relies upon limited numbers of teachers, who have increasingly busy clinical commitments. There is evidence that training is currently insufficient to meet the demands of the workplace. e-Learning provides an opportunity to improve the learning experience. The advantages for teachers are improved distribution of learning content, ease of update, standardization and tracking of learner activities. The advantages for learners are ease of access, greater interactivity and individual choice concerning the pace and mix of learning. Important disadvantages are the considerable resource required to develop e-Learning projects and difficulties in simulating some aspects of the real world prescribing experience. Pre-requisites for developing an e-Learning programme to support prescribing include academic expertise, institutional support, learning technology services and an effective virtual learning environment. e-Learning content might range from complex interactive learning sessions through to static web pages with links. It is now possible to simulate and provide feedback on prescribing decisions and this will improve with advances in virtual reality. Other content might include a student formulary, self-assessment exercises (e.g. calculations), a glossary and an on-line library. There is some evidence for the effectiveness of e-Learning but better research is required into its potential impact on prescribing. PMID:22509885

  6. Fault Management Design Strategies

    Science.gov (United States)

    Day, John C.; Johnson, Stephen B.

    2014-01-01

    Development of dependable systems relies on the ability of the system to determine and respond to off-nominal system behavior. Specification and development of these fault management capabilities must be done in a structured and principled manner to improve our understanding of these systems, and to make significant gains in dependability (safety, reliability and availability). Prior work has described a fundamental taxonomy and theory of System Health Management (SHM), and of its operational subset, Fault Management (FM). This conceptual foundation provides a basis to develop framework to design and implement FM design strategies that protect mission objectives and account for system design limitations. Selection of an SHM strategy has implications for the functions required to perform the strategy, and it places constraints on the set of possible design solutions. The framework developed in this paper provides a rigorous and principled approach to classifying SHM strategies, as well as methods for determination and implementation of SHM strategies. An illustrative example is used to describe the application of the framework and the resulting benefits to system and FM design and dependability.

  7. Medication reconciliation and prescribing reviews by pharmacy technicians in a geriatric ward

    DEFF Research Database (Denmark)

    Buck, Thomas Croft; Gronkjaer, Louise Smed; Duckert, Marie-Louise

    2013-01-01

    OBJECTIVE: Incomplete medication histories obtained on hospital admission are responsible for more than 25% of prescribing errors. This study aimed to evaluate whether pharmacy technicians can assist hospital physicians' in obtaining medication histories by performing medication reconciliation an...... reconciliation and focused medication reviews. Further randomized, controlled studies including a larger number of patients are required to elucidate whether these observations are of significance and of importance for securing patient safety....... and prescribing reviews. A secondary aim was to evaluate whether the interventions made by pharmacy technicians could reduce the time spent by the nurses on administration of medications to the patients. METHODS: This observational study was conducted over a 7 week period in the geriatric ward at Odense...... University Hospital, Denmark. Two pharmacy technicians conducted medication reconciliation and prescribing reviews at the time of patients' admission to the ward. The reviews were conducted according to standard operating procedures developed by a clinical pharmacist and approved by the Head of the Geriatric...

  8. Evoking prescribed spike times in stochastic neurons

    Science.gov (United States)

    Doose, Jens; Lindner, Benjamin

    2017-09-01

    Single cell stimulation in vivo is a powerful tool to investigate the properties of single neurons and their functionality in neural networks. We present a method to determine a cell-specific stimulus that reliably evokes a prescribed spike train with high temporal precision of action potentials. We test the performance of this stimulus in simulations for two different stochastic neuron models. For a broad range of parameters and a neuron firing with intermediate firing rates (20-40 Hz) the reliability in evoking the prescribed spike train is close to its theoretical maximum that is mainly determined by the level of intrinsic noise.

  9. Blueprint for prescriber continuing education program.

    Science.gov (United States)

    2012-06-01

    On October 25, 2011, the Center for Drug Evaluation and Research (CDER) of the Food and Drug Administration (FDA) posted online this Blueprint for Prescriber Continuing Education, labeled "final," relating to extended-release and long-acting opioids. The pending FDA Risk Evaluation Management Strategy (REMS) requires prescriber education. This document provides guidance to sponsors of these dosage forms in developing the prescvriber education component of their REMS. This report was posted online by the federal agency on October 25, 2011 at: http://www.fda.gov/downloads/drugs/drugsafety/informationbydrugclass/ucm277916.pdf. It is in the public domain.

  10. Approximate dynamic fault tree calculations for modelling water supply risks

    International Nuclear Information System (INIS)

    Lindhe, Andreas; Norberg, Tommy; Rosén, Lars

    2012-01-01

    Traditional fault tree analysis is not always sufficient when analysing complex systems. To overcome the limitations dynamic fault tree (DFT) analysis is suggested in the literature as well as different approaches for how to solve DFTs. For added value in fault tree analysis, approximate DFT calculations based on a Markovian approach are presented and evaluated here. The approximate DFT calculations are performed using standard Monte Carlo simulations and do not require simulations of the full Markov models, which simplifies model building and in particular calculations. It is shown how to extend the calculations of the traditional OR- and AND-gates, so that information is available on the failure probability, the failure rate and the mean downtime at all levels in the fault tree. Two additional logic gates are presented that make it possible to model a system's ability to compensate for failures. This work was initiated to enable correct analyses of water supply risks. Drinking water systems are typically complex with an inherent ability to compensate for failures that is not easily modelled using traditional logic gates. The approximate DFT calculations are compared to results from simulations of the corresponding Markov models for three water supply examples. For the traditional OR- and AND-gates, and one gate modelling compensation, the errors in the results are small. For the other gate modelling compensation, the error increases with the number of compensating components. The errors are, however, in most cases acceptable with respect to uncertainties in input data. The approximate DFT calculations improve the capabilities of fault tree analysis of drinking water systems since they provide additional and important information and are simple and practically applicable.

  11. 69-74 A Retrospective Analysis of Prescribing Prac

    African Journals Online (AJOL)

    user

    A Retrospective Analysis of Prescribing Practice Based on WHO Prescribing Indicators at Four. Selected Hospitals of West ... Key words: World Health Organization, prescribing indicators, rational drug use. INTRODUCTION. Indicators of ... factors, the risk of irrational prescribing could raise several folds. Irrational use of ...

  12. Antimalarial prescribing patterns in state hospitals and selected ...

    African Journals Online (AJOL)

    slowdown of progression to resistance could be achieved by improving prescribing practice, drug quality, and patient compliance. Objective: To determine the antimalarial prescribing pattern and to assess rational prescribing of chloroquine by prescribers in government hospitals and parastatals in Lagos State. Methods: ...

  13. Accelerometer having integral fault null

    Science.gov (United States)

    Bozeman, Richard J., Jr.

    1995-08-01

    An improved accelerometer is introduced. It comprises a transducer responsive to vibration in machinery which produces an electrical signal related to the magnitude and frequency of the vibration; and a decoding circuit responsive to the transducer signal which produces a first fault signal to produce a second fault signal in which ground shift effects are nullified.

  14. Sliding Mode Fault Tolerant Control with Adaptive Diagnosis for Aircraft Engines

    Science.gov (United States)

    Xiao, Lingfei; Du, Yanbin; Hu, Jixiang; Jiang, Bin

    2018-03-01

    In this paper, a novel sliding mode fault tolerant control method is presented for aircraft engine systems with uncertainties and disturbances on the basis of adaptive diagnostic observer. By taking both sensors faults and actuators faults into account, the general model of aircraft engine control systems which is subjected to uncertainties and disturbances, is considered. Then, the corresponding augmented dynamic model is established in order to facilitate the fault diagnosis and fault tolerant controller design. Next, a suitable detection observer is designed to detect the faults effectively. Through creating an adaptive diagnostic observer and based on sliding mode strategy, the sliding mode fault tolerant controller is constructed. Robust stabilization is discussed and the closed-loop system can be stabilized robustly. It is also proven that the adaptive diagnostic observer output errors and the estimations of faults converge to a set exponentially, and the converge rate greater than some value which can be adjusted by choosing designable parameters properly. The simulation on a twin-shaft aircraft engine verifies the applicability of the proposed fault tolerant control method.

  15. Incipient Fault Detection and Isolation of Field Devices in Nuclear Power Systems Using Principal Component Analysis

    International Nuclear Information System (INIS)

    Kaistha, Nitin; Upadhyaya, Belle R.

    2001-01-01

    An integrated method for the detection and isolation of incipient faults in common field devices, such as sensors and actuators, using plant operational data is presented. The approach is based on the premise that data for normal operation lie on a surface and abnormal situations lead to deviations from the surface in a particular way. Statistically significant deviations from the surface result in the detection of faults, and the characteristic directions of deviations are used for isolation of one or more faults from the set of typical faults. Principal component analysis (PCA), a multivariate data-driven technique, is used to capture the relationships in the data and fit a hyperplane to the data. The fault direction for each of the scenarios is obtained using the singular value decomposition on the state and control function prediction errors, and fault isolation is then accomplished from projections on the fault directions. This approach is demonstrated for a simulated pressurized water reactor steam generator system and for a laboratory process control system under single device fault conditions. Enhanced fault isolation capability is also illustrated by incorporating realistic nonlinear terms in the PCA data matrix

  16. About problematic peculiarities of Fault Tolerance digital regulation organization

    Science.gov (United States)

    Rakov, V. I.; Zakharova, O. V.

    2018-05-01

    The solution of problems concerning estimation of working capacity of regulation chains and possibilities of preventing situations of its violation in three directions are offered. The first direction is working out (creating) the methods of representing the regulation loop (circuit) by means of uniting (combining) diffuse components and forming algorithmic tooling for building predicates of serviceability assessment separately for the components and the for regulation loops (circuits, contours) in general. The second direction is creating methods of Fault Tolerance redundancy in the process of complex assessment of current values of control actions, closure errors and their regulated parameters. The third direction is creating methods of comparing the processes of alteration (change) of control actions, errors of closure and regulating parameters with their standard models or their surroundings. This direction allows one to develop methods and algorithmic tool means, aimed at preventing loss of serviceability and effectiveness of not only a separate digital regulator, but also the whole complex of Fault Tolerance regulation.

  17. Fault-tolerant clock synchronization validation methodology. [in computer systems

    Science.gov (United States)

    Butler, Ricky W.; Palumbo, Daniel L.; Johnson, Sally C.

    1987-01-01

    A validation method for the synchronization subsystem of a fault-tolerant computer system is presented. The high reliability requirement of flight-crucial systems precludes the use of most traditional validation methods. The method presented utilizes formal design proof to uncover design and coding errors and experimentation to validate the assumptions of the design proof. The experimental method is described and illustrated by validating the clock synchronization system of the Software Implemented Fault Tolerance computer. The design proof of the algorithm includes a theorem that defines the maximum skew between any two nonfaulty clocks in the system in terms of specific system parameters. Most of these parameters are deterministic. One crucial parameter is the upper bound on the clock read error, which is stochastic. The probability that this upper bound is exceeded is calculated from data obtained by the measurement of system parameters. This probability is then included in a detailed reliability analysis of the system.

  18. Reducing diagnostic errors in medicine: what's the goal?

    Science.gov (United States)

    Graber, Mark; Gordon, Ruthanna; Franklin, Nancy

    2002-10-01

    This review considers the feasibility of reducing or eliminating the three major categories of diagnostic errors in medicine: "No-fault errors" occur when the disease is silent, presents atypically, or mimics something more common. These errors will inevitably decline as medical science advances, new syndromes are identified, and diseases can be detected more accurately or at earlier stages. These errors can never be eradicated, unfortunately, because new diseases emerge, tests are never perfect, patients are sometimes noncompliant, and physicians will inevitably, at times, choose the most likely diagnosis over the correct one, illustrating the concept of necessary fallibility and the probabilistic nature of choosing a diagnosis. "System errors" play a role when diagnosis is delayed or missed because of latent imperfections in the health care system. These errors can be reduced by system improvements, but can never be eliminated because these improvements lag behind and degrade over time, and each new fix creates the opportunity for novel errors. Tradeoffs also guarantee system errors will persist, when resources are just shifted. "Cognitive errors" reflect misdiagnosis from faulty data collection or interpretation, flawed reasoning, or incomplete knowledge. The limitations of human processing and the inherent biases in using heuristics guarantee that these errors will persist. Opportunities exist, however, for improving the cognitive aspect of diagnosis by adopting system-level changes (e.g., second opinions, decision-support systems, enhanced access to specialists) and by training designed to improve cognition or cognitive awareness. Diagnostic error can be substantially reduced, but never eradicated.

  19. Simultaneous Robust Fault and State Estimation for Linear Discrete-Time Uncertain Systems

    Directory of Open Access Journals (Sweden)

    Feten Gannouni

    2017-01-01

    Full Text Available We consider the problem of robust simultaneous fault and state estimation for linear uncertain discrete-time systems with unknown faults which affect both the state and the observation matrices. Using transformation of the original system, a new robust proportional integral filter (RPIF having an error variance with an optimized guaranteed upper bound for any allowed uncertainty is proposed to improve robust estimation of unknown time-varying faults and to improve robustness against uncertainties. In this study, the minimization problem of the upper bound of the estimation error variance is formulated as a convex optimization problem subject to linear matrix inequalities (LMI for all admissible uncertainties. The proportional and the integral gains are optimally chosen by solving the convex optimization problem. Simulation results are given in order to illustrate the performance of the proposed filter, in particular to solve the problem of joint fault and state estimation.

  20. Accident Fault Trees for Defense Waste Processing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Sarrack, A.G.

    1999-06-22

    The purpose of this report is to document fault tree analyses which have been completed for the Defense Waste Processing Facility (DWPF) safety analysis. Logic models for equipment failures and human error combinations that could lead to flammable gas explosions in various process tanks, or failure of critical support systems were developed for internal initiating events and for earthquakes. These fault trees provide frequency estimates for support systems failures and accidents that could lead to radioactive and hazardous chemical releases both on-site and off-site. Top event frequency results from these fault trees will be used in further APET analyses to calculate accident risk associated with DWPF facility operations. This report lists and explains important underlying assumptions, provides references for failure data sources, and briefly describes the fault tree method used. Specific commitments from DWPF to provide new procedural/administrative controls or system design changes are listed in the ''Facility Commitments'' section. The purpose of the ''Assumptions'' section is to clarify the basis for fault tree modeling, and is not necessarily a list of items required to be protected by Technical Safety Requirements (TSRs).

  1. Assessing pediatrics residents' mathematical skills for prescribing medication: a need for improved training.

    Science.gov (United States)

    Glover, Mark L; Sussmane, Jeffrey B

    2002-10-01

    To evaluate residents' skills in performing basic mathematical calculations used for prescribing medications to pediatric patients. In 2001, a test of ten questions on basic calculations was given to first-, second-, and third-year residents at Miami Children's Hospital in Florida. Four additional questions were included to obtain the residents' levels of training, specific pediatrics intensive care unit (PICU) experience, and whether or not they routinely double-checked doses and adjusted them for each patient's weight. The test was anonymous and calculators were permitted. The overall score and the score for each resident class were calculated. Twenty-one residents participated. The overall average test score and the mean test score of each resident class was less than 70%. Second-year residents had the highest mean test scores, although there was no significant difference between the classes of residents (p =.745) or relationship between the residents' PICU experiences and their exam scores (p =.766). There was no significant difference between residents' levels of training and whether they double-checked their calculations (p =.633) or considered each patient's weight relative to the dose prescribed (p =.869). Seven residents committed tenfold dosing errors, and one resident committed a 1,000-fold dosing error. Pediatrics residents need to receive additional education in performing the calculations needed to prescribe medications. In addition, residents should be required to demonstrate these necessary mathematical skills before they are allowed to prescribe medications.

  2. Fault isolatability conditions for linear systems

    DEFF Research Database (Denmark)

    Stoustrup, Jakob; Niemann, Henrik

    2006-01-01

    In this paper, we shall show that an unlimited number of additive single faults can be isolated under mild conditions if a general isolation scheme is applied. Multiple faults are also covered. The approach is algebraic and is based on a set representation of faults, where all faults within a set...... the faults have occurred. The last step is a fault isolation (FI) of the faults occurring in a specific fault set, i.e. equivalent with the standard FI step. A simple example demonstrates how to turn the algebraic necessary and sufficient conditions into explicit algorithms for designing filter banks, which...

  3. ESR dating of the fault rocks

    International Nuclear Information System (INIS)

    Lee, Hee Kwon

    2005-01-01

    We carried out ESR dating of fault rocks collected near the nuclear reactor. The Upcheon fault zone is exposed close to the Ulzin nuclear reactor. The space-time pattern of fault activity on the Upcheon fault deduced from ESR dating of fault gouge can be summarised as follows : this fault zone was reactivated between fault breccia derived from Cretaceous sandstone and tertiary volcanic sedimentary rocks about 2 Ma, 1.5 Ma and 1 Ma ago. After those movements, the Upcheon fault was reactivated between Cretaceous sandstone and fault breccia zone about 800 ka ago. This fault zone was reactivated again between fault breccia derived form Cretaceous sandstone and Tertiary volcanic sedimentary rocks about 650 ka and after 125 ka ago. These data suggest that the long-term(200-500 k.y.) cyclic fault activity of the Upcheon fault zone continued into the Pleistocene. In the Ulzin area, ESR dates from the NW and EW trend faults range from 800 ka to 600 ka NE and EW trend faults were reactivated about between 200 ka and 300 ka ago. On the other hand, ESR date of the NS trend fault is about 400 ka and 50 ka. Results of this research suggest the fault activity near the Ulzin nuclear reactor fault activity continued into the Pleistocene. One ESR date near the Youngkwang nuclear reactor is 200 ka

  4. Learning from escaped prescribed fire reviews

    Science.gov (United States)

    Anne E. Black; Dave Thomas; James Saveland; Jennifer D. Ziegler

    2011-01-01

    The U.S. wildland fire community has developed a number of innovative methods for conducting a review following escape of a prescribed fire (expanding on the typical regional or local reviews, to include more of a learning focus - expanded After Action Reviews, reviews that incorporate High Reliability Organizing, Facilitated Learning Analyses, etc). The stated purpose...

  5. Prescribing Behavior of General Practitioners : Competition Matters!

    NARCIS (Netherlands)

    Schaumans, C.B.C.

    2014-01-01

    Background: General Practitioners have limited means to compete. As quality is hard to observe by patients, GPs have incentives to signal quality by using instruments patients perceive as quality. Objectives: We investigate whether GPs exhibit different prescribing behavior (volume and value of

  6. Prescribing behavior of general practitioners : Competition matters!

    NARCIS (Netherlands)

    Schaumans, C.B.C.

    Background General Practitioners (GP) have limited means to compete. As quality is hard to observe by patients, GPs have incentives to signal quality by using instruments patients perceive as quality. Objectives I investigate whether GPs prescribe more units when confronted with more competition. As

  7. Antimalarial Drugs for Pediatrics - Prescribing and Dispensing ...

    African Journals Online (AJOL)

    Purpose: To assess dispensing and prescribing practices with regard to antimalarial drugs for pediatrics in private pharmacies and public hospitals in Dar es Salaam, Tanzania. Methods: This was a cross-sectional, descriptive study that assessed the knowledge and practice of 200 drug dispensers in the private community ...

  8. Cost Evaluation of Commonly Prescribed Antihypertensive Drugs ...

    African Journals Online (AJOL)

    It was also concluded that generic prescription should be encouraged among prescribers to lessen the financial burden of patients because drugs marketed under generic names are usually cheaper than those with brand names. Key words: Brand, Generic,Prescription, Antihypertensives,Cost. [Nig. Jnl Health & Biomedical ...

  9. PRESCRIBING PATTERN OF NON-STEROIDAL ANTI ...

    African Journals Online (AJOL)

    2015-03-01

    Mar 1, 2015 ... Design: A total of 3800 prescriptions containing. NSAIDs were analyzed for information on drug name, the number of NSAIDs per prescription, the presence of ACE inhibitors and diuretics alongside. NSAIDs and NSAIDs prescribed in generic or brand names. Results: The results showed that Aspirin was ...

  10. Prescribing Patterns of Methylphenidate and Atomoxetine for ...

    African Journals Online (AJOL)

    Purpose: To determine the prescribing pattern of methylphenidate and atomoxetine to patients with. Attention-Deficit/Hyperactivity Disorder (ADHD) in South Africa. Methods: A retrospective, cross-sectional pharmacoepidemiological study was conducted based on the data from a medical aid administrator in South Africa for ...

  11. Prescribing Patterns of Methylphenidate and Atomoxetine for ...

    African Journals Online (AJOL)

    Purpose: To determine the prescribing pattern of methylphenidate and atomoxetine to patients with Attention-Deficit/Hyperactivity Disorder (ADHD) in South Africa. Methods: A retrospective, cross-sectional pharmacoepidemiological study was conducted based on the data from a medical aid administrator in South Africa for ...

  12. An atmospheric dispersion index for prescribed burning

    Science.gov (United States)

    Leonidas G. Lavdas

    1986-01-01

    A numerical index that estimates the atmosphere's capacity to disperse smoke from prescribed burning is described. The physical assumptions and mathematical development of the index are given in detail. A preliminary interpretation of dispersion index values is offered. A FORTRAN subroutine package for computing the index is included.

  13. [Prescribing, the perspectives of health professionals].

    Science.gov (United States)

    Debout, Christophe; Lescot, Thomas; Loyer, Frédérique; Ambrosino, Florence

    2016-10-01

    While, in France, various health professionals are authorised to prescribe, they approach this activity in a different way, depending on the professional category to which they belong. The areas and products concerned are specific to each profession, and inevitably evolve. This article presents the different perspectives of a doctor, a midwife and a nurse. Copyright © 2016. Published by Elsevier Masson SAS.

  14. Fault Current Characteristics of the DFIG under Asymmetrical Fault Conditions

    Directory of Open Access Journals (Sweden)

    Fan Xiao

    2015-09-01

    Full Text Available During non-severe fault conditions, crowbar protection is not activated and the rotor windings of a doubly-fed induction generator (DFIG are excited by the AC/DC/AC converter. Meanwhile, under asymmetrical fault conditions, the electrical variables oscillate at twice the grid frequency in synchronous dq frame. In the engineering practice, notch filters are usually used to extract the positive and negative sequence components. In these cases, the dynamic response of a rotor-side converter (RSC and the notch filters have a large influence on the fault current characteristics of the DFIG. In this paper, the influence of the notch filters on the proportional integral (PI parameters is discussed and the simplified calculation models of the rotor current are established. Then, the dynamic performance of the stator flux linkage under asymmetrical fault conditions is also analyzed. Based on this, the fault characteristics of the stator current under asymmetrical fault conditions are studied and the corresponding analytical expressions of the stator fault current are obtained. Finally, digital simulation results validate the analytical results. The research results are helpful to meet the requirements of a practical short-circuit calculation and the construction of a relaying protection system for the power grid with penetration of DFIGs.

  15. Detecting Soft Errors in Stencil based Computations

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, V. [Univ. of Utah, Salt Lake City, UT (United States); Gopalkrishnan, G. [Univ. of Utah, Salt Lake City, UT (United States); Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    Given the growing emphasis on system resilience, it is important to develop software-level error detectors that help trap hardware-level faults with reasonable accuracy while minimizing false alarms as well as the performance overhead introduced. We present a technique that approaches this idea by taking stencil computations as our target, and synthesizing detectors based on machine learning. In particular, we employ linear regression to generate computationally inexpensive models which form the basis for error detection. Our technique has been incorporated into a new open-source library called SORREL. In addition to reporting encouraging experimental results, we demonstrate techniques that help reduce the size of training data. We also discuss the efficacy of various detectors synthesized, as well as our future plans.

  16. The error in total error reduction.

    Science.gov (United States)

    Witnauer, James E; Urcelay, Gonzalo P; Miller, Ralph R

    2014-02-01

    Most models of human and animal learning assume that learning is proportional to the discrepancy between a delivered outcome and the outcome predicted by all cues present during that trial (i.e., total error across a stimulus compound). This total error reduction (TER) view has been implemented in connectionist and artificial neural network models to describe the conditions under which weights between units change. Electrophysiological work has revealed that the activity of dopamine neurons is correlated with the total error signal in models of reward learning. Similar neural mechanisms presumably support fear conditioning, human contingency learning, and other types of learning. Using a computational modeling approach, we compared several TER models of associative learning to an alternative model that rejects the TER assumption in favor of local error reduction (LER), which assumes that learning about each cue is proportional to the discrepancy between the delivered outcome and the outcome predicted by that specific cue on that trial. The LER model provided a better fit to the reviewed data than the TER models. Given the superiority of the LER model with the present data sets, acceptance of TER should be tempered. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Fault tolerant control of multivariable processes using auto-tuning PID controller.

    Science.gov (United States)

    Yu, Ding-Li; Chang, T K; Yu, Ding-Wen

    2005-02-01

    Fault tolerant control of dynamic processes is investigated in this paper using an auto-tuning PID controller. A fault tolerant control scheme is proposed composing an auto-tuning PID controller based on an adaptive neural network model. The model is trained online using the extended Kalman filter (EKF) algorithm to learn system post-fault dynamics. Based on this model, the PID controller adjusts its parameters to compensate the effects of the faults, so that the control performance is recovered from degradation. The auto-tuning algorithm for the PID controller is derived with the Lyapunov method and therefore, the model predicted tracking error is guaranteed to converge asymptotically. The method is applied to a simulated two-input two-output continuous stirred tank reactor (CSTR) with various faults, which demonstrate the applicability of the developed scheme to industrial processes.

  18. A Framework For Evaluating Comprehensive Fault Resilience Mechanisms In Numerical Programs

    Energy Technology Data Exchange (ETDEWEB)

    Chen, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Peng, L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-01-09

    As HPC systems approach Exascale, their circuit feature will shrink, while their overall size will grow, all at a fixed power limit. These trends imply that soft faults in electronic circuits will become an increasingly significant problem for applications that run on these systems, causing them to occasionally crash or worse, silently return incorrect results. This is motivating extensive work on application resilience to such faults, ranging from generic techniques such as replication or checkpoint/restart to algorithm-specific error detection and resilience techniques. Effective use of such techniques requires a detailed understanding of (1) which vulnerable parts of the application are most worth protecting (2) the performance and resilience impact of fault resilience mechanisms on the application. This paper presents FaultTelescope, a tool that combines these two and generates actionable insights by presenting in an intuitive way application vulnerabilities and impact of fault resilience mechanisms on applications.

  19. Arc fault detection system

    Science.gov (United States)

    Jha, K.N.

    1999-05-18

    An arc fault detection system for use on ungrounded or high-resistance-grounded power distribution systems is provided which can be retrofitted outside electrical switchboard circuits having limited space constraints. The system includes a differential current relay that senses a current differential between current flowing from secondary windings located in a current transformer coupled to a power supply side of a switchboard, and a total current induced in secondary windings coupled to a load side of the switchboard. When such a current differential is experienced, a current travels through a operating coil of the differential current relay, which in turn opens an upstream circuit breaker located between the switchboard and a power supply to remove the supply of power to the switchboard. 1 fig.

  20. Arc fault detection system

    Science.gov (United States)

    Jha, Kamal N.

    1999-01-01

    An arc fault detection system for use on ungrounded or high-resistance-grounded power distribution systems is provided which can be retrofitted outside electrical switchboard circuits having limited space constraints. The system includes a differential current relay that senses a current differential between current flowing from secondary windings located in a current transformer coupled to a power supply side of a switchboard, and a total current induced in secondary windings coupled to a load side of the switchboard. When such a current differential is experienced, a current travels through a operating coil of the differential current relay, which in turn opens an upstream circuit breaker located between the switchboard and a power supply to remove the supply of power to the switchboard.

  1. Errors in Neonatology

    OpenAIRE

    Antonio Boldrini; Rosa T. Scaramuzzo; Armando Cuttano

    2013-01-01

    Introduction: Danger and errors are inherent in human activities. In medical practice errors can lean to adverse events for patients. Mass media echo the whole scenario. Methods: We reviewed recent published papers in PubMed database to focus on the evidence and management of errors in medical practice in general and in Neonatology in particular. We compared the results of the literature with our specific experience in Nina Simulation Centre (Pisa, Italy). Results: In Neonatology the main err...

  2. The impact of pharmacy services on opioid prescribing in dental practice.

    Science.gov (United States)

    Stewart, Autumn; Zborovancik, Kelsey J; Stiely, Kara L

    To compare rates of dental opioid prescribing between periods of full and partial integration of pharmacy services and periods of no integration. This observational study used a retrospective chart review of opioid prescriptions written by dental providers practicing in a free dental clinic for the medically underserved over a period of 74 months. Pharmacy services were fully integrated into the practice model for 48 of the 74 months under study. During this time frame, all dental opioid orders required review by the pharmacy department before prescribing. Outcomes related to prescribing rates and errors were compared between groups, which were defined by the level of integrated pharmacy services. Demographic and prescription-specific data (drug name, dose, quantity, directions, professional designation of individual entering order) and clinic appointment data were collected and analyzed with the use of descriptive and inferential statistics. A total of 102 opioids were prescribed to 89 patients; hydrocodone-acetaminophen combination products were the most frequently used. Opioid prescribing rates were 5 times greater when pharmacy services were not integrated (P dental practice. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  3. How doctors diagnose diseases and prescribe treatments: an fMRI study of diagnostic salience

    OpenAIRE

    Melo, Marcio; Gusso, Gustavo D. F.; Levites, Marcelo; Amaro Jr., Edson; Massad, Eduardo; Lotufo, Paulo A.; Zeidman, Peter; Price, Cathy J.; Friston, Karl J.

    2017-01-01

    Understanding the brain mechanisms involved in diagnostic reasoning may contribute to the development of methods that reduce errors in medical practice. In this study we identified similar brain systems for diagnosing diseases, prescribing treatments, and naming animals and objects using written information as stimuli. Employing time resolved modeling of blood oxygen level dependent (BOLD) responses enabled time resolved (400 milliseconds epochs) analyses. With this approach it was possible t...

  4. Effect of Pharmacist Participation During Physician Rounds and Prescription Error in the Intensive Care Unit

    Directory of Open Access Journals (Sweden)

    Marlina A. Turnodihardjo

    2016-09-01

    Full Text Available Patient’s safety is now a prominent issue in pharmaceutical care because of adverse drug events that is common in hospitalized patients. Majority of error are likely occured during prescribing, which is the first stage of pharmacy process. Prescription errors mostly occured in an Intensive Care Unit (ICU, which is due to the severity of the illness of its patients as well as the large number of medications prescribed. Pharmacist participation actually could reduce prescribing error made by doctors. The main objective of this study was to determine the effect of pharmacist participation during physician rounds on prescription errors in the ICU. This study was a quasi-experimental design with one group pre-post test. A prospective study was conducted from April to May 2015 by screening 110 samples of orders. Screening was done to identify type of prescription errors. Prescription error was defined as error in the prescription writing process – incomplete information and not according to agreement. Mann-Whitney test was used to analyze the differences in prescribing errors. The results showed that there was the differences between prescription errors before and during the pharmacist participation (p<0.05. There was also a significant negative correlation between the frequency of pharmacist recommendation on drug ordering and prescription errors (r= –0.638; p<0.05. It means the pharmacist participation was one of the strategies that can be adopted to prevent in prescribing errors and implementation of collaboration between both doctors and pharmacists. In other words, the supporting hospital management system which would encourage interpersonal communication among health care proffesionals is needed.

  5. Experimental magic state distillation for fault-tolerant quantum computing.

    Science.gov (United States)

    Souza, Alexandre M; Zhang, Jingfu; Ryan, Colm A; Laflamme, Raymond

    2011-01-25

    Any physical quantum device for quantum information processing (QIP) is subject to errors in implementation. In order to be reliable and efficient, quantum computers will need error-correcting or error-avoiding methods. Fault-tolerance achieved through quantum error correction will be an integral part of quantum computers. Of the many methods that have been discovered to implement it, a highly successful approach has been to use transversal gates and specific initial states. A critical element for its implementation is the availability of high-fidelity initial states, such as |0〉 and the 'magic state'. Here, we report an experiment, performed in a nuclear magnetic resonance (NMR) quantum processor, showing sufficient quantum control to improve the fidelity of imperfect initial magic states by distilling five of them into one with higher fidelity.

  6. Medical Errors in Cyprus: The 2005 Eurobarometer Survey

    Directory of Open Access Journals (Sweden)

    Andreas Pavlakis

    2012-01-01

    Full Text Available Background: Medical errors have been highlighted in recent years by different agencies, scientific bodies and research teams alike. We sought to explore the issue of medical errors in Cyprus using data from the Eurobarometer survey.Methods: Data from the special Eurobarometer survey conducted in 2005 across all European Union countries (EU-25 and the acceding countries were obtained from the corresponding EU office. Statisticalanalyses including logistic regression models were performed using SPSS.Results: A total of 502 individuals participated in the Cyprus survey. About 90% reported that they had often or sometimes heard about medical errors, while 22% reported that a family member or they had suffered a serious medical error in a local hospital. In addition, 9.4% reported a serious problem from a prescribed medicine. We also found statistically significant differences across different ages and gender and in rural versus urban residents. Finally, using multivariable-adjusted logistic regression models, wefound that residents in rural areas were more likely to have suffered a serious medical error in a local hospital or from a prescribed medicine.Conclusion: Our study shows that the vast majority of residents in Cyprus in parallel with the other Europeans worry about medical errors and a significant percentage report having suffered a serious medical error at a local hospital or from a prescribed medicine. The results of our study could help the medical community in Cyprus and the society at large to enhance its vigilance with respect to medical errors in order to improve medical care.

  7. Response to "Improving Patient Safety With Error Identification in Chemotherapy Orders by Verification Nurses"
.

    Science.gov (United States)

    Zhu, Ling-Ling; Lv, Na; Zhou, Quan

    2016-12-01

    We read, with great interest, the study by Baldwin and Rodriguez (2016), which described the role of the verification nurse and details the verification process in identifying errors related to chemotherapy orders. We strongly agree with their findings that a verification nurse, collaborating closely with the prescribing physician, pharmacist, and treating nurse, can better identify errors and maintain safety during chemotherapy administration.

  8. Modeling and Measurement Constraints in Fault Diagnostics for HVAC Systems

    Energy Technology Data Exchange (ETDEWEB)

    Najafi, Massieh; Auslander, David M.; Bartlett, Peter L.; Haves, Philip; Sohn, Michael D.

    2010-05-30

    Many studies have shown that energy savings of five to fifteen percent are achievable in commercial buildings by detecting and correcting building faults, and optimizing building control systems. However, in spite of good progress in developing tools for determining HVAC diagnostics, methods to detect faults in HVAC systems are still generally undeveloped. Most approaches use numerical filtering or parameter estimation methods to compare data from energy meters and building sensors to predictions from mathematical or statistical models. They are effective when models are relatively accurate and data contain few errors. In this paper, we address the case where models are imperfect and data are variable, uncertain, and can contain error. We apply a Bayesian updating approach that is systematic in managing and accounting for most forms of model and data errors. The proposed method uses both knowledge of first principle modeling and empirical results to analyze the system performance within the boundaries defined by practical constraints. We demonstrate the approach by detecting faults in commercial building air handling units. We find that the limitations that exist in air handling unit diagnostics due to practical constraints can generally be effectively addressed through the proposed approach.

  9. A simulation of the San Andreas fault experiment

    Science.gov (United States)

    Agreen, R. W.; Smith, D. E.

    1974-01-01

    The San Andreas fault experiment (Safe), which employs two laser tracking systems for measuring the relative motion of two points on opposite sides of the fault, has been simulated for an 8-yr observation period. The two tracking stations are located near San Diego on the western side of the fault and near Quincy on the eastern side; they are roughly 900 km apart. Both will simultaneously track laser reflector equipped satellites as they pass near the stations. Tracking of the Beacon Explorer C spacecraft has been simulated for these two stations during August and September for 8 consecutive years. An error analysis of the recovery of the relative location of Quincy from the data has been made, allowing for model errors in the mass of the earth, the gravity field, solar radiation pressure, atmospheric drag, errors in the position of the San Diego site, and biases and noise in the laser systems. The results of this simulation indicate that the distance of Quincy from San Diego will be determined each year with a precision of about 10 cm. Projected improvements in these model parameters and in the laser systems over the next few years will bring the precision to about 1-2 cm by 1980.

  10. Systematic Procedural Error

    National Research Council Canada - National Science Library

    Byrne, Michael D

    2006-01-01

    .... This problem has received surprisingly little attention from cognitive psychologists. The research summarized here examines such errors in some detail both empirically and through computational cognitive modeling...

  11. Human errors and mistakes

    International Nuclear Information System (INIS)

    Wahlstroem, B.

    1993-01-01

    Human errors have a major contribution to the risks for industrial accidents. Accidents have provided important lesson making it possible to build safer systems. In avoiding human errors it is necessary to adapt the systems to their operators. The complexity of modern industrial systems is however increasing the danger of system accidents. Models of the human operator have been proposed, but the models are not able to give accurate predictions of human performance. Human errors can never be eliminated, but their frequency can be decreased by systematic efforts. The paper gives a brief summary of research in human error and it concludes with suggestions for further work. (orig.)

  12. Reliability of Measured Data for pH Sensor Arrays with Fault Diagnosis and Data Fusion Based on LabVIEW

    OpenAIRE

    Liao, Yi-Hung; Chou, Jung-Chuan; Lin, Chin-Yi

    2013-01-01

    Fault diagnosis (FD) and data fusion (DF) technologies implemented in the LabVIEW program were used for a ruthenium dioxide pH sensor array. The purpose of the fault diagnosis and data fusion technologies is to increase the reliability of measured data. Data fusion is a very useful statistical method used for sensor arrays in many fields. Fault diagnosis is used to avoid sensor faults and to measure errors in the electrochemical measurement system, therefore, in this study, we use fault diagn...

  13. DYNAMIC SOFTWARE TESTING MODELS WITH PROBABILISTIC PARAMETERS FOR FAULT DETECTION AND ERLANG DISTRIBUTION FOR FAULT RESOLUTION DURATION

    Directory of Open Access Journals (Sweden)

    A. D. Khomonenko

    2016-07-01

    Full Text Available Subject of Research.Software reliability and test planning models are studied taking into account the probabilistic nature of error detection and discovering. Modeling of software testing enables to plan the resources and final quality at early stages of project execution. Methods. Two dynamic models of processes (strategies are suggested for software testing, using error detection probability for each software module. The Erlang distribution is used for arbitrary distribution approximation of fault resolution duration. The exponential distribution is used for approximation of fault resolution discovering. For each strategy, modified labeled graphs are built, along with differential equation systems and their numerical solutions. The latter makes it possible to compute probabilistic characteristics of the test processes and states: probability states, distribution functions for fault detection and elimination, mathematical expectations of random variables, amount of detected or fixed errors. Evaluation of Results. Probabilistic characteristics for software development projects were calculated using suggested models. The strategies have been compared by their quality indexes. Required debugging time to achieve the specified quality goals was calculated. The calculation results are used for time and resources planning for new projects. Practical Relevance. The proposed models give the possibility to use the reliability estimates for each individual module. The Erlang approximation removes restrictions on the use of arbitrary time distribution for fault resolution duration. It improves the accuracy of software test process modeling and helps to take into account the viability (power of the tests. With the use of these models we can search for ways to improve software reliability by generating tests which detect errors with the highest probability.

  14. Absolute age determination of quaternary faults

    International Nuclear Information System (INIS)

    Cheong, Chang Sik; Lee, Seok Hoon; Choi, Man Sik

    2000-03-01

    To constrain the age of neotectonic fault movement, Rb-Sr, K-Ar, U-series disequilibrium, C-14 and Be-10 methods were applied to the fault gouges, fracture infillings and sediments from the Malbang, Ipsil, Wonwonsa faults faults in the Ulsan fault zone, Yangsan fault in the Yeongdeog area and southeastern coastal area. Rb-Sr and K-Ar data imply that the fault movement of the Ulan fault zone initiated at around 30 Ma and preliminary dating result for the Yang san fault is around 70 Ma in the Yeongdeog area. K-Ar and U-series disequilibrium dating results for fracture infillings in the Ipsil fault are consistent with reported ESR ages. Radiocarbon ages of quaternary sediments from the Jeongjari area are discordant with stratigraphic sequence. Carbon isotope data indicate a difference of sedimentry environment for those samples. Be-10 dating results for the Suryum fault area are consistent with reported OSL results

  15. Absolute age determination of quaternary faults

    Energy Technology Data Exchange (ETDEWEB)

    Cheong, Chang Sik; Lee, Seok Hoon; Choi, Man Sik [Korea Basic Science Institute, Seoul (Korea, Republic of)] (and others)

    2000-03-15

    To constrain the age of neotectonic fault movement, Rb-Sr, K-Ar, U-series disequilibrium, C-14 and Be-10 methods were applied to the fault gouges, fracture infillings and sediments from the Malbang, Ipsil, Wonwonsa faults faults in the Ulsan fault zone, Yangsan fault in the Yeongdeog area and southeastern coastal area. Rb-Sr and K-Ar data imply that the fault movement of the Ulan fault zone initiated at around 30 Ma and preliminary dating result for the Yang san fault is around 70 Ma in the Yeongdeog area. K-Ar and U-series disequilibrium dating results for fracture infillings in the Ipsil fault are consistent with reported ESR ages. Radiocarbon ages of quaternary sediments from the Jeongjari area are discordant with stratigraphic sequence. Carbon isotope data indicate a difference of sedimentry environment for those samples. Be-10 dating results for the Suryum fault area are consistent with reported OSL results.

  16. Comparison of Cenozoic Faulting at the Savannah River Site to Fault Characteristics of the Atlantic Coast Fault Province: Implications for Fault Capability

    International Nuclear Information System (INIS)

    Cumbest, R.J.

    2000-01-01

    This study compares the faulting observed on the Savannah River Site and vicinity with the faults of the Atlantic Coastal Fault Province and concludes that both sets of faults exhibit the same general characteristics and are closely associated. Based on the strength of this association it is concluded that the faults observed on the Savannah River Site and vicinity are in fact part of the Atlantic Coastal Fault Province. Inclusion in this group means that the historical precedent established by decades of previous studies on the seismic hazard potential for the Atlantic Coastal Fault Province is relevant to faulting at the Savannah River Site. That is, since these faults are genetically related the conclusion of ''not capable'' reached in past evaluations applies.In addition, this study establishes a set of criteria by which individual faults may be evaluated in order to assess their inclusion in the Atlantic Coast Fault Province and the related association of the ''not capable'' conclusion

  17. Design and Evaluation of an Electronic Override Mechanism for Medication Alerts to Facilitate Communication Between Prescribers and Pharmacists.

    Science.gov (United States)

    Russ, Alissa L; Chen, Siying; Melton, Brittany L; Saleem, Jason J; Weiner, Michael; Spina, Jeffrey R; Daggy, Joanne K; Zillich, Alan J

    2015-07-01

    Computerized medication alerts can often be bypassed by entering an override rationale, but prescribers' override reasons are frequently ambiguous to pharmacists who review orders. To develop and evaluate a new override mechanism for adverse reaction and drug-drug interaction alerts. We hypothesized that the new mechanism would improve usability for prescribers and increase the clinical appropriateness of override reasons. A counterbalanced, crossover study was conducted with 20 prescribers in a simulated prescribing environment. We modified the override mechanism timing, navigation, and text entry. Instead of free-text entry, the new mechanism presented prescribers with a predefined set of override reasons. We assessed usability (learnability, perceived efficiency, and usability errors) and used a priori criteria to evaluate the clinical appropriateness of override reasons entered. Prescribers rated the new mechanism as more efficient (Wilcoxon signed-rank test, P = 0.032). When first using the new design, 5 prescribers had difficulty finding the new mechanism, and 3 interpreted the navigation to mean that the alert could not be overridden. The number of appropriate override reasons significantly increased with the new mechanism compared with the original mechanism (median change of 3.0; interquartile range = 3.0; P < 0.0001). When prescribers were given a menu-based choice for override reasons, clinical appropriateness of these reasons significantly improved. Further enhancements are necessary, but this study is an important first step toward a more standardized menu of override choices. Findings may be used to improve communication through e-prescribing systems between prescribers and pharmacists. © The Author(s) 2015.

  18. Subaru FATS (fault tracking system)

    Science.gov (United States)

    Winegar, Tom W.; Noumaru, Junichi

    2000-07-01

    The Subaru Telescope requires a fault tracking system to record the problems and questions that staff experience during their work, and the solutions provided by technical experts to these problems and questions. The system records each fault and routes it to a pre-selected 'solution-provider' for each type of fault. The solution provider analyzes the fault and writes a solution that is routed back to the fault reporter and recorded in a 'knowledge-base' for future reference. The specifications of our fault tracking system were unique. (1) Dual language capacity -- Our staff speak both English and Japanese. Our contractors speak Japanese. (2) Heterogeneous computers -- Our computer workstations are a mixture of SPARCstations, Macintosh and Windows computers. (3) Integration with prime contractors -- Mitsubishi and Fujitsu are primary contractors in the construction of the telescope. In many cases, our 'experts' are our contractors. (4) Operator scheduling -- Our operators spend 50% of their work-month operating the telescope, the other 50% is spent working day shift at the base facility in Hilo, or day shift at the summit. We plan for 8 operators, with a frequent rotation. We need to keep all operators informed on the current status of all faults, no matter the operator's location.

  19. Objective Function and Learning Algorithm for the General Node Fault Situation.

    Science.gov (United States)

    Xiao, Yi; Feng, Rui-Bin; Leung, Chi-Sing; Sum, John

    2016-04-01

    Fault tolerance is one interesting property of artificial neural networks. However, the existing fault models are able to describe limited node fault situations only, such as stuck-at-zero and stuck-at-one. There is no general model that is able to describe a large class of node fault situations. This paper studies the performance of faulty radial basis function (RBF) networks for the general node fault situation. We first propose a general node fault model that is able to describe a large class of node fault situations, such as stuck-at-zero, stuck-at-one, and the stuck-at level being with arbitrary distribution. Afterward, we derive an expression to describe the performance of faulty RBF networks. An objective function is then identified from the formula. With the objective function, a training algorithm for the general node situation is developed. Finally, a mean prediction error (MPE) formula that is able to estimate the test set error of faulty networks is derived. The application of the MPE formula in the selection of basis width is elucidated. Simulation experiments are then performed to demonstrate the effectiveness of the proposed method.

  20. Why the 2002 Denali fault rupture propagated onto the Totschunda fault: implications for fault branching and seismic hazards

    Science.gov (United States)

    Schwartz, David P.; Haeussler, Peter J.; Seitz, Gordon G.; Dawson, Timothy E.

    2012-01-01

    The propagation of the rupture of the Mw7.9 Denali fault earthquake from the central Denali fault onto the Totschunda fault has provided a basis for dynamic models of fault branching in which the angle of the regional or local prestress relative to the orientation of the main fault and branch plays a principal role in determining which fault branch is taken. GeoEarthScope LiDAR and paleoseismic data allow us to map the structure of the Denali-Totschunda fault intersection and evaluate controls of fault branching from a geological perspective. LiDAR data reveal the Denali-Totschunda fault intersection is structurally simple with the two faults directly connected. At the branch point, 227.2 km east of the 2002 epicenter, the 2002 rupture diverges southeast to become the Totschunda fault. We use paleoseismic data to propose that differences in the accumulated strain on each fault segment, which express differences in the elapsed time since the most recent event, was one important control of the branching direction. We suggest that data on event history, slip rate, paleo offsets, fault geometry and structure, and connectivity, especially on high slip rate-short recurrence interval faults, can be used to assess the likelihood of branching and its direction. Analysis of the Denali-Totschunda fault intersection has implications for evaluating the potential for a rupture to propagate across other types of fault intersections and for characterizing sources of future large earthquakes.

  1. Architecture of thrust faults with alongstrike variations in fault-plane dip: anatomy of the Lusatian Fault, Bohemian Massif

    Czech Academy of Sciences Publication Activity Database

    Coubal, Miroslav; Adamovič, Jiří; Málek, Jiří; Prouza, V.

    2014-01-01

    Roč. 59, č. 3 (2014), s. 183-208 ISSN 1802-6222 Institutional support: RVO:67985831 ; RVO:67985891 Keywords : fault architecture * fault plane geometry * drag structures * thrust fault * sandstone * Lusatian Fault Subject RIV: DB - Geology ; Mineralogy Impact factor: 1.405, year: 2014

  2. Model-based monitoring of rotors with multiple coexisting faults

    International Nuclear Information System (INIS)

    Rossner, Markus

    2015-01-01

    Monitoring systems are applied to many rotors, but only few monitoring systems can separate coexisting errors and identify their quantity. This research project solves this problem using a combination of signal-based and model-based monitoring. The signal-based part performs a pre-selection of possible errors; these errors are further separated with model-based methods. This approach is demonstrated for the errors unbalance, bow, stator-fixed misalignment, rotor-fixed misalignment and roundness errors. For the model-based part, unambiguous error definitions and models are set up. The Ritz approach reduces the model order and therefore speeds up the diagnosis. Identification algorithms are developed for the different rotor faults. Hereto, reliable damage indicators and proper sub steps of the diagnosis have to be defined. For several monitoring problems, measuring both deflection and bearing force is very useful. The monitoring system is verified by experiments on an academic rotor test rig. The interpretation of the measurements requires much knowledge concerning the dynamics of the rotor. Due to the model-based approach, the system can separate errors with similar signal patterns and identify bow and roundness error online at operation speed. [de

  3. Inappropriate prescribing and prescribing omissions among drug-related problems using STOPP-START criteria

    NARCIS (Netherlands)

    Verdoorn, M.A.; Kwint, H.-F.; Faber, A.; L. Bouvy, M.

    2013-01-01

    Background and objectives: Medication review has been suggested as a way to prevent drug related problems (DRPs). Screening tools have been formulated to identify potentially inappropriate medicines (PIMs) and potential prescribing omissions (PPOs) respectively called Screening Tool of Older

  4. Automated vehicle for railway track fault detection

    Science.gov (United States)

    Bhushan, M.; Sujay, S.; Tushar, B.; Chitra, P.

    2017-11-01

    For the safety reasons, railroad tracks need to be inspected on a regular basis for detecting physical defects or design non compliances. Such track defects and non compliances, if not detected in a certain interval of time, may eventually lead to severe consequences such as train derailments. Inspection must happen twice weekly by a human inspector to maintain safety standards as there are hundreds and thousands of miles of railroad track. But in such type of manual inspection, there are many drawbacks that may result in the poor inspection of the track, due to which accidents may cause in future. So to avoid such errors and severe accidents, this automated system is designed.Such a concept would surely introduce automation in the field of inspection process of railway track and can help to avoid mishaps and severe accidents due to faults in the track.

  5. Learning from Errors

    Science.gov (United States)

    Metcalfe, Janet

    2017-01-01

    Although error avoidance during learning appears to be the rule in American classrooms, laboratory studies suggest that it may be a counterproductive strategy, at least for neurologically typical students. Experimental investigations indicate that errorful learning followed by corrective feedback is beneficial to learning. Interestingly, the…

  6. Are we setting about improving the safety of computerised prescribing in the right way? A workshop report

    Directory of Open Access Journals (Sweden)

    Arash Vaziri

    2009-09-01

    Conclusion Prescribing errors remain a major source of unnecessary morbidity and mortality and current systems do not appear to have significantly reduced this problem; nor has the extensive literature about how to reduce unnecessary alerts been taken into account. We need a new and more rational basis for the selection and presentation of alerts that would help, not hinder, the clinician's performance.

  7. Analysis of error-correction constraints in an optical disk

    Science.gov (United States)

    Roberts, Jonathan D.; Ryley, Alan; Jones, David M.; Burke, David

    1996-07-01

    The compact disk read-only memory (CD-ROM) is a mature storage medium with complex error control. It comprises four levels of Reed Solomon codes allied to a sequence of sophisticated interleaving strategies and 8:14 modulation coding. New storage media are being developed and introduced that place still further demands on signal processing for error correction. It is therefore appropriate to explore thoroughly the limit of existing strategies to assess future requirements. We describe a simulation of all stages of the CD-ROM coding, modulation, and decoding. The results of decoding the burst error of a prescribed number of modulation bits are discussed in detail. Measures of residual uncorrected error within a sector are displayed by C1, C2, P, and Q error counts and by the status of the final cyclic redundancy check (CRC). Where each data sector is encoded separately, it is shown that error-correction performance against burst errors depends critically on the position of the burst within a sector. The C1 error measures the burst length, whereas C2 errors reflect the burst position. The performance of Reed Solomon product codes is shown by the P and Q statistics. It is shown that synchronization loss is critical near the limits of error correction. An example is given of miscorrection that is identified by the CRC check.

  8. Action errors, error management, and learning in organizations.

    Science.gov (United States)

    Frese, Michael; Keith, Nina

    2015-01-03

    Every organization is confronted with errors. Most errors are corrected easily, but some may lead to negative consequences. Organizations often focus on error prevention as a single strategy for dealing with errors. Our review suggests that error prevention needs to be supplemented by error management--an approach directed at effectively dealing with errors after they have occurred, with the goal of minimizing negative and maximizing positive error consequences (examples of the latter are learning and innovations). After defining errors and related concepts, we review research on error-related processes affected by error management (error detection, damage control). Empirical evidence on positive effects of error management in individuals and organizations is then discussed, along with emotional, motivational, cognitive, and behavioral pathways of these effects. Learning from errors is central, but like other positive consequences, learning occurs under certain circumstances--one being the development of a mind-set of acceptance of human error.

  9. Fault Features Extraction and Identification based Rolling Bearing Fault Diagnosis

    International Nuclear Information System (INIS)

    Qin, B; Sun, G D; Zhang L Y; Wang J G; HU, J

    2017-01-01

    For the fault classification model based on extreme learning machine (ELM), the diagnosis accuracy and stability of rolling bearing is greatly influenced by a critical parameter, which is the number of nodes in hidden layer of ELM. An adaptive adjustment strategy is proposed based on vibrational mode decomposition, permutation entropy, and nuclear kernel extreme learning machine to determine the tunable parameter. First, the vibration signals are measured and then decomposed into different fault feature models based on variation mode decomposition. Then, fault feature of each model is formed to a high dimensional feature vector set based on permutation entropy. Second, the ELM output function is expressed by the inner product of Gauss kernel function to adaptively determine the number of hidden layer nodes. Finally, the high dimension feature vector set is used as the input to establish the kernel ELM rolling bearing fault classification model, and the classification and identification of different fault states of rolling bearings are carried out. In comparison with the fault classification methods based on support vector machine and ELM, the experimental results show that the proposed method has higher classification accuracy and better generalization ability. (paper)

  10. Modeling and Experimental Study of Soft Error Propagation Based on Cellular Automaton

    OpenAIRE

    He, Wei; Wang, Yueke; Xing, Kefei; Yang, Jianwei

    2016-01-01

    Aiming to estimate SEE soft error performance of complex electronic systems, a soft error propagation model based on cellular automaton is proposed and an estimation methodology based on circuit partitioning and error propagation is presented. Simulations indicate that different fault grade jamming and different coupling factors between cells are the main parameters influencing the vulnerability of the system. Accelerated radiation experiments have been developed to determine the main paramet...

  11. Indirect adaptive fuzzy fault-tolerant tracking control for MIMO nonlinear systems with actuator and sensor failures.

    Science.gov (United States)

    Bounemeur, Abdelhamid; Chemachema, Mohamed; Essounbouli, Najib

    2018-05-10

    In this paper, an active fuzzy fault tolerant tracking control (AFFTTC) scheme is developed for a class of multi-input multi-output (MIMO) unknown nonlinear systems in the presence of unknown actuator faults, sensor failures and external disturbance. The developed control scheme deals with four kinds of faults for both sensors and actuators. The bias, drift, and loss of accuracy additive faults are considered along with the loss of effectiveness multiplicative fault. A fuzzy adaptive controller based on back-stepping design is developed to deal with actuator failures and unknown system dynamics. However, an additional robust control term is added to deal with sensor faults, approximation errors, and external disturbances. Lyapunov theory is used to prove the stability of the closed loop system. Numerical simulations on a quadrotor are presented to show the effectiveness of the proposed approach. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  12. 20 CFR 410.561b - Fault.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Fault. 410.561b Section 410.561b Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL COAL MINE HEALTH AND SAFETY ACT OF 1969, TITLE IV-BLACK LUNG BENEFITS (1969- ) Payment of Benefits § 410.561b Fault. Fault as used in without fault (see § 410...

  13. Fault Detection for Diesel Engine Actuator

    DEFF Research Database (Denmark)

    Blanke, M.; Bøgh, S.A.; Jørgensen, R.B.

    1994-01-01

    Feedback control systems are vulnerable to faults in control loop sensors and actuators, because feedback actions may cause abrupt responses and process damage when faults occur.......Feedback control systems are vulnerable to faults in control loop sensors and actuators, because feedback actions may cause abrupt responses and process damage when faults occur....

  14. 22 CFR 17.3 - Fault.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Fault. 17.3 Section 17.3 Foreign Relations...) § 17.3 Fault. A recipient of an overpayment is without fault if he or she performed no act of... agency may have been at fault in initiating an overpayment will not necessarily relieve the individual...

  15. Active fault diagnosis by temporary destabilization

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, Jakob

    2006-01-01

    An active fault diagnosis method for parametric or multiplicative faults is proposed. The method periodically adds a term to the controller that for a short period of time renders the system unstable if a fault has occurred, which facilitates rapid fault detection. An illustrative example is given....

  16. From fault classification to fault tolerance for multi-agent systems

    CERN Document Server

    Potiron, Katia; Taillibert, Patrick

    2013-01-01

    Faults are a concern for Multi-Agent Systems (MAS) designers, especially if the MAS are built for industrial or military use because there must be some guarantee of dependability. Some fault classification exists for classical systems, and is used to define faults. When dependability is at stake, such fault classification may be used from the beginning of the system's conception to define fault classes and specify which types of faults are expected. Thus, one may want to use fault classification for MAS; however, From Fault Classification to Fault Tolerance for Multi-Agent Systems argues that

  17. Optimization of Second Fault Detection Thresholds to Maximize Mission POS

    Science.gov (United States)

    Anzalone, Evan

    2018-01-01

    In order to support manned spaceflight safety requirements, the Space Launch System (SLS) has defined program-level requirements for key systems to ensure successful operation under single fault conditions. To accommodate this with regards to Navigation, the SLS utilizes an internally redundant Inertial Navigation System (INS) with built-in capability to detect, isolate, and recover from first failure conditions and still maintain adherence to performance requirements. The unit utilizes multiple hardware- and software-level techniques to enable detection, isolation, and recovery from these events in terms of its built-in Fault Detection, Isolation, and Recovery (FDIR) algorithms. Successful operation is defined in terms of sufficient navigation accuracy at insertion while operating under worst case single sensor outages (gyroscope and accelerometer faults at launch). In addition to first fault detection and recovery, the SLS program has also levied requirements relating to the capability of the INS to detect a second fault, tracking any unacceptable uncertainty in knowledge of the vehicle's state. This detection functionality is required in order to feed abort analysis and ensure crew safety. Increases in navigation state error and sensor faults can drive the vehicle outside of its operational as-designed environments and outside of its performance envelope causing loss of mission, or worse, loss of crew. The criteria for operation under second faults allows for a larger set of achievable missions in terms of potential fault conditions, due to the INS operating at the edge of its capability. As this performance is defined and controlled at the vehicle level, it allows for the use of system level margins to increase probability of mission success on the operational edges of the design space. Due to the implications of the vehicle response to abort conditions (such as a potentially failed INS), it is important to consider a wide range of failure scenarios in terms of

  18. Differential Fault Analysis on CLEFIA

    Science.gov (United States)

    Chen, Hua; Wu, Wenling; Feng, Dengguo

    CLEFIA is a new 128-bit block cipher proposed by SONY corporation recently. The fundamental structure of CLEFIA is a generalized Feistel structure consisting of 4 data lines. In this paper, the strength of CLEFIA against the differential fault attack is explored. Our attack adopts the byte-oriented model of random faults. Through inducing randomly one byte fault in one round, four bytes of faults can be simultaneously obtained in the next round, which can efficiently reduce the total induce times in the attack. After attacking the last several rounds' encryptions, the original secret key can be recovered based on some analysis of the key schedule. The data complexity analysis and experiments show that only about 18 faulty ciphertexts are needed to recover the entire 128-bit secret key and about 54 faulty ciphertexts for 192/256-bit keys.

  19. Fault Tolerant External Memory Algorithms

    DEFF Research Database (Denmark)

    Jørgensen, Allan Grønlund; Brodal, Gerth Stølting; Mølhave, Thomas

    2009-01-01

    Algorithms dealing with massive data sets are usually designed for I/O-efficiency, often captured by the I/O model by Aggarwal and Vitter. Another aspect of dealing with massive data is how to deal with memory faults, e.g. captured by the adversary based faulty memory RAM by Finocchi and Italiano....... However, current fault tolerant algorithms do not scale beyond the internal memory. In this paper we investigate for the first time the connection between I/O-efficiency in the I/O model and fault tolerance in the faulty memory RAM, and we assume that both memory and disk are unreliable. We show a lower...... bound on the number of I/Os required for any deterministic dictionary that is resilient to memory faults. We design a static and a dynamic deterministic dictionary with optimal query performance as well as an optimal sorting algorithm and an optimal priority queue. Finally, we consider scenarios where...

  20. Cell boundary fault detection system

    Science.gov (United States)

    Archer, Charles Jens [Rochester, MN; Pinnow, Kurt Walter [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian Edward [Rochester, MN

    2009-05-05

    A method determines a nodal fault along the boundary, or face, of a computing cell. Nodes on adjacent cell boundaries communicate with each other, and the communications are analyzed to determine if a node or connection is faulty.

  1. Adaptive Fault-Tolerant Synchronization Control of a Class of Complex Dynamical Networks With General Input Distribution Matrices and Actuator Faults.

    Science.gov (United States)

    Li, Xiao-Jian; Yang, Guang-Hong

    2017-03-01

    This paper is concerned with the problem of adaptive fault-tolerant synchronization control of a class of complex dynamical networks (CDNs) with actuator faults and unknown coupling weights. The considered input distribution matrix is assumed to be an arbitrary matrix, instead of a unit one. Within this framework, an adaptive fault-tolerant controller is designed to achieve synchronization for the CDN. Moreover, a convex combination technique and an important graph theory result are developed, such that the rigorous convergence analysis of synchronization errors can be conducted. In particular, it is shown that the proposed fault-tolerant synchronization control approach is valid for the CDN with both time-invariant and time-varying coupling weights. Finally, two simulation examples are provided to validate the effectiveness of the theoretical results.

  2. 75 deaths in asthmatics prescribed home nebulisers.

    Science.gov (United States)

    Sears, M R; Rea, H H; Fenwick, J; Gillies, A J; Holst, P E; O'Donnell, T V; Rothwell, R P

    1987-02-21

    The circumstances surrounding the deaths of 75 asthmatic patients who had been prescribed a domiciliary nebuliser driven by an air compressor pump for administration of high dose beta sympathomimetic drugs were investigated as part of the New Zealand national asthma mortality study. Death was judged unavoidable in 19 patients who seemed to have precipitous attacks despite apparently good long term management. Delays in seeking medical help because of overreliance on beta agonist delivered by nebuliser were evident in 12 cases and possible in a further 11, but these represented only 8% of the 271 verified deaths from asthma in New Zealanders aged under 70 during the period. Evidence for direct toxicity of high dose beta agonist was not found. Nevertheless, the absence of serum potassium and theophylline concentrations and of electrocardiographic monitoring in the period immediately preceding death precluded firm conclusions whether arrhythmias might have occurred due to these factors rather than to hypoxia alone. In most patients prescribed domiciliary nebulisers death was associated with deficiencies in long term and short term care similar to those seen in patients without nebulisers. Discretion in prescribing home nebulisers, greater use of other appropriate drugs, including adequate corticosteroids, and careful supervision and instruction of patients taking beta agonist by nebuliser should help to reduce the mortality from asthma.

  3. Best available control measures for prescribed burning

    International Nuclear Information System (INIS)

    Smith, A.M.; Stoneman, C.S.

    1992-01-01

    Section 190 of the Clean Air Act (CAA) as amended in 1990 requires the US Environmental Protection Agency (EPA) to issue guidance on Best Available Control Measures (BACM) of PM 10 (particulate matter with a nominal aerodynamic diameter less than or equal to 10 micrometers) from urban fugitive dust, residential wood combustion, and prescribed silvicultural and agricultural burning (prescribed burning). The purpose of this guidance is to assist states (especially, but not exclusively, those with PM 10 nonattainment areas which have been classified as serious) in developing a control measure for these three source categories. This guidance is to be issued no later than May 15, 1992 as required under the CAA. The guidance will be issued in the form of a policy guidance generic to all three BACM and in the form of Technical Information Documents (TIDs) for each of the three source categories. The policy guidance will provide the analytical approach for determining BACM and the TID will provide the technical information. The purpose of this paper is to present some insight from the forthcoming TID on what BACM might entail for prescribed burning in a serious PM 10 nonattainment area

  4. Deformation around basin scale normal faults

    International Nuclear Information System (INIS)

    Spahic, D.

    2010-01-01

    Faults in the earth crust occur within large range of scales from microscale over mesoscopic to large basin scale faults. Frequently deformation associated with faulting is not only limited to the fault plane alone, but rather forms a combination with continuous near field deformation in the wall rock, a phenomenon that is generally called fault drag. The correct interpretation and recognition of fault drag is fundamental for the reconstruction of the fault history and determination of fault kinematics, as well as prediction in areas of limited exposure or beyond comprehensive seismic resolution. Based on fault analyses derived from 3D visualization of natural examples of fault drag, the importance of fault geometry for the deformation of marker horizons around faults is investigated. The complex 3D structural models presented here are based on a combination of geophysical datasets and geological fieldwork. On an outcrop scale example of fault drag in the hanging wall of a normal fault, located at St. Margarethen, Burgenland, Austria, data from Ground Penetrating Radar (GPR) measurements, detailed mapping and terrestrial laser scanning were used to construct a high-resolution structural model of the fault plane, the deformed marker horizons and associated secondary faults. In order to obtain geometrical information about the largely unexposed master fault surface, a standard listric balancing dip domain technique was employed. The results indicate that for this normal fault a listric shape can be excluded, as the constructed fault has a geologically meaningless shape cutting upsection into the sedimentary strata. This kinematic modeling result is additionally supported by the observation of deformed horizons in the footwall of the structure. Alternatively, a planar fault model with reverse drag of markers in the hanging wall and footwall is proposed. Deformation around basin scale normal faults. A second part of this thesis investigates a large scale normal fault

  5. An Overview of Optical Network Bandwidth and Fault Management

    Directory of Open Access Journals (Sweden)

    J.A. Zubairi

    2010-09-01

    Full Text Available This paper discusses the optical network management issues and identifies potential areas for focused research. A general outline of the main components in optical network management is given and specific problems in GMPLS based model are explained. Later, protection and restoration issues are discussed in the broader context of fault management and the tools developed for fault detection are listed. Optical networks need efficient and reliable protection schemes that restore the communications quickly on the occurrence of faults without causing failure of real-time applications using the network. A holistic approach is required that provides mechanisms for fault detection, rapid restoration and reversion in case of fault resolution. Since the role of SDH/SONET is diminishing, the modern optical networks are poised towards the IP-centric model where high performance IP-MPLS routers manage a core intelligent network of IP over WDM. Fault management schemes are developed for both the IP layer and the WDM layer. Faults can be detected and repaired locally and also through centralized network controller. A hybrid approach works best in detecting the faults where the domain controller verifies the established LSPs in addition to the link tests at the node level. On detecting a fault, rapid restoration can perform localized routing of traffic away from the affected port and link. The traffic may be directed to pre-assigned backup paths that are established as shared or dedicated resources. We examine the protection issues in detail including the choice of layer for protection, implementing protection or restoration, backup path routing, backup resource efficiency, subpath protection, QoS traffic survival and multilayer protection triggers and alarm propagation. The complete protection cycle is described and mechanisms incorporated into RSVP-TE and other protocols for detecting and recording path errors are outlined. In addition, MPLS testbed

  6. Enhanced fault-tolerant quantum computing in d-level systems.

    Science.gov (United States)

    Campbell, Earl T

    2014-12-05

    Error-correcting codes protect quantum information and form the basis of fault-tolerant quantum computing. Leading proposals for fault-tolerant quantum computation require codes with an exceedingly rare property, a transversal non-Clifford gate. Codes with the desired property are presented for d-level qudit systems with prime d. The codes use n=d-1 qudits and can detect up to ∼d/3 errors. We quantify the performance of these codes for one approach to quantum computation known as magic-state distillation. Unlike prior work, we find performance is always enhanced by increasing d.

  7. Qademah Fault Passive Data

    KAUST Repository

    Hanafy, Sherif M.

    2014-01-01

    OBJECTIVE: In this field trip we collect passive data to 1. Convert passive to surface waves 2. Locate Qademah fault using surface wave migration INTRODUCTION: In this field trip we collected passive data for several days. This data will be used to find the surface waves using interferometry and then compared to active-source seismic data collected at the same location. A total of 288 receivers are used. A 3D layout with 5 m inline intervals and 10 m cross line intervals is used, where we used 12 lines with 24 receivers at each line. You will need to download the file (rec_times.mat), it contains important information about 1. Field record no 2. Record day 3. Record month 4. Record hour 5. Record minute 6. Record second 7. Record length P.S. 1. All files are converted from original format (SEG-2) to matlab format P.S. 2. Overlaps between records (10 to 1.5 sec.) are already removed from these files

  8. Exposing the faults

    International Nuclear Information System (INIS)

    Richardson, P.J.

    1989-01-01

    UK NIREX, the body with responsibility for finding an acceptable strategy for deposition of radioactive waste has given the impression throughout its recent public consultation that the problem of nuclear waste is one of public and political acceptability, rather than one of a technical nature. However the results of the consultation process show that it has no mandate from the British public to develop a single, national, deep repository for the burial of radioactive waste. There is considerable opposition to this method of managing radioactive waste and suspicion of the claims by NIREX concerning the supposed integrity and safety of this deep burial option. This report gives substance to those suspicions and details the significant areas of uncertainty in the concept of effective geological containment of hazardous radioactive elements, which remain dangerous for tens of thousands of years. Because the science of geology is essentially retrospective rather than predictive, NIREX's plans for a single, national, deep 'repository' depend heavily upon a wide range of assumptions about the geological and hydrogeological regimes in certain areas of the UK. This report demonstrates that these assumptions are based on a limited understanding of UK geology and on unvalidated and simplistic theoretical models of geological processes, the performance of which can never be directly tested over the long time-scales involved. NIREX's proposals offer no guarantees for the safe and effective containment of radioactivity. They are deeply flawed. This report exposes the faults. (author)

  9. Prescriber and staff perceptions of an electronic prescribing system in primary care: a qualitative assessment

    Directory of Open Access Journals (Sweden)

    Sittig Dean F

    2010-11-01

    Full Text Available Abstract Background The United States (US Health Information Technology for Economic and Clinical Health Act of 2009 has spurred adoption of electronic health records. The corresponding meaningful use criteria proposed by the Centers for Medicare and Medicaid Services mandates use of computerized provider order entry (CPOE systems. Yet, adoption in the US and other Western countries is low and descriptions of successful implementations are primarily from the inpatient setting; less frequently the ambulatory setting. We describe prescriber and staff perceptions of implementation of a CPOE system for medications (electronic- or e-prescribing system in the ambulatory setting. Methods Using a cross-sectional study design, we conducted eight focus groups at three primary care sites in an independent medical group. Each site represented a unique stage of e-prescribing implementation - pre/transition/post. We used a theoretically based, semi-structured questionnaire to elicit physician (n = 17 and staff (n = 53 perceptions of implementation of the e-prescribing system. We conducted a thematic analysis of focus group discussions using formal qualitative analytic techniques (i.e. deductive framework and grounded theory. Two coders independently coded to theoretical saturation and resolved discrepancies through discussions. Results Ten themes emerged that describe perceptions of e-prescribing implementation: 1 improved availability of clinical information resulted in prescribing efficiencies and more coordinated care; 2 improved documentation resulted in safer care; 3 efficiencies were gained by using fewer paper charts; 4 organizational support facilitated adoption; 5 transition required time; resulted in workload shift to staff; 6 hardware configurations and network stability were important in facilitating workflow; 7 e-prescribing was time-neutral or time-saving; 8 changes in patient interactions enhanced patient care but required education; 9 pharmacy

  10. Uncertainties related to the fault tree reliability data

    International Nuclear Information System (INIS)

    Apostol, Minodora; Nitoi, Mirela; Farcasiu, M.

    2003-01-01

    Uncertainty analyses related to the fault trees evaluate the system variability which appears from the uncertainties of the basic events probabilities. Having a logical model which describes a system, to obtain outcomes means to evaluate it, using estimations for each basic event of the model. If the model has basic events that incorporate uncertainties, then the results of the model should incorporate the uncertainties of the events. Uncertainties estimation in the final result of the fault tree means first the uncertainties evaluation for the basic event probabilities and then combination of these uncertainties, to calculate the top event uncertainty. To calculate the propagating uncertainty, a knowledge of the probability density function as well as the range of possible values of the basic event probabilities is required. The following data are defined, using suitable probability density function: the components failure rates; the human error probabilities; the initiating event frequencies. It was supposed that the possible value distribution of the basic event probabilities is given by the lognormal probability density function. To know the range of possible value of the basic event probabilities, the error factor or the uncertainty factor is required. The aim of this paper is to estimate the error factor for the failure rates and for the human errors probabilities from the reliability data base used in Cernavoda Probabilistic Safety Evaluation. The top event chosen as an example is FEED3, from the Pressure and Inventory Control System. The quantitative evaluation of this top event was made by using EDFT code, developed in Institute for Nuclear Research Pitesti (INR). It was supposed that the error factors for the component failures are the same as for the failure rates. Uncertainty analysis was made with INCERT application, which uses the moment method and Monte Carlo method. The reliability data base used at INR Pitesti does not contain the error factors (ef

  11. Fault-tolerant rotary actuator

    Science.gov (United States)

    Tesar, Delbert

    2006-10-17

    A fault-tolerant actuator module, in a single containment shell, containing two actuator subsystems that are either asymmetrically or symmetrically laid out is provided. Fault tolerance in the actuators of the present invention is achieved by the employment of dual sets of equal resources. Dual resources are integrated into single modules, with each having the external appearance and functionality of a single set of resources.

  12. Static Decoupling in fault detection

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik

    1998-01-01

    An algebraic approach is given for a design of a static residual weighting factor in connection with fault detection. A complete parameterization is given of the weighting factor which will minimize a given performance index......An algebraic approach is given for a design of a static residual weighting factor in connection with fault detection. A complete parameterization is given of the weighting factor which will minimize a given performance index...

  13. Diagnosis and fault-tolerant control

    CERN Document Server

    Blanke, Mogens; Lunze, Jan; Staroswiecki, Marcel

    2016-01-01

    Fault-tolerant control aims at a gradual shutdown response in automated systems when faults occur. It satisfies the industrial demand for enhanced availability and safety, in contrast to traditional reactions to faults, which bring about sudden shutdowns and loss of availability. The book presents effective model-based analysis and design methods for fault diagnosis and fault-tolerant control. Architectural and structural models are used to analyse the propagation of the fault through the process, to test the fault detectability and to find the redundancies in the process that can be used to ensure fault tolerance. It also introduces design methods suitable for diagnostic systems and fault-tolerant controllers for continuous processes that are described by analytical models of discrete-event systems represented by automata. The book is suitable for engineering students, engineers in industry and researchers who wish to get an overview of the variety of approaches to process diagnosis and fault-tolerant contro...

  14. ASCS online fault detection and isolation based on an improved MPCA

    Science.gov (United States)

    Peng, Jianxin; Liu, Haiou; Hu, Yuhui; Xi, Junqiang; Chen, Huiyan

    2014-09-01

    Multi-way principal component analysis (MPCA) has received considerable attention and been widely used in process monitoring. A traditional MPCA algorithm unfolds multiple batches of historical data into a two-dimensional matrix and cut the matrix along the time axis to form subspaces. However, low efficiency of subspaces and difficult fault isolation are the common disadvantages for the principal component model. This paper presents a new subspace construction method based on kernel density estimation function that can effectively reduce the storage amount of the subspace information. The MPCA model and the knowledge base are built based on the new subspace. Then, fault detection and isolation with the squared prediction error (SPE) statistic and the Hotelling ( T 2) statistic are also realized in process monitoring. When a fault occurs, fault isolation based on the SPE statistic is achieved by residual contribution analysis of different variables. For fault isolation of subspace based on the T 2 statistic, the relationship between the statistic indicator and state variables is constructed, and the constraint conditions are presented to check the validity of fault isolation. Then, to improve the robustness of fault isolation to unexpected disturbances, the statistic method is adopted to set the relation between single subspace and multiple subspaces to increase the corrective rate of fault isolation. Finally fault detection and isolation based on the improved MPCA is used to monitor the automatic shift control system (ASCS) to prove the correctness and effectiveness of the algorithm. The research proposes a new subspace construction method to reduce the required storage capacity and to prove the robustness of the principal component model, and sets the relationship between the state variables and fault detection indicators for fault isolation.

  15. Uncorrected refractive errors.

    Science.gov (United States)

    Naidoo, Kovin S; Jaggernath, Jyoti

    2012-01-01

    Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC), were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR) Development, Service Development and Social Entrepreneurship.

  16. Uncorrected refractive errors

    Directory of Open Access Journals (Sweden)

    Kovin S Naidoo

    2012-01-01

    Full Text Available Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC, were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR Development, Service Development and Social Entrepreneurship.

  17. Periodic Application of Concurrent Error Detection in Processor Array Architectures. PhD. Thesis -

    Science.gov (United States)

    Chen, Paul Peichuan

    1993-01-01

    Processor arrays can provide an attractive architecture for some applications. Featuring modularity, regular interconnection and high parallelism, such arrays are well-suited for VLSI/WSI implementations, and applications with high computational requirements, such as real-time signal processing. Preserving the integrity of results can be of paramount importance for certain applications. In these cases, fault tolerance should be used to ensure reliable delivery of a system's service. One aspect of fault tolerance is the detection of errors caused by faults. Concurrent error detection (CED) techniques offer the advantage that transient and intermittent faults may be detected with greater probability than with off-line diagnostic tests. Applying time-redundant CED techniques can reduce hardware redundancy costs. However, most time-redundant CED techniques degrade a system's performance.

  18. Aeromagnetic anomalies over faulted strata

    Science.gov (United States)

    Grauch, V.J.S.; Hudson, Mark R.

    2011-01-01

    High-resolution aeromagnetic surveys are now an industry standard and they commonly detect anomalies that are attributed to faults within sedimentary basins. However, detailed studies identifying geologic sources of magnetic anomalies in sedimentary environments are rare in the literature. Opportunities to study these sources have come from well-exposed sedimentary basins of the Rio Grande rift in New Mexico and Colorado. High-resolution aeromagnetic data from these areas reveal numerous, curvilinear, low-amplitude (2–15 nT at 100-m terrain clearance) anomalies that consistently correspond to intrasedimentary normal faults (Figure 1). Detailed geophysical and rock-property studies provide evidence for the magnetic sources at several exposures of these faults in the central Rio Grande rift (summarized in Grauch and Hudson, 2007, and Hudson et al., 2008). A key result is that the aeromagnetic anomalies arise from the juxtaposition of magnetically differing strata at the faults as opposed to chemical processes acting at the fault zone. The studies also provide (1) guidelines for understanding and estimating the geophysical parameters controlling aeromagnetic anomalies at faulted strata (Grauch and Hudson), and (2) observations on key geologic factors that are favorable for developing similar sedimentary sources of aeromagnetic anomalies elsewhere (Hudson et al.).

  19. Passive fault current limiting device

    Science.gov (United States)

    Evans, Daniel J.; Cha, Yung S.

    1999-01-01

    A passive current limiting device and isolator is particularly adapted for use at high power levels for limiting excessive currents in a circuit in a fault condition such as an electrical short. The current limiting device comprises a magnetic core wound with two magnetically opposed, parallel connected coils of copper, a high temperature superconductor or other electrically conducting material, and a fault element connected in series with one of the coils. Under normal operating conditions, the magnetic flux density produced by the two coils cancel each other. Under a fault condition, the fault element is triggered to cause an imbalance in the magnetic flux density between the two coils which results in an increase in the impedance in the coils. While the fault element may be a separate current limiter, switch, fuse, bimetal strip or the like, it preferably is a superconductor current limiter conducting one-half of the current load compared to the same limiter wired to carry the total current of the circuit. The major voltage during a fault condition is in the coils wound on the common core in a preferred embodiment.

  20. RECENT GEODYNAMICS OF FAULT ZONES: FAULTING IN REAL TIME SCALE

    Directory of Open Access Journals (Sweden)

    Yu. O. Kuzmin

    2014-01-01

    Full Text Available Recent deformation processes taking place in real time are analyzed on the basis of data on fault zones which were collected by long-term detailed geodetic survey studies with application of field methods and satellite monitoring.A new category of recent crustal movements is described and termed as parametrically induced tectonic strain in fault zones. It is shown that in the fault zones located in seismically active and aseismic regions, super intensive displacements of the crust (5 to 7 cm per year, i.e. (5 to 7·10–5 per year occur due to very small external impacts of natural or technogenic / industrial origin.The spatial discreteness of anomalous deformation processes is established along the strike of the regional Rechitsky fault in the Pripyat basin. It is concluded that recent anomalous activity of the fault zones needs to be taken into account in defining regional regularities of geodynamic processes on the basis of real-time measurements.The paper presents results of analyses of data collected by long-term (20 to 50 years geodetic surveys in highly seismically active regions of Kopetdag, Kamchatka and California. It is evidenced by instrumental geodetic measurements of recent vertical and horizontal displacements in fault zones that deformations are ‘paradoxically’ deviating from the inherited movements of the past geological periods.In terms of the recent geodynamics, the ‘paradoxes’ of high and low strain velocities are related to a reliable empirical fact of the presence of extremely high local velocities of deformations in the fault zones (about 10–5 per year and above, which take place at the background of slow regional deformations which velocities are lower by the order of 2 to 3. Very low average annual velocities of horizontal deformation are recorded in the seismic regions of Kopetdag and Kamchatka and in the San Andreas fault zone; they amount to only 3 to 5 amplitudes of the earth tidal deformations per year.A ‘fault

  1. Preventing Errors in Laterality

    OpenAIRE

    Landau, Elliot; Hirschorn, David; Koutras, Iakovos; Malek, Alexander; Demissie, Seleshie

    2014-01-01

    An error in laterality is the reporting of a finding that is present on the right side as on the left or vice versa. While different medical and surgical specialties have implemented protocols to help prevent such errors, very few studies have been published that describe these errors in radiology reports and ways to prevent them. We devised a system that allows the radiologist to view reports in a separate window, displayed in a simple font and with all terms of laterality highlighted in sep...

  2. Errors and violations

    International Nuclear Information System (INIS)

    Reason, J.

    1988-01-01

    This paper is in three parts. The first part summarizes the human failures responsible for the Chernobyl disaster and argues that, in considering the human contribution to power plant emergencies, it is necessary to distinguish between: errors and violations; and active and latent failures. The second part presents empirical evidence, drawn from driver behavior, which suggest that errors and violations have different psychological origins. The concluding part outlines a resident pathogen view of accident causation, and seeks to identify the various system pathways along which errors and violations may be propagated

  3. What are incident reports telling us? A comparative study at two Australian hospitals of medication errors identified at audit, detected by staff and reported to an incident system.

    Science.gov (United States)

    Westbrook, Johanna I; Li, Ling; Lehnbom, Elin C; Baysari, Melissa T; Braithwaite, Jeffrey; Burke, Rosemary; Conn, Chris; Day, Richard O

    2015-02-01

    To (i) compare medication errors identified at audit and observation with medication incident reports; (ii) identify differences between two hospitals in incident report frequency and medication error rates; (iii) identify prescribing error detection rates by staff. Audit of 3291 patient records at two hospitals to identify prescribing errors and evidence of their detection by staff. Medication administration errors were identified from a direct observational study of 180 nurses administering 7451 medications. Severity of errors was classified. Those likely to lead to patient harm were categorized as 'clinically important'. Two major academic teaching hospitals in Sydney, Australia. Rates of medication errors identified from audit and from direct observation were compared with reported medication incident reports. A total of 12 567 prescribing errors were identified at audit. Of these 1.2/1000 errors (95% CI: 0.6-1.8) had incident reports. Clinically important prescribing errors (n = 539) were detected by staff at a rate of 218.9/1000 (95% CI: 184.0-253.8), but only 13.0/1000 (95% CI: 3.4-22.5) were reported. 78.1% (n = 421) of clinically important prescribing errors were not detected. A total of 2043 drug administrations (27.4%; 95% CI: 26.4-28.4%) contained ≥ 1 errors; none had an incident report. Hospital A had a higher frequency of incident reports than Hospital B, but a lower rate of errors at audit. Prescribing errors with the potential to cause harm frequently go undetected. Reported incidents do not reflect the profile of medication errors which occur in hospitals or the underlying rates. This demonstrates the inaccuracy of using incident frequency to compare patient risk or quality performance within or across hospitals. New approaches including data mining of electronic clinical information systems are required to support more effective medication error detection and mitigation. © The Author 2015. Published by Oxford University Press in association

  4. Fault Modeling and Testing for Analog Circuits in Complex Space Based on Supply Current and Output Voltage

    Directory of Open Access Journals (Sweden)

    Hongzhi Hu

    2015-01-01

    Full Text Available This paper deals with the modeling of fault for analog circuits. A two-dimensional (2D fault model is first proposed based on collaborative analysis of supply current and output voltage. This model is a family of circle loci on the complex plane, and it simplifies greatly the algorithms for test point selection and potential fault simulations, which are primary difficulties in fault diagnosis of analog circuits. Furthermore, in order to reduce the difficulty of fault location, an improved fault model in three-dimensional (3D complex space is proposed, which achieves a far better fault detection ratio (FDR against measurement error and parametric tolerance. To address the problem of fault masking in both 2D and 3D fault models, this paper proposes an effective design for testability (DFT method. By adding redundant bypassing-components in the circuit under test (CUT, this method achieves excellent fault isolation ratio (FIR in ambiguity group isolation. The efficacy of the proposed model and testing method is validated through experimental results provided in this paper.

  5. Fault diagnosis for the heat exchanger of the aircraft environmental control system based on the strong tracking filter.

    Science.gov (United States)

    Ma, Jian; Lu, Chen; Liu, Hongmei

    2015-01-01

    The aircraft environmental control system (ECS) is a critical aircraft system, which provides the appropriate environmental conditions to ensure the safe transport of air passengers and equipment. The functionality and reliability of ECS have received increasing attention in recent years. The heat exchanger is a particularly significant component of the ECS, because its failure decreases the system's efficiency, which can lead to catastrophic consequences. Fault diagnosis of the heat exchanger is necessary to prevent risks. However, two problems hinder the implementation of the heat exchanger fault diagnosis in practice. First, the actual measured parameter of the heat exchanger cannot effectively reflect the fault occurrence, whereas the heat exchanger faults are usually depicted by utilizing the corresponding fault-related state parameters that cannot be measured directly. Second, both the traditional Extended Kalman Filter (EKF) and the EKF-based Double Model Filter have certain disadvantages, such as sensitivity to modeling errors and difficulties in selection of initialization values. To solve the aforementioned problems, this paper presents a fault-related parameter adaptive estimation method based on strong tracking filter (STF) and Modified Bayes classification algorithm for fault detection and failure mode classification of the heat exchanger, respectively. Heat exchanger fault simulation is conducted to generate fault data, through which the proposed methods are validated. The results demonstrate that the proposed methods are capable of providing accurate, stable, and rapid fault diagnosis of the heat exchanger.

  6. Fault diagnosis for the heat exchanger of the aircraft environmental control system based on the strong tracking filter.

    Directory of Open Access Journals (Sweden)

    Jian Ma

    Full Text Available The aircraft environmental control system (ECS is a critical aircraft system, which provides the appropriate environmental conditions to ensure the safe transport of air passengers and equipment. The functionality and reliability of ECS have received increasing attention in recent years. The heat exchanger is a particularly significant component of the ECS, because its failure decreases the system's efficiency, which can lead to catastrophic consequences. Fault diagnosis of the heat exchanger is necessary to prevent risks. However, two problems hinder the implementation of the heat exchanger fault diagnosis in practice. First, the actual measured parameter of the heat exchanger cannot effectively reflect the fault occurrence, whereas the heat exchanger faults are usually depicted by utilizing the corresponding fault-related state parameters that cannot be measured directly. Second, both the traditional Extended Kalman Filter (EKF and the EKF-based Double Model Filter have certain disadvantages, such as sensitivity to modeling errors and difficulties in selection of initialization values. To solve the aforementioned problems, this paper presents a fault-related parameter adaptive estimation method based on strong tracking filter (STF and Modified Bayes classification algorithm for fault detection and failure mode classification of the heat exchanger, respectively. Heat exchanger fault simulation is conducted to generate fault data, through which the proposed methods are validated. The results demonstrate that the proposed methods are capable of providing accurate, stable, and rapid fault diagnosis of the heat exchanger.

  7. Help prevent hospital errors

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/patientinstructions/000618.htm Help prevent hospital errors To use the sharing features ... in the hospital. If You Are Having Surgery, Help Keep Yourself Safe Go to a hospital you ...

  8. Pedal Application Errors

    Science.gov (United States)

    2012-03-01

    This project examined the prevalence of pedal application errors and the driver, vehicle, roadway and/or environmental characteristics associated with pedal misapplication crashes based on a literature review, analysis of news media reports, a panel ...

  9. Rounding errors in weighing

    International Nuclear Information System (INIS)

    Jeach, J.L.

    1976-01-01

    When rounding error is large relative to weighing error, it cannot be ignored when estimating scale precision and bias from calibration data. Further, if the data grouping is coarse, rounding error is correlated with weighing error and may also have a mean quite different from zero. These facts are taken into account in a moment estimation method. A copy of the program listing for the MERDA program that provides moment estimates is available from the author. Experience suggests that if the data fall into four or more cells or groups, it is not necessary to apply the moment estimation method. Rather, the estimate given by equation (3) is valid in this instance. 5 tables

  10. Spotting software errors sooner

    International Nuclear Information System (INIS)

    Munro, D.

    1989-01-01

    Static analysis is helping to identify software errors at an earlier stage and more cheaply than conventional methods of testing. RTP Software's MALPAS system also has the ability to check that a code conforms to its original specification. (author)

  11. Errors in energy bills

    International Nuclear Information System (INIS)

    Kop, L.

    2001-01-01

    On request, the Dutch Association for Energy, Environment and Water (VEMW) checks the energy bills for her customers. It appeared that in the year 2000 many small, but also big errors were discovered in the bills of 42 businesses

  12. Medical Errors Reduction Initiative

    National Research Council Canada - National Science Library

    Mutter, Michael L

    2005-01-01

    The Valley Hospital of Ridgewood, New Jersey, is proposing to extend a limited but highly successful specimen management and medication administration medical errors reduction initiative on a hospital-wide basis...

  13. Smartphone apps to support hospital prescribing and pharmacology education: a review of current provision.

    Science.gov (United States)

    Haffey, Faye; Brady, Richard R W; Maxwell, Simon

    2014-01-01

    Junior doctors write the majority of hospital prescriptions but many indicate they feel underprepared to assume this responsibility and around 10% of prescriptions contain errors. Medical smartphone apps are now widely used in clinical practice and present an opportunity to provide support to inexperienced prescribers. This study assesses the contemporary range of smartphone apps with prescribing or related content. Six smartphone app stores were searched for apps aimed at the healthcare professional with drug, pharmacology or prescribing content. Three hundred and six apps were identified. 34% appeared to be for use within the clinical environment in order to aid prescribing, 14% out with the clinical setting and 51% of apps were deemed appropriate for both clinical and non-clinical use. Apps with drug reference material, such as textbooks, manuals or medical apps with drug information were the commonest apps found (51%), followed by apps offering drug or infusion rate dose calculation (26%). 68% of apps charged for download, with a mean price of £14.25 per app and a range of £0.62-101.90. A diverse range of pharmacology-themed apps are available and there is further potential for the development of contemporary apps to improve prescribing performance. Personalized app stores may help universities/healthcare organizations offer high quality apps to students to aid in pharmacology education. Users of prescribing apps must be aware of the lack of information regarding the medical expertise of app developers. This will enable them to make informed choices about the use of such apps in their clinical practice. © 2013 The British Pharmacological Society.

  14. The surveillance error grid.

    Science.gov (United States)

    Klonoff, David C; Lias, Courtney; Vigersky, Robert; Clarke, William; Parkes, Joan Lee; Sacks, David B; Kirkman, M Sue; Kovatchev, Boris

    2014-07-01

    Currently used error grids for assessing clinical accuracy of blood glucose monitors are based on out-of-date medical practices. Error grids have not been widely embraced by regulatory agencies for clearance of monitors, but this type of tool could be useful for surveillance of the performance of cleared products. Diabetes Technology Society together with representatives from the Food and Drug Administration, the American Diabetes Association, the Endocrine Society, and the Association for the Advancement of Medical Instrumentation, and representatives of academia, industry, and government, have developed a new error grid, called the surveillance error grid (SEG) as a tool to assess the degree of clinical risk from inaccurate blood glucose (BG) monitors. A total of 206 diabetes clinicians were surveyed about the clinical risk of errors of measured BG levels by a monitor. The impact of such errors on 4 patient scenarios was surveyed. Each monitor/reference data pair was scored and color-coded on a graph per its average risk rating. Using modeled data representative of the accuracy of contemporary meters, the relationships between clinical risk and monitor error were calculated for the Clarke error grid (CEG), Parkes error grid (PEG), and SEG. SEG action boundaries were consistent across scenarios, regardless of whether the patient was type 1 or type 2 or using insulin or not. No significant differences were noted between responses of adult/pediatric or 4 types of clinicians. Although small specific differences in risk boundaries between US and non-US clinicians were noted, the panel felt they did not justify separate grids for these 2 types of clinicians. The data points of the SEG were classified in 15 zones according to their assigned level of risk, which allowed for comparisons with the classic CEG and PEG. Modeled glucose monitor data with realistic self-monitoring of blood glucose errors derived from meter testing experiments plotted on the SEG when compared to

  15. Design for Error Tolerance

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1983-01-01

    An important aspect of the optimal design of computer-based operator support systems is the sensitivity of such systems to operator errors. The author discusses how a system might allow for human variability with the use of reversibility and observability.......An important aspect of the optimal design of computer-based operator support systems is the sensitivity of such systems to operator errors. The author discusses how a system might allow for human variability with the use of reversibility and observability....

  16. Fault Management Guiding Principles

    Science.gov (United States)

    Newhouse, Marilyn E.; Friberg, Kenneth H.; Fesq, Lorraine; Barley, Bryan

    2011-01-01

    Regardless of the mission type: deep space or low Earth orbit, robotic or human spaceflight, Fault Management (FM) is a critical aspect of NASA space missions. As the complexity of space missions grows, the complexity of supporting FM systems increase in turn. Data on recent NASA missions show that development of FM capabilities is a common driver for significant cost overruns late in the project development cycle. Efforts to understand the drivers behind these cost overruns, spearheaded by NASA's Science Mission Directorate (SMD), indicate that they are primarily caused by the growing complexity of FM systems and the lack of maturity of FM as an engineering discipline. NASA can and does develop FM systems that effectively protect mission functionality and assets. The cost growth results from a lack of FM planning and emphasis by project management, as well the maturity of FM as an engineering discipline, which lags behind the maturity of other engineering disciplines. As a step towards controlling the cost growth associated with FM development, SMD has commissioned a multi-institution team to develop a practitioner's handbook representing best practices for the end-to-end processes involved in engineering FM systems. While currently concentrating primarily on FM for science missions, the expectation is that this handbook will grow into a NASA-wide handbook, serving as a companion to the NASA Systems Engineering Handbook. This paper presents a snapshot of the principles that have been identified to guide FM development from cradle to grave. The principles range from considerations for integrating FM into the project and SE organizational structure, the relationship between FM designs and mission risk, and the use of the various tools of FM (e.g., redundancy) to meet the FM goal of protecting mission functionality and assets.

  17. Apologies and Medical Error

    Science.gov (United States)

    2008-01-01

    One way in which physicians can respond to a medical error is to apologize. Apologies—statements that acknowledge an error and its consequences, take responsibility, and communicate regret for having caused harm—can decrease blame, decrease anger, increase trust, and improve relationships. Importantly, apologies also have the potential to decrease the risk of a medical malpractice lawsuit and can help settle claims by patients. Patients indicate they want and expect explanations and apologies after medical errors and physicians indicate they want to apologize. However, in practice, physicians tend to provide minimal information to patients after medical errors and infrequently offer complete apologies. Although fears about potential litigation are the most commonly cited barrier to apologizing after medical error, the link between litigation risk and the practice of disclosure and apology is tenuous. Other barriers might include the culture of medicine and the inherent psychological difficulties in facing one’s mistakes and apologizing for them. Despite these barriers, incorporating apology into conversations between physicians and patients can address the needs of both parties and can play a role in the effective resolution of disputes related to medical error. PMID:18972177

  18. Thermodynamics of Error Correction

    Directory of Open Access Journals (Sweden)

    Pablo Sartori

    2015-12-01

    Full Text Available Information processing at the molecular scale is limited by thermal fluctuations. This can cause undesired consequences in copying information since thermal noise can lead to errors that can compromise the functionality of the copy. For example, a high error rate during DNA duplication can lead to cell death. Given the importance of accurate copying at the molecular scale, it is fundamental to understand its thermodynamic features. In this paper, we derive a universal expression for the copy error as a function of entropy production and work dissipated by the system during wrong incorporations. Its derivation is based on the second law of thermodynamics; hence, its validity is independent of the details of the molecular machinery, be it any polymerase or artificial copying device. Using this expression, we find that information can be copied in three different regimes. In two of them, work is dissipated to either increase or decrease the error. In the third regime, the protocol extracts work while correcting errors, reminiscent of a Maxwell demon. As a case study, we apply our framework to study a copy protocol assisted by kinetic proofreading, and show that it can operate in any of these three regimes. We finally show that, for any effective proofreading scheme, error reduction is limited by the chemical driving of the proofreading reaction.

  19. Fault geometry and earthquake mechanics

    Directory of Open Access Journals (Sweden)

    D. J. Andrews

    1994-06-01

    Full Text Available Earthquake mechanics may be determined by the geometry of a fault system. Slip on a fractal branching fault surface can explain: 1 regeneration of stress irregularities in an earthquake; 2 the concentration of stress drop in an earthquake into asperities; 3 starting and stopping of earthquake slip at fault junctions, and 4 self-similar scaling of earthquakes. Slip at fault junctions provides a natural realization of barrier and asperity models without appealing to variations of fault strength. Fault systems are observed to have a branching fractal structure, and slip may occur at many fault junctions in an earthquake. Consider the mechanics of slip at one fault junction. In order to avoid a stress singularity of order 1/r, an intersection of faults must be a triple junction and the Burgers vectors on the three fault segments at the junction must sum to zero. In other words, to lowest order the deformation consists of rigid block displacement, which ensures that the local stress due to the dislocations is zero. The elastic dislocation solution, however, ignores the fact that the configuration of the blocks changes at the scale of the displacement. A volume change occurs at the junction; either a void opens or intense local deformation is required to avoid material overlap. The volume change is proportional to the product of the slip increment and the total slip since the formation of the junction. Energy absorbed at the junction, equal to confining pressure times the volume change, is not large enongh to prevent slip at a new junction. The ratio of energy absorbed at a new junction to elastic energy released in an earthquake is no larger than P/µ where P is confining pressure and µ is the shear modulus. At a depth of 10 km this dimensionless ratio has th value P/µ= 0.01. As slip accumulates at a fault junction in a number of earthquakes, the fault segments are displaced such that they no longer meet at a single point. For this reason the

  20. Fault Analysis in Solar Photovoltaic Arrays

    Science.gov (United States)

    Zhao, Ye

    Fault analysis in solar photovoltaic (PV) arrays is a fundamental task to increase reliability, efficiency and safety in PV systems. Conventional fault protection methods usually add fuses or circuit breakers in series with PV components. But these protection devices are only able to clear faults and isolate faulty circuits if they carry a large fault current. However, this research shows that faults in PV arrays may not be cleared by fuses under some fault scenarios, due to the current-limiting nature and non-linear output characteristics of PV arrays. First, this thesis introduces new simulation and analytic models that are suitable for fault analysis in PV arrays. Based on the simulation environment, this thesis studies a variety of typical faults in PV arrays, such as ground faults, line-line faults, and mismatch faults. The effect of a maximum power point tracker on fault current is discussed and shown to, at times, prevent the fault current protection devices to trip. A small-scale experimental PV benchmark system has been developed in Northeastern University to further validate the simulation conclusions. Additionally, this thesis examines two types of unique faults found in a PV array that have not been studied in the literature. One is a fault that occurs under low irradiance condition. The other is a fault evolution in a PV array during night-to-day transition. Our simulation and experimental results show that overcurrent protection devices are unable to clear the fault under "low irradiance" and "night-to-day transition". However, the overcurrent protection devices may work properly when the same PV fault occurs in daylight. As a result, a fault under "low irradiance" and "night-to-day transition" might be hidden in the PV array and become a potential hazard for system efficiency and reliability.

  1. Radial basis function neural network in fault detection of automotive ...

    African Journals Online (AJOL)

    Radial basis function neural network in fault detection of automotive engines. ... Five faults have been simulated on the MVEM, including three sensor faults, one component fault and one actuator fault. The three sensor faults ... Keywords: Automotive engine, independent RBFNN model, RBF neural network, fault detection

  2. Does non-medical prescribing make a difference to patients?

    Science.gov (United States)

    Carey, Nicola; Stenner, Karen

    This article examines the literature on non-medical prescribing to establish its impact on UK healthcare. It discusses how better access to medication through non-medical prescribing can improve patient safety and patient-centred care, and how nurse prescribing can help ensure quality of care in the NHS during the current financial crisis.

  3. Antibiotic Utilization and Prescribing Patterns in a Nigerian ...

    African Journals Online (AJOL)

    The study of prescribing pattern seeks to monitor, evaluate and suggest a modification in prescriber's prescribing habits so as to make medical care rational and cost effective. Information about antibiotic use pattern is necessary for a constructive approach to problems that arise from multiple antibiotics available. To identify ...

  4. Out-Patient Prescribing Practices at Mbagathi District Hospital ...

    African Journals Online (AJOL)

    On average, each patient was prescribed 3.85 types of drugs. A total of 835 drugs were prescribed by generic name, accounting for 25.6% of total number of drugs prescribed (1,506). Out of 391 sampled prescriptions, 266 had antibiotics accounting for (68.0%). A relatively small proportion of the prescriptions, 9.5% had an ...

  5. Using relative humidity to predict spotfire probability on prescribed burns

    Science.gov (United States)

    John R. Weir

    2007-01-01

    Spotfires have and always will be a problem that burn bosses and fire crews will have to contend with on prescribed burns. Weather factors (temperature, wind speed and relative humidity) are the main variables burn bosses can use to predict and monitor prescribed fire behavior. At the Oklahoma State University Research Range, prescribed burns are conducted during...

  6. Nature and frequency of medication errors in a geriatric ward: an Indonesian experience

    Directory of Open Access Journals (Sweden)

    Ernawati DK

    2014-06-01

    Full Text Available Desak Ketut Ernawati,1,2 Ya Ping Lee,2 Jeffery David Hughes21Faculty of Medicine, Udayana University, Denpasar, Bali, Indonesia; 2School of Pharmacy and Curtin Health Innovation and Research Institute, Curtin University, Perth, WA, AustraliaPurpose: To determine the nature and frequency of medication errors during medication delivery processes in a public teaching hospital geriatric ward in Bali, Indonesia.Methods: A 20-week prospective study on medication errors occurring during the medication delivery process was conducted in a geriatric ward in a public teaching hospital in Bali, Indonesia. Participants selected were inpatients aged more than 60 years. Patients were excluded if they had a malignancy, were undergoing surgery, or receiving chemotherapy treatment. The occurrence of medication errors in prescribing, transcribing, dispensing, and administration were detected by the investigator providing in-hospital clinical pharmacy services.Results: Seven hundred and seventy drug orders and 7,662 drug doses were reviewed as part of the study. There were 1,563 medication errors detected among the 7,662 drug doses reviewed, representing an error rate of 20.4%. Administration errors were the most frequent medication errors identified (59%, followed by transcription errors (15%, dispensing errors (14%, and prescribing errors (7%. Errors in documentation were the most common form of administration errors. Of these errors, 2.4% were classified as potentially serious and 10.3% as potentially significant.Conclusion: Medication errors occurred in every stage of the medication delivery process, with administration errors being the most frequent. The majority of errors identified in the administration stage were related to documentation. Provision of in-hospital clinical pharmacy services could potentially play a significant role in detecting and preventing medication errors.Keywords: geriatric, medication errors, inpatients, medication delivery process

  7. Universal Fault-Tolerant Gates on Concatenated Stabilizer Codes

    Directory of Open Access Journals (Sweden)

    Theodore J. Yoder

    2016-09-01

    Full Text Available It is an oft-cited fact that no quantum code can support a set of fault-tolerant logical gates that is both universal and transversal. This no-go theorem is generally responsible for the interest in alternative universality constructions including magic state distillation. Widely overlooked, however, is the possibility of nontransversal, yet still fault-tolerant, gates that work directly on small quantum codes. Here, we demonstrate precisely the existence of such gates. In particular, we show how the limits of nontransversality can be overcome by performing rounds of intermediate error correction to create logical gates on stabilizer codes that use no ancillas other than those required for syndrome measurement. Moreover, the logical gates we construct, the most prominent examples being Toffoli and controlled-controlled-Z, often complete universal gate sets on their codes. We detail such universal constructions for the smallest quantum codes, the 5-qubit and 7-qubit codes, and then proceed to generalize the approach. One remarkable result of this generalization is that any nondegenerate stabilizer code with a complete set of fault-tolerant single-qubit Clifford gates has a universal set of fault-tolerant gates. Another is the interaction of logical qubits across different stabilizer codes, which, for instance, implies a broadly applicable method of code switching.

  8. A γ-ray survey along Hanaore fault

    International Nuclear Information System (INIS)

    Mino, Kazuo

    1978-01-01

    The γ-ray survey was carried out by a scintillation survey meter at O-hara area near around Hanaore Fault Zone in the northern part of Kyoto. The survey was done several times over along the same observational line. Static pattern of γ-ray intensity is revealed similar one in each other, even there is small difference. Strong intensity of γ-ray means subsistance of crushed rocks zone and a huge fault as Hanaore consists of the structure made by these weak zones. A pretty large earthquake among microearthquakes was occurred, fortunately for us, during survey period. The γ-ray survey was done just on January 6, 1978 when it was just one day before the earthquake. The observational results before the earthquake, did not give large variations of γ-ray intensity. But after 5 days from the earthquake, that is January 11, the intensity of γ-ray decreases into low value, over observational error, at almost all stations. The improvement of γ-ray was found after 2 weeks from the earthquake. Ordinarily the large fault as Hanaore is one of boundaries around block of crust, and fault zone is more sensitive to geophysical activity in the crust. Continuous observation of γ-ray will give the solution to corelation with earthquake or earthquake prediction. (author)

  9. Links between N-modular redundancy and the theory of error-correcting codes

    Science.gov (United States)

    Bobin, V.; Whitaker, S.; Maki, G.

    1992-01-01

    N-Modular Redundancy (NMR) is one of the best known fault tolerance techniques. Replication of a module to achieve fault tolerance is in some ways analogous to the use of a repetition code where an information symbol is replicated as parity symbols in a codeword. Linear Error-Correcting Codes (ECC) use linear combinations of information symbols as parity symbols which are used to generate syndromes for error patterns. These observations indicate links between the theory of ECC and the use of hardware redundancy for fault tolerance. In this paper, we explore some of these links and show examples of NMR systems where identification of good and failed elements is accomplished in a manner similar to error correction using linear ECC's.

  10. Soft error evaluation and vulnerability analysis in Xilinx Zynq-7010 system-on chip

    Energy Technology Data Exchange (ETDEWEB)

    Du, Xuecheng; He, Chaohui; Liu, Shuhuan, E-mail: liushuhuan@mail.xjtu.edu.cn; Zhang, Yao; Li, Yonghong; Xiong, Ceng; Tan, Pengkang

    2016-09-21

    Radiation-induced soft errors are an increasingly important threat to the reliability of modern electronic systems. In order to evaluate system-on chip's reliability and soft error, the fault tree analysis method was used in this work. The system fault tree was constructed based on Xilinx Zynq-7010 All Programmable SoC. Moreover, the soft error rates of different components in Zynq-7010 SoC were tested by americium-241 alpha radiation source. Furthermore, some parameters that used to evaluate the system's reliability and safety were calculated using Isograph Reliability Workbench 11.0, such as failure rate, unavailability and mean time to failure (MTTF). According to fault tree analysis for system-on chip, the critical blocks and system reliability were evaluated through the qualitative and quantitative analysis.

  11. A theoretical basis for the analysis of multiversion software subject to coincident errors

    Science.gov (United States)

    Eckhardt, D. E., Jr.; Lee, L. D.

    1985-01-01

    Fundamental to the development of redundant software techniques (known as fault-tolerant software) is an understanding of the impact of multiple joint occurrences of errors, referred to here as coincident errors. A theoretical basis for the study of redundant software is developed which: (1) provides a probabilistic framework for empirically evaluating the effectiveness of a general multiversion strategy when component versions are subject to coincident errors, and (2) permits an analytical study of the effects of these errors. An intensity function, called the intensity of coincident errors, has a central role in this analysis. This function describes the propensity of programmers to introduce design faults in such a way that software components fail together when executing in the application environment. A condition under which a multiversion system is a better strategy than relying on a single version is given.

  12. Soil heating and impact of prescribed burning

    Science.gov (United States)

    Stoof, Cathelijne

    2016-04-01

    Prescribed burning is highly uncommon in the Netherlands, where wildfire awareness is increasing but its risk management does not yet include fuel management strategies. A major exception is on two military bases, that need to burn their fields in winter and spring to prevent wildfires during summer shooting practice. Research on these very frequent burns has so far been limited to effects on biodiversity, yet site managers and policy makers have questions regarding the soil temperatures reached during these burns because of potential impact on soil properties and soil dwelling fauna. In March 2015, I therefore measured soil and litter temperatures under heath and grass vegetation during a prescribed burn on military terrain in the Netherlands. Soil and litter moisture were sampled pre- and post-fire, ash was collected, and fireline intensity was estimated from flame length. While standing vegetation was dry (0.13 g water/g biomass for grass and 0.6 g/g for heather), soil and litter were moist (0.21 cm3/cm3 and 1.6 g/g, respectively). Soil heating was therefore very limited, with maximum soil temperature at the soil-litter interface remaining being as low as 6.5 to 11.5°C, and litter temperatures reaching a maximum of 77.5°C at the top of the litter layer. As a result, any changes in physical properties like soil organic matter content and bulk density were not significant. These results are a first step towards a database of soil heating in relation to fuel load and fire intensity in this temperate country, which is not only valuable to increase understanding of the relationships between fire intensity and severity, but also instrumental in the policy debate regarding the sustainability of prescribed burns.

  13. The incidence and types of medication errors in patients receiving antiretroviral therapy in resource-constrained settings.

    Directory of Open Access Journals (Sweden)

    Kenneth Anene Agu

    Full Text Available This study assessed the incidence and types of medication errors, interventions and outcomes in patients on antiretroviral therapy (ART in selected HIV treatment centres in Nigeria.Of 69 health facilities that had program for active screening of medication errors, 14 were randomly selected for prospective cohort assessment. All patients who filled/refilled their antiretroviral medications between February 2009 and March 2011 were screened for medication errors using study-specific pharmaceutical care daily worksheet (PCDW. All potential or actual medication errors identified, interventions provided and the outcomes were documented in the PCDW. Interventions included pharmaceutical care in HIV training for pharmacists amongst others. Chi-square was used for inferential statistics and P0.05. The major medications errors identified were 26.4% incorrect ART regimens prescribed; 19.8% potential drug-drug interaction or contraindication present; and 16.6% duration and/or frequency of medication inappropriate. Interventions provided included 67.1% cases of prescriber contacted to clarify/resolve errors and 14.7% cases of patient counselling and education; 97.4% of potential/actual medication error(s were resolved.The incidence rate of medication errors was somewhat high; and majority of identified errors were related to prescription of incorrect ART regimens and potential drug-drug interactions; the prescriber was contacted and the errors were resolved in majority of cases. Active screening for medication errors is feasible in resource-limited settings following a capacity building intervention.

  14. Bounding quantum gate error rate based on reported average fidelity

    International Nuclear Information System (INIS)

    Sanders, Yuval R; Wallman, Joel J; Sanders, Barry C

    2016-01-01

    Remarkable experimental advances in quantum computing are exemplified by recent announcements of impressive average gate fidelities exceeding 99.9% for single-qubit gates and 99% for two-qubit gates. Although these high numbers engender optimism that fault-tolerant quantum computing is within reach, the connection of average gate fidelity with fault-tolerance requirements is not direct. Here we use reported average gate fidelity to determine an upper bound on the quantum-gate error rate, which is the appropriate metric for assessing progress towards fault-tolerant quantum computation, and we demonstrate that this bound is asymptotically tight for general noise. Although this bound is unlikely to be saturated by experimental noise, we demonstrate using explicit examples that the bound indicates a realistic deviation between the true error rate and the reported average fidelity. We introduce the Pauli distance as a measure of this deviation, and we show that knowledge of the Pauli distance enables tighter estimates of the error rate of quantum gates. (fast track communication)

  15. An advanced SEU tolerant latch based on error detection

    Science.gov (United States)

    Xu, Hui; Zhu, Jianwei; Lu, Xiaoping; Li, Jingzhao

    2018-05-01

    This paper proposes a latch that can mitigate SEUs via an error detection circuit. The error detection circuit is hardened by a C-element and a stacked PMOS. In the hold state, a particle strikes the latch or the error detection circuit may cause a fault logic state of the circuit. The error detection circuit can detect the upset node in the latch and the fault output will be corrected. The upset node in the error detection circuit can be corrected by the C-element. The power dissipation and propagation delay of the proposed latch are analyzed by HSPICE simulations. The proposed latch consumes about 77.5% less energy and 33.1% less propagation delay than the triple modular redundancy (TMR) latch. Simulation results demonstrate that the proposed latch can mitigate SEU effectively. Project supported by the National Natural Science Foundation of China (Nos. 61404001, 61306046), the Anhui Province University Natural Science Research Major Project (No. KJ2014ZD12), the Huainan Science and Technology Program (No. 2013A4011), and the National Natural Science Foundation of China (No. 61371025).

  16. [Prescribed drugs - a new crime field?].

    Science.gov (United States)

    Schwarzenbrunner, Thomas

    2014-12-01

    The first chapter of the following article discusses measures in terms of substitution treatment of a program of the Austrian Minister of the Interior. The relevance of psychosocial measures and aims of substitution treatment for opioid-dependent patients is illuminated. The abstinence as the only goal definition is modified and by the results of the study PREMOS a target differentiation at addiction work is illustrated. The second chapter addresses the misuse of prescribed drugs. Thereby police report data will be analyzed and the market situation of opioids will be outlined.

  17. How to prescribe physical exercise in rheumatology

    Directory of Open Access Journals (Sweden)

    S. Maddali Bongi

    2011-06-01

    Full Text Available Physical exercise, aiming to improve range of movement, muscle strength and physical well being, lately substituted the immobilization previously prescribed in rheumatic diseases. International guidelines, recommendations of Scientific Societies, and structured reviews regard physical exercise as of pivotal importance in treating rheumatoid arthritis, ankylosing spondylitis, osteoarthritis, fibromyalgia syndrome, osteoporosis, and to be considered in connective tissue diseases. Therapeutic exercise should: aim to improve firstly local symptoms and then general health; respect the pain threshold; be a part of a treatment including pharmacological therapies and other rehabilitation techniques, be administered by skilled physiotherapist under the guide of a rheumatologist, be different according to different diseases, disease phases and patient expectations.

  18. Prescribing tests must have curriculum support

    Directory of Open Access Journals (Sweden)

    Lemon TI

    2013-05-01

    Full Text Available Rupali D Shah, Thomas I LemonSchool of Medicine, Cardiff University, University Hospital of Wales, Cardiff, WalesGordon, Catchpole and Baker1 have discussed and investigated a very interesting, currently relevant, subject in medical education; particularly with the introduction of the prescribing test for undergraduates trialled in the UK this year and set to become a fully-fledged part of the curriculum and assessment criteria for 2014 graduates.2 It would of course be of great interest to compare the themes discussed in this paper and see they how would compare to recent graduates in late 2014.View original paper by Gordon and colleagues.

  19. A New Method for Weak Fault Feature Extraction Based on Improved MED

    Directory of Open Access Journals (Sweden)

    Junlin Li

    2018-01-01

    Full Text Available Because of the characteristics of weak signal and strong noise, the low-speed vibration signal fault feature extraction has been a hot spot and difficult problem in the field of equipment fault diagnosis. Moreover, the traditional minimum entropy deconvolution (MED method has been proved to be used to detect such fault signals. The MED uses objective function method to design the filter coefficient, and the appropriate threshold value should be set in the calculation process to achieve the optimal iteration effect. It should be pointed out that the improper setting of the threshold will cause the target function to be recalculated, and the resulting error will eventually affect the distortion of the target function in the background of strong noise. This paper presents an improved MED based method of fault feature extraction from rolling bearing vibration signals that originate in high noise environments. The method uses the shuffled frog leaping algorithm (SFLA, finds the set of optimal filter coefficients, and eventually avoids the artificial error influence of selecting threshold parameter. Therefore, the fault bearing under the two rotating speeds of 60 rpm and 70 rpm is selected for verification with typical low-speed fault bearing as the research object; the results show that SFLA-MED extracts more obvious bearings and has a higher signal-to-noise ratio than the prior MED method.

  20. Application of fault current limiters

    Energy Technology Data Exchange (ETDEWEB)

    Neumann, A.

    2007-11-30

    This report presents the results of a study commissioned by the Department for Business, Enterprise and Industry (BERR; formerly the Department of Trade and Industry) into the application of fault current limiters in the UK. The study reviewed the current state of fault current limiter (FCL) technology and regulatory position in relation to all types of current limiters. It identified significant research and development work with respect to medium voltage FCLs and a move to high voltage. Appropriate FCL technologies being developed include: solid state breakers; superconducting FCLs (including superconducting transformers); magnetic FCLs; and active network controllers. Commercialisation of these products depends on successful field tests and experience, plus material development in the case of high temperature superconducting FCL technologies. The report describes FCL techniques, the current state of FCL technologies, practical applications and future outlook for FCL technologies, distribution fault level analysis and an outline methodology for assessing the materiality of the fault level problem. A roadmap is presented that provides an 'action agenda' to advance the fault level issues associated with low carbon networks.

  1. Fault trees for diagnosis of system fault conditions

    International Nuclear Information System (INIS)

    Lambert, H.E.; Yadigaroglu, G.

    1977-01-01

    Methods for generating repair checklists on the basis of fault tree logic and probabilistic importance are presented. A one-step-ahead optimization procedure, based on the concept of component criticality, minimizing the expected time to diagnose system failure is outlined. Options available to the operator of a nuclear power plant when system fault conditions occur are addressed. A low-pressure emergency core cooling injection system, a standby safeguard system of a pressurized water reactor power plant, is chosen as an example illustrating the methods presented

  2. Identifying Conventionally Sub-Seismic Faults in Polygonal Fault Systems

    Science.gov (United States)

    Fry, C.; Dix, J.

    2017-12-01

    Polygonal Fault Systems (PFS) are prevalent in hydrocarbon basins globally and represent potential fluid pathways. However the characterization of these pathways is subject to the limitations of conventional 3D seismic imaging; only capable of resolving features on a decametre scale horizontally and metres scale vertically. While outcrop and core examples can identify smaller features, they are limited by the extent of the exposures. The disparity between these scales can allow for smaller faults to be lost in a resolution gap which could mean potential pathways are left unseen. Here the focus is upon PFS from within the London Clay, a common bedrock that is tunnelled into and bears construction foundations for much of London. It is a continuation of the Ieper Clay where PFS were first identified and is found to approach the seafloor within the Outer Thames Estuary. This allows for the direct analysis of PFS surface expressions, via the use of high resolution 1m bathymetric imaging in combination with high resolution seismic imaging. Through use of these datasets surface expressions of over 1500 faults within the London Clay have been identified, with the smallest fault measuring 12m and the largest at 612m in length. The displacements over these faults established from both bathymetric and seismic imaging ranges from 30cm to a couple of metres, scales that would typically be sub-seismic for conventional basin seismic imaging. The orientations and dimensions of the faults within this network have been directly compared to 3D seismic data of the Ieper Clay from the offshore Dutch sector where it exists approximately 1km below the seafloor. These have typical PFS attributes with lengths of hundreds of metres to kilometres and throws of tens of metres, a magnitude larger than those identified in the Outer Thames Estuary. The similar orientations and polygonal patterns within both locations indicates that the smaller faults exist within typical PFS structure but are

  3. EXPERIMENT BASED FAULT DIAGNOSIS ON BOTTLE FILLING PLANT WITH LVQ ARTIFICIAL NEURAL NETWORK ALGORITHM

    Directory of Open Access Journals (Sweden)

    Mustafa DEMETGÜL

    2008-01-01

    Full Text Available In this study, an artificial neural network is developed to find an error rapidly on pneumatic system. Also the ANN prevents the system versus the failure. The error on the experimental bottle filling plant can be defined without any interference using analog values taken from pressure sensors and linear potentiometers. The sensors and potentiometers are placed on different places of the plant. Neural network diagnosis faults on plant, where no bottle, cap closing cylinder B is not working, bottle cap closing cylinder C is not working, air pressure is not sufficient, water is not filling and low air pressure faults. The fault is diagnosed by artificial neural network with LVQ. It is possible to find an failure by using normal programming or PLC. The reason offing Artificial Neural Network is to give a information where the fault is. However, ANN can be used for different systems. The aim is to find the fault by using ANN simultaneously. In this situation, the error taken place on the pneumatic system is collected by a data acquisition card. It is observed that the algorithm is very capable program for many industrial plants which have mechatronic systems.

  4. Learning from Errors

    Directory of Open Access Journals (Sweden)

    MA. Lendita Kryeziu

    2015-06-01

    Full Text Available “Errare humanum est”, a well known and widespread Latin proverb which states that: to err is human, and that people make mistakes all the time. However, what counts is that people must learn from mistakes. On these grounds Steve Jobs stated: “Sometimes when you innovate, you make mistakes. It is best to admit them quickly, and get on with improving your other innovations.” Similarly, in learning new language, learners make mistakes, thus it is important to accept them, learn from them, discover the reason why they make them, improve and move on. The significance of studying errors is described by Corder as: “There have always been two justifications proposed for the study of learners' errors: the pedagogical justification, namely that a good understanding of the nature of error is necessary before a systematic means of eradicating them could be found, and the theoretical justification, which claims that a study of learners' errors is part of the systematic study of the learners' language which is itself necessary to an understanding of the process of second language acquisition” (Corder, 1982; 1. Thus the importance and the aim of this paper is analyzing errors in the process of second language acquisition and the way we teachers can benefit from mistakes to help students improve themselves while giving the proper feedback.

  5. Compact disk error measurements

    Science.gov (United States)

    Howe, D.; Harriman, K.; Tehranchi, B.

    1993-01-01

    The objectives of this project are as follows: provide hardware and software that will perform simple, real-time, high resolution (single-byte) measurement of the error burst and good data gap statistics seen by a photoCD player read channel when recorded CD write-once discs of variable quality (i.e., condition) are being read; extend the above system to enable measurement of the hard decision (i.e., 1-bit error flags) and soft decision (i.e., 2-bit error flags) decoding information that is produced/used by the Cross Interleaved - Reed - Solomon - Code (CIRC) block decoder employed in the photoCD player read channel; construct a model that uses data obtained via the systems described above to produce meaningful estimates of output error rates (due to both uncorrected ECC words and misdecoded ECC words) when a CD disc having specific (measured) error statistics is read (completion date to be determined); and check the hypothesis that current adaptive CIRC block decoders are optimized for pressed (DAD/ROM) CD discs. If warranted, do a conceptual design of an adaptive CIRC decoder that is optimized for write-once CD discs.

  6. Fault-tolerant architecture: Evaluation methodology

    International Nuclear Information System (INIS)

    Battle, R.E.; Kisner, R.A.

    1992-08-01

    The design and reliability of four fault-tolerant architectures that may be used in nuclear power plant control systems were evaluated. Two architectures are variations of triple-modular-redundant (TMR) systems, and two are variations of dual redundant systems. The evaluation includes a review of methods of implementing fault-tolerant control, the importance of automatic recovery from failures, methods of self-testing diagnostics, block diagrams of typical fault-tolerant controllers, review of fault-tolerant controllers operating in nuclear power plants, and fault tree reliability analyses of fault-tolerant systems

  7. Fault Isolation for Shipboard Decision Support

    DEFF Research Database (Denmark)

    Lajic, Zoran; Blanke, Mogens; Nielsen, Ulrik Dam

    2010-01-01

    Fault detection and fault isolation for in-service decision support systems for marine surface vehicles will be presented in this paper. The stochastic wave elevation and the associated ship responses are modeled in the frequency domain. The paper takes as an example fault isolation of a containe......Fault detection and fault isolation for in-service decision support systems for marine surface vehicles will be presented in this paper. The stochastic wave elevation and the associated ship responses are modeled in the frequency domain. The paper takes as an example fault isolation...... to the quality of decisions given to navigators....

  8. An architecture for fault tolerant controllers

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, Jakob

    2005-01-01

    degradation in the sense of guaranteed degraded performance. A number of fault diagnosis problems, fault tolerant control problems, and feedback control with fault rejection problems are formulated/considered, mainly from a fault modeling point of view. The method is illustrated on a servo example including......A general architecture for fault tolerant control is proposed. The architecture is based on the (primary) YJBK parameterization of all stabilizing compensators and uses the dual YJBK parameterization to quantify the performance of the fault tolerant system. The approach suggested can be applied...

  9. Fault estimation - A standard problem approach

    DEFF Research Database (Denmark)

    Stoustrup, J.; Niemann, Hans Henrik

    2002-01-01

    This paper presents a range of optimization based approaches to fault diagnosis. A variety of fault diagnosis problems are reformulated in the so-called standard problem set-up introduced in the literature on robust control. Once the standard problem formulations are given, the fault diagnosis...... problems can be solved by standard optimization techniques. The proposed methods include (1) fault diagnosis (fault estimation, (FE)) for systems with model uncertainties; FE for systems with parametric faults, and FE for a class of nonlinear systems. Copyright...

  10. The Combined Application of Fault Trees and Turbine Cycle Simulation in Generation Risk Assessment

    International Nuclear Information System (INIS)

    Heo, Gyun Young; Park, Jin Kyun

    2009-01-01

    The paper describes a few ideas developed for the framework to quantify human errors taking place during the test and maintenance (T and M) in a secondary system of nuclear power plants, which was presented in the previous meeting. GRA-HRE (Generation Risk Assessment for Human Related Events) is composed of four essential components, the human error interpreter, the frequency estimator, the risk estimator, and the derate estimator. The proposed GRA gave emphasis on explicitly considering human errors, performing fault tree analysis including the entire balance-of-plant side, and quantifying electric loss under abnormal plant configurations. In terms of the consideration of human errors, it was hard to distinguish the effects of human errors from other failure modes in the conventional GRA because the human errors were implicitly involved in mechanical failure mode. Since the risk estimator in GRA-HRE separately deals with the basic events representing human error modes such as control failure, wrong object, omission, wrong action, etc., we can recognize their relative importance comparing with other types of mechanical failures. Other specialties in GRA-HRE came from the combined application of fault tree analysis and turbine cycle simulation. The previous study suggested that we would use the fault tree analysis with the top events designated by system's malfunction such as 'feedwater system failure' to develop the risk estimator. However, this approach could not clearly provide the path of propagation of human errors, and it was difficult to present the failure logics in some cases. In order to overcome these bottlenecks, the paper is going to propose the modified idea to setup top events and to explain how to make use of turbine cycle simulation to complete the fault trees in a cooperative manner

  11. Comprehensive analysis of a medication dosing error related to CPOE.

    Science.gov (United States)

    Horsky, Jan; Kuperman, Gilad J; Patel, Vimla L

    2005-01-01

    This case study of a serious medication error demonstrates the necessity of a comprehensive methodology for the analysis of failures in interaction between humans and information systems. The authors used a novel approach to analyze a dosing error related to computer-based ordering of potassium chloride (KCl). The method included a chronological reconstruction of events and their interdependencies from provider order entry usage logs, semistructured interviews with involved clinicians, and interface usability inspection of the ordering system. Information collected from all sources was compared and evaluated to understand how the error evolved and propagated through the system. In this case, the error was the product of faults in interaction among human and system agents that methods limited in scope to their distinct analytical domains would not identify. The authors characterized errors in several converging aspects of the drug ordering process: confusing on-screen laboratory results review, system usability difficulties, user training problems, and suboptimal clinical system safeguards that all contributed to a serious dosing error. The results of the authors' analysis were used to formulate specific recommendations for interface layout and functionality modifications, suggest new user alerts, propose changes to user training, and address error-prone steps of the KCl ordering process to reduce the risk of future medication dosing errors.

  12. “Comprehensive emission measurements from prescribed ...

    Science.gov (United States)

    Simultaneous aerial- and ground-based emission sampling was conducted during prescribed burns at Eglin Air Force Base in November 2012 on a short grass/shrub field and a pine forest. Cumulative emission samples for volatile organic comounds, elemental carbon, organic carbon, chlorinated dioxins and furans, and PM2.5 and continuous samples for black carbon, particle size, and CO2 were taken. Aerial instruments were lofted using a 5 m diameter, helium-filled aerostat that was maneuvered with two remotely-controlled tethers mounted on all-terrain vehicles. A parallel set of instruments on the ground made simultaneous measurements, allowing for a comparison of ground level versus elevated measurements. Ground instruments were supplemented by additional measurements of polycyclic aromatic hydrocarbons and particle aerosol absorption and light scattering. Raw biomass was also gathered on site and tested in a laboratory combustion facility using the same array of instruments. This work compares emissions derived from aerial and ground sampling as well as field and laboratory results. This abstract will likely be the first ever prescribed burn study to compare laboratory and field emission results with results from aerial and and ground sampling. As such it will inform sampling methods for future events and determine the ability of laboratory simulations to mimic events inthe field.

  13. Data-Driven Method for Wind Turbine Yaw Angle Sensor Zero-Point Shifting Fault Detection

    Directory of Open Access Journals (Sweden)

    Yan Pei

    2018-03-01

    Full Text Available Wind turbine yaw control plays an important role in increasing the wind turbine production and also in protecting the wind turbine. Accurate measurement of yaw angle is the basis of an effective wind turbine yaw controller. The accuracy of yaw angle measurement is affected significantly by the problem of zero-point shifting. Hence, it is essential to evaluate the zero-point shifting error on wind turbines on-line in order to improve the reliability of yaw angle measurement in real time. Particularly, qualitative evaluation of the zero-point shifting error could be useful for wind farm operators to realize prompt and cost-effective maintenance on yaw angle sensors. In the aim of qualitatively evaluating the zero-point shifting error, the yaw angle sensor zero-point shifting fault is firstly defined in this paper. A data-driven method is then proposed to detect the zero-point shifting fault based on Supervisory Control and Data Acquisition (SCADA data. The zero-point shifting fault is detected in the proposed method by analyzing the power performance under different yaw angles. The SCADA data are partitioned into different bins according to both wind speed and yaw angle in order to deeply evaluate the power performance. An indicator is proposed in this method for power performance evaluation under each yaw angle. The yaw angle with the largest indicator is considered as the yaw angle measurement error in our work. A zero-point shifting fault would trigger an alarm if the error is larger than a predefined threshold. Case studies from several actual wind farms proved the effectiveness of the proposed method in detecting zero-point shifting fault and also in improving the wind turbine performance. Results of the proposed method could be useful for wind farm operators to realize prompt adjustment if there exists a large error of yaw angle measurement.

  14. Using UAVSAR to Estimate Creep Along the Superstition Hills Fault, Southern California

    Science.gov (United States)

    Donnellan, A.; Parker, J. W.; Pierce, M.; Wang, J.

    2012-12-01

    UAVSAR data were first acquired over the Salton Trough region, just north of the Mexican border in October 2009. Second passes of data were acquired on 12 and 13 April 2010, about one week following the 5 April 2010 M 7.2 El Mayor - Cucapah earthquake. The earthquake resulted in creep on several faults north of the main rupture, including the Yuha, Imperial, and Superstition Hills faults. The UAVSAR platform acquires data about every six meters in swaths about 15 km wide. Tropospheric effects and residual aircraft motion contribute to error in the estimation of surface deformation in the Repeat Pass Interferometry products. The Superstition Hills fault shows clearly in the associated radar interferogram; however, error in the data product makes it difficult to infer deformation from long profiles that cross the fault. Using the QuakeSim InSAR Profile tool we extracted line of site profiles on either side of the fault delineated in the interferogram. We were able to remove much of the correlated error by differencing profiles 250 m on either side of the fault. The result shows right-lateral creep of 1.5±.4 mm along the northern 7 km of the fault in the interferogram. The amount of creep abruptly changes to 8.4±.4 mm of right lateral creep along at least 9 km of the fault covered in the image to the south. The transition occurs within less than 100 m along the fault. We also extracted 2 km long line of site profiles perpendicular to this section of the fault. Averaging these profiles shows a step across the fault of 14.9±.3 mm with greater creep on the order of 20 mm on the northern two profiles and lower creep of about 10 mm on the southern two profiles. Nearby GPS stations P503 and P493 are consistent with this result. They also confirm that the creep event occurred at the time of the El Mayor - Cucapah earthquake. By removing regional deformation resulting from the main rupture we were able to invert for the depth of creep from the surface. Results indicate

  15. Personal and professional challenges of nurse prescribing in Ireland.

    Science.gov (United States)

    McBrien, Barry

    This article presents the challenges regarding the development of a collaborative practice agreement in order to undertake nurse prescribing in an emergency department in a large teaching hospital. Nurse prescribing has been introduced quite recently in Ireland. Although there is a plethora of knowledge regarding the topic, there are many personal and professional challenges in relation to this emerging role. The nurse prescribing initiative in Ireland is continually developing and many nurses now have the authority to prescribe from almost the same range of medicines as doctors. Prescribing has the potential to improve job satisfaction, autonomy and ultimately improves patient outcomes. However, nurses need to be cognisant of the impact it can have on the dynamics of the healthcare team. An analysis of some complexities of nurse prescribing is given, in conjunction with reflective thoughts on a clinical incident in the area of morphine prescribing.

  16. Integrated fault tree development environment

    International Nuclear Information System (INIS)

    Dixon, B.W.

    1986-01-01

    Probabilistic Risk Assessment (PRA) techniques are utilized in the nuclear industry to perform safety analyses of complex defense-in-depth systems. A major effort in PRA development is fault tree construction. The Integrated Fault Tree Environment (IFTREE) is an interactive, graphics-based tool for fault tree design. IFTREE provides integrated building, editing, and analysis features on a personal workstation. The design philosophy of IFTREE is presented, and the interface is described. IFTREE utilizes a unique rule-based solution algorithm founded in artificial intelligence (AI) techniques. The impact of the AI approach on the program design is stressed. IFTREE has been developed to handle the design and maintenance of full-size living PRAs and is currently in use

  17. A description of medication errors reported by pharmacists in a neonatal intensive care unit.

    Science.gov (United States)

    Pawluk, Shane; Jaam, Myriam; Hazi, Fatima; Al Hail, Moza Sulaiman; El Kassem, Wessam; Khalifa, Hanan; Thomas, Binny; Abdul Rouf, Pallivalappila

    2017-02-01

    Background Patients in the Neonatal Intensive Care Unit (NICU) are at an increased risk for medication errors. Objective The objective of this study is to describe the nature and setting of medication errors occurring in patients admitted to an NICU in Qatar based on a standard electronic system reported by pharmacists. Setting Neonatal intensive care unit, Doha, Qatar. Method This was a retrospective cross-sectional study on medication errors reported electronically by pharmacists in the NICU between January 1, 2014 and April 30, 2015. Main outcome measure Data collected included patient information, and incident details including error category, medications involved, and follow-up completed. Results A total of 201 NICU pharmacists-reported medication errors were submitted during the study period. All reported errors did not reach the patient and did not cause harm. Of the errors reported, 98.5% occurred in the prescribing phase of the medication process with 58.7% being due to calculation errors. Overall, 53 different medications were documented in error reports with the anti-infective agents being the most frequently cited. The majority of incidents indicated that the primary prescriber was contacted and the error was resolved before reaching the next phase of the medication process. Conclusion Medication errors reported by pharmacists occur most frequently in the prescribing phase of the medication process. Our data suggest that error reporting systems need to be specific to the population involved. Special attention should be paid to frequently used medications in the NICU as these were responsible for the greatest numbers of medication errors.

  18. Errors in Neonatology

    Directory of Open Access Journals (Sweden)

    Antonio Boldrini

    2013-06-01

    Full Text Available Introduction: Danger and errors are inherent in human activities. In medical practice errors can lean to adverse events for patients. Mass media echo the whole scenario. Methods: We reviewed recent published papers in PubMed database to focus on the evidence and management of errors in medical practice in general and in Neonatology in particular. We compared the results of the literature with our specific experience in Nina Simulation Centre (Pisa, Italy. Results: In Neonatology the main error domains are: medication and total parenteral nutrition, resuscitation and respiratory care, invasive procedures, nosocomial infections, patient identification, diagnostics. Risk factors include patients’ size, prematurity, vulnerability and underlying disease conditions but also multidisciplinary teams, working conditions providing fatigue, a large variety of treatment and investigative modalities needed. Discussion and Conclusions: In our opinion, it is hardly possible to change the human beings but it is likely possible to change the conditions under they work. Voluntary errors report systems can help in preventing adverse events. Education and re-training by means of simulation can be an effective strategy too. In Pisa (Italy Nina (ceNtro di FormazIone e SimulazioNe NeonAtale is a simulation center that offers the possibility of a continuous retraining for technical and non-technical skills to optimize neonatological care strategies. Furthermore, we have been working on a novel skill trainer for mechanical ventilation (MEchatronic REspiratory System SImulator for Neonatal Applications, MERESSINA. Finally, in our opinion national health policy indirectly influences risk for errors. Proceedings of the 9th International Workshop on Neonatology · Cagliari (Italy · October 23rd-26th, 2013 · Learned lessons, changing practice and cutting-edge research

  19. Research on bearing fault diagnosis of large machinery based on mathematical morphology

    Science.gov (United States)

    Wang, Yu

    2018-04-01

    To study the automatic diagnosis of large machinery fault based on support vector machine, combining the four common faults of the large machinery, the support vector machine is used to classify and identify the fault. The extracted feature vectors are entered. The feature vector is trained and identified by multi - classification method. The optimal parameters of the support vector machine are searched by trial and error method and cross validation method. Then, the support vector machine is compared with BP neural network. The results show that the support vector machines are short in time and high in classification accuracy. It is more suitable for the research of fault diagnosis in large machinery. Therefore, it can be concluded that the training speed of support vector machines (SVM) is fast and the performance is good.

  20. Narrowing the scope of failure prediction using targeted fault load injection

    Science.gov (United States)

    Jordan, Paul L.; Peterson, Gilbert L.; Lin, Alan C.; Mendenhall, Michael J.; Sellers, Andrew J.

    2018-05-01

    As society becomes more dependent upon computer systems to perform increasingly critical tasks, ensuring that those systems do not fail becomes increasingly important. Many organizations depend heavily on desktop computers for day-to-day operations. Unfortunately, the software that runs on these computers is written by humans and, as such, is still subject to human error and consequent failure. A natural solution is to use statistical machine learning to predict failure. However, since failure is still a relatively rare event, obtaining labelled training data to train these models is not a trivial task. This work presents new simulated fault-inducing loads that extend the focus of traditional fault injection techniques to predict failure in the Microsoft enterprise authentication service and Apache web server. These new fault loads were successful in creating failure conditions that were identifiable using statistical learning methods, with fewer irrelevant faults being created.

  1. Fault Detection of Aircraft Cable via Spread Spectrum Time Domain Reflectometry

    Directory of Open Access Journals (Sweden)

    Xudong SHI

    2014-03-01

    Full Text Available As the airplane cable fault detection based on TDR (time domain reflectometry is affected easily by various noise signals, which makes the reflected signal attenuate and distort heavily, failing to locate the fault. In order to solve these problems, a method of spread spectrum time domain reflectometry (SSTDR is introduced in this paper, taking the advantage of the sharp peak of correlation function. The test signal is generated from ML sequence (MLS modulated by sine wave in the same frequency. Theoretically, the test signal has the very high immunity of noise, which can be applied with excellent precision to fault location on the aircraft cable. In this paper, the method of SSTDR was normally simulated in MATLAB. Then, an experimental setup, based on LabVIEW, was organized to detect and locate the fault on the aircraft cable. It has been demonstrated that SSTDR has the high immunity of noise, reducing some detection errors effectively.

  2. Adaptive Fault Tolerance for Many-Core Based Space-Borne Computing

    Science.gov (United States)

    James, Mark; Springer, Paul; Zima, Hans

    2010-01-01

    This paper describes an approach to providing software fault tolerance for future deep-space robotic NASA missions, which will require a high degree of autonomy supported by an enhanced on-board computational capability. Such systems have become possible as a result of the emerging many-core technology, which is expected to offer 1024-core chips by 2015. We discuss the challenges and opportunities of this new technology, focusing on introspection-based adaptive fault tolerance that takes into account the specific requirements of applications, guided by a fault model. Introspection supports runtime monitoring of the program execution with the goal of identifying, locating, and analyzing errors. Fault tolerance assertions for the introspection system can be provided by the user, domain-specific knowledge, or via the results of static or dynamic program analysis. This work is part of an on-going project at the Jet Propulsion Laboratory in Pasadena, California.

  3. Implementing nurse prescribing: a case study in diabetes.

    Science.gov (United States)

    Stenner, Karen; Carey, Nicola; Courtenay, Molly

    2010-03-01

    This paper is a report of a study exploring the views of nurses and team members on the implementation of nurse prescribing in diabetes services. Nurse prescribing is adopted as a means of improving service efficiency, particularly where demand outstretches resources. Although factors that support nurse prescribing have been identified, it is not known how these function within specific contexts. This is important as its uptake and use varies according to mode of prescribing and area of practice. A case study was undertaken in nine practice settings across England where nurses prescribed medicines for patients with diabetes. Thematic analysis was conducted on qualitative data from 31 semi-structured interviews undertaken between 2007 and 2008. Participants were qualified nurse prescribers, administrative staff, physicians and non-nurse prescribers. Nurses prescribed more often following the expansion of nurse independent prescribing rights in 2006. Initial implementation problems had been resolved and few current problems were reported. As nurses' roles were well-established, no major alterations to service provision were required to implement nurse prescribing. Access to formal and informal resources for support and training were available. Participants were accepting and supportive of this initiative to improve the efficiency of diabetes services. The main factors that promoted implementation of nurse prescribing in this setting were the ability to prescribe independently, acceptance of the prescribing role, good working relationships between doctors and nurses, and sound organizational and interpersonal support. The history of established nursing roles in diabetes care, and increasing service demand, meant that these diabetes services were primed to assimilate nurse prescribing.

  4. Growth of nurse prescribing competence: facilitators and barriers during education.

    Science.gov (United States)

    Hopia, Hanna; Karhunen, Anne; Heikkilä, Johanna

    2017-10-01

    To describe facilitators and barriers in relation to the growth of nurse prescribing competence from the perspective of the nurses studying in a prescribing programme. The number of nurses enrolled in a nurse prescribing programme is rapidly increasing in Finland. However, few studies on nurse prescribing education are available and therefore research is needed, particularly from the point of view of nurses studying in the programme. The descriptive, qualitative study used the text of student online learning diaries as data during a 14-month prescribing programme. The sample consisted of 31 nurses, public health nurses or midwives enrolled in a prescribing programme at a university of applied sciences. The data were analysed using the inductive analysis method. The growth of nurses' prescribing competence was facilitated by learning clinical examination of the patient, networking with peers, receiving support from the workplace and supervisors, doctors' positive attitude towards nurse prescribing and being able to apply competencies directly to nursing practice. The barriers to the growth of nurses' prescribing competence were unclear job description, incomplete care plans and concerns about how consultation with doctors will be organised and realised. The results show that, for the purpose of developing the new role and position of nurse prescribers, educators and nursing managers must invest more in staff awareness of nurse prescribing education and also offer more support to nurse prescribers in their workplaces. The results of this study can be used especially in countries where nurse prescribing education is only in the process of being planned or has just been started. Heads of nursing and educators in prescribing education will benefit from the results when creating expanded job descriptions for nurses and supporting networking between students during the period of training. © 2016 John Wiley & Sons Ltd.

  5. LIBERTARISMO & ERROR CATEGORIAL

    Directory of Open Access Journals (Sweden)

    Carlos G. Patarroyo G.

    2009-01-01

    Full Text Available En este artículo se ofrece una defensa del libertarismo frente a dos acusaciones según las cuales éste comete un error categorial. Para ello, se utiliza la filosofía de Gilbert Ryle como herramienta para explicar las razones que fundamentan estas acusaciones y para mostrar por qué, pese a que ciertas versiones del libertarismo que acuden a la causalidad de agentes o al dualismo cartesiano cometen estos errores, un libertarismo que busque en el indeterminismo fisicalista la base de la posibilidad de la libertad humana no necesariamente puede ser acusado de incurrir en ellos.

  6. Libertarismo & Error Categorial

    OpenAIRE

    PATARROYO G, CARLOS G

    2009-01-01

    En este artículo se ofrece una defensa del libertarismo frente a dos acusaciones según las cuales éste comete un error categorial. Para ello, se utiliza la filosofía de Gilbert Ryle como herramienta para explicar las razones que fundamentan estas acusaciones y para mostrar por qué, pese a que ciertas versiones del libertarismo que acuden a la causalidad de agentes o al dualismo cartesiano cometen estos errores, un libertarismo que busque en el indeterminismo fisicalista la base de la posibili...

  7. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  8. Technology and medication errors: impact in nursing homes.

    Science.gov (United States)

    Baril, Chantal; Gascon, Viviane; St-Pierre, Liette; Lagacé, Denis

    2014-01-01

    The purpose of this paper is to study a medication distribution technology's (MDT) impact on medication errors reported in public nursing homes in Québec Province. The work was carried out in six nursing homes (800 patients). Medication error data were collected from nursing staff through a voluntary reporting process before and after MDT was implemented. The errors were analysed using: totals errors; medication error type; severity and patient consequences. A statistical analysis verified whether there was a significant difference between the variables before and after introducing MDT. The results show that the MDT detected medication errors. The authors' analysis also indicates that errors are detected more rapidly resulting in less severe consequences for patients. MDT is a step towards safer and more efficient medication processes. Our findings should convince healthcare administrators to implement technology such as electronic prescriber or bar code medication administration systems to improve medication processes and to provide better healthcare to patients. Few studies have been carried out in long-term healthcare facilities such as nursing homes. The authors' study extends what is known about MDT's impact on medication errors in nursing homes.

  9. Update: San Andreas Fault experiment

    Science.gov (United States)

    Christodoulidis, D. C.; Smith, D. E.

    1984-01-01

    Satellite laser ranging techniques are used to monitor the broad motion of the tectonic plates comprising the San Andreas Fault System. The San Andreas Fault Experiment, (SAFE), has progressed through the upgrades made to laser system hardware and an improvement in the modeling capabilities of the spaceborne laser targets. Of special note is the launch of the Laser Geodynamic Satellite, LAGEOS spacecraft, NASA's only completely dedicated laser satellite in 1976. The results of plate motion projected into this 896 km measured line over the past eleven years are summarized and intercompared.

  10. Support vector machine based fault classification and location of a long transmission line

    Directory of Open Access Journals (Sweden)

    Papia Ray

    2016-09-01

    Full Text Available This paper investigates support vector machine based fault type and distance estimation scheme in a long transmission line. The planned technique uses post fault single cycle current waveform and pre-processing of the samples is done by wavelet packet transform. Energy and entropy are obtained from the decomposed coefficients and feature matrix is prepared. Then the redundant features from the matrix are taken out by the forward feature selection method and normalized. Test and train data are developed by taking into consideration variables of a simulation situation like fault type, resistance path, inception angle, and distance. In this paper 10 different types of short circuit fault are analyzed. The test data are examined by support vector machine whose parameters are optimized by particle swarm optimization method. The anticipated method is checked on a 400 kV, 300 km long transmission line with voltage source at both the ends. Two cases were examined with the proposed method. The first one is fault very near to both the source end (front and rear and the second one is support vector machine with and without optimized parameter. Simulation result indicates that the anticipated method for fault classification gives high accuracy (99.21% and least fault distance estimation error (0.29%.

  11. Transformer Incipient Fault Prediction Using Combined Artificial Neural Network and Various Particle Swarm Optimisation Techniques.

    Directory of Open Access Journals (Sweden)

    Hazlee Azil Illias

    Full Text Available It is important to predict the incipient fault in transformer oil accurately so that the maintenance of transformer oil can be performed correctly, reducing the cost of maintenance and minimise the error. Dissolved gas analysis (DGA has been widely used to predict the incipient fault in power transformers. However, sometimes the existing DGA methods yield inaccurate prediction of the incipient fault in transformer oil because each method is only suitable for certain conditions. Many previous works have reported on the use of intelligence methods to predict the transformer faults. However, it is believed that the accuracy of the previously proposed methods can still be improved. Since artificial neural network (ANN and particle swarm optimisation (PSO techniques have never been used in the previously reported work, this work proposes a combination of ANN and various PSO techniques to predict the transformer incipient fault. The advantages of PSO are simplicity and easy implementation. The effectiveness of various PSO techniques in combination with ANN is validated by comparison with the results from the actual fault diagnosis, an existing diagnosis method and ANN alone. Comparison of the results from the proposed methods with the previously reported work was also performed to show the improvement of the proposed methods. It was found that the proposed ANN-Evolutionary PSO method yields the highest percentage of correct identification for transformer fault type than the existing diagnosis method and previously reported works.

  12. Transformer Incipient Fault Prediction Using Combined Artificial Neural Network and Various Particle Swarm Optimisation Techniques.

    Science.gov (United States)

    Illias, Hazlee Azil; Chai, Xin Rui; Abu Bakar, Ab Halim; Mokhlis, Hazlie

    2015-01-01

    It is important to predict the incipient fault in transformer oil accurately so that the maintenance of transformer oil can be performed correctly, reducing the cost of maintenance and minimise the error. Dissolved gas analysis (DGA) has been widely used to predict the incipient fault in power transformers. However, sometimes the existing DGA methods yield inaccurate prediction of the incipient fault in transformer oil because each method is only suitable for certain conditions. Many previous works have reported on the use of intelligence methods to predict the transformer faults. However, it is believed that the accuracy of the previously proposed methods can still be improved. Since artificial neural network (ANN) and particle swarm optimisation (PSO) techniques have never been used in the previously reported work, this work proposes a combination of ANN and various PSO techniques to predict the transformer incipient fault. The advantages of PSO are simplicity and easy implementation. The effectiveness of various PSO techniques in combination with ANN is validated by comparison with the results from the actual fault diagnosis, an existing diagnosis method and ANN alone. Comparison of the results from the proposed methods with the previously reported work was also performed to show the improvement of the proposed methods. It was found that the proposed ANN-Evolutionary PSO method yields the highest percentage of correct identification for transformer fault type than the existing diagnosis method and previously reported works.

  13. Transformer Incipient Fault Prediction Using Combined Artificial Neural Network and Various Particle Swarm Optimisation Techniques

    Science.gov (United States)

    2015-01-01

    It is important to predict the incipient fault in transformer oil accurately so that the maintenance of transformer oil can be performed correctly, reducing the cost of maintenance and minimise the error. Dissolved gas analysis (DGA) has been widely used to predict the incipient fault in power transformers. However, sometimes the existing DGA methods yield inaccurate prediction of the incipient fault in transformer oil because each method is only suitable for certain conditions. Many previous works have reported on the use of intelligence methods to predict the transformer faults. However, it is believed that the accuracy of the previously proposed methods can still be improved. Since artificial neural network (ANN) and particle swarm optimisation (PSO) techniques have never been used in the previously reported work, this work proposes a combination of ANN and various PSO techniques to predict the transformer incipient fault. The advantages of PSO are simplicity and easy implementation. The effectiveness of various PSO techniques in combination with ANN is validated by comparison with the results from the actual fault diagnosis, an existing diagnosis method and ANN alone. Comparison of the results from the proposed methods with the previously reported work was also performed to show the improvement of the proposed methods. It was found that the proposed ANN-Evolutionary PSO method yields the highest percentage of correct identification for transformer fault type than the existing diagnosis method and previously reported works. PMID:26103634

  14. Faulting at Mormon Point, Death Valley, California: A low-angle normal fault cut by high-angle faults

    Science.gov (United States)

    Keener, Charles; Serpa, Laura; Pavlis, Terry L.

    1993-04-01

    New geophysical and fault kinematic studies indicate that late Cenozoic basin development in the Mormon Point area of Death Valley, California, was accommodated by fault rotations. Three of six fault segments recognized at Mormon Point are now inactive and have been rotated to low dips during extension. The remaining three segments are now active and moderately to steeply dipping. From the geophysical data, one active segment appears to offset the low-angle faults in the subsurface of Death Valley.

  15. Prescribed Performance Fuzzy Adaptive Output-Feedback Control for Nonlinear Stochastic Systems

    Directory of Open Access Journals (Sweden)

    Lili Zhang

    2014-01-01

    Full Text Available A prescribed performance fuzzy adaptive output-feedback control approach is proposed for a class of single-input and single-output nonlinear stochastic systems with unmeasured states. Fuzzy logic systems are used to identify the unknown nonlinear system, and a fuzzy state observer is designed for estimating the unmeasured states. Based on the backstepping recursive design technique and the predefined performance technique, a new fuzzy adaptive output-feedback control method is developed. It is shown that all the signals of the resulting closed-loop system are bounded in probability and the tracking error remains an adjustable neighborhood of the origin with the prescribed performance bounds. A simulation example is provided to show the effectiveness of the proposed approach.

  16. Fault-tolerant system for catastrophic faults in AMR sensors

    NARCIS (Netherlands)

    Zambrano Constantini, A.C.; Kerkhoff, Hans G.

    Anisotropic Magnetoresistance angle sensors are widely used in automotive applications considered to be safety-critical applications. Therefore dependability is an important requirement and fault-tolerant strategies must be used to guarantee the correct operation of the sensors even in case of

  17. Fault-tolerant quantum computation for local non-Markovian noise

    International Nuclear Information System (INIS)

    Terhal, Barbara M.; Burkard, Guido

    2005-01-01

    We derive a threshold result for fault-tolerant quantum computation for local non-Markovian noise models. The role of error amplitude in our analysis is played by the product of the elementary gate time t 0 and the spectral width of the interaction Hamiltonian between system and bath. We discuss extensions of our model and the applicability of our analysis

  18. STEM - software test and evaluation methods: fault detection using static analysis techniques

    International Nuclear Information System (INIS)

    Bishop, P.G.; Esp, D.G.

    1988-08-01

    STEM is a software reliability project with the objective of evaluating a number of fault detection and fault estimation methods which can be applied to high integrity software. This Report gives some interim results of applying both manual and computer-based static analysis techniques, in particular SPADE, to an early CERL version of the PODS software containing known faults. The main results of this study are that: The scope for thorough verification is determined by the quality of the design documentation; documentation defects become especially apparent when verification is attempted. For well-defined software, the thoroughness of SPADE-assisted verification for detecting a large class of faults was successfully demonstrated. For imprecisely-defined software (not recommended for high-integrity systems) the use of tools such as SPADE is difficult and inappropriate. Analysis and verification tools are helpful, through their reliability and thoroughness. However, they are designed to assist, not replace, a human in validating software. Manual inspection can still reveal errors (such as errors in specification and errors of transcription of systems constants) which current tools cannot detect. There is a need for tools to automatically detect typographical errors in system constants, for example by reporting outliers to patterns. To obtain the maximum benefit from advanced tools, they should be applied during software development (when verification problems can be detected and corrected) rather than retrospectively. (author)

  19. Drug use evaluation of antibiotics prescribed in a Jordanian hospital outpatient and emergency clinics using WHO prescribing indicators

    International Nuclear Information System (INIS)

    Al-Niemat, Sahar I.; Bloukh, Diana T.; Al-Harasis, Manal D.; Al-Fanek, Alen F.; Salah, Rehab K.

    2008-01-01

    Objective was to evaluate the use of antibiotics prescribed in hospital outpatient and emergency clinics in King Hussein Medical Centre (KHMC) using WHO prescribing indicators in an attempt to rationalize the use of antibiotics in the Royal Medical Services. We retrospectively surveyed a sample of 187,822 antibiotic prescriptions obtained from 5 outpatient pharmacies in KHMC written over the period of 3 consecutive months May 2007 to July 2007. The percentage of encounters of an antibiotic prescribed was calculated using the methodology recommended by the WHO. An additional indicator, the percentage share of different antibiotics was also included to identify the frequency prescribed from those antibiotics. The average percentage of prescriptions involving antibiotics was 35.6% out of 187,822 prescriptions surveyed. From these, 65,500 antibiotic prescriptions were observed. Penicillins most frequently amoxcillins and Quinolones most frequently ciprofloxacinllin and norfloxacillin were the most commonly prescribed antibiotics with an average percentage of 31.8% and 27.5%. The average prescribing rate for the other antibiotic categories was as follows: macrolides 5.2%, cephalosporins 16% and amoxcillins/clavulanate 5.4%. The high percentage of prescriptions involving antibiotics observed in KHMC pharmacies requires rational use of antibiotics and judicious prescribing by Military prescribers. An insight into factors influencing antibiotic prescribing patterns and adherence to antibiotic prescribing guidelines by the Military prescribers is warranted. (author)

  20. Error Correcting Codes

    Indian Academy of Sciences (India)

    Science and Automation at ... the Reed-Solomon code contained 223 bytes of data, (a byte ... then you have a data storage system with error correction, that ..... practical codes, storing such a table is infeasible, as it is generally too large.

  1. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  2. Fault Management Assistant (FMA), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — S&K Aerospace (SKA) proposes to develop the Fault Management Assistant (FMA) to aid project managers and fault management engineers in developing better and more...

  3. SDEM modelling of fault-propagation folding

    DEFF Research Database (Denmark)

    Clausen, O.R.; Egholm, D.L.; Poulsen, Jane Bang

    2009-01-01

    and variations in Mohr-Coulomb parameters including internal friction. Using SDEM modelling, we have mapped the propagation of the tip-line of the fault, as well as the evolution of the fold geometry across sedimentary layers of contrasting rheological parameters, as a function of the increased offset......Understanding the dynamics and kinematics of fault-propagation-folding is important for evaluating the associated hydrocarbon play, for accomplishing reliable section balancing (structural reconstruction), and for assessing seismic hazards. Accordingly, the deformation style of fault-propagation...... a precise indication of when faults develop and hence also the sequential evolution of secondary faults. Here we focus on the generation of a fault -propagated fold with a reverse sense of motion at the master fault, and varying only the dip of the master fault and the mechanical behaviour of the deformed...

  4. A summary of the active fault investigation in the extension sea area of Kikugawa fault and the Nishiyama fault , N-S direction fault in south west Japan

    Science.gov (United States)

    Abe, S.

    2010-12-01

    In this study, we carried out two sets of active fault investigation by the request from Ministry of Education, Culture, Sports, Science and Technology in the sea area of the extension of Kikugawa fault and the Nishiyama fault. We want to clarify the five following matters about both active faults based on those results. (1)Fault continuity of the land and the sea. (2) The length of the active fault. (3) The division of the segment. (4) Activity characteristics. In this investigation, we carried out a digital single channel seismic reflection survey in the whole area of both active faults. In addition, a high-resolution multichannel seismic reflection survey was carried out to recognize the detailed structure of a shallow stratum. Furthermore, the sampling with the vibrocoring to get information of the sedimentation age was carried out. The reflection profile of both active faults was extremely clear. The characteristics of the lateral fault such as flower structure, the dispersion of the active fault were recognized. In addition, from analysis of the age of the stratum, it was recognized that the thickness of the sediment was extremely thin in Holocene epoch on the continental shelf in this sea area. It was confirmed that the Kikugawa fault extended to the offing than the existing results of research by a result of this investigation. In addition, the width of the active fault seems to become wide toward the offing while dispersing. At present, we think that we can divide Kikugawa fault into some segments based on the distribution form of the segment. About the Nishiyama fault, reflection profiles to show the existence of the active fault was acquired in the sea between Ooshima and Kyushu. From this result and topographical existing results of research in Ooshima, it is thought that Nishiyama fault and the Ooshima offing active fault are a series of structure. As for Ooshima offing active fault, the upheaval side changes, and a direction changes too. Therefore, we

  5. Challenge and Error: Critical Events and Attention-Related Errors

    Science.gov (United States)

    Cheyne, James Allan; Carriere, Jonathan S. A.; Solman, Grayden J. F.; Smilek, Daniel

    2011-01-01

    Attention lapses resulting from reactivity to task challenges and their consequences constitute a pervasive factor affecting everyday performance errors and accidents. A bidirectional model of attention lapses (error [image omitted] attention-lapse: Cheyne, Solman, Carriere, & Smilek, 2009) argues that errors beget errors by generating attention…

  6. 31 CFR 29.522 - Fault.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Fault. 29.522 Section 29.522 Money... Overpayments § 29.522 Fault. (a) General rule. A debtor is considered to be at fault if he or she, or any other... requirement. (3) The following factors may affect the decision as to whether the debtor is or is not at fault...

  7. [Prescribing medication in 2013: legal aspects].

    Science.gov (United States)

    Berland-Benhaïm, C; Bartoli, C; Karsenty, G; Piercecchi-Marti, M-D

    2013-11-01

    To describe the legal framework of medicine prescription in France in 2013. With the assistance of lawyer and forensic pathologist, consultation (legifrance.gouv.fr), analysis, summary of French laws and rules surrounding drugs prescriptions to humans for medical purpose. Free medicine prescription is an essential feature of a doctor's action. To prescribe involve his responsibility at 3 levels: deontological, civilian and penal. Aim of the rules of medicine prescription is to preserve patient's safety and health. Doctors are encouraged to refer to recommendations and peer-reviewed publication every time the prescriptions go out of the case planned by law. Knowledge and respect of medicine prescription legal rules is essential for a good quality practice. Medical societies have a major role to improve medicine use among practitioners. Copyright © 2013. Published by Elsevier Masson SAS.

  8. Designing magnets with prescribed magnetic fields

    International Nuclear Information System (INIS)

    Liu Liping

    2011-01-01

    We present a novel design method capable of finding the magnetization densities that generate prescribed magnetic fields. The method is based on the solution to a simple variational inequality and the resulting designs have simple piecewise-constant magnetization densities. By this method, we obtain new designs of magnets that generate commonly used magnetic fields: uniform magnetic fields, self-shielding fields, quadrupole fields and sextupole fields. Further, it is worth noting that this method is not limited to the presented examples, and in particular, three-dimensional designs can be constructed in a similar manner. In conclusion, this novel design method is anticipated to have broad applications where specific magnetic fields are important for the performance of the devices.

  9. Ibuprofen in paediatrics: pharmacology, prescribing and controversies.

    Science.gov (United States)

    Moriarty, Camilla; Carroll, Will

    2016-12-01

    Ibuprofen, a propionic acid derivative, is a non-steroidal anti-inflammatory drug. The oral formulation is widely used in paediatric practice and after paracetamol it is one of the most common drugs prescribed for children in hospital. The treatment of fever with antipyretics such as ibuprofen is controversial as fever is the normal response of the body to infection and unless the child becomes distressed or symptomatic, fever alone should not be routinely treated. Combined treatment with paracetamol and ibuprofen is commonly undertaken but almost certainly is not helpful. This article aims to describe the indications and mode of action of the drug, outline its pharmacokinetics and highlight the important key messages regarding its use in clinical practice. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  10. Team errors: definition and taxonomy

    International Nuclear Information System (INIS)

    Sasou, Kunihide; Reason, James

    1999-01-01

    In error analysis or error management, the focus is usually upon individuals who have made errors. In large complex systems, however, most people work in teams or groups. Considering this working environment, insufficient emphasis has been given to 'team errors'. This paper discusses the definition of team errors and its taxonomy. These notions are also applied to events that have occurred in the nuclear power industry, aviation industry and shipping industry. The paper also discusses the relations between team errors and Performance Shaping Factors (PSFs). As a result, the proposed definition and taxonomy are found to be useful in categorizing team errors. The analysis also reveals that deficiencies in communication, resource/task management, excessive authority gradient, excessive professional courtesy will cause team errors. Handling human errors as team errors provides an opportunity to reduce human errors

  11. Generic medicine and prescribing: A quick assessment

    Directory of Open Access Journals (Sweden)

    Mainul Haque

    2017-01-01

    Full Text Available Generic drugs are copies of brand-name drugs that have exactly the same dosage, intended use, effects, side effects, route of administration, risks, safety, and strength as the original drug. In other words, their pharmacological effects are exactly the same as those of their brand-name counterparts. The Food and Drug Administration (FDA describes that generic drugs are essential possibilities that allow better access to healthcare for all Americans. They are replicas of brand-name drugs and are the identical as those of brand-name drugs in dosage form, safety, strength, route of administration, quality, performance features, and anticipated to use. Healthcare authorities and users can be guaranteed that FDA-approved generic drug products have met the same stiff principles as the innovator drug. The company that made Bayer aspirin fought in court enthusiastically to keep generic versions off the shelves, in the 1920s. The company lost in court, and consumers suddenly had an array of choices in generic aspirin. The Supreme Court of India uttering ‘the Supreme Court's ruling will prevent companies from further seeking unwarranted patents on HIV and other essential medicines.’ Generic medicine cannot be sold at a price higher than the branded medicine, so it is regularly a low-priced option. Thereafter, both the end user and the government who pay for part of the price of the medicine under the Pharmaceutical Benefits Scheme in Australia are benefitted. The treatment of diseases using essential drugs, prescribed by their generic names, has been emphasised by the WHO and many national health policies. Although there are some improvements in generic medicine prescribing, it has been advised by the WHO that ‘countries should intensify efforts to measure and regularly monitor medicine prices and availability, and adopt policy measures to address the issues identified.’

  12. Modeling and Experimental Study of Soft Error Propagation Based on Cellular Automaton

    Directory of Open Access Journals (Sweden)

    Wei He

    2016-01-01

    Full Text Available Aiming to estimate SEE soft error performance of complex electronic systems, a soft error propagation model based on cellular automaton is proposed and an estimation methodology based on circuit partitioning and error propagation is presented. Simulations indicate that different fault grade jamming and different coupling factors between cells are the main parameters influencing the vulnerability of the system. Accelerated radiation experiments have been developed to determine the main parameters for raw soft error vulnerability of the module and coupling factors. Results indicate that the proposed method is feasible.

  13. Factors Influencing Patterns of Antibiotic Prescribing in Primary Health Care Centers in the Savodjbolaq District During 2012-13: A Cross-Sectional Study

    Directory of Open Access Journals (Sweden)

    Gh. Karimi

    2015-08-01

    Full Text Available Background and Objective: Inappropriate prescribing of antibiotics is one of the main reasons for antibiotic resistance in the world which has an increasing pressure and cost on health system and also household economy. The present study aimed to determine the pattern of antibiotic prescribing and related it,s factors in health centers. Materials and Methods: In a cross-sectional design, 1068 random prescriptions of General Physicians (GPs who work in Savodjbolaq Health Centers were studied. Variables included age, gender of patients and physicians, frequency of antibiotic prescribing, rate of combination therapy, methods of prescribing, type of patient’s insurance booklet and seasons. Statistical analysis was performed by SPSS version 18 software. Results: More than half of prescriptions (56.8% included at least one antibiotics. One in every four prescriptions had some sort of antibiotic combination therapy. According to the scientific criteria, 57.1% of antibiotics were prescribed inappropriately. among these criteria, the highest error belongs to doses per day with 67.72%. Frequency of antibiotic prescribing based on age, gender, type of patient’s insurance booklet, physicians experience, different seasons was significantly different (p<0.05. Conclusions: Combination therapy and unscientific prescribing of antibiotics for youths are concern for public health and household economy. Review of protocols and methods of supervision, Changes in purchasing medical services, Design and implementation of operational and targeted educational interventions, Training physicians emphasizing on logical aspects of antibiotic prescription and prescribing skills, are recommended.  

  14. Cell boundary fault detection system

    Science.gov (United States)

    Archer, Charles Jens [Rochester, MN; Pinnow, Kurt Walter [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian Edward [Rochester, MN

    2011-04-19

    An apparatus and program product determine a nodal fault along the boundary, or face, of a computing cell. Nodes on adjacent cell boundaries communicate with each other, and the communications are analyzed to determine if a node or connection is faulty.

  15. RESULTS, RESPONSIBILITY, FAULT AND CONTROL

    Directory of Open Access Journals (Sweden)

    Evgeniy Stoyanov

    2016-09-01

    Full Text Available The paper focuses on the responsibility arising from the registered financial results. The analysis of this responsibility presupposes its evaluation and determination of the role of fault in the formation of negative results. The search for efficiency in this whole process is justified by the understanding of the mechanisms that regulate the behavior of economic actors.

  16. Fault detection using (PI) observers

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, J.; Shafai, B.

    1997-01-01

    The fault detection and isolation (FDI) problem in connection with Proportional Integral (PI) Observers is considered in this paper. A compact formulation of the FDI design problem using PI observers is given. An analysis of the FDI design problem is derived with respectt to the time domain...

  17. Investigation of the applicability of a functional programming model to fault-tolerant parallel processing for knowledge-based systems

    Science.gov (United States)

    Harper, Richard

    1989-01-01

    In a fault-tolerant parallel computer, a functional programming model can facilitate distributed checkpointing, error recovery, load balancing, and graceful degradation. Such a model has been implemented on the Draper Fault-Tolerant Parallel Processor (FTPP). When used in conjunction with the FTPP's fault detection and masking capabilities, this implementation results in a graceful degradation of system performance after faults. Three graceful degradation algorithms have been implemented and are presented. A user interface has been implemented which requires minimal cognitive overhead by the application programmer, masking such complexities as the system's redundancy, distributed nature, variable complement of processing resources, load balancing, fault occurrence and recovery. This user interface is described and its use demonstrated. The applicability of the functional programming style to the Activation Framework, a paradigm for intelligent systems, is then briefly described.

  18. A novel Lagrangian approach for the stable numerical simulation of fault and fracture mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Franceschini, Andrea; Ferronato, Massimiliano, E-mail: massimiliano.ferronato@unipd.it; Janna, Carlo; Teatini, Pietro

    2016-06-01

    The simulation of the mechanics of geological faults and fractures is of paramount importance in several applications, such as ensuring the safety of the underground storage of wastes and hydrocarbons or predicting the possible seismicity triggered by the production and injection of subsurface fluids. However, the stable numerical modeling of ground ruptures is still an open issue. The present work introduces a novel formulation based on the use of the Lagrange multipliers to prescribe the constraints on the contact surfaces. The variational formulation is modified in order to take into account the frictional work along the activated fault portion according to the principle of maximum plastic dissipation. The numerical model, developed in the framework of the Finite Element method, provides stable solutions with a fast convergence of the non-linear problem. The stabilizing properties of the proposed model are emphasized with the aid of a realistic numerical example dealing with the generation of ground fractures due to groundwater withdrawal in arid regions. - Highlights: • A numerical model is developed for the simulation of fault and fracture mechanics. • The model is implemented in the framework of the Finite Element method and with the aid of Lagrange multipliers. • The proposed formulation introduces a new contribution due to the frictional work on the portion of activated fault. • The resulting algorithm is highly non-linear as the portion of activated fault is itself unknown. • The numerical solution is validated against analytical results and proves to be stable also in realistic applications.

  19. A novel Lagrangian approach for the stable numerical simulation of fault and fracture mechanics

    International Nuclear Information System (INIS)

    Franceschini, Andrea; Ferronato, Massimiliano; Janna, Carlo; Teatini, Pietro

    2016-01-01

    The simulation of the mechanics of geological faults and fractures is of paramount importance in several applications, such as ensuring the safety of the underground storage of wastes and hydrocarbons or predicting the possible seismicity triggered by the production and injection of subsurface fluids. However, the stable numerical modeling of ground ruptures is still an open issue. The present work introduces a novel formulation based on the use of the Lagrange multipliers to prescribe the constraints on the contact surfaces. The variational formulation is modified in order to take into account the frictional work along the activated fault portion according to the principle of maximum plastic dissipation. The numerical model, developed in the framework of the Finite Element method, provides stable solutions with a fast convergence of the non-linear problem. The stabilizing properties of the proposed model are emphasized with the aid of a realistic numerical example dealing with the generation of ground fractures due to groundwater withdrawal in arid regions. - Highlights: • A numerical model is developed for the simulation of fault and fracture mechanics. • The model is implemented in the framework of the Finite Element method and with the aid of Lagrange multipliers. • The proposed formulation introduces a new contribution due to the frictional work on the portion of activated fault. • The resulting algorithm is highly non-linear as the portion of activated fault is itself unknown. • The numerical solution is validated against analytical results and proves to be stable also in realistic applications.

  20. Exact, almost and delayed fault detection

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Saberi, Ali; Stoorvogel, Anton A.

    1999-01-01

    Considers the problem of fault detection and isolation while using zero or almost zero threshold. A number of different fault detection and isolation problems using exact or almost exact disturbance decoupling are formulated. Solvability conditions are given for the formulated design problems....... The l-step delayed fault detection problem is also considered for discrete-time systems....