WorldWideScience

Sample records for reliably detected future

  1. Transmission reliability faces future challenges

    International Nuclear Information System (INIS)

    Beaty, W.

    1993-01-01

    The recently published Washington International Energy Group's 1993 Electric Utility Outlook states that nearly one-third (31 percent) of U.S. utility executives expect reliability to decrease in the near future. Electric power system stability is crucial to reliability. Stability analysis determines whether a system will stay intact under normal operating conditions, during minor disturbances such as load fluctuations, and during major disturbances when one or more parts of the system fails. All system elements contribute to reliability or the lack of it. However, this report centers on the transmission segment of the electric system. The North American Electric Reliability Council (NERC) says the transmission systems as planned will be adequate over the next 10 years. However, delays in building new lines and increasing demands for transmission services are serious concerns. Reliability concerns exist in the Mid-Continent Area Power Pool and the Mid-America Interconnected Network regions where transmission facilities have not been allowed to be constructed as planned. Portions of the transmission systems in other regions are loaded at or near their limits. NERC further states that utilities must be allowed to complete planned generation and transmission as scheduled. A reliable supply of electricity also depends on adhering to established operating criteria. Factors that could complicate operations include: More interchange schedules resulting from increased transmission services. Increased line loadings in portions of the transmission systems. Proliferation of non-utility generators

  2. Reliability of leak detection systems in LWRs

    International Nuclear Information System (INIS)

    Kupperman, D.S.

    1986-10-01

    In this paper, NRC guidelines for leak detection will be reviewed, current practices described, potential safety-related problems discussed, and potential improvements in leak detection technology (with emphasis on acoustic methods) evaluated

  3. New Multiplexing Tools for Reliable GMO Detection

    NARCIS (Netherlands)

    Pla, M.; Nadal, A.; Baeten, V.; Bahrdt, C.; Berben, G.; Bertheau, Y.; Coll, A.; Dijk, van J.P.; Dobnik, D.; Fernandez-Pierna, J.A.; Gruden, K.; Hamels, S.; Holck, A.; Holst-Jensen, A.; Janssen, E.; Kok, E.J.; Paz, La J.L.; Laval, V.; Leimanis, S.; Malcevschi, A.; Marmiroli, N.; Morisset, D.; Prins, T.W.; Remacle, J.; Ujhelyi, G.; Wulff, D.

    2012-01-01

    Among the available methods for GMO detection, enforcement and routine laboratories use in practice PCR, based on the detection of transgenic DNA. The cost required for GMO analysis is constantly increasing due to the progress of GMO commercialization, with inclusion of higher diversity of species,

  4. Reliably detectable flaw size for NDE methods that use calibration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-04-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-1823 and associated mh18232 POD software gives most common methods of POD analysis. In this paper, POD analysis is applied to an NDE method, such as eddy current testing, where calibration is used. NDE calibration standards have known size artificial flaws such as electro-discharge machined (EDM) notches and flat bottom hole (FBH) reflectors which are used to set instrument sensitivity for detection of real flaws. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. Therefore, it is important to correlate signal responses from real flaws with signal responses form artificial flaws used in calibration process to determine reliably detectable flaw size.

  5. Object Detection: Current and Future Directions

    Directory of Open Access Journals (Sweden)

    Rodrigo eVerschae

    2015-11-01

    Full Text Available Object detection is a key ability required by most computer and robot vision systems. The latest research on this area has been making great progress in many directions. In the current manuscript we give an overview of past research on object detection, outline the current main research directions, and discuss open problems and possible future directions.

  6. More reliable financing of future nuclear waste costs

    International Nuclear Information System (INIS)

    1994-01-01

    A commission of inquiry was established by Government in 1993 to review the management of capital funds according to the existing Act of the Financing of Future Expenses for Spent Nuclear Fuel etc. The commission proposes that: The funds which have been paid to the Swedish state to finance the costs arising in connection with the handling and final disposal of spent nuclear fuel etc, from the year 1995, should be invested in accordance with guidelines which aim at attaining a higher return than is currently possible; That an independent government body, called the Nuclear Waste Fund, should be assigned the task of managing the funds, in accordance with these guidelines; That the Swedish Nuclear Power Inspectorate should continue to examine and evaluate issues relating to the application of the funds and recommend the level of the fee to be paid; and That a system including additional measures for guaranteeing the availability of funds should be implemented from the year 1995, in order to improve the reliability of the financing system. Our proposal involves extensive amendments to the Financing Act. On the other hand, the basic stipulations concerning responsibilities under the Act on Nuclear Activities, are not affected. (Seven work documents produced by consulting firms are published in a separate volume; SOU 1994:108) 5 figs., 16 tabs

  7. Future of structural reliability methodology in nuclear power plant technology

    Energy Technology Data Exchange (ETDEWEB)

    Schueeller, G I [Technische Univ. Muenchen (Germany, F.R.); Kafka, P [Gesellschaft fuer Reaktorsicherheit m.b.H. (GRS), Garching (Germany, F.R.)

    1978-10-01

    This paper presents the authors' personal view as to which areas of structural reliability in nuclear power plant design need most urgently to be advanced. Aspects of simulation modeling, design rules, codification and specification of reliability, system analysis, probabilistic structural dynamics, rare events and particularly the interaction of systems and structural reliability are discussed. As an example, some considerations of the interaction effects between the protective systems and the pressure vessel are stated. The paper concludes with recommendation for further research.

  8. Reliability evaluation of the Savannah River reactor leak detection system

    International Nuclear Information System (INIS)

    Daugherty, W.L.; Sindelar, R.L.; Wallace, I.T.

    1991-01-01

    The Savannah River Reactors have been in operation since the mid-1950's. The primary degradation mode for the primary coolant loop piping is intergranular stress corrosion cracking. The leak-before-break (LBB) capability of the primary system piping has been demonstrated as part of an overall structural integrity evaluation. One element of the LBB analyses is a reliability evaluation of the leak detection system. The most sensitive element of the leak detection system is the airborne tritium monitors. The presence of small amounts of tritium in the heavy water coolant provide the basis for a very sensitive system of leak detection. The reliability of the tritium monitors to properly identify a crack leaking at a rate of either 50 or 300 lb/day (0.004 or 0.023 gpm, respectively) has been characterized. These leak rates correspond to action points for which specific operator actions are required. High reliability has been demonstrated using standard fault tree techniques. The probability of not detecting a leak within an assumed mission time of 24 hours is estimated to be approximately 5 x 10 -5 per demand. This result is obtained for both leak rates considered. The methodology and assumptions used to obtain this result are described in this paper. 3 refs., 1 fig., 1 tab

  9. Revenue Sufficiency and Reliability in a Zero Marginal Cost Future

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany A.

    2017-04-17

    Features of existing wholesale electricity markets, such as administrative pricing rules and policy-based reliability standards, can distort market incentives from allowing generators sufficient opportunities to recover both fixed and variable costs. Moreover, these challenges can be amplified by other factors, including (1) inelastic demand resulting from a lack of price signal clarity, (2) low- or near-zero marginal cost generation, particularly arising from low natural gas fuel prices and variable generation (VG), such as wind and solar, and (3) the variability and uncertainty of this VG. As power systems begin to incorporate higher shares of VG, many questions arise about the suitability of the existing marginal-cost-based price formation, primarily within an energy-only market structure, to ensure the economic viability of resources that might be needed to provide system reliability. This article discusses these questions and provides a summary of completed and ongoing modelling-based work at the National Renewable Energy Laboratory to better understand the impacts of evolving power systems on reliability and revenue sufficiency.

  10. Fault detection and reliability, knowledge based and other approaches

    International Nuclear Information System (INIS)

    Singh, M.G.; Hindi, K.S.; Tzafestas, S.G.

    1987-01-01

    These proceedings are split up into four major parts in order to reflect the most significant aspects of reliability and fault detection as viewed at present. The first part deals with knowledge-based systems and comprises eleven contributions from leading experts in the field. The emphasis here is primarily on the use of artificial intelligence, expert systems and other knowledge-based systems for fault detection and reliability. The second part is devoted to fault detection of technological systems and comprises thirteen contributions dealing with applications of fault detection techniques to various technological systems such as gas networks, electric power systems, nuclear reactors and assembly cells. The third part of the proceedings, which consists of seven contributions, treats robust, fault tolerant and intelligent controllers and covers methodological issues as well as several applications ranging from nuclear power plants to industrial robots to steel grinding. The fourth part treats fault tolerant digital techniques and comprises five contributions. Two papers, one on reactor noise analysis, the other on reactor control system design, are indexed separately. (author)

  11. RELIABILITY OF THE DETECTION OF THE BARYON ACOUSTIC PEAK

    International Nuclear Information System (INIS)

    MartInez, Vicent J.; Arnalte-Mur, Pablo; De la Cruz, Pablo; Saar, Enn; Tempel, Elmo; Pons-BorderIa, MarIa Jesus; Paredes, Silvestre; Fernandez-Soto, Alberto

    2009-01-01

    The correlation function of the distribution of matter in the universe shows, at large scales, baryon acoustic oscillations, which were imprinted prior to recombination. This feature was first detected in the correlation function of the luminous red galaxies of the Sloan Digital Sky Survey (SDSS). Recently, the final release (DR7) of the SDSS has been made available, and the useful volume is about two times bigger than in the old sample. We present here, for the first time, the redshift-space correlation function of this sample at large scales together with that for one shallower, but denser volume-limited subsample drawn from the Two-Degree Field Redshift Survey. We test the reliability of the detection of the acoustic peak at about 100 h -1 Mpc and the behavior of the correlation function at larger scales by means of careful estimation of errors. We confirm the presence of the peak in the latest data although broader than in previous detections.

  12. Reliability analysis for the quench detection in the LHC machine

    CERN Document Server

    Denz, R; Vergara-Fernández, A

    2002-01-01

    The Large Hadron Collider (LHC) will incorporate a large amount of superconducting elements that require protection in case of a quench. Key elements in the quench protection system are the electronic quench detectors. Their reliability will have an important impact on the down time as well as on the operational cost of the collider. The expected rates of both false and missed quenches have been computed for several redundant detection schemes. The developed model takes account of the maintainability of the system to optimise the frequency of foreseen checks, and evaluate their influence on the performance of different detection topologies. Seen the uncertainty of the failure rate of the components combined with the LHC tunnel environment, the study has been completed with a sensitivity analysis of the results. The chosen detection scheme and the maintainability strategy for each detector family are given.

  13. Future Trends in Reliability-Based Bridge Management

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    Future bridge management systems will be based on simple stochastic models predicting the residual strength of structural elements. The current deterministic management systems are not effective in optimizing e.g. the life cycle cost of a bridge or a system of bridges. A number of important factors...

  14. Extracting information from an ensemble of GCMs to reliably assess future global runoff change

    NARCIS (Netherlands)

    Sperna Weiland, F.C.; Beek, L.P.H. van; Weerts, A.H.; Bierkens, M.F.P.

    2011-01-01

    Future runoff projections derived from different global climate models (GCMs) show large differences. Therefore, within this study the, information from multiple GCMs has been combined to better assess hydrological changes. For projections of precipitation and temperature the Reliability ensemble

  15. Detecting binary black holes with efficient and reliable templates

    International Nuclear Information System (INIS)

    Damour, T.; Iyer, B.R.; Sathyaprakash, B.S.

    2001-01-01

    Detecting binary black holes in interferometer data requires an accurate knowledge of the orbital phase evolution of the system. From the point of view of data analysis one also needs fast algorithms to compute the templates that will be employed in searching for black hole binaries. Recently, there has been progress on both these fronts: On one hand, re-summation techniques have made it possible to accelerate the convergence of poorly convergent asymptotic post-Newtonian series and derive waveforms beyond the conventional adiabatic approximation. We now have a waveform model that extends beyond the inspiral regime into the plunge phase followed by the quasi-normal mode ringing. On the other hand, explicit Fourier domain waveforms have been derived that make the generation of waveforms fast enough so as not to be a burden on the computational resources required in filtering the detector data. These new developments should make it possible to efficiently and reliably search for black hole binaries in data from first interferometers. (author)

  16. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  17. The National Centre of Systems Reliability and some aspects of its future activities

    International Nuclear Information System (INIS)

    Bourne, A.J.

    1975-01-01

    The National Centre of Systems Reliability (NCSR) has been set up to enhance the work of the Systems Reliability Service (SRS), which during its four years of operation by the UKAEA has offered to industry expertise in the quantification of reliability of systems in various technological applications. An outline is presented of the background to the establishment of the NCSR, including a brief summary of the work of the SRS. Certain aspects of the future activities of the NCSR particularly in relation to research and collaboration with universities are discussed. (U.K.)

  18. A Novel Reliability Enhanced Handoff Method in Future Wireless Heterogeneous Networks

    Directory of Open Access Journals (Sweden)

    Wang YuPeng

    2016-01-01

    Full Text Available As the demand increases, future networks will follow the trends of network variety and service flexibility, which requires heterogeneous type of network deployment and reliable communication method. In practice, most communication failure happens due to the bad radio link quality, i.e., high-speed users suffers a lot on the problem of radio link failure, which causes the problem of communication interrupt and radio link recovery. To make the communication more reliable, especially for the high mobility users, we propose a novel communication handoff mechanism to reduce the occurrence of service interrupt. Based on computer simulation, we find that the reliability on the service is greatly improved.

  19. Keeping an eye on reliability : The organizational requirements of future renewable energy systems

    NARCIS (Netherlands)

    Scholten, D.J.

    2012-01-01

    The reliable operation of energy infrastructures is more than just a technical matter. It is also dependent upon the organizational structure that enables and constrains entities in their management of operations. Yet this lesson seems forgotten in our planning of future renewable energy systems.

  20. Automatic Student Plagiarism Detection: Future Perspectives

    Science.gov (United States)

    Mozgovoy, Maxim; Kakkonen, Tuomo; Cosma, Georgina

    2010-01-01

    The availability and use of computers in teaching has seen an increase in the rate of plagiarism among students because of the wide availability of electronic texts online. While computer tools that have appeared in recent years are capable of detecting simple forms of plagiarism, such as copy-paste, a number of recent research studies devoted to…

  1. Bubble Radiation Detection: Current and Future Capability

    International Nuclear Information System (INIS)

    Peurrung, A.J.; Craig, R.A.

    1999-01-01

    Despite a number of noteworthy achievements in other fields, superheated droplet detectors (SDDs) and bubble chambers (BCs) have not been used for nuclear nonproliferation and arms control. This report examines these two radiation-detection technologies in detail and answers the question of how they can be or should be ''adapted'' for use in national security applications. These technologies involve closely related approaches to radiation detection in which an energetic charged particle deposits sufficient energy to initiate the process of bubble nucleation in a superheated fluid. These detectors offer complete gamma-ray insensitivity when used to detect neutrons. They also provide controllable neutron-energy thresholds and excellent position resolution. SDDs are extraordinarily simple and inexpensive. BCs offer the promise of very high efficiency (∼75%). A notable drawback for both technologies is temperature sensitivity. As a result of this problem, the temperature must be controlled whenever high accuracy is required, or harsh environmental conditions are encountered. The primary findings of this work are listed and briefly summarized below: (1) SDDs are ready to function as electronics-free neutron detectors on demand for arms-control applications. The elimination of electronics at the weapon's location greatly eases the negotiability of radiation-detection technologies in general. (2) As a result of their high efficiency and sharp energy threshold, current BCs are almost ready for use in the development of a next-generation active assay system. Development of an instrument based on appropriately safe materials is warranted. (3) Both kinds of bubble detectors are ready for use whenever very high gamma-ray fields must be confronted. Spent fuel MPC and A is a good example where this need presents itself. (4) Both kinds of bubble detectors have the potential to function as low-cost replacements for conventional neutron detectors such as 3 He tubes. For SDDs

  2. Reliability of leak detection systems in light water reactors

    International Nuclear Information System (INIS)

    Kupperman, D.S.

    1987-01-01

    US Nuclear Regulatory Commission Guide 1.45 recommends the use of at least three different detection methods in reactors to detect leakage. Monitoring of both sump-flow and airborne particulate radioactivity is recommended. A third method can involve either monitoring of condensate flow rate from air coolers or monitoring of airborne gaseous radioactivity. Although the methods currently used for leak detection reflect the state of the art, other techniques may be developed and used. Since the recommendations of Regulatory Guide 1.45 are not mandatory, the technical specifications for 74 operating plants have been reviewed to determine the types of leak detection methods employed. In addition, Licensee Event Report (LER) Compilations from June 1985 to June 1986 have been reviewed to help establish actual capabilities for detecting leaks and determining their source. Work at Argonne National Laboratory has demonstrated that improvements in leak detection, location, and sizing are possible with advanced acoustic leak detection technology

  3. How will climate novelty influence ecological forecasts? Using the Quaternary to assess future reliability.

    Science.gov (United States)

    Fitzpatrick, Matthew C; Blois, Jessica L; Williams, John W; Nieto-Lugilde, Diego; Maguire, Kaitlin C; Lorenz, David J

    2018-03-23

    Future climates are projected to be highly novel relative to recent climates. Climate novelty challenges models that correlate ecological patterns to climate variables and then use these relationships to forecast ecological responses to future climate change. Here, we quantify the magnitude and ecological significance of future climate novelty by comparing it to novel climates over the past 21,000 years in North America. We then use relationships between model performance and climate novelty derived from the fossil pollen record from eastern North America to estimate the expected decrease in predictive skill of ecological forecasting models as future climate novelty increases. We show that, in the high emissions scenario (RCP 8.5) and by late 21st century, future climate novelty is similar to or higher than peak levels of climate novelty over the last 21,000 years. The accuracy of ecological forecasting models is projected to decline steadily over the coming decades in response to increasing climate novelty, although models that incorporate co-occurrences among species may retain somewhat higher predictive skill. In addition to quantifying future climate novelty in the context of late Quaternary climate change, this work underscores the challenges of making reliable forecasts to an increasingly novel future, while highlighting the need to assess potential avenues for improvement, such as increased reliance on geological analogs for future novel climates and improving existing models by pooling data through time and incorporating assemblage-level information. © 2018 John Wiley & Sons Ltd.

  4. PV Systems Reliability Final Technical Report: Ground Fault Detection

    Energy Technology Data Exchange (ETDEWEB)

    Lavrova, Olga [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flicker, Jack David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Johnson, Jay [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    We have examined ground faults in PhotoVoltaic (PV) arrays and the efficacy of fuse, current detection (RCD), current sense monitoring/relays (CSM), isolation/insulation (Riso) monitoring, and Ground Fault Detection and Isolation (GFID) using simulations based on a Simulation Program with Integrated Circuit Emphasis SPICE ground fault circuit model, experimental ground faults installed on real arrays, and theoretical equations.

  5. Human factors assessment of conflict resolution aid reliability and time pressure in future air traffic control.

    Science.gov (United States)

    Trapsilawati, Fitri; Qu, Xingda; Wickens, Chris D; Chen, Chun-Hsien

    2015-01-01

    Though it has been reported that air traffic controllers' (ATCos') performance improves with the aid of a conflict resolution aid (CRA), the effects of imperfect automation on CRA are so far unknown. The main objective of this study was to examine the effects of imperfect automation on conflict resolution. Twelve students with ATC knowledge were instructed to complete ATC tasks in four CRA conditions including reliable, unreliable and high time pressure, unreliable and low time pressure, and manual conditions. Participants were able to resolve the designated conflicts more accurately and faster in the reliable versus unreliable CRA conditions. When comparing the unreliable CRA and manual conditions, unreliable CRA led to better conflict resolution performance and higher situation awareness. Surprisingly, high time pressure triggered better conflict resolution performance as compared to the low time pressure condition. The findings from the present study highlight the importance of CRA in future ATC operations. Practitioner Summary: Conflict resolution aid (CRA) is a proposed automation decision aid in air traffic control (ATC). It was found in the present study that CRA was able to promote air traffic controllers' performance even when it was not perfectly reliable. These findings highlight the importance of CRA in future ATC operations.

  6. Big data analytics for the Future Circular Collider reliability and availability studies

    Science.gov (United States)

    Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter

    2017-10-01

    Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.

  7. Objective Methods for Reliable Detection of Concealed Depression

    Directory of Open Access Journals (Sweden)

    Cynthia eSolomon

    2015-04-01

    Full Text Available Recent research has shown that it is possible to automatically detect clinical depression from audio-visual recordings. Before considering integration in a clinical pathway, a key question that must be asked is whether such systems can be easily fooled. This work explores the potential of acoustic features to detect clinical depression in adults both when acting normally and when asked to conceal their depression. Nine adults diagnosed with mild to moderate depression as per the Beck Depression Inventory (BDI-II and Patient Health Questionnaire (PHQ-9 were asked a series of questions and to read a excerpt from a novel aloud under two different experimental conditions. In one, participants were asked to act naturally and in the other, to suppress anything that they felt would be indicative of their depression. Acoustic features were then extracted from this data and analysed using paired t-tests to determine any statistically significant differences between healthy and depressed participants. Most features that were found to be significantly different during normal behaviour remained so during concealed behaviour. In leave-one-subject-out automatic classification studies of the 9 depressed subjects and 8 matched healthy controls, an 88% classification accuracy and 89% sensitivity was achieved. Results remained relatively robust during concealed behaviour, with classifiers trained on only non-concealed data achieving 81% detection accuracy and 75% sensitivity when tested on concealed data. These results indicate there is good potential to build deception-proof automatic depression monitoring systems.

  8. Research Note The reliability of a field test kit for the detection and ...

    African Journals Online (AJOL)

    Research Note The reliability of a field test kit for the detection and the persistence of ... Open Access DOWNLOAD FULL TEXT ... The objectives were to test a field kit for practicality and reliability, to assess the spread of the bacteria among ...

  9. Towards Reliable Evaluation of Anomaly-Based Intrusion Detection Performance

    Science.gov (United States)

    Viswanathan, Arun

    2012-01-01

    This report describes the results of research into the effects of environment-induced noise on the evaluation process for anomaly detectors in the cyber security domain. This research was conducted during a 10-week summer internship program from the 19th of August, 2012 to the 23rd of August, 2012 at the Jet Propulsion Laboratory in Pasadena, California. The research performed lies within the larger context of the Los Angeles Department of Water and Power (LADWP) Smart Grid cyber security project, a Department of Energy (DoE) funded effort involving the Jet Propulsion Laboratory, California Institute of Technology and the University of Southern California/ Information Sciences Institute. The results of the present effort constitute an important contribution towards building more rigorous evaluation paradigms for anomaly-based intrusion detectors in complex cyber physical systems such as the Smart Grid. Anomaly detection is a key strategy for cyber intrusion detection and operates by identifying deviations from profiles of nominal behavior and are thus conceptually appealing for detecting "novel" attacks. Evaluating the performance of such a detector requires assessing: (a) how well it captures the model of nominal behavior, and (b) how well it detects attacks (deviations from normality). Current evaluation methods produce results that give insufficient insight into the operation of a detector, inevitably resulting in a significantly poor characterization of a detectors performance. In this work, we first describe a preliminary taxonomy of key evaluation constructs that are necessary for establishing rigor in the evaluation regime of an anomaly detector. We then focus on clarifying the impact of the operational environment on the manifestation of attacks in monitored data. We show how dynamic and evolving environments can introduce high variability into the data stream perturbing detector performance. Prior research has focused on understanding the impact of this

  10. Bedside ultrasound reliability in locating catheter and detecting complications

    Directory of Open Access Journals (Sweden)

    Payman Moharamzadeh

    2016-10-01

    Full Text Available Introduction: Central venous catheterization is one of the most common medical procedures and is associated with such complications as misplacement and pneumothorax. Chest X-ray is among good ways for evaluation of these complications. However, due to patient’s excessive exposure to radiation, time consumption and low diagnostic value in detecting pneumothorax in the supine patient, the present study intends to examine bedside ultrasound diagnostic value in locating tip of the catheter and pneumothorax. Materials and methods: In the present cross-sectional study, all referred patients requiring central venous catheterization were examined. Central venous catheterization was performed by a trained emergency medicine specialist, and the location of catheter and the presence of pneumothorax were examined and compared using two modalities of ultrasound and x-ray (as the reference standard. Sensitivity, specificity, and positive and negative predicting values were reported. Results: A total of 200 non-trauma patients were included in the study (58% men. Cohen’s Kappa consistency coefficients for catheterization and diagnosis of pneumothorax were found as 0.49 (95% CI: 0.43-0.55, 0.89 (P<0.001, (95% CI: 97.8-100, respectively. Also, ultrasound sensitivity and specificity in diagnosing pneumothorax were 75% (95% CI: 35.6-95.5, and 100% (95% CI: 97.6-100, respectively. Conclusion: The present study results showed low diagnostic value of ultrasound in determining catheter location and in detecting pneumothorax. With knowledge of previous studies, the search still on this field.   Keywords: Central venous catheterization; complications; bedside ultrasound; radiography;

  11. Reliability considerations of electronics components for the deep underwater muon and neutrino detection system

    International Nuclear Information System (INIS)

    Leskovar, B.

    1980-02-01

    The reliability of some electronics components for the Deep Underwater Muon and Neutrino Detection (DUMAND) System is discussed. An introductory overview of engineering concepts and technique for reliability assessment is given. Component reliability is discussed in the contest of major factors causing failures, particularly with respect to physical and chemical causes, process technology and testing, and screening procedures. Failure rates are presented for discrete devices and for integrated circuits as well as for basic electronics components. Furthermore, the military reliability specifications and standards for semiconductor devices are reviewed

  12. Current activities and future trends in reliability analysis and probabilistic safety assessment in Hungary

    International Nuclear Information System (INIS)

    Hollo, E.; Toth, J.

    1986-01-01

    In Hungary reliability analysis (RA) and probabilistic safety assessment (PSA) of nuclear power plants was initiated 3 years ago. First, computer codes for automatic fault tree analysis (CAT, PREP) and numerical evaluation (REMO, KITT1,2) were adapted. Two main case studies - detailed availability/reliability calculation of diesel sets and analysis of safety systems influencing event sequences induced by large LOCA - were performed. Input failure data were taken from publications, a need for failure and reliability data bank was revealed. Current and future activities involves: setup of national data bank for WWER-440 units; full-scope level-I PSA of PAKS NPP in Hungary; operational safety assessment of particular problems at PAKS NPP. In the present article the state of RA and PSA activities in Hungary, as well as the main objectives of ongoing work are described. A need for international cooperation (for unified data collection of WWER-440 units) and for IAEA support (within Interregional Program INT/9/063) is emphasized. (author)

  13. The reliability, accuracy and minimal detectable difference of a multi-segment kinematic model of the foot-shoe complex.

    Science.gov (United States)

    Bishop, Chris; Paul, Gunther; Thewlis, Dominic

    2013-04-01

    Kinematic models are commonly used to quantify foot and ankle kinematics, yet no marker sets or models have been proven reliable or accurate when wearing shoes. Further, the minimal detectable difference of a developed model is often not reported. We present a kinematic model that is reliable, accurate and sensitive to describe the kinematics of the foot-shoe complex and lower leg during walking gait. In order to achieve this, a new marker set was established, consisting of 25 markers applied on the shoe and skin surface, which informed a four segment kinematic model of the foot-shoe complex and lower leg. Three independent experiments were conducted to determine the reliability, accuracy and minimal detectable difference of the marker set and model. Inter-rater reliability of marker placement on the shoe was proven to be good to excellent (ICC=0.75-0.98) indicating that markers could be applied reliably between raters. Intra-rater reliability was better for the experienced rater (ICC=0.68-0.99) than the inexperienced rater (ICC=0.38-0.97). The accuracy of marker placement along each axis was <6.7 mm for all markers studied. Minimal detectable difference (MDD90) thresholds were defined for each joint; tibiocalcaneal joint--MDD90=2.17-9.36°, tarsometatarsal joint--MDD90=1.03-9.29° and the metatarsophalangeal joint--MDD90=1.75-9.12°. These thresholds proposed are specific for the description of shod motion, and can be used in future research designed at comparing between different footwear. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. The Reliability and Effectiveness of a Radar-Based Animal Detection System

    Science.gov (United States)

    2017-09-22

    This document contains data on the reliability and effectiveness of an animal detection system along U.S. Hwy 95 near Bonners Ferry, Idaho. The system uses a Doppler radar to detect large mammals (e.g., deer and elk) when they approach the highway. T...

  15. The Reliability and Effectiveness of a Radar-Based Animal Detection System

    Science.gov (United States)

    2017-09-01

    This document contains data on the reliability and effectiveness of an animal detection system along U.S. Hwy 95 near Bonners Ferry, Idaho. The system uses a Doppler radar to detect large mammals (e.g., deer and elk) when they approach the highway. T...

  16. Adapting to a Changing Colorado River: Making Future Water Deliveries More Reliable Through Robust Management Strategies

    Science.gov (United States)

    Groves, D.; Bloom, E.; Fischbach, J. R.; Knopman, D.

    2013-12-01

    The U.S. Bureau of Reclamation and water management agencies representing the seven Colorado River Basin States initiated the Colorado River Basin Study in January 2010 to evaluate the resiliency of the Colorado River system over the next 50 years and compare different options for ensuring successful management of the river's resources. RAND was asked to join this Basin Study Team in January 2012 to help develop an analytic approach to identify key vulnerabilities in managing the Colorado River basin over the coming decades and to evaluate different options that could reduce this vulnerability. Using a quantitative approach for planning under uncertainty called Robust Decision Making (RDM), the RAND team assisted the Basin Study by: identifying future vulnerable conditions that could lead to imbalances that could cause the basin to be unable to meet its water delivery objectives; developing a computer-based tool to define 'portfolios' of management options reflecting different strategies for reducing basin imbalances; evaluating these portfolios across thousands of future scenarios to determine how much they could improve basin outcomes; and analyzing the results from the system simulations to identify key tradeoffs among the portfolios. This talk will describe RAND's contribution to the Basin Study, focusing on the methodologies used to to identify vulnerabilities for Upper Basin and Lower Basin water supply reliability and to compare portfolios of options. Several key findings emerged from the study. Future Streamflow and Climate Conditions Are Key: - Vulnerable conditions arise in a majority of scenarios where streamflows are lower than historical averages and where drought conditions persist for eight years or more. - Depending where the shortages occur, problems will arise for delivery obligations for the upper river basin and the lower river basin. The lower river basin is vulnerable to a broader range of plausible future conditions. Additional Investments in

  17. Human Reliability Assessments: Using the Past (Shuttle) to Predict the Future (Orion)

    Science.gov (United States)

    DeMott, Diana L.; Bigler, Mark A.

    2017-01-01

    NASA (National Aeronautics and Space Administration) Johnson Space Center (JSC) Safety and Mission Assurance (S&MA) uses two human reliability analysis (HRA) methodologies. The first is a simplified method which is based on how much time is available to complete the action, with consideration included for environmental and personal factors that could influence the human's reliability. This method is expected to provide a conservative value or placeholder as a preliminary estimate. This preliminary estimate or screening value is used to determine which placeholder needs a more detailed assessment. The second methodology is used to develop a more detailed human reliability assessment on the performance of critical human actions. This assessment needs to consider more than the time available, this would include factors such as: the importance of the action, the context, environmental factors, potential human stresses, previous experience, training, physical design interfaces, available procedures/checklists and internal human stresses. The more detailed assessment is expected to be more realistic than that based primarily on time available. When performing an HRA on a system or process that has an operational history, we have information specific to the task based on this history and experience. In the case of a Probabilistic Risk Assessment (PRA) that is based on a new design and has no operational history, providing a "reasonable" assessment of potential crew actions becomes more challenging. To determine what is expected of future operational parameters, the experience from individuals who had relevant experience and were familiar with the system and process previously implemented by NASA was used to provide the "best" available data. Personnel from Flight Operations, Flight Directors, Launch Test Directors, Control Room Console Operators, and Astronauts were all interviewed to provide a comprehensive picture of previous NASA operations. Verification of the

  18. Advances in developing rapid, reliable and portable detection systems for alcohol.

    Science.gov (United States)

    Thungon, Phurpa Dema; Kakoti, Ankana; Ngashangva, Lightson; Goswami, Pranab

    2017-11-15

    Development of portable, reliable, sensitive, simple, and inexpensive detection system for alcohol has been an instinctive demand not only in traditional brewing, pharmaceutical, food and clinical industries but also in rapidly growing alcohol based fuel industries. Highly sensitive, selective, and reliable alcohol detections are currently amenable typically through the sophisticated instrument based analyses confined mostly to the state-of-art analytical laboratory facilities. With the growing demand of rapid and reliable alcohol detection systems, an all-round attempt has been made over the past decade encompassing various disciplines from basic and engineering sciences. Of late, the research for developing small-scale portable alcohol detection system has been accelerated with the advent of emerging miniaturization techniques, advanced materials and sensing platforms such as lab-on-chip, lab-on-CD, lab-on-paper etc. With these new inter-disciplinary approaches along with the support from the parallel knowledge growth on rapid detection systems being pursued for various targets, the progress on translating the proof-of-concepts to commercially viable and environment friendly portable alcohol detection systems is gaining pace. Here, we summarize the progress made over the years on the alcohol detection systems, with a focus on recent advancement towards developing portable, simple and efficient alcohol sensors. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Reliability and minimal detectable difference in multisegment foot kinematics during shod walking and running.

    Science.gov (United States)

    Milner, Clare E; Brindle, Richard A

    2016-01-01

    There has been increased interest recently in measuring kinematics within the foot during gait. While several multisegment foot models have appeared in the literature, the Oxford foot model has been used frequently for both walking and running. Several studies have reported the reliability for the Oxford foot model, but most studies to date have reported reliability for barefoot walking. The purpose of this study was to determine between-day (intra-rater) and within-session (inter-trial) reliability of the modified Oxford foot model during shod walking and running and calculate minimum detectable difference for common variables of interest. Healthy adult male runners participated. Participants ran and walked in the gait laboratory for five trials of each. Three-dimensional gait analysis was conducted and foot and ankle joint angle time series data were calculated. Participants returned for a second gait analysis at least 5 days later. Intraclass correlation coefficients and minimum detectable difference were determined for walking and for running, to indicate both within-session and between-day reliability. Overall, relative variables were more reliable than absolute variables, and within-session reliability was greater than between-day reliability. Between-day intraclass correlation coefficients were comparable to those reported previously for adults walking barefoot. It is an extension in the use of the Oxford foot model to incorporate wearing a shoe while maintaining marker placement directly on the skin for each segment. These reliability data for walking and running will aid in the determination of meaningful differences in studies which use this model during shod gait. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. The reliability of magnetic resonance imaging in traumatic brain injury lesion detection

    NARCIS (Netherlands)

    Geurts, B.H.J.; Andriessen, T.M.J.C.; Goraj, B.M.; Vos, P.E.

    2012-01-01

    Objective: This study compares inter-rater-reliability, lesion detection and clinical relevance of T2-weighted imaging (T2WI), Fluid Attenuated Inversion Recovery (FLAIR), T2*-gradient recalled echo (T2*-GRE) and Susceptibility Weighted Imaging (SWI) in Traumatic Brain Injury (TBI). Methods: Three

  1. Detecting Chemical Weapons: Threats, Requirements, Solutions, and Future Challenges

    Science.gov (United States)

    Boso, Brian

    2011-03-01

    Although chemicals have been reportedly used as weapons for thousands of years, it was not until 1915 at Ypres, France that an industrial chemical, chlorine, was used in World War I as an offensive weapon in significant quantity, causing mass casualties. From that point until today the development, detection, production and protection from chemical weapons has be an organized endeavor of many of the world's armed forces and in more recent times, non-governmental terrorist organizations. The number of Chemical Warfare Agents (CWAs) has steadily increased as research into more toxic substances continued for most of the 20 th century. Today there are over 70 substances including harassing agents like tear gas, incapacitating agents, and lethal agents like blister, blood, chocking, and nerve agents. The requirements for detecting chemical weapons vary depending on the context in which they are encountered and the concept of operation of the organization deploying the detection equipment. The US DoD, for example, has as a requirement, that US forces be able to continue their mission, even in the event of a chemical attack. This places stringent requirements on detection equipment. It must be lightweight (developed for this application, including, but not limited to: mass spectroscopy, IR spectroscopy, RAMAN spectroscopy, MEMs micro-cantilever sensors, surface acoustic wave sensors, differential mobility spectrometry, and amplifying fluorescence polymers. In the future the requirements for detection equipment will continue to become even more stringent. The continuing increase in the sheer number of threats that will need to be detected, the development of binary agents requiring that even the precursor chemicals be detected, the development of new types of agents unlike any of the current chemistries, and the expansion of the list of toxic industrial chemical will require new techniques with higher specificity and more sensitivity.

  2. Hunting electroweakinos at future hadron colliders and direct detection experiments

    Energy Technology Data Exchange (ETDEWEB)

    Cortona, Giovanni Grilli di [SISSA - International School for Advanced Studies,Via Bonomea 265, I-34136 Trieste (Italy); INFN - Sezione di Trieste,via Valerio 2, I-34127 Trieste (Italy)

    2015-05-07

    We analyse the mass reach for electroweakinos at future hadron colliders and their interplay with direct detection experiments. Motivated by the LHC data, we focus on split supersymmetry models with different electroweakino spectra. We find for example that a 100 TeV collider may explore Winos up to ∼7 TeV in low scale gauge mediation models or thermal Wino dark matter around 3 TeV in models of anomaly mediation with long-lived Winos. We show moreover how collider searches and direct detection experiments have the potential to cover large part of the parameter space even in scenarios where the lightest neutralino does not contribute to the whole dark matter relic density.

  3. Detection of GH abuse in sport: Past, present and future.

    Science.gov (United States)

    Barroso, Osquel; Schamasch, Patrick; Rabin, Olivier

    2009-08-01

    Due to its considered performance enhancing effects, human growth hormone (hGH) is abused as a doping agent in sport. Its misuse also carries potentially serious side effects to a person's health. Consequently, hGH and its releasing factors are prohibited in sport, as established in the Prohibited List which is updated and published yearly by the World Anti-Doping Agency (WADA). In order to fight the menace that hGH doping poses to the spirit of sport and to the health of athletes, the sport movement and the anti-doping authorities, initially led by the International Olympic Committee (IOC) and later by WADA, have put substantial efforts into developing tests for its detection. Currently, a primary analytical approach, the isoform differential immunoassay, has been implemented in WADA-accredited laboratories. In parallel, a second, indirect approach for the detection of hGH abuse, based on the quantification of hGH-associated biological markers, has been developed. The final aim is to combine both methodologies to improve the sensitivity and expand the time window to detect doping with hGH. In addition, novel analytical procedures, based on proteomic and genomic technologies as well as the use of mass spectrometry-based methods of detection, are being investigated for future application in hGH anti-doping tests.

  4. Reliability of recordings of subgingival calculus detected using an ultrasonic device.

    Science.gov (United States)

    Corraini, Priscila; López, Rodrigo

    2015-04-01

    To assess the intra-examiner reliability of recordings of subgingival calculus detected using an ultrasonic device, and to investigate the influence of subject-, tooth- and site-level factors on the reliability of these subgingival calculus recordings. On two occasions, within a 1-week interval, 147 adult periodontitis patients received a full-mouth clinical periodontal examination by a single trained examiner. Duplicate subgingival calculus recordings, in six sites per tooth, were obtained using an ultrasonic device for calculus detection and removal. Agreement was observed in 65 % of the 22,584 duplicate subgingival calculus recordings, ranging 45 % to 83 % according to subject. Using hierarchical modeling, disagreements in the subgingival calculus duplicate recordings were more likely in all other sites than the mid-buccal, and in sites harboring supragingival calculus. Disagreements were less likely in sites with PD ≥  4 mm and with furcation involvement  ≥  degree 2. Bleeding on probing or suppuration did not influence the reliability of subgingival calculus. At the subject-level, disagreements were less likely in patients presenting with the highest and lowest extent categories of the covariate subgingival calculus. The reliability of subgingival calculus recordings using the ultrasound technology is reasonable. The results of the present study suggest that the reliability of subgingival calculus recordings is not influenced by the presence of inflammation. Moreover, subgingival calculus can be more reliably detected using the ultrasound device at sites with higher need for periodontal therapy, i.e., sites presenting with deep pockets and premolars and molars with furcation involvement.

  5. Test-retest reliability of myofascial trigger point detection in hip and thigh areas.

    Science.gov (United States)

    Rozenfeld, E; Finestone, A S; Moran, U; Damri, E; Kalichman, L

    2017-10-01

    Myofascial trigger points (MTrP's) are a primary source of pain in patients with musculoskeletal disorders. Nevertheless, they are frequently underdiagnosed. Reliable MTrP palpation is the necessary for their diagnosis and treatment. The few studies that have looked for intra-tester reliability of MTrPs detection in upper body, provide preliminary evidence that MTrP palpation is reliable. Reliability tests for MTrP palpation on the lower limb have not yet been performed. To evaluate inter- and intra-tester reliability of MTrP recognition in hip and thigh muscles. Reliability study. 21 patients (15 males and 6 females, mean age 21.1 years) referred to the physical therapy clinic, 10 with knee or hip pain and 11 with pain in an upper limb, low back, shin or ankle. Two experienced physical therapists performed the examinations, blinded to the subjects' identity, medical condition and results of the previous MTrP evaluation. Each subject was evaluated four times, twice by each examiner in a random order. Dichotomous findings included a palpable taut band, tenderness, referred pain, and relevance of referred pain to patient's complaint. Based on these, diagnosis of latent MTrP's or active MTrP's was established. The evaluation was performed on both legs and included a total of 16 locations in the following muscles: rectus femoris (proximal), vastus medialis (middle and distal), vastus lateralis (middle and distal) and gluteus medius (anterior, posterior and distal). Inter- and intra-tester reliability (Cohen's kappa (κ)) values for single sites ranged from -0.25 to 0.77. Median intra-tester reliability was 0.45 and 0.46 for latent and active MTrP's, and median inter-tester reliability was 0.51 and 0.64 for latent and active MTrPs, respectively. The examination of the distal vastus medialis was most reliable for latent and active MTrP's (intra-tester k = 0.27-0.77, inter-tester k = 0.77 and intra-tester k = 0.53-0.72, inter-tester k = 0.72, correspondingly

  6. Reliability assessment for thickness measurements of pipe wall using probability of detection

    International Nuclear Information System (INIS)

    Nakamoto, Hiroyuki; Kojima, Fumio; Kato, Sho

    2013-01-01

    This paper proposes a reliability assessment method for thickness measurements of pipe wall using probability of detection (POD). Thicknesses of pipes are measured by qualified inspectors with ultrasonic thickness gauges. The inspection results are affected by human factors of the inspectors and include some errors, because the inspectors have different experiences and frequency of inspections. In order to ensure reliability for inspection results, first, POD evaluates experimental results of pipe-wall thickness inspection. We verify that the results have differences depending on inspectors including qualified inspectors. Second, two human factors that affect POD are indicated. Finally, it is confirmed that POD can identify the human factors and ensure reliability for pipe-wall thickness inspections. (author)

  7. Future developments of probabilistic structural reliability to meet the needs of risk analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Schnurer, H.

    1980-01-01

    The methods of structural reliability, knowing their benefits and their limitations, will offer an increasingly important tool in order to make future quality decisions for nuclear safety more rational, objective and balanced. This might make them suitable for licensing and approval decisions of components and structures, offering an alternative to the presently used deterministic practice. (orig./RW)

  8. Reliability of tensiomyography and myotonometry in detecting mechanical and contractile characteristics of the lumbar erector spinae in healthy volunteers.

    Science.gov (United States)

    Lohr, Christine; Braumann, Klaus-Michael; Reer, Ruediger; Schroeder, Jan; Schmidt, Tobias

    2018-04-20

    Tensiomyography™ (TMG) and MyotonPRO ® (MMT) are two non-invasive devices for monitoring muscle contractile and mechanical characteristics. This study aimed to evaluate the test-retest reliability of TMG and MMT parameters for measuring (TMG:) muscle displacement (D m ), contraction time (T c ), and velocity (V c ) and (MMT:) frequency (F), stiffness (S), and decrement (D) of the erector spinae muscles (ES) in healthy adults. A particular focus was set on the establishment of reliability measures for the previously barely evaluated secondary TMG parameter V c . Twenty-four subjects (13 female and 11 male, mean ± SD, 38.0 ± 12.0 years) were measured using TMG and MMT over 2 consecutive days. Absolute and relative reliability was calculated by standard error of measurement (SEM, SEM%), Minimum detectable change (MDC, MDC%), coefficient of variation (CV%) and intraclass correlation coefficient (ICC, 3.1) with a 95% confidence interval (CI). The ICCs for all variables and test-retest intervals ranged from 0.75 to 0.99 indicating a good to excellent relative reliability for both TMG and MMT, demonstrating the lowest values for TMG T c and between-day MMT D (ICC TMG parameter (ICC > 0.95, CV TMG V c could be established successfully. Its further applicability needs to be confirmed in future studies. MMT was found to be more reliable on repeated testing than the two other TMG parameters D m and T c .

  9. Reliability Study Regarding the Use of Histogram Similarity Methods for Damage Detection

    Directory of Open Access Journals (Sweden)

    Nicoleta Gillich

    2013-01-01

    Full Text Available The paper analyses the reliability of three dissimilarity estimators to compare histograms, as support for a frequency-based damage detection method, able to identify structural changes in beam-like structures. First a brief presentation of the own developed damage detection method is made, with focus on damage localization. It consists actually in comparing a histogram derived from measurement results, with a large series of histograms, namely the damage location indexes for all locations along the beam, obtained by calculus. We tested some dissimilarity estimators like the Minkowski-form Distances, the Kullback-Leibler Divergence and the Histogram Intersection and found the Minkowski Distance as the method providing best results. It was tested for numerous locations, using real measurement results and with results artificially debased by noise, proving its reliability.

  10. Indian program for development of technologies relevant to reliable, non-intrusive, concealed-contraband detection

    International Nuclear Information System (INIS)

    Auluck, S.K.H.

    2007-01-01

    Generating capability for reliable, non-intrusive detection of concealed-contraband, particularly, organic contraband like explosives and narcotics, has become a national priority. This capability spans a spectrum of technologies. If a technology mission addressing the needs of a highly sophisticated technology like PFNA is set up, the capabilities acquired would be adequate to meet the requirements of many other sets of technologies. This forms the background of the Indian program for development of technologies relevant to reliable, non-intrusive, concealed contraband detection. One of the central themes of the technology development programs would be modularization of the neutron source and detector technologies, so that common elements can be combined in different ways for meeting a variety of application requirements. (author)

  11. Scenario based approach to structural damage detection and its value in a risk and reliability perspective

    DEFF Research Database (Denmark)

    Hovgaard, Mads Knude; Hansen, Jannick Balleby; Brincker, Rune

    2013-01-01

    A scenario- and vibration based structural damage detection method is demonstrated though simulation. The method is Finite Element (FE) based. The value of the monitoring is calculated using structural reliability theory. A high cycle fatigue crack propagation model is assumed as the damage mecha......- and without monitoring. Monte Carlo Sampling (MCS) is used to estimate the probabilities and the tower of an onshore NREL 5MW wind turbine is given as a calculation case......A scenario- and vibration based structural damage detection method is demonstrated though simulation. The method is Finite Element (FE) based. The value of the monitoring is calculated using structural reliability theory. A high cycle fatigue crack propagation model is assumed as the damage...

  12. NDE reliability and probability of detection (POD) evolution and paradigm shift

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Surendra [NDE Engineering, Materials and Process Engineering, Honeywell Aerospace, Phoenix, AZ 85034 (United States)

    2014-02-18

    The subject of NDE Reliability and POD has gone through multiple phases since its humble beginning in the late 1960s. This was followed by several programs including the important one nicknamed “Have Cracks – Will Travel” or in short “Have Cracks” by Lockheed Georgia Company for US Air Force during 1974–1978. This and other studies ultimately led to a series of developments in the field of reliability and POD starting from the introduction of fracture mechanics and Damaged Tolerant Design (DTD) to statistical framework by Bernes and Hovey in 1981 for POD estimation to MIL-STD HDBK 1823 (1999) and 1823A (2009). During the last decade, various groups and researchers have further studied the reliability and POD using Model Assisted POD (MAPOD), Simulation Assisted POD (SAPOD), and applying Bayesian Statistics. All and each of these developments had one objective, i.e., improving accuracy of life prediction in components that to a large extent depends on the reliability and capability of NDE methods. Therefore, it is essential to have a reliable detection and sizing of large flaws in components. Currently, POD is used for studying reliability and capability of NDE methods, though POD data offers no absolute truth regarding NDE reliability, i.e., system capability, effects of flaw morphology, and quantifying the human factors. Furthermore, reliability and POD have been reported alike in meaning but POD is not NDE reliability. POD is a subset of the reliability that consists of six phases: 1) samples selection using DOE, 2) NDE equipment setup and calibration, 3) System Measurement Evaluation (SME) including Gage Repeatability and Reproducibility (Gage R and R) and Analysis Of Variance (ANOVA), 4) NDE system capability and electronic and physical saturation, 5) acquiring and fitting data to a model, and data analysis, and 6) POD estimation. This paper provides an overview of all major POD milestones for the last several decades and discuss rationale for using

  13. Reliability of high mobility SiGe channel MOSFETs for future CMOS applications

    CERN Document Server

    Franco, Jacopo; Groeseneken, Guido

    2014-01-01

    Due to the ever increasing electric fields in scaled CMOS devices, reliability is becoming a showstopper for further scaled technology nodes. Although several groups have already demonstrated functional Si channel devices with aggressively scaled Equivalent Oxide Thickness (EOT) down to 5Å, a 10 year reliable device operation cannot be guaranteed anymore due to severe Negative Bias Temperature Instability. This book focuses on the reliability of the novel (Si)Ge channel quantum well pMOSFET technology. This technology is being considered for possible implementation in next CMOS technology nodes, thanks to its benefit in terms of carrier mobility and device threshold voltage tuning. We observe that it also opens a degree of freedom for device reliability optimization. By properly tuning the device gate stack, sufficiently reliable ultra-thin EOT devices with a 10 years lifetime at operating conditions are demonstrated. The extensive experimental datasets collected on a variety of processed 300mm wafers and pr...

  14. A novel approach for reliable detection of cathepsin S activities in mouse antigen presenting cells.

    Science.gov (United States)

    Steimle, Alex; Kalbacher, Hubert; Maurer, Andreas; Beifuss, Brigitte; Bender, Annika; Schäfer, Andrea; Müller, Ricarda; Autenrieth, Ingo B; Frick, Julia-Stefanie

    2016-05-01

    Cathepsin S (CTSS) is a eukaryotic protease mostly expressed in professional antigen presenting cells (APCs). Since CTSS activity regulation plays a role in the pathogenesis of various autoimmune diseases like multiple sclerosis, atherosclerosis, Sjögren's syndrome and psoriasis as well as in cancer progression, there is an ongoing interest in the reliable detection of cathepsin S activity. Various applications have been invented for specific detection of this enzyme. However, most of them have only been shown to be suitable for human samples, do not deliver quantitative results or the experimental procedure requires technical equipment that is not commonly available in a standard laboratory. We have tested a fluorogen substrate, Mca-GRWPPMGLPWE-Lys(Dnp)-DArg-NH2, that has been described to specifically detect CTSS activities in human APCs for its potential use for mouse samples. We have modified the protocol and thereby offer a cheap, easy, reproducible and quick activity assay to detect CTSS activities in mouse APCs. Since most of basic research on CTSS is performed in mice, this method closes a gap and offers a possibility for reliable and quantitative CTSS activity detection that can be performed in almost every laboratory. Copyright © 2016. Published by Elsevier B.V.

  15. Testing effort dependent software reliability model for imperfect debugging process considering both detection and correction

    International Nuclear Information System (INIS)

    Peng, R.; Li, Y.F.; Zhang, W.J.; Hu, Q.P.

    2014-01-01

    This paper studies the fault detection process (FDP) and fault correction process (FCP) with the incorporation of testing effort function and imperfect debugging. In order to ensure high reliability, it is essential for software to undergo a testing phase, during which faults can be detected and corrected by debuggers. The testing resource allocation during this phase, which is usually depicted by the testing effort function, considerably influences not only the fault detection rate but also the time to correct a detected fault. In addition, testing is usually far from perfect such that new faults may be introduced. In this paper, we first show how to incorporate testing effort function and fault introduction into FDP and then develop FCP as delayed FDP with a correction effort. Various specific paired FDP and FCP models are obtained based on different assumptions of fault introduction and correction effort. An illustrative example is presented. The optimal release policy under different criteria is also discussed

  16. Reliable Grid Condition Detection and Control of Single-Phase Distributed Power Generation Systems

    DEFF Research Database (Denmark)

    Ciobotaru, Mihai

    standards addressed to the grid-connected systems will harmonize the combination of the DPGS and the classical power plants. Consequently, the major tasks of this thesis were to develop new grid condition detection techniques and intelligent control in order to allow the DPGS not only to deliver power...... to the utility grid but also to sustain it. This thesis was divided into two main parts, namely "Grid Condition Detection" and "Control of Single-Phase DPGS". In the first part, the main focus was on reliable Phase Locked Loop (PLL) techniques for monitoring the grid voltage and on grid impedance estimation...... techniques. Additionally, a new technique for detecting the islanding mode has been developed and successfully tested. In the second part, the main reported research was concentrated around adaptive current controllers based on the information provided by the grid condition detection techniques. To guarantee...

  17. Is sequential cranial ultrasound reliable for detection of white matter injury in very preterm infants?

    International Nuclear Information System (INIS)

    Leijser, Lara M.; Steggerda, Sylke J.; Walther, Frans J.; Wezel-Meijler, Gerda van; Bruine, Francisca T. de; Grond, Jeroen van der

    2010-01-01

    Cranial ultrasound (cUS) may not be reliable for detection of diffuse white matter (WM) injury. Our aim was to assess in very preterm infants the reliability of a classification system for WM injury on sequential cUS throughout the neonatal period, using magnetic resonance imaging (MRI) as reference standard. In 110 very preterm infants (gestational age <32 weeks), serial cUS during admission (median 8, range 4-22) and again around term equivalent age (TEA) and a single MRI around TEA were performed. cUS during admission were assessed for presence of WM changes, and contemporaneous cUS and MRI around TEA additionally for abnormality of lateral ventricles. Sequential cUS (from birth up to TEA) and MRI were classified as normal/mildly abnormal, moderately abnormal, or severely abnormal, based on a combination of findings of the WM and lateral ventricles. Predictive values of the cUS classification were calculated. Sequential cUS were classified as normal/mildly abnormal, moderately abnormal, and severely abnormal in, respectively, 22%, 65%, and 13% of infants and MRI in, respectively, 30%, 52%, and 18%. The positive predictive value of the cUS classification for the MRI classification was high for severely abnormal WM (0.79) but lower for normal/mildly abnormal (0.67) and moderately abnormal (0.64) WM. Sequential cUS during the neonatal period detects severely abnormal WM in very preterm infants but is less reliable for mildly and moderately abnormal WM. MRI around TEA seems needed to reliably detect WM injury in very preterm infants. (orig.)

  18. Engineered for the energy future. I. Moisture separator-reheaters: extreme reliability an imperative

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    A description is given of the design and development activities performed by Foster-Wheeler to insure operational reliability of sixteen moisture separator-reheaters being manufactured for eight twin-unit BWR power plants to be operated by TVA

  19. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  20. Design for reliability in power electronics in renewable energy systems – status and future

    DEFF Research Database (Denmark)

    Wang, Huai; Blaabjerg, Frede; Ma, Ke

    2013-01-01

    Advances in power electronics enable efficient and flexible interconnection of renewable sources, loads and electric grids. While targets concerning efficiency of power converters are within reach, recent research endeavors to predict and improve their reliability to ensure high availability, low...... maintenance costs, and herefore, low Levelized-Cost-of-Energy (LCOE) of renewable energy systems. This paper presents the prior-art Design for Reliability (DFR) process for power converters and addresses the paradigm shift to Physics-of-Failure (PoF) approach and mission profile based analysis. Moreover...

  1. The European reliability data system - ERDS: a state of the art and future developments

    International Nuclear Information System (INIS)

    Mancini, G.; Amesz, J.; Bastianini, P.; Capobianchi, S.

    1982-01-01

    In the frame of the Multiannual Nuclear Safety Programme of the Joint Research Centre of the Commisson of the European Communities, a project is being carried out aiming at the creation of a centralized data system collecting and organizing, at European level, information related to the operation of LWRs. The European Reliability Data System ERDS will exploit information already collected in national data systems and information deriving from single reactor sources. The paper describes the development of the four data systems constituting the ERDS: Component Event Data Bank; Abnormal Occurrences Reporting System; Operating Unit Status Report; Generic Reliability Parameter Data Bank

  2. Rapid and reliable detection and identification of GM events using multiplex PCR coupled with oligonucleotide microarray.

    Science.gov (United States)

    Xu, Xiaodan; Li, Yingcong; Zhao, Heng; Wen, Si-yuan; Wang, Sheng-qi; Huang, Jian; Huang, Kun-lun; Luo, Yun-bo

    2005-05-18

    To devise a rapid and reliable method for the detection and identification of genetically modified (GM) events, we developed a multiplex polymerase chain reaction (PCR) coupled with a DNA microarray system simultaneously aiming at many targets in a single reaction. The system included probes for screening gene, species reference gene, specific gene, construct-specific gene, event-specific gene, and internal and negative control genes. 18S rRNA was combined with species reference genes as internal controls to assess the efficiency of all reactions and to eliminate false negatives. Two sets of the multiplex PCR system were used to amplify four and five targets, respectively. Eight different structure genes could be detected and identified simultaneously for Roundup Ready soybean in a single microarray. The microarray specificity was validated by its ability to discriminate two GM maizes Bt176 and Bt11. The advantages of this method are its high specificity and greatly reduced false-positives and -negatives. The multiplex PCR coupled with microarray technology presented here is a rapid and reliable tool for the simultaneous detection of GM organism ingredients.

  3. Reliability and Minimum Detectable Change of Temporal-Spatial, Kinematic, and Dynamic Stability Measures during Perturbed Gait.

    Directory of Open Access Journals (Sweden)

    Christopher A Rábago

    Full Text Available Temporal-spatial, kinematic variability, and dynamic stability measures collected during perturbation-based assessment paradigms are often used to identify dysfunction associated with gait instability. However, it remains unclear which measures are most reliable for detecting and tracking responses to perturbations. This study systematically determined the between-session reliability and minimum detectable change values of temporal-spatial, kinematic variability, and dynamic stability measures during three types of perturbed gait. Twenty young healthy adults completed two identical testing sessions two weeks apart, comprised of an unperturbed and three perturbed (cognitive, physical, and visual walking conditions in a virtual reality environment. Within each session, perturbation responses were compared to unperturbed walking using paired t-tests. Between-session reliability and minimum detectable change values were also calculated for each measure and condition. All temporal-spatial, kinematic variability and dynamic stability measures demonstrated fair to excellent between-session reliability. Minimal detectable change values, normalized to mean values ranged from 1-50%. Step width mean and variability measures demonstrated the greatest response to perturbations with excellent between-session reliability and low minimum detectable change values. Orbital stability measures demonstrated specificity to perturbation direction and sensitivity with excellent between-session reliability and low minimum detectable change values. We observed substantially greater between-session reliability and lower minimum detectable change values for local stability measures than previously described which may be the result of averaging across trials within a session and using velocity versus acceleration data for reconstruction of state spaces. Across all perturbation types, temporal-spatial, orbital and local measures were the most reliable measures with the

  4. Automated Energy Distribution and Reliability System: Validation Integration - Results of Future Architecture Implementation

    Energy Technology Data Exchange (ETDEWEB)

    Buche, D. L.

    2008-06-01

    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects. This report is second in a series of reports detailing this effort.

  5. MOA-2010-BLG-311: A PLANETARY CANDIDATE BELOW THE THRESHOLD OF RELIABLE DETECTION

    International Nuclear Information System (INIS)

    Yee, J. C.; Hung, L.-W.; Gould, A.; Gaudi, B. S.; Bond, I. A.; Allen, W.; Monard, L. A. G.; Albrow, M. D.; Fouqué, P.; Dominik, M.; Tsapras, Y.; Udalski, A.; Zellem, R.; Bos, M.; Christie, G. W.; DePoy, D. L.; Dong, Subo; Drummond, J.; Gorbikov, E.; Han, C.

    2013-01-01

    We analyze MOA-2010-BLG-311, a high magnification (A max > 600) microlensing event with complete data coverage over the peak, making it very sensitive to planetary signals. We fit this event with both a point lens and a two-body lens model and find that the two-body lens model is a better fit but with only Δχ 2 ∼ 80. The preferred mass ratio between the lens star and its companion is q = 10 –3.7±0.1 , placing the candidate companion in the planetary regime. Despite the formal significance of the planet, we show that because of systematics in the data the evidence for a planetary companion to the lens is too tenuous to claim a secure detection. When combined with analyses of other high-magnification events, this event helps empirically define the threshold for reliable planet detection in high-magnification events, which remains an open question.

  6. MOA-2010-BLG-311: A PLANETARY CANDIDATE BELOW THE THRESHOLD OF RELIABLE DETECTION

    Energy Technology Data Exchange (ETDEWEB)

    Yee, J. C.; Hung, L.-W.; Gould, A.; Gaudi, B. S. [Department of Astronomy, Ohio State University, 140 West 18th Avenue, Columbus, OH 43210 (United States); Bond, I. A. [Institute for Information and Mathematical Sciences, Massey University, Private Bag 102-904, Auckland 1330 (New Zealand); Allen, W. [Vintage Lane Observatory, Blenheim (New Zealand); Monard, L. A. G. [Bronberg Observatory, Centre for Backyard Astrophysics, Pretoria (South Africa); Albrow, M. D. [Department of Physics and Astronomy, University of Canterbury, Private Bag 4800, Christchurch 8020 (New Zealand); Fouque, P. [IRAP, CNRS, Universite de Toulouse, 14 avenue Edouard Belin, F-31400 Toulouse (France); Dominik, M. [SUPA, University of St. Andrews, School of Physics and Astronomy, North Haugh, St. Andrews, KY16 9SS (United Kingdom); Tsapras, Y. [Las Cumbres Observatory Global Telescope Network, 6740B Cortona Drive, Goleta, CA 93117 (United States); Udalski, A. [Warsaw University Observatory, Al. Ujazdowskie 4, 00-478 Warszawa (Poland); Zellem, R. [Department of Planetary Sciences/LPL, University of Arizona, 1629 East University Boulevard, Tucson, AZ 85721 (United States); Bos, M. [Molehill Astronomical Observatory, North Shore City, Auckland (New Zealand); Christie, G. W. [Auckland Observatory, P.O. Box 24-180, Auckland (New Zealand); DePoy, D. L. [Department of Physics, Texas A and M University, 4242 TAMU, College Station, TX 77843-4242 (United States); Dong, Subo [Institute for Advanced Study, Einstein Drive, Princeton, NJ 08540 (United States); Drummond, J. [Possum Observatory, Patutahi (New Zealand); Gorbikov, E. [School of Physics and Astronomy, Raymond and Beverley Sackler Faculty of Exact Sciences, Tel-Aviv University, Tel Aviv 69978 (Israel); Han, C., E-mail: liweih@astro.ucla.edu, E-mail: rzellem@lpl.arizona.edu, E-mail: tim.natusch@aut.ac.nz [Department of Physics, Chungbuk National University, 410 Seongbong-Rho, Hungduk-Gu, Chongju 371-763 (Korea, Republic of); Collaboration: muFUN Collaboration; MOA Collaboration; OGLE Collaboration; PLANET Collaboration; RoboNet Collaboration; MiNDSTEp Consortium; and others

    2013-05-20

    We analyze MOA-2010-BLG-311, a high magnification (A{sub max} > 600) microlensing event with complete data coverage over the peak, making it very sensitive to planetary signals. We fit this event with both a point lens and a two-body lens model and find that the two-body lens model is a better fit but with only {Delta}{chi}{sup 2} {approx} 80. The preferred mass ratio between the lens star and its companion is q = 10{sup -3.7{+-}0.1}, placing the candidate companion in the planetary regime. Despite the formal significance of the planet, we show that because of systematics in the data the evidence for a planetary companion to the lens is too tenuous to claim a secure detection. When combined with analyses of other high-magnification events, this event helps empirically define the threshold for reliable planet detection in high-magnification events, which remains an open question.

  7. A reliable cw Lyman-α laser source for future cooling of antihydrogen

    International Nuclear Information System (INIS)

    Kolbe, Daniel; Beczkowiak, Anna; Diehl, Thomas; Koglbauer, Andreas; Sattler, Matthias; Stappel, Matthias; Steinborn, Ruth; Walz, Jochen

    2012-01-01

    We demonstrate a reliable continuous-wave (cw) laser source at the 1 S–2 P transition in (anti)hydrogen at 121.56 nm (Lyman-α) based on four-wave sum-frequency mixing in mercury. A two-photon resonance in the four-wave mixing scheme is essential for a powerful cw Lyman-α source and is well investigated.

  8. A reliable cw Lyman-{alpha} laser source for future cooling of antihydrogen

    Energy Technology Data Exchange (ETDEWEB)

    Kolbe, Daniel, E-mail: kolbed@uni-mainz.de; Beczkowiak, Anna; Diehl, Thomas; Koglbauer, Andreas; Sattler, Matthias; Stappel, Matthias; Steinborn, Ruth; Walz, Jochen [Johannes Gutenberg-Universitaet, Institut fuer Physik (Germany)

    2012-12-15

    We demonstrate a reliable continuous-wave (cw) laser source at the 1 S-2 P transition in (anti)hydrogen at 121.56 nm (Lyman-{alpha}) based on four-wave sum-frequency mixing in mercury. A two-photon resonance in the four-wave mixing scheme is essential for a powerful cw Lyman-{alpha} source and is well investigated.

  9. Revenue Sufficiency and Reliability in a Zero Marginal Cost Future: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany A.; Milligan, Michael; Brinkman, Greg; Bloom, Aaron; Clark, Kara; Denholm, Paul

    2016-12-01

    Features of existing wholesale electricity markets, such as administrative pricing rules and policy-based reliability standards, can distort market incentives from allowing generators sufficient opportunities to recover both fixed and variable costs. Moreover, these challenges can be amplified by other factors, including (1) inelastic demand resulting from a lack of price signal clarity, (2) low- or near-zero marginal cost generation, particularly arising from low natural gas fuel prices and variable generation (VG), such as wind and solar, and (3) the variability and uncertainty of this VG. As power systems begin to incorporate higher shares of VG, many questions arise about the suitability of the existing marginal-cost-based price formation, primarily within an energy-only market structure, to ensure the economic viability of resources that might be needed to provide system reliability. This article discusses these questions and provides a summary of completed and ongoing modelling-based work at the National Renewable Energy Laboratory to better understand the impacts of evolving power systems on reliability and revenue sufficiency.

  10. Reliability of IGBT-based power devices in the viewpoint of applications in future power supply systems

    International Nuclear Information System (INIS)

    Lutz, J.

    2011-01-01

    IGBT-based high-voltage power devices will be key components for future renewable energy base of the society. Windmills in the range up to 10 MW use converters with IGBTs. HVDC systems with IGBT-based voltage source converters have the advantage of a lower level of harmonics, less efforts for filters and more possibilities for control. The power devices need a lifetime expectation of several ten years. The lifetime is determined by the reliability of the packaging technology. IGBTs are offered packaged in presspacks and modules. The presentation will have the focus on IGBT high power modules. Accelerated power cycling tests for to determine the end-of-life at given conditions and their results are shown. models to calculate the lifetime, and actual work in research for systems with increased reliability.

  11. Visual acuity measures do not reliably detect childhood refractive error--an epidemiological study.

    Directory of Open Access Journals (Sweden)

    Lisa O'Donoghue

    Full Text Available PURPOSE: To investigate the utility of uncorrected visual acuity measures in screening for refractive error in white school children aged 6-7-years and 12-13-years. METHODS: The Northern Ireland Childhood Errors of Refraction (NICER study used a stratified random cluster design to recruit children from schools in Northern Ireland. Detailed eye examinations included assessment of logMAR visual acuity and cycloplegic autorefraction. Spherical equivalent refractive data from the right eye were used to classify significant refractive error as myopia of at least 1DS, hyperopia as greater than +3.50DS and astigmatism as greater than 1.50DC, whether it occurred in isolation or in association with myopia or hyperopia. RESULTS: Results are presented from 661 white 12-13-year-old and 392 white 6-7-year-old school-children. Using a cut-off of uncorrected visual acuity poorer than 0.20 logMAR to detect significant refractive error gave a sensitivity of 50% and specificity of 92% in 6-7-year-olds and 73% and 93% respectively in 12-13-year-olds. In 12-13-year-old children a cut-off of poorer than 0.20 logMAR had a sensitivity of 92% and a specificity of 91% in detecting myopia and a sensitivity of 41% and a specificity of 84% in detecting hyperopia. CONCLUSIONS: Vision screening using logMAR acuity can reliably detect myopia, but not hyperopia or astigmatism in school-age children. Providers of vision screening programs should be cognisant that where detection of uncorrected hyperopic and/or astigmatic refractive error is an aspiration, current UK protocols will not effectively deliver.

  12. Circuit design for reliability

    CERN Document Server

    Cao, Yu; Wirth, Gilson

    2015-01-01

    This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units.  The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.

  13. A Method for Improving Reliability of Radiation Detection using Deep Learning Framework

    International Nuclear Information System (INIS)

    Chang, Hojong; Kim, Tae-Ho; Han, Byunghun; Kim, Hyunduk; Kim, Ki-duk

    2017-01-01

    Radiation detection is essential technology for overall field of radiation and nuclear engineering. Previously, technology for radiation detection composes of preparation of the table of the input spectrum to output spectrum in advance, which requires simulation of numerous predicted output spectrum with simulation using parameters modeling the spectrum. In this paper, we propose new technique to improve the performance of radiation detector. The software in the radiation detector has been stagnant for a while with possible intrinsic error of simulation. In the proposed method, to predict the input source using output spectrum measured by radiation detector is performed using deep neural network. With highly complex model, we expect that the complex pattern between data and the label can be captured well. Furthermore, the radiation detector should be calibrated regularly and beforehand. We propose a method to calibrate radiation detector using GAN. We hope that the power of deep learning may also reach to radiation detectors and make huge improvement on the field. Using improved radiation detector, the reliability of detection would be confident, and there are many tasks remaining to solve using deep learning in nuclear engineering society.

  14. Are the Projections of Future Climate Change Reliable in the IPCC Reports?

    Institute of Scientific and Technical Information of China (English)

    Zongci Zhao

    2011-01-01

    @@ As we know,the projections of future climate change including impacts and strategies in the IPCC Assessment Reports were based on global climate models with scenarios on various human activities.Global climate model simulations provide key inputs for climate change assessments. In this study,the main objective is to analyze if the projections of fu-ture climate change by global climate models are reli-able.Several workshops have been held on this issue,such as the IPCC expert meeting on assessing and combining multi-model climate projections in January of 2010 (presided by the co-chairs of the IPCC WGI and WGII AR5),and the workshop of the combined global climate model group held by NCAR in June of 2010.

  15. Designing a reliable leak bio-detection system for natural gas pipelines

    International Nuclear Information System (INIS)

    Batzias, F.A.; Siontorou, C.G.; Spanidis, P.-M.P.

    2011-01-01

    Monitoring of natural gas (NG) pipelines is an important task for economical/safety operation, loss prevention and environmental protection. Timely and reliable leak detection of gas pipeline, therefore, plays a key role in the overall integrity management for the pipeline system. Owing to the various limitations of the currently available techniques and the surveillance area that needs to be covered, the research on new detector systems is still thriving. Biosensors are worldwide considered as a niche technology in the environmental market, since they afford the desired detector capabilities at low cost, provided they have been properly designed/developed and rationally placed/networked/maintained by the aid of operational research techniques. This paper addresses NG leakage surveillance through a robust cooperative/synergistic scheme between biosensors and conventional detector systems; the network is validated in situ and optimized in order to provide reliable information at the required granularity level. The proposed scheme is substantiated through a knowledge based approach and relies on Fuzzy Multicriteria Analysis (FMCA), for selecting the best biosensor design that suits both, the target analyte and the operational micro-environment. This approach is illustrated in the design of leak surveying over a pipeline network in Greece.

  16. Designing a reliable leak bio-detection system for natural gas pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Batzias, F.A., E-mail: fbatzi@unipi.gr [Univ. Piraeus, Dept. Industrial Management and Technology, Karaoli and Dimitriou 80, 18534 Piraeus (Greece); Siontorou, C.G., E-mail: csiontor@unipi.gr [Univ. Piraeus, Dept. Industrial Management and Technology, Karaoli and Dimitriou 80, 18534 Piraeus (Greece); Spanidis, P.-M.P., E-mail: pspani@asprofos.gr [Asprofos Engineering S.A, El. Venizelos 284, 17675 Kallithea (Greece)

    2011-02-15

    Monitoring of natural gas (NG) pipelines is an important task for economical/safety operation, loss prevention and environmental protection. Timely and reliable leak detection of gas pipeline, therefore, plays a key role in the overall integrity management for the pipeline system. Owing to the various limitations of the currently available techniques and the surveillance area that needs to be covered, the research on new detector systems is still thriving. Biosensors are worldwide considered as a niche technology in the environmental market, since they afford the desired detector capabilities at low cost, provided they have been properly designed/developed and rationally placed/networked/maintained by the aid of operational research techniques. This paper addresses NG leakage surveillance through a robust cooperative/synergistic scheme between biosensors and conventional detector systems; the network is validated in situ and optimized in order to provide reliable information at the required granularity level. The proposed scheme is substantiated through a knowledge based approach and relies on Fuzzy Multicriteria Analysis (FMCA), for selecting the best biosensor design that suits both, the target analyte and the operational micro-environment. This approach is illustrated in the design of leak surveying over a pipeline network in Greece.

  17. Designing a reliable leak bio-detection system for natural gas pipelines.

    Science.gov (United States)

    Batzias, F A; Siontorou, C G; Spanidis, P-M P

    2011-02-15

    Monitoring of natural gas (NG) pipelines is an important task for economical/safety operation, loss prevention and environmental protection. Timely and reliable leak detection of gas pipeline, therefore, plays a key role in the overall integrity management for the pipeline system. Owing to the various limitations of the currently available techniques and the surveillance area that needs to be covered, the research on new detector systems is still thriving. Biosensors are worldwide considered as a niche technology in the environmental market, since they afford the desired detector capabilities at low cost, provided they have been properly designed/developed and rationally placed/networked/maintained by the aid of operational research techniques. This paper addresses NG leakage surveillance through a robust cooperative/synergistic scheme between biosensors and conventional detector systems; the network is validated in situ and optimized in order to provide reliable information at the required granularity level. The proposed scheme is substantiated through a knowledge based approach and relies on Fuzzy Multicriteria Analysis (FMCA), for selecting the best biosensor design that suits both, the target analyte and the operational micro-environment. This approach is illustrated in the design of leak surveying over a pipeline network in Greece. Copyright © 2010 Elsevier B.V. All rights reserved.

  18. Diagnostic reliability of 3.0-T MRI for detecting osseous abnormalities of the temporomandibular joint.

    Science.gov (United States)

    Sawada, Kunihiko; Amemiya, Toshihiko; Hirai, Shigenori; Hayashi, Yusuke; Suzuki, Toshihiro; Honda, Masahiko; Sisounthone, Johnny; Matsumoto, Kunihito; Honda, Kazuya

    2018-01-01

    We compared the diagnostic reliability of 3.0-T magnetic resonance imaging (MRI) for detection of osseous abnormalities of the temporomandibular joint (TMJ) with that of the gold standard, cone-beam computed tomography (CBCT). Fifty-six TMJs were imaged with CBCT and MRI, and images of condyles and fossae were independently assessed for the presence of osseous abnormalities. The accuracy, sensitivity, and specificity of 3.0-T MRI were 0.88, 1.0, and 0.73, respectively, in condyle evaluation and 0.91, 0.75, and 0.95 in fossa evaluation. The McNemar test showed no significant difference (P > 0.05) between MRI and CBCT in the evaluation of osseous abnormalities in condyles and fossae. The present results indicate that 3.0-T MRI is equal to CBCT in the diagnostic evaluation of osseous abnormalities of the mandibular condyle.

  19. Prediction of Global and Localized Damage and Future Reliability for RC Structures subject to Earthquakes

    DEFF Research Database (Denmark)

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.

    1997-01-01

    the arrival of the first earthquake from non-destructive vibration tests or via structural analysis. The previous excitation and displacement response time series is employed for the identification of the instantaneous softening using an ARMA model. The hysteresis parameters are updated after each earthquake....... The proposed model is next generalized for the MDOF system. Using the adapted models for the structure and the global damage state, the global damage in a future earthquake can then be estimated when a suitable earthquake model is applied. The performance of the model is illustrated on RC frames which were...

  20. Prediction of Global and Localized Damage and Future Reliability for RC Structures subject to Earthquakes

    DEFF Research Database (Denmark)

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.

    1994-01-01

    the arrival of the first earthquake from non-destructive vibration tests or via structural analysis. The previous excitation and displacement response time series is employed for the identification of the instantaneous softening using an ARMA model. The hysteresis parameters are updated after each earthquake....... The proposed model is next generalized for the MDOF system. Using the adapted models for the structure and the global damage state, the global damage in a future earthquake can then be estimated when a suitable earthquake model is applied. The performance of the model is illustrated on RC frames which were...

  1. Recent and future evolution of the conception of French PWR facing safety and reliability

    International Nuclear Information System (INIS)

    Vignon, D.; Morin, R.; Brisbois, J.

    1987-11-01

    The realization of French construction of REP(54 units) has conducted at an original approach of the safety. Now this approach is finished and the totality of detained dispositions are taken in consideration for the conception of the new standardized plant series N4. For the future, after this rationalization of this safety approach, a research on the simplification of the conception is provided. This new conception is based on the experience returns and on the results of the probabilistic studies on the 900 and 1300 MWe reactors. This rationalization, the new concepts and the research of simplifications are illustrated by concrete examples in this presentation [fr

  2. Photogrammetry: an accurate and reliable tool to detect thoracic musculoskeletal abnormalities in preterm infants.

    Science.gov (United States)

    Davidson, Josy; dos Santos, Amelia Miyashiro N; Garcia, Kessey Maria B; Yi, Liu C; João, Priscila C; Miyoshi, Milton H; Goulart, Ana Lucia

    2012-09-01

    To analyse the accuracy and reproducibility of photogrammetry in detecting thoracic abnormalities in infants born prematurely. Cross-sectional study. The Premature Clinic at the Federal University of São Paolo. Fifty-eight infants born prematurely in their first year of life. Measurement of the manubrium/acromion/trapezius angle (degrees) and the deepest thoracic retraction (cm). Digitised photographs were analysed by two blinded physiotherapists using a computer program (SAPO; http://SAPO.incubadora.fapesp.br) to detect shoulder elevation and thoracic retraction. Physical examinations performed independently by two physiotherapists were used to assess the accuracy of the new tool. Thoracic alterations were detected in 39 (67%) and in 40 (69%) infants by Physiotherapists 1 and 2, respectively (kappa coefficient=0.80). Using a receiver operating characteristic curve, measurement of the manubrium/acromion/trapezius angle and the deepest thoracic retraction indicated accuracy of 0.79 and 0.91, respectively. For measurement of the manubrium/acromion/trapezius angle, the Bland and Altman limits of agreement were -6.22 to 7.22° [mean difference (d)=0.5] for repeated measures by one physiotherapist, and -5.29 to 5.79° (d=0.75) between two physiotherapists. For thoracic retraction, the intra-rater limits of agreement were -0.14 to 0.18cm (d=0.02) and the inter-rater limits of agreement were -0.20 to -0.17cm (d=0.02). SAPO provided an accurate and reliable tool for the detection of thoracic abnormalities in preterm infants. Copyright © 2011 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  3. Reliability of the exercise ECG in detecting silent ischemia in patients with prior myocardial infarction

    International Nuclear Information System (INIS)

    Yamagishi, Takashi; Matsuda, Yasuo; Satoh, Akira

    1991-01-01

    To assess the reliability of the exercise ECG in detecting silent ischemia, ECG results were compared with those of stress-redistribution thallium-201 single-photon emission computed tomography (SPECT) in 116 patients with prior myocardial infarction and in 20 normal subjects used as a control. The left ventricle (LV) was divided into 20 segmental images, which were scored blindly on a 5-point scale. The redistribution score was defined as thallium defect score of exercise subtracted by that of redistribution image and was used as a measure of amount of ischemic but viable myocardium. The upper limit of normal redistribution score (=4.32) was defined as mean+2 standard deviations derived from 20 normal subjects. Of 116 patients, 47 had the redistribution score above the normal range. Twenty-five (53%) of the 47 patients showed positive ECG response. Fourteen (20%) of the 69 patients, who had the normal redistribution score, showed positive ECG response. Thus, the ECG response had a sensitivity of 53% and a specificity of 80% in detecting transient ischemia. Furthermore, the 116 patients were subdivided into 4 groups according to the presence or absence of chest pain and ECG change during exercise. Fourteen patients showed both chest pain and ECG change and all these patients had the redistribution score above the normal range. Twenty-five patients showed ECG change without chest pain and 11 (44%) of the 25 patients had the abnormal redistribution. Three (43%) of 7 patients who showed chest pain without ECG change had the abnormal redistribution score. Of 70 patients who had neither chest pain nor ECG change, 19 (27%) had the redistribution score above the normal range. Thus, limitations exist in detecting silent ischemia by ECG in patients with a prior myocardial infarction, because the ECG response to the exercise test may have a low degree of sensitivity and a high degree of false positive and false negative results in detecting silent ischemia. (author)

  4. Prevent cervical cancer by screening with reliable human papillomavirus detection and genotyping

    International Nuclear Information System (INIS)

    Ge, Shichao; Gong, Bo; Cai, Xushan; Yang, Xiaoer; Gan, Xiaowei; Tong, Xinghai; Li, Haichuan; Zhu, Meijuan; Yang, Fengyun; Zhou, Hongrong; Hong, Guofan

    2012-01-01

    The incidence of cervical cancer is expected to rise sharply in China. A reliable routine human papillomavirus (HPV) detection and genotyping test to be supplemented by the limited Papanicolaou cytology facilities is urgently needed to help identify the patients with cervical precancer for preventive interventions. To this end, we evaluated a nested polymerase chain reaction (PCR) protocol for detection of HPV L1 gene DNA in cervicovaginal cells. The PCR amplicons were genotyped by direct DNA sequencing. In parallel, split samples were subjected to a Digene HC2 HPV test which has been widely used for “cervical cancer risk” screen. Of the 1826 specimens, 1655 contained sufficient materials for analysis and 657 were truly negative. PCR/DNA sequencing showed 674 infected by a single high-risk HPV, 188 by a single low-risk HPV, and 136 by multiple HPV genotypes with up to five HPV genotypes in one specimen. In comparison, the HC2 test classified 713 specimens as infected by high-risk HPV, and 942 as negative for HPV infections. The high-risk HC2 test correctly detected 388 (57.6%) of the 674 high-risk HPV isolates in clinical specimens, mislabeled 88 (46.8%) of the 188 low-risk HPV isolates as high-risk genotypes, and classified 180 (27.4%) of the 657 “true-negative” samples as being infected by high-risk HPV. It was found to cross-react with 20 low-risk HPV genotypes. We conclude that nested PCR detection of HPV followed by short target DNA sequencing can be used for screening and genotyping to formulate a paradigm in clinical management of HPV-related disorders in a rapidly developing economy

  5. Results of the reliability benchmark exercise and the future CEC-JRC program

    International Nuclear Information System (INIS)

    Amendola, A.

    1985-01-01

    As a contribution towards identifying problem areas and for assessing probabilistic safety assessment (PSA) methods and procedures of analysis, JRC has organized a wide-range Benchmark Exercise on systems reliability. This has been executed by ten different teams involving seventeen organizations from nine European countries. The exercise has been based on a real case (Auxiliary Feedwater System of EDF Paluel PWR 1300 MWe Unit), starting from analysis of technical specifications, logical and topological layout and operational procedures. Terms of references included both qualitative and quantitative analyses. The subdivision of the exercise into different phases and the rules adopted allowed assessment of the different components of the spread of the overall results. It appeared that modelling uncertainties may overwhelm data uncertainties and major efforts must be spent in order to improve consistency and completeness of qualitative analysis. After successful completion of the first exercise, CEC-JRC program has planned separate exercises on analysis of dependent failures and human factors before approaching the evaluation of a complete accident sequence

  6. Diffusion-weighted MR imaging in postoperative follow-up: Reliability for detection of recurrent cholesteatoma

    Energy Technology Data Exchange (ETDEWEB)

    Cimsit, Nuri Cagatay [Marmara University Hospital, Department of Radiology, Istanbul (Turkey); Engin Sitesi Peker Sokak No:1 D:13, 34330 Levent, Istanbul (Turkey)], E-mail: cagataycimsit@gmail.com; Cimsit, Canan [Goztepe Education and Research Hospital, Department of Radiology, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, Radyoloji Klinigi, Goztepe, Istanbul (Turkey)], E-mail: ccimsit@ttmail.com; Baysal, Begumhan [Goztepe Education and Research Hospital, Department of Radiology, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, Radyoloji Klinigi, Goztepe, Istanbul (Turkey)], E-mail: begumbaysal@yahoo.com; Ruhi, Ilteris Cagatay [Goztepe Education and Research Hospital, Department of ENT, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, KBB Klinigi, Goztepe, Istanbul (Turkey)], E-mail: cruhi@yahoo.com; Ozbilgen, Suha [Goztepe Education and Research Hospital, Department of ENT, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, KBB Klinigi, Goztepe, Istanbul (Turkey)], E-mail: sozbilgen@yahoo.com; Aksoy, Elif Ayanoglu [Acibadem Bakirkoy Hospital, Department of ENT, Istanbul (Turkey); Acibadem Hastanesi, KBB Boeluemue, Bakirkoey, Istanbul (Turkey)], E-mail: elifayanoglu@yahoo.com

    2010-04-15

    Introduction: Cholesteatoma is a progressively growing process that destroy the neighboring bony structures and treatment is surgical removal. Follow-up is important in the postoperative period, since further surgery is necessary if recurrence is present, but not if granulation tissue is detected. This study evaluates if diffusion-weighted MR imaging alone can be a reliable alternative to CT, without use of contrast agent for follow-up of postoperative patients in detecting recurrent cholesteatoma. Materials and methods: 26 consecutive patients with mastoidectomy reporting for routine follow-up CT after mastoidectomy were included in the study, if there was loss of middle ear aeration on CT examination. MR images were evaluated for loss of aeration and signal intensity changes on diffusion-weighted sequences. Surgical results were compared with imaging findings. Results: Interpretation of MR images were parallel with the loss of aeration detected on CT for all 26 patients. Of the 26 patients examined, 14 were not evaluated as recurrent cholesteatoma and verified with surgery (NPV: 100%). Twelve patients were diagnosed as recurrent cholesteatoma and 11 were surgically diagnosed as recurrent cholesteatoma (PPV: 91.7%). Four of these 11 patients had loss of aeration size greater than the high signal intensity area on DWI, which were surgically confirmed as granulation tissue or fibrosis accompanying recurrent cholesteatoma. Conclusion: Diffusion-weighted MR for suspected recurrent cholesteatoma is a valuable tool to cut costs and prevent unnecessary second-look surgeries. It has the potential to become the MR sequence of choice to differentiate recurrent cholesteatoma from other causes of loss of aeration in patients with mastoidectomy.

  7. Acoustic particle detection - From early ideas to future benefits

    International Nuclear Information System (INIS)

    Nahnhauer, Rolf

    2012-01-01

    The history of acoustic neutrino detection technology is shortly reviewed from the first ideas 50 years ago to the detailed R and D programs of the last decade. The physics potential of ultra-high energy neutrino interaction studies is discussed for some examples. Ideas about the necessary detector size and suitable design are presented.

  8. Conflict Detection and Resolution for Future Air Transportation Management

    Science.gov (United States)

    Krozel, Jimmy; Peters, Mark E.; Hunter, George

    1997-01-01

    With a Free Flight policy, the emphasis for air traffic control is shifting from active control to passive air traffic management with a policy of intervention by exception. Aircraft will be allowed to fly user preferred routes, as long as safety Alert Zones are not violated. If there is a potential conflict, two (or more) aircraft must be able to arrive at a solution for conflict resolution without controller intervention. Thus, decision aid tools are needed in Free Flight to detect and resolve conflicts, and several problems must be solved to develop such tools. In this report, we analyze and solve problems of proximity management, conflict detection, and conflict resolution under a Free Flight policy. For proximity management, we establish a system based on Delaunay Triangulations of aircraft at constant flight levels. Such a system provides a means for analyzing the neighbor relationships between aircraft and the nearby free space around air traffic which can be utilized later in conflict resolution. For conflict detection, we perform both 2-dimensional and 3-dimensional analyses based on the penetration of the Protected Airspace Zone. Both deterministic and non-deterministic analyses are performed. We investigate several types of conflict warnings including tactical warnings prior to penetrating the Protected Airspace Zone, methods based on the reachability overlap of both aircraft, and conflict probability maps to establish strategic Alert Zones around aircraft.

  9. Detection of local non-Gaussianity with future observations

    International Nuclear Information System (INIS)

    Li Hong; Liu Jie

    2012-01-01

    In this Letter we estimate the primordial non-Gaussianity (PNG) by simulating future observations. We use the Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) as an example and focus on the cross correlation signal between the galaxies and the Integrate Sachs-Wolfe (ISW) effect of CMB. Our result is optimistical. It shows the potential of LAMOST, particularly its quasar survey, in probing for the PNG by ISW - galaxy cross correlation. This study is particularly relevant because LAMOST is almost parallel to the timetable of the upcoming high precision Planck satellite.

  10. Analysis of Balancing Requirements in Future Sustainable and Reliable Power Systems

    Energy Technology Data Exchange (ETDEWEB)

    Frunt, J.

    2011-06-01

    This thesis elaborates on the rules for power balancing, provides a method for quantifying balancing requirements and examines the effect of future changes on balancing. Chapter 2 elaborates on system balancing and the different actors and entities in the electricity delivery system. The necessity and implementation of power balancing are explained. Also different subsequent markets (i.e., day-ahead markets, intraday markets and imbalance settlement systems) and options to trade electricity are discussed. As the research focusses mainly on the Netherlands, properties of the Dutch imbalance settlement system are analyzed. Based on this framework an in-depth analysis of imbalances and calls for balancing capacity with the corresponding prices is given. This shows the incentives to minimize the amount of imbalance in the system and to participate in the imbalance settlement system. Chapter 3 elaborates on the level of aggregation that the entities, involved in the imbalance settlement system, in electricity markets can have. Based on current market rules, incentives to either grow or shrink and by aggregating more or less entities are discussed. The level of aggregation will directly influence the functioning of the imbalance settlement system. It is shown that larger aggregations benefit more from the canceling out of imbalances. The imbalances of the Netherlands and Belgium have been aggregated to illustrate the possible benefits of aggregating multiple national imbalance settlement systems. The increased penetration of renewable generation strongly influences the planning and operation of the power system. As many renewable energy generators have a fluctuating power output, several methods are discussed in chapter 4 that can be used to classify and quantify the balancing requirements to counteract these fluctuations. Chapter 4 discusses the multiple existing classes of balancing capacity and the corresponding methods to quantify their needs. Due to the

  11. Analysis of Balancing Requirements in Future Sustainable and Reliable Power Systems

    International Nuclear Information System (INIS)

    Frunt, J.

    2011-01-01

    This thesis elaborates on the rules for power balancing, provides a method for quantifying balancing requirements and examines the effect of future changes on balancing. Chapter 2 elaborates on system balancing and the different actors and entities in the electricity delivery system. The necessity and implementation of power balancing are explained. Also different subsequent markets (i.e., day-ahead markets, intraday markets and imbalance settlement systems) and options to trade electricity are discussed. As the research focusses mainly on the Netherlands, properties of the Dutch imbalance settlement system are analyzed. Based on this framework an in-depth analysis of imbalances and calls for balancing capacity with the corresponding prices is given. This shows the incentives to minimize the amount of imbalance in the system and to participate in the imbalance settlement system. Chapter 3 elaborates on the level of aggregation that the entities, involved in the imbalance settlement system, in electricity markets can have. Based on current market rules, incentives to either grow or shrink and by aggregating more or less entities are discussed. The level of aggregation will directly influence the functioning of the imbalance settlement system. It is shown that larger aggregations benefit more from the canceling out of imbalances. The imbalances of the Netherlands and Belgium have been aggregated to illustrate the possible benefits of aggregating multiple national imbalance settlement systems. The increased penetration of renewable generation strongly influences the planning and operation of the power system. As many renewable energy generators have a fluctuating power output, several methods are discussed in chapter 4 that can be used to classify and quantify the balancing requirements to counteract these fluctuations. Chapter 4 discusses the multiple existing classes of balancing capacity and the corresponding methods to quantify their needs. Due to the

  12. OCT4 and SOX2 are reliable markers in detecting stem cells in odontogenic lesions

    Directory of Open Access Journals (Sweden)

    Abhishek Banerjee

    2016-01-01

    Full Text Available Context (Background: Stem cells are a unique subpopulation of cells in the human body with a capacity to initiate differentiation into various cell lines. Tumor stem cells (TSCs are a unique subpopulation of cells that possess the ability to initiate a neoplasm and sustain self-renewal. Epithelial stem cell (ESC markers such as octamer-binding transcription factor 4 (OCT4 and sex-determining region Y (SRY-box 2 (SOX2 are capable of identifying these stem cells expressed during the early stages of tooth development. Aims: To detect the expression of the stem cell markers OCT4 and SOX2 in the normal odontogenic tissues and the odontogenic cysts and tumors. Materials and Methods: Paraffin sections of follicular tissue, radicular cyst, dentigerous cyst, odontogenic keratocyst, ameloblastoma, adenomatoid odontogenic tumor, and ameloblastic carcinoma were obtained from the archives. The sections were subjected to immunohistochemical assay by the use of mouse monoclonal antibodies to OCT4 and SOX2. Statistical Analysis: The results were evaluated by descriptive analysis. Results: The results show the presence of stem cells in the normal and lesional tissues with these stem cell identifying markers. SOX2 was found to be more consistent and reliable in the detection of stem cells. Conclusion: The stem cell expressions are maintained in the tumor transformation of tissue and probably suggest that there is no phenotypic change of stem cells in progression from normal embryonic state to its tumor component. The quantification and localization reveals interesting trends that indicate the probable role of the cells in the pathogenesis of the lesions.

  13. Reliability of magnetic resonance imaging for the detection of hypopituitarism in children with optic nerve hypoplasia.

    Science.gov (United States)

    Ramakrishnaiah, Raghu H; Shelton, Julie B; Glasier, Charles M; Phillips, Paul H

    2014-01-01

    It is essential to identify hypopituitarism in children with optic nerve hypoplasia (ONH) because they are at risk for developmental delay, seizures, or death. The purpose of this study is to determine the reliability of neurohypophyseal abnormalities on magnetic resonance imaging (MRI) for the detection of hypopituitarism in children with ONH. Cross-sectional study. One hundred one children with clinical ONH who underwent MRI of the brain and orbits and a detailed pediatric endocrinologic evaluation. Magnetic resonance imaging studies were performed on 1.5-Tesla scanners. The imaging protocol included sagittal T1-weighted images, axial fast fluid-attenuated inversion-recovery/T2-weighted images, and diffusion-weighted images of the brain. Orbital imaging included fat-saturated axial and coronal images and high-resolution axial T2-weighted images. The MRI studies were reviewed by 2 pediatric neuroradiologists for optic nerve hypoplasia, absent or ectopic posterior pituitary, absent pituitary infundibulum, absent septum pellucidum, migration anomalies, and hemispheric injury. Medical records were reviewed for clinical examination findings and endocrinologic status. All patients underwent a clinical evaluation by a pediatric endocrinologist and a standardized panel of serologic testing that included serum insulin-like growth factor-1, insulin-like growth factor binding protein-3, prolactin, cortisol, adrenocorticotropic hormone, thyroid-stimulating hormone, and free thyroxine levels. Radiologists were masked to patients' endocrinologic status and funduscopic findings. Sensitivity and specificity of MRI findings for the detection of hypopituitarism. Neurohypophyseal abnormalities, including absent pituitary infundibulum, ectopic posterior pituitary bright spot, and absent posterior pituitary bright spot, occurred in 33 children. Magnetic resonance imaging disclosed neurohypophyseal abnormalities in 27 of the 28 children with hypopituitarism (sensitivity, 96%). A

  14. Human reliability-based MC and A models for detecting insider theft

    International Nuclear Information System (INIS)

    Duran, Felicia Angelica; Wyss, Gregory Dane

    2010-01-01

    Material control and accounting (MC and A) safeguards operations that track and account for critical assets at nuclear facilities provide a key protection approach for defeating insider adversaries. These activities, however, have been difficult to characterize in ways that are compatible with the probabilistic path analysis methods that are used to systematically evaluate the effectiveness of a site's physical protection (security) system (PPS). MC and A activities have many similar characteristics to operator procedures performed in a nuclear power plant (NPP) to check for anomalous conditions. This work applies human reliability analysis (HRA) methods and models for human performance of NPP operations to develop detection probabilities for MC and A activities. This has enabled the development of an extended probabilistic path analysis methodology in which MC and A protections can be combined with traditional sensor data in the calculation of PPS effectiveness. The extended path analysis methodology provides an integrated evaluation of a safeguards and security system that addresses its effectiveness for attacks by both outside and inside adversaries.

  15. How often should we monitor for reliable detection of atrial fibrillation recurrence? Efficiency considerations and implications for study design.

    Directory of Open Access Journals (Sweden)

    Efstratios I Charitos

    Full Text Available Although atrial fibrillation (AF recurrence is unpredictable in terms of onset and duration, current intermittent rhythm monitoring (IRM diagnostic modalities are short-termed and discontinuous. The aim of the present study was to investigate the necessary IRM frequency required to reliably detect recurrence of various AF recurrence patterns.The rhythm histories of 647 patients (mean AF burden: 12 ± 22% of monitored time; 687 patient-years with implantable continuous monitoring devices were reconstructed and analyzed. With the use of computationally intensive simulation, we evaluated the necessary IRM frequency to reliably detect AF recurrence of various AF phenotypes using IRM of various durations.The IRM frequency required for reliable AF detection depends on the amount and temporal aggregation of the AF recurrence (p95% sensitivity of AF recurrence required higher IRM frequencies (>12 24-hour; >6 7-day; >4 14-day; >3 30-day IRM per year; p<0.0001 than currently recommended. Lower IRM frequencies will under-detect AF recurrence and introduce significant bias in the evaluation of therapeutic interventions. More frequent but of shorter duration, IRMs (24-hour are significantly more time effective (sensitivity per monitored time than a fewer number of longer IRM durations (p<0.0001.Reliable AF recurrence detection requires higher IRM frequencies than currently recommended. Current IRM frequency recommendations will fail to diagnose a significant proportion of patients. Shorter duration but more frequent IRM strategies are significantly more efficient than longer IRM durations.Unique identifier: NCT00806689.

  16. Towards achieving a reliable leakage detection and localization algorithm for application in water piping networks: an overview

    CSIR Research Space (South Africa)

    Adedeji, KB

    2017-09-01

    Full Text Available Leakage detection and localization in pipelines has become an important aspect of water management systems. Since monitoring leakage in large-scale water distribution networks (WDNs) is a challenging task, the need to develop a reliable and robust...

  17. Osteoarthritis: detection, pathophysiology, and current/future treatment strategies.

    Science.gov (United States)

    Sovani, Sujata; Grogan, Shawn P

    2013-01-01

    Osteoarthritis (OA) is a disease of the joint, and age is the major risk factor for its development. Clinical manifestation of OA includes joint pain, stiffness, and loss of mobility. Currently, no pharmacological treatments are available to treat this specific joint disease; only symptom-modifying drugs are available. Improvement in imaging technology, identification of biomarkers, and increased understanding of the molecular basis of OA will aid in detecting the early stages of disease. Yet the development of interventional strategies remains elusive and will be critical for effective prevention of OA-associated joint destruction. The potential of cell-based therapies may be applicable in improving joint function in mild to more advanced cases of OA. Ongoing studies to understand the basis of this disease will eventually lead to prevention and treatment strategies and will also be a key in reducing the social and economic burden of this disease. Nurses are advised to provide an integrative approach of disease assessment and management in OA patients' care with a focus on education and implementation. Knowledge and understanding of OA and how this affects the individual patient form the basis for such an integrative approach to all-round patient care and disease management.

  18. Reliable Detection and Smart Deletion of Malassez Counting Chamber Grid in Microscopic White Light Images for Microbiological Applications.

    Science.gov (United States)

    Denimal, Emmanuel; Marin, Ambroise; Guyot, Stéphane; Journaux, Ludovic; Molin, Paul

    2015-08-01

    In biology, hemocytometers such as Malassez slides are widely used and are effective tools for counting cells manually. In a previous work, a robust algorithm was developed for grid extraction in Malassez slide images. This algorithm was evaluated on a set of 135 images and grids were accurately detected in most cases, but there remained failures for the most difficult images. In this work, we present an optimization of this algorithm that allows for 100% grid detection and a 25% improvement in grid positioning accuracy. These improvements make the algorithm fully reliable for grid detection. This optimization also allows complete erasing of the grid without altering the cells, which eases their segmentation.

  19. Reliability, validity and minimal detectable change of the Mini-BESTest in Greek participants with chronic stroke.

    Science.gov (United States)

    Lampropoulou, Sofia I; Billis, Evdokia; Gedikoglou, Ingrid A; Michailidou, Christina; Nowicky, Alexander V; Skrinou, Dimitra; Michailidi, Fotini; Chandrinou, Danae; Meligkoni, Margarita

    2018-02-23

    This study aimed to investigate the psychometric characteristics of reliability, validity and ability to detect change of a newly developed balance assessment tool, the Mini-BESTest, in Greek patients with stroke. A prospective, observational design study with test-retest measures was conducted. A convenience sample of 21 Greek patients with chronic stroke (14 male, 7 female; age of 63 ± 16 years) was recruited. Two independent examiners administered the scale, for the inter-rater reliability, twice within 10 days for the test-retest reliability. Bland Altman Analysis for repeated measures assessed the absolute reliability and the Standard Error of Measurement (SEM) and the Minimum Detectable Change at 95% confidence interval (MDC 95% ) were established. The Greek Mini-BESTest (Mini-BESTest GR ) was correlated with the Greek Berg Balance Scale (BBS GR ) for assessing the concurrent validity and with the Timed Up and Go (TUG), the Functional Reach Test (FRT) and the Greek Falls Efficacy Scale-International (FES-I GR ) for the convergent validity. The Mini-BESTestGR demonstrated excellent inter-rater reliability (ICC (95%CI) = 0.997 (0.995-0.999, SEM = 0.46) with the scores of two raters within the limits of agreement (mean dif  = -0.143 ± 0.727, p > 0.05) and test-retest reliability (ICC (95%CI) = 0.966 (0.926-0.988), SEM = 1.53). Additionally, the Mini-BESTest GR yielded very strong to moderate correlations with BBS GR (r = 0.924, p reliability and the equally good validity of the Mini-BESTest GR , strongly support its utility in Greek people with chronic stroke. Its ability to identify clinically meaningful changes and falls risk need further investigation.

  20. Self-Tuning Method for Increased Obstacle Detection Reliability Based on Internet of Things LiDAR Sensor Models.

    Science.gov (United States)

    Castaño, Fernando; Beruvides, Gerardo; Villalonga, Alberto; Haber, Rodolfo E

    2018-05-10

    On-chip LiDAR sensors for vehicle collision avoidance are a rapidly expanding area of research and development. The assessment of reliable obstacle detection using data collected by LiDAR sensors has become a key issue that the scientific community is actively exploring. The design of a self-tuning methodology and its implementation are presented in this paper, to maximize the reliability of LiDAR sensors network for obstacle detection in the 'Internet of Things' (IoT) mobility scenarios. The Webots Automobile 3D simulation tool for emulating sensor interaction in complex driving environments is selected in order to achieve that objective. Furthermore, a model-based framework is defined that employs a point-cloud clustering technique, and an error-based prediction model library that is composed of a multilayer perceptron neural network, and k-nearest neighbors and linear regression models. Finally, a reinforcement learning technique, specifically a Q-learning method, is implemented to determine the number of LiDAR sensors that are required to increase sensor reliability for obstacle localization tasks. In addition, a IoT driving assistance user scenario, connecting a five LiDAR sensor network is designed and implemented to validate the accuracy of the computational intelligence-based framework. The results demonstrated that the self-tuning method is an appropriate strategy to increase the reliability of the sensor network while minimizing detection thresholds.

  1. Reliable detection of fluence anomalies in EPID-based IMRT pretreatment quality assurance using pixel intensity deviations

    International Nuclear Information System (INIS)

    Gordon, J. J.; Gardner, J. K.; Wang, S.; Siebers, J. V.

    2012-01-01

    Purpose: This work uses repeat images of intensity modulated radiation therapy (IMRT) fields to quantify fluence anomalies (i.e., delivery errors) that can be reliably detected in electronic portal images used for IMRT pretreatment quality assurance. Methods: Repeat images of 11 clinical IMRT fields are acquired on a Varian Trilogy linear accelerator at energies of 6 MV and 18 MV. Acquired images are corrected for output variations and registered to minimize the impact of linear accelerator and electronic portal imaging device (EPID) positioning deviations. Detection studies are performed in which rectangular anomalies of various sizes are inserted into the images. The performance of detection strategies based on pixel intensity deviations (PIDs) and gamma indices is evaluated using receiver operating characteristic analysis. Results: Residual differences between registered images are due to interfraction positional deviations of jaws and multileaf collimator leaves, plus imager noise. Positional deviations produce large intensity differences that degrade anomaly detection. Gradient effects are suppressed in PIDs using gradient scaling. Background noise is suppressed using median filtering. In the majority of images, PID-based detection strategies can reliably detect fluence anomalies of ≥5% in ∼1 mm 2 areas and ≥2% in ∼20 mm 2 areas. Conclusions: The ability to detect small dose differences (≤2%) depends strongly on the level of background noise. This in turn depends on the accuracy of image registration, the quality of the reference image, and field properties. The longer term aim of this work is to develop accurate and reliable methods of detecting IMRT delivery errors and variations. The ability to resolve small anomalies will allow the accuracy of advanced treatment techniques, such as image guided, adaptive, and arc therapies, to be quantified.

  2. 2-D or 3-D Mammography?: The Future of Breast Cancer Detection | NIH MedlinePlus the Magazine

    Science.gov (United States)

    ... Future of Breast Cancer Detection Follow us 2-D or 3-D Mammography?: The Future of Breast Cancer Detection NIH- ... will test two types of imaging tools—2-D and 3-D mammography. 2-D mammography takes ...

  3. Consortium for Electric Reliability Technology Solutions Grid of the Future White Paper on Review of Recent Reliability Issues and Systems Events

    Energy Technology Data Exchange (ETDEWEB)

    Hauer, John F.; Dagle, Jeffery E.

    1999-12-01

    This report is one of six reports developed under the U.S. Department of Energy (DOE) program in Power System Integration and Reliability (PSIR). The objective of this report is to review, analyze, and evaluate critical reliability issues demonstrated by recent disturbance events in the North America power system. Eleven major disturbances are examined, most occurring in this decade. The strategic challenge is that the pattern of technical need has persisted for a long period of time. For more than a decade, anticipation of market deregulation has been a major disincentive to new investments in system capacity. It has also inspired reduced maintenance of existing assets. A massive infusion of better technology is emerging as the final option to continue reliable electrical services. If an investment in better technology will not be made in a timely manner, then North America should plan its adjustments to a very different level of electrical service. It is apparent that technical operations staff among the utilities can be very effective at marshaling their forces in the immediate aftermath of a system emergency, and that serious disturbances often lead to improved mechanisms for coordinated operation. It is not at all apparent that such efforts can be sustained through voluntary reliability organizations in which utility personnel external to those organizations do most of the technical work. The eastern interconnection shows several situations in which much of the technical support has migrated from the utilities to the Independent System Operator (ISO), and the ISO staffs or shares staff with the regional reliability council. This process may be a natural and very positive consequence of utility restructuring. If so, the process should be expedited in regions where it is less advanced.

  4. A Type-2 fuzzy data fusion approach for building reliable weighted protein interaction networks with application in protein complex detection.

    Science.gov (United States)

    Mehranfar, Adele; Ghadiri, Nasser; Kouhsar, Morteza; Golshani, Ashkan

    2017-09-01

    Detecting the protein complexes is an important task in analyzing the protein interaction networks. Although many algorithms predict protein complexes in different ways, surveys on the interaction networks indicate that about 50% of detected interactions are false positives. Consequently, the accuracy of existing methods needs to be improved. In this paper we propose a novel algorithm to detect the protein complexes in 'noisy' protein interaction data. First, we integrate several biological data sources to determine the reliability of each interaction and determine more accurate weights for the interactions. A data fusion component is used for this step, based on the interval type-2 fuzzy voter that provides an efficient combination of the information sources. This fusion component detects the errors and diminishes their effect on the detection protein complexes. So in the first step, the reliability scores have been assigned for every interaction in the network. In the second step, we have proposed a general protein complex detection algorithm by exploiting and adopting the strong points of other algorithms and existing hypotheses regarding real complexes. Finally, the proposed method has been applied for the yeast interaction datasets for predicting the interactions. The results show that our framework has a better performance regarding precision and F-measure than the existing approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. The Threat of Uncertainty: Why Using Traditional Approaches for Evaluating Spacecraft Reliability are Insufficient for Future Human Mars Missions

    Science.gov (United States)

    Stromgren, Chel; Goodliff, Kandyce; Cirillo, William; Owens, Andrew

    2016-01-01

    Through the Evolvable Mars Campaign (EMC) study, the National Aeronautics and Space Administration (NASA) continues to evaluate potential approaches for sending humans beyond low Earth orbit (LEO). A key aspect of these missions is the strategy that is employed to maintain and repair the spacecraft systems, ensuring that they continue to function and support the crew. Long duration missions beyond LEO present unique and severe maintainability challenges due to a variety of factors, including: limited to no opportunities for resupply, the distance from Earth, mass and volume constraints of spacecraft, high sensitivity of transportation element designs to variation in mass, the lack of abort opportunities to Earth, limited hardware heritage information, and the operation of human-rated systems in a radiation environment with little to no experience. The current approach to maintainability, as implemented on ISS, which includes a large number of spares pre-positioned on ISS, a larger supply sitting on Earth waiting to be flown to ISS, and an on demand delivery of logistics from Earth, is not feasible for future deep space human missions. For missions beyond LEO, significant modifications to the maintainability approach will be required.Through the EMC evaluations, several key findings related to the reliability and safety of the Mars spacecraft have been made. The nature of random and induced failures presents significant issues for deep space missions. Because spare parts cannot be flown as needed for Mars missions, all required spares must be flown with the mission or pre-positioned. These spares must cover all anticipated failure modes and provide a level of overall reliability and safety that is satisfactory for human missions. This will require a large amount of mass and volume be dedicated to storage and transport of spares for the mission. Further, there is, and will continue to be, a significant amount of uncertainty regarding failure rates for spacecraft

  6. Technologies for improving the availability and reliability of current and future water cooled nuclear power plants. Proceedings of a technical committee meeting

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-11-01

    One of the activities of the IAEA is to provide all Member States with an international source of balanced, objective information on advanced in technology for water cooled reactors. Since the global nuclear industry has a common interest in improving plant availability and reliability to assure specific individual plant and country perspective as well as to have an image of well managed competitive industry, the IAEA held a Technical Committee Meeting on Technologies for Improving the Availability and Reliability of Current and Future Water Cooled Nuclear Power Plants in September 1997. The basic aim to was to identify, review and exchange information on international developments in technologies for achieving high availability and reliability and to suggest areas where further technical advances could contribute to improvement of performance. Designs for future plants were presented in the context of how they can accommodate both the organizational and technical means for reaching even higher levels of performance. This proceedings contains the contributed papers presented at this Meeting each with a separate abstract. Four sessions were concerned with: policies, practices and procedures for achieving high reliability and availability; improving availability and reliability through better use of today`s technologies; recent advances in technologies for improving availability and reliability; achieving high availability for new plants Refs, figs, tabs

  7. Technologies for improving the availability and reliability of current and future water cooled nuclear power plants. Proceedings of a technical committee meeting

    International Nuclear Information System (INIS)

    1998-11-01

    One of the activities of the IAEA is to provide all Member States with an international source of balanced, objective information on advanced in technology for water cooled reactors. Since the global nuclear industry has a common interest in improving plant availability and reliability to assure specific individual plant and country perspective as well as to have an image of well managed competitive industry, the IAEA held a Technical Committee Meeting on Technologies for Improving the Availability and Reliability of Current and Future Water Cooled Nuclear Power Plants in September 1997. The basic aim to was to identify, review and exchange information on international developments in technologies for achieving high availability and reliability and to suggest areas where further technical advances could contribute to improvement of performance. Designs for future plants were presented in the context of how they can accommodate both the organizational and technical means for reaching even higher levels of performance. This proceedings contains the contributed papers presented at this Meeting each with a separate abstract. Four sessions were concerned with: policies, practices and procedures for achieving high reliability and availability; improving availability and reliability through better use of today's technologies; recent advances in technologies for improving availability and reliability; achieving high availability for new plants

  8. The National Opportunity for Interoperability and its Benefits for a Reliable, Robust, and Future Grid Realized Through Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Office of Energy Efficiency and Renewable Energy

    2016-02-01

    Today, increasing numbers of intermittent generation sources (e.g., wind and photovoltaic) and new mobile intermittent loads (e.g., electric vehicles) can significantly affect traditional utility business practices and operations. At the same time, a growing number of technologies and devices, from appliances to lighting systems, are being deployed at consumer premises that have more sophisticated controls and information that remain underused for anything beyond basic building equipment operations. The intersection of these two drivers is an untapped opportunity and underused resource that, if appropriately configured and realized in open standards, can provide significant energy efficiency and commensurate savings on utility bills, enhanced and lower cost reliability to utilities, and national economic benefits in the creation of new markets, sectors, and businesses being fueled by the seamless coordination of energy and information through device and technology interoperability. Or, as the Quadrennial Energy Review puts it, “A plethora of both consumer-level and grid-level devices are either in the market, under development, or at the conceptual stage. When tied together through the information technology that is increasingly being deployed on electric utilities’ distribution grids, they can be an important enabling part of the emerging grid of the future. However, what is missing is the ability for all of these devices to coordinate and communicate their operations with the grid, and among themselves, in a common language — an open standard.” In this paper, we define interoperability as the ability to exchange actionable information between two or more systems within a home or building, or across and within organizational boundaries. Interoperability relies on the shared meaning of the exchanged information, with agreed-upon expectations and consequences, for the response to the information exchange.

  9. Fueling our future: four steps to a new reliable, cleaner, decentralized energy supply based on Hydrogen and fuel cells

    International Nuclear Information System (INIS)

    Evers, A.

    2005-01-01

    In examining various market strategies, this presentation demonstrates the possible driving factors and necessary elements needed to move Hydrogen and Fuel Cells (H2/FC) to worldwide commercialisation. Focusing not only on the technology itself, this presentation looks at the 'bigger picture' explaining how certain trends have impacted the progress of new technologies developments in the past. The presentation demonstrates how these models can be applied to our present day situation. In this process, the consumer has played and will continue to play the key and leading role. Due to such strong influence, the consumer will ultimately fuel the future of H2/FC commercialisation by a desire for new and not yet discovered products and services. Examining different Distributed Generation scenarios, the catalyst to the Hydrogen Economy may be found through distributed generation via fuel cells. One possible step could be the use of Personal Power Cars equipped with Fuel Cells which not only drive on Hydrogen, but also supply (while standing) electricity /heat to residential and commercial buildings. The incentive for car owners driving and using these vehicles is twofold: either save (at his own home) or earn (at his office) money while their cars are parked and plugged into buildings via smart docking stations available at key parking sites. Cars parked at home in the garage will supply electricity to the homes and additionally, replace the function of the existing boiler. Car owners can earn money by selling the electricity generated (but not needed at that time) to the utilities and feed it into the existing electricity grid. The inter-dependability between supply and consumer-driven demand (or better, demand and supply) and other examples are explained. The steps necessary to achieve a new, reliable, cleaner and decentralized Energy Supply based on H2/FC are also presented and examined. (author)

  10. Assessment of scapular positioning and function as future effect measure of shoulder interventions – an inter-examiner reliability study of the clinical assessment methods

    DEFF Research Database (Denmark)

    Larsen, Camilla Marie; Eshøj, Henrik; Ingwersen, Kim Gordon

    2015-01-01

    Assessment of scapular positioning and function as future effect measure of shoulder interventions – an inter-examiner reliability study of the clinical assessment methods Eshøj H1, Ingwersen KG1, Larsen CM1, 2, Søgaard K1, Juul-Kristensen B1, 3 1 University of Southern Denmark, Institute of Sports...... only been tested for intra-examiner reliability. The objective was to investigate the inter-examiner reliability of an extended battery of clinical tests for assessing scapular positioning and function. Methods A standardized three-phase protocol for clinical reliability studies was conducted...... coefficients (ICC) and kappa values were interpreted as: 0.0-0.40 (poor); 0.40-0.75 (fair to good); and 0.75-1.00 (good to excellent). Results A total of 41 subjects (23 males, yrs 25±9), were recruited among adult overhead athletes from the municipality of Odense, DK. Prevalence of the index condition was 54...

  11. Electronic logic to enhance switch reliability in detecting openings and closures of redundant switches

    Science.gov (United States)

    Cooper, James A.

    1986-01-01

    A logic circuit is used to enhance redundant switch reliability. Two or more switches are monitored for logical high or low output. The output for the logic circuit produces a redundant and failsafe representation of the switch outputs. When both switch outputs are high, the output is high. Similarly, when both switch outputs are low, the logic circuit's output is low. When the output states of the two switches do not agree, the circuit resolves the conflict by memorizing the last output state which both switches were simultaneously in and produces the logical complement of this output state. Thus, the logic circuit of the present invention allows the redundant switches to be treated as if they were in parallel when the switches are open and as if they were in series when the switches are closed. A failsafe system having maximum reliability is thereby produced.

  12. Reliability of ultrasonography in detecting shoulder disease in patients with rheumatoid arthritis.

    LENUS (Irish Health Repository)

    Bruyn, G A W

    2009-03-01

    To assess the intra and interobserver reproducibility of musculoskeletal ultrasonography (US) among rheumatologists in detecting destructive and inflammatory shoulder abnormalities in patients with rheumatoid arthritis (RA) and to determine the overall agreement between US and MRI.

  13. Reliable fault detection and diagnosis of photovoltaic systems based on statistical monitoring approaches

    KAUST Repository

    Harrou, Fouzi; Sun, Ying; Taghezouit, Bilal; Saidi, Ahmed; Hamlati, Mohamed-Elkarim

    2017-01-01

    This study reports the development of an innovative fault detection and diagnosis scheme to monitor the direct current (DC) side of photovoltaic (PV) systems. Towards this end, we propose a statistical approach that exploits the advantages of one

  14. A simple and reliable methodology to detect egg white in art samples

    Indian Academy of Sciences (India)

    2013-04-26

    Apr 26, 2013 ... threshold density values useful for the detection of ovalbumin in samples from ancient works of art. .... slides a mixture of a water solution of dry egg white and the .... ily, facing the problems of sample leakage, background.

  15. Reliability of ultrasonography in detecting shoulder disease in patients with rheumatoid arthritis

    NARCIS (Netherlands)

    Bruyn, G. A. W.; Naredo, E.; Moeller, I.; Moragues, C.; Garrido, J.; de Bock, G. H.; d'Agostino, M-A; Filippucci, E.; Iagnocco, A.; Backhaus, M.; Swen, W. A. A.; Balint, P.; Pineda, C.; Milutinovic, S.; Kane, D.; Kaeley, G.; Narvaez, F. J.; Wakefield, R. J.; Narvaez, J. A.; de Augustin, J.; Schmidt, W. A.; Moller, I.; Swen, N.; de Agustin, J.

    Objective: To assess the intra and interobserver reproducibility of musculoskeletal ultrasonography ( US) among rheumatologists in detecting destructive and inflammatory shoulder abnormalities in patients with rheumatoid arthritis ( RA) and to determine the overall agreement between US and MRI.

  16. Implanted cardiac devices are reliably detected by commercially available metal detectors

    DEFF Research Database (Denmark)

    Holm, Katja Fiedler; Hjortshøj, Søren Pihlkjær; Pehrson, Steen

    2013-01-01

    Explosions of Cardiovascular Implantable Electronic Devices (CIEDs) (pacemakers, defibrillators, and loop recorders) are a well-recognized problem during cremation, due to lithium-iodine batteries. In addition, burial of the deceased with a CIED can present a potential risk for environmental...... contamination. Therefore, detection of CIEDs in the deceased would be of value. This study evaluated a commercially available metal detector for detecting CIEDs....

  17. Reliability, Validity, and Minimal Detectable Change of Balance Evaluation Systems Test and Its Short Versions in Older Cancer Survivors: A Pilot Study.

    Science.gov (United States)

    Huang, Min H; Miller, Kara; Smith, Kristin; Fredrickson, Kayle; Shilling, Tracy

    2016-01-01

    Cancer is primarily a disease of older adults. About 77% of all cancers are diagnosed in persons aged 55 years and older. Cancer and its treatment can cause diverse sequelae impacting body systems underlying balance control. No study has examined the psychometric properties of balance assessment tools in older cancer survivors, presenting a significant challenge in the selection of outcome measures for clinicians treating this fast-growing population. This study aimed to determine the reliability, validity, and minimal detectable change (MDC) of the Balance Evaluation System Test (BESTest), Mini-Balance Evaluation Systems Test (Mini-BESTest), and Brief-Balance Evaluation Systems Test (Brief-BESTest) in community-dwelling older cancer survivors. This study was a cross-sectional design. Twenty breast and 8 prostate cancer survivors participated [age (SD) = 68.4 (8.13) years]. The BESTest and Activity-specific Balance Confidence (ABC) Scale were administered during the first session. Scores of Mini-BESTest and Brief-BESTest were extracted on the basis of the scores of BESTest. The BESTest was repeated within 1 to 2 weeks by the same rater to determine the test-retest reliability. For the analysis of the inter-rater reliability, 21 participants were randomly selected to be evaluated by 2 raters. A primary rater administered the test. The 2 raters independently and concurrently scored the performance of the participants. Each rater recorded the ratings separately on the scoring sheet. No discussion among the raters was allowed throughout the testing. Intraclass correlation coefficients (ICCs), standard error of measurement, minimal detectable change (MDC), and Bland-Altman plots were calculated. Concurrent validity of these balance tests with the ABC Scale was examined using the Spearman correlation. The BESTest, Mini-BESTest, and Brief-BESTest had high test-retest (ICC = 0.90-0.94) and interrater reliability (ICC = 0.86-0.96), small standard error of measurement (0

  18. Experimental Research of Reliability of Plant Stress State Detection by Laser-Induced Fluorescence Method

    Directory of Open Access Journals (Sweden)

    Yury Fedotov

    2016-01-01

    Full Text Available Experimental laboratory investigations of the laser-induced fluorescence spectra of watercress and lawn grass were conducted. The fluorescence spectra were excited by YAG:Nd laser emitting at 532 nm. It was established that the influence of stress caused by mechanical damage, overwatering, and soil pollution is manifested in changes of the spectra shapes. The mean values and confidence intervals for the ratio of two fluorescence maxima near 685 and 740 nm were estimated. It is presented that the fluorescence ratio could be considered a reliable characteristic of plant stress state.

  19. Autism detection in early childhood (ADEC): reliability and validity data for a Level 2 screening tool for autistic disorder.

    Science.gov (United States)

    Nah, Yong-Hwee; Young, Robyn L; Brewer, Neil; Berlingeri, Genna

    2014-03-01

    The Autism Detection in Early Childhood (ADEC; Young, 2007) was developed as a Level 2 clinician-administered autistic disorder (AD) screening tool that was time-efficient, suitable for children under 3 years, easy to administer, and suitable for persons with minimal training and experience with AD. A best estimate clinical Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev.; DSM-IV-TR; American Psychiatric Association, 2000) diagnosis of AD was made for 70 children using all available information and assessment results, except for the ADEC data. A screening study compared these children on the ADEC with 57 children with other developmental disorders and 64 typically developing children. Results indicated high internal consistency (α = .91). Interrater reliability and test-retest reliability of the ADEC were also adequate. ADEC scores reliably discriminated different diagnostic groups after controlling for nonverbal IQ and Vineland Adaptive Behavior Composite scores. Construct validity (using exploratory factor analysis) and concurrent validity using performance on the Autism Diagnostic Observation Schedule (Lord et al., 2000), the Autism Diagnostic Interview-Revised (Le Couteur, Lord, & Rutter, 2003), and DSM-IV-TR criteria were also demonstrated. Signal detection analysis identified the optimal ADEC cutoff score, with the ADEC identifying all children who had an AD (N = 70, sensitivity = 1.0) but overincluding children with other disabilities (N = 13, specificity ranging from .74 to .90). Together, the reliability and validity data indicate that the ADEC has potential to be established as a suitable and efficient screening tool for infants with AD. 2014 APA

  20. Technical Note: The single particle soot photometer fails to reliably detect PALAS soot nanoparticles

    Directory of Open Access Journals (Sweden)

    M. Gysel

    2012-12-01

    Full Text Available The single particle soot photometer (SP2 uses laser-induced incandescence (LII for the measurement of atmospheric black carbon (BC particles. The BC mass concentration is obtained by combining quantitative detection of BC mass in single particles with a counting efficiency of 100% above its lower detection limit. It is commonly accepted that a particle must contain at least several tenths of a femtogram BC in order to be detected by the SP2.

    Here we show the result that most BC particles from a PALAS spark discharge soot generator remain undetected by the SP2, even if their BC mass, as independently determined with an aerosol particle mass analyser (APM, is clearly above the typical lower detection limit of the SP2. Comparison of counting efficiency and effective density data of PALAS soot with flame generated soot (combustion aerosol standard burner, CAST, fullerene soot and carbon black particles (Cabot Regal 400R reveals that particle morphology can affect the SP2's lower detection limit. PALAS soot particles are fractal-like agglomerates of very small primary particles with a low fractal dimension, resulting in a very low effective density. Such loosely packed particles behave like "the sum of individual primary particles" in the SP2's laser. Accordingly, most PALAS soot particles remain undetected as the SP2's laser intensity is insufficient to heat the primary particles to their vaporisation temperature because of their small size (Dpp ≈ 5–10 nm. Previous knowledge from pulsed laser-induced incandescence indicated that particle morphology might have an effect on the SP2's lower detection limit, however, an increase of the lower detection limit by a factor of ∼5–10, as reported here for PALAS soot, was not expected.

    In conclusion, the SP2's lower detection limit at a certain laser power depends primarily on the total BC mass per particle for compact particles with sufficiently high effective

  1. Test-retest reliability and minimal detectable change of two simplified 3-point balance measures in patients with stroke.

    Science.gov (United States)

    Chen, Yi-Miau; Huang, Yi-Jing; Huang, Chien-Yu; Lin, Gong-Hong; Liaw, Lih-Jiun; Lee, Shih-Chieh; Hsieh, Ching-Lin

    2017-10-01

    The 3-point Berg Balance Scale (BBS-3P) and 3-point Postural Assessment Scale for Stroke Patients (PASS-3P) were simplified from the BBS and PASS to overcome the complex scoring systems. The BBS-3P and PASS-3P were more feasible in busy clinical practice and showed similarly sound validity and responsiveness to the original measures. However, the reliability of the BBS-3P and PASS-3P is unknown limiting their utility and the interpretability of scores. We aimed to examine the test-retest reliability and minimal detectable change (MDC) of the BBS-3P and PASS-3P in patients with stroke. Cross-sectional study. The rehabilitation departments of a medical center and a community hospital. A total of 51 chronic stroke patients (64.7% male). Both balance measures were administered twice 7 days apart. The test-retest reliability of both the BBS-3P and PASS-3P were examined by intraclass correlation coefficients (ICC). The MDC and its percentage over the total score (MDC%) of each measure was calculated for examining the random measurement errors. The ICC values of the BBS-3P and PASS-3P were 0.99 and 0.97, respectively. The MDC% (MDC) of the BBS-3P and PASS-3P were 9.1% (5.1 points) and 8.4% (3.0 points), respectively, indicating that both measures had small and acceptable random measurement errors. Our results showed that both the BBS-3P and the PASS-3P had good test-retest reliability, with small and acceptable random measurement error. These two simplified 3-level balance measures can provide reliable results over time. Our findings support the repeated administration of the BBS-3P and PASS-3P to monitor the balance of patients with stroke. The MDC values can help clinicians and researchers interpret the change scores more precisely.

  2. Three dimensional quantitative coronary angiography can detect reliably ischemic coronary lesions based on fractional flow reserve.

    Science.gov (United States)

    Chung, Woo-Young; Choi, Byoung-Joo; Lim, Seong-Hoon; Matsuo, Yoshiki; Lennon, Ryan J; Gulati, Rajiv; Sandhu, Gurpreet S; Holmes, David R; Rihal, Charanjit S; Lerman, Amir

    2015-06-01

    Conventional coronary angiography (CAG) has limitations in evaluating lesions producing ischemia. Three dimensional quantitative coronary angiography (3D-QCA) shows reconstructed images of CAG using computer based algorithm, the Cardio-op B system (Paieon Medical, Rosh Ha'ayin, Israel). The aim of this study was to evaluate whether 3D-QCA can reliably predict ischemia assessed by myocardial fractional flow reserve (FFR) < 0.80. 3D-QCA images were reconstructed from CAG which also were evaluated with FFR to assess ischemia. Minimal luminal diameter (MLD), percent diameter stenosis (%DS), minimal luminal area (MLA), and percent area stenosis (%AS) were obtained. The results of 3D-QCA and FFR were compared. A total of 266 patients was enrolled for the present study. FFR for all lesions ranged from 0.57 to 1.00 (0.85 ± 0.09). Measurement of MLD, %DS, MLA, and %AS all were significantly correlated with FFR (r = 0.569, 0609, 0.569, 0.670, respectively, all P < 0.001). In lesions with MLA < 4.0 mm(2), %AS of more than 65.5% had a 80% sensitivity and a 83% specificity to predict FFR < 0.80 (area under curve, AUC was 0.878). 3D-QCA can reliably predict coronary lesions producing ischemia and may be used to guide therapeutic approach for coronary artery disease.

  3. Reliable fault detection and diagnosis of photovoltaic systems based on statistical monitoring approaches

    KAUST Repository

    Harrou, Fouzi

    2017-09-18

    This study reports the development of an innovative fault detection and diagnosis scheme to monitor the direct current (DC) side of photovoltaic (PV) systems. Towards this end, we propose a statistical approach that exploits the advantages of one-diode model and those of the univariate and multivariate exponentially weighted moving average (EWMA) charts to better detect faults. Specifically, we generate array\\'s residuals of current, voltage and power using measured temperature and irradiance. These residuals capture the difference between the measurements and the predictions MPP for the current, voltage and power from the one-diode model, and use them as fault indicators. Then, we apply the multivariate EWMA (MEWMA) monitoring chart to the residuals to detect faults. However, a MEWMA scheme cannot identify the type of fault. Once a fault is detected in MEWMA chart, the univariate EWMA chart based on current and voltage indicators is used to identify the type of fault (e.g., short-circuit, open-circuit and shading faults). We applied this strategy to real data from the grid-connected PV system installed at the Renewable Energy Development Center, Algeria. Results show the capacity of the proposed strategy to monitors the DC side of PV systems and detects partial shading.

  4. Reliability and validity of the KIPPPI: an early detection tool for psychosocial problems in toddlers.

    Directory of Open Access Journals (Sweden)

    Ingrid Kruizinga

    Full Text Available BACKGROUND: The KIPPPI (Brief Instrument Psychological and Pedagogical Problem Inventory is a Dutch questionnaire that measures psychosocial and pedagogical problems in 2-year olds and consists of a KIPPPI Total score, Wellbeing scale, Competence scale, and Autonomy scale. This study examined the reliability, validity, screening accuracy and clinical application of the KIPPPI. METHODS: Parents of 5959 2-year-old children in the Rotterdam area, the Netherlands, were invited to participate in the study. Parents of 3164 children (53.1% of all invited parents completed the questionnaire. The internal consistency was evaluated and in subsamples the test-retest reliability and concurrent validity with regard to the Child Behavioral Checklist (CBCL. Discriminative validity was evaluated by comparing scores of parents who worried about their child's upbringing and parent's that did not. Screening accuracy of the KIPPPI was evaluated against the CBCL by calculating the Receiver Operating Characteristic (ROC curves. The clinical application was evaluated by the relation between KIPPPI scores and the clinical decision made by the child health professionals. RESULTS: Psychometric properties of the KIPPPI Total score, Wellbeing scale, Competence scale and Autonomy scale were respectively: Cronbach's alphas: 0.88, 0.86, 0.83, 0.58. Test-retest correlations: 0.80, 0.76, 0.73, 0.60. Concurrent validity was as hypothesised. The KIPPPI was able to discriminate between parents that worried about their child and parents that did not. Screening accuracy was high (>0.90 for the KIPPPI Total score and for the Wellbeing scale. The KIPPPI scale scores and clinical decision of the child health professional were related (p<0.05, indicating a good clinical application. CONCLUSION: The results in this large-scale study of a diverse general population sample support the reliability, validity and clinical application of the KIPPPI Total score, Wellbeing scale and Competence

  5. Acoustic feedwater heater leak detection: Industry application of low ampersand high frequency detection increases response and reliability

    International Nuclear Information System (INIS)

    Woyshner, W.S.; Bryson, T.; Robertson, M.O.

    1993-01-01

    The Electric Power Research Institute has sponsored research associated with acoustic Feedwater Heater Leak Detection since the early 1980s. Results indicate that this technology is economically beneficial and dependable. Recent research work has employed acoustic sensors and signal conditioning with wider frequency range response and background noise elimination techniques to provide increased accuracy and dependability. Dual frequency sensors have been applied at a few facilities to provide information on this application of dual frequency response. Sensor mounting methods and attenuation due to various mounting configurations are more conclusively understood. These are depicted and discussed in detail. The significance of trending certain plant parameters such as heat cycle flows, heater vent and drain valve position, proper relief valve operation, etc. is also addressed. Test data were collected at various facilities to monitor the effect of varying several related operational parameters. A group of FWHLD Users have been involved from the inception of the project and reports on their latest successes and failures, along with various data depicting early detection of FWHLD tube leaks, will be included. 3 refs., 12 figs., 1 tab

  6. FISHing for bacteria in food--a promising tool for the reliable detection of pathogenic bacteria?

    Science.gov (United States)

    Rohde, Alexander; Hammerl, Jens Andre; Appel, Bernd; Dieckmann, Ralf; Al Dahouk, Sascha

    2015-04-01

    Foodborne pathogens cause millions of infections every year and are responsible for considerable economic losses worldwide. The current gold standard for the detection of bacterial pathogens in food is still the conventional cultivation following standardized and generally accepted protocols. However, these methods are time-consuming and do not provide fast information about food contaminations and thus are limited in their ability to protect consumers in time from potential microbial hazards. Fluorescence in situ hybridization (FISH) represents a rapid and highly specific technique for whole-cell detection. This review aims to summarize the current data on FISH-testing for the detection of pathogenic bacteria in different food matrices and to evaluate its suitability for the implementation in routine testing. In this context, the use of FISH in different matrices and their pretreatment will be presented, the sensitivity and specificity of FISH tests will be considered and the need for automation shall be discussed as well as the use of technological improvements to overcome current hurdles for a broad application in monitoring food safety. In addition, the overall economical feasibility will be assessed in a rough calculation of costs, and strengths and weaknesses of FISH are considered in comparison with traditional and well-established detection methods. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Multivariate normative comparison, a novel method for more reliably detecting cognitive impairment in HIV infection

    NARCIS (Netherlands)

    Su, Tanja; Schouten, Judith; Geurtsen, Gert J.; Wit, Ferdinand W.; Stolte, Ineke G.; Prins, Maria; Portegies, Peter; Caan, Matthan W. A.; Reiss, Peter; Majoie, Charles B.; Schmand, Ben A.

    2015-01-01

    The objective of this study is to assess whether multivariate normative comparison (MNC) improves detection of HIV-1-associated neurocognitive disorder (HAND) as compared with Frascati and Gisslén criteria. One-hundred and three HIV-1-infected men with suppressed viremia on combination

  8. Reliability of using retinal vascular fractal dimension as a biomarker in the diabetic retinopathy detection

    NARCIS (Netherlands)

    Huang, F.; Dashtbozorg, B.; Zhang, J.; Bekkers, E.J.; Abbasi-Sureshjani, S.; Berendschot, T.T.J.M.; ter Haar Romenij, B.M.

    2016-01-01

    The retinal fractal dimension (FD) is a measure of vasculature branching pattern complexity. FD has been considered as a potential biomarker for the detection of several diseases like diabetes and hypertension. However, conflicting findings were found in the reported literature regarding the

  9. Comparison of specificity and sensitivity of immunochemical and molecular techniques for reliable detection of Erwinia amylovora

    Czech Academy of Sciences Publication Activity Database

    Kokošková, B.; Mráz, Ivan; Hýblová, Jana

    2007-01-01

    Roč. 52, č. 2 (2007), s. 175-182 ISSN 0015-5632 R&D Projects: GA AV ČR(CZ) 1QS500510558 Institutional research plan: CEZ:AV0Z50510513 Keywords : Erwinia amylovora * detection Subject RIV: EE - Microbiology, Virology Impact factor: 0.989, year: 2007

  10. Sensitive and reliable detection of genomic imbalances in human neuroblastomas using comparative genomic hybridisation analysis

    NARCIS (Netherlands)

    van Gele, M.; van Roy, N.; Jauch, A.; Laureys, G.; Benoit, Y.; Schelfhout, V.; de Potter, C. R.; Brock, P.; Uyttebroeck, A.; Sciot, R.; Schuuring, E.; Versteeg, R.; Speleman, F.

    1997-01-01

    Deletions of the short arm of chromosome 1, extra copies of chromosome 17q and MYCN amplification are the most frequently encountered genetic changes in neuroblastomas. Standard techniques for detection of one or more of these genetic changes are karyotyping, FISH analysis and LOH analysis by

  11. Chromogenic in situ hybridization is a reliable assay for detection of ALK rearrangements in adenocarcinomas of the lung.

    Science.gov (United States)

    Schildhaus, Hans-Ulrich; Deml, Karl-Friedrich; Schmitz, Katja; Meiboom, Maren; Binot, Elke; Hauke, Sven; Merkelbach-Bruse, Sabine; Büttner, Reinhard

    2013-11-01

    Reliable detection of anaplastic lymphoma kinase (ALK) rearrangements is a prerequisite for personalized treatment of lung cancer patients, as ALK rearrangements represent a predictive biomarker for the therapy with specific tyrosine kinase inhibitors. Currently, fluorescent in situ hybridization (FISH) is considered to be the standard method for assessing formalin-fixed and paraffin-embedded tissue for ALK inversions and translocations. However, FISH requires a specialized equipment, the signals fade rapidly and it is difficult to detect overall morphology and tumor heterogeneity. Chromogenic in situ hybridization (CISH) has been successfully introduced as an alternative test for the detection of several genetic aberrations. This study validates a newly developed ALK CISH assay by comparing FISH and CISH signal patterns in lung cancer samples with and without ALK rearrangements. One hundred adenocarcinomas of the lung were included in this study, among them 17 with known ALK rearrangement. FISH and CISH were carried out and evaluated according to the manufacturers' recommendations. For both assays, tumors were considered positive if ≥15% of tumor cells showed either isolated 3' signals or break-apart patterns or a combination of both. A subset of tumors was exemplarily examined by using a novel EML4 (echinoderm microtubule-associated protein-like 4) CISH probe. Red, green and fusion CISH signals were clearcut and different signal patterns were easily recognized. The percentage of aberrant tumor cells was statistically highly correlated (PCISH. On the basis of 86 samples that were evaluable by ALK CISH, we found a 100% sensitivity and 100% specificity of this assay. Furthermore, EML4 rearrangements could be recognized by CISH. CISH is a highly reliable, sensitive and specific method for the detection of ALK gene rearrangements in pulmonary adenocarcinomas. Our results suggest that CISH might serve as a suitable alternative to FISH, which is the current gold

  12. Linear SVM-Based Android Malware Detection for Reliable IoT Services

    Directory of Open Access Journals (Sweden)

    Hyo-Sik Ham

    2014-01-01

    Full Text Available Current many Internet of Things (IoT services are monitored and controlled through smartphone applications. By combining IoT with smartphones, many convenient IoT services have been provided to users. However, there are adverse underlying effects in such services including invasion of privacy and information leakage. In most cases, mobile devices have become cluttered with important personal user information as various services and contents are provided through them. Accordingly, attackers are expanding the scope of their attacks beyond the existing PC and Internet environment into mobile devices. In this paper, we apply a linear support vector machine (SVM to detect Android malware and compare the malware detection performance of SVM with that of other machine learning classifiers. Through experimental validation, we show that the SVM outperforms other machine learning classifiers.

  13. Autopiquer - a Robust and Reliable Peak Detection Algorithm for Mass Spectrometry.

    Science.gov (United States)

    Kilgour, David P A; Hughes, Sam; Kilgour, Samantha L; Mackay, C Logan; Palmblad, Magnus; Tran, Bao Quoc; Goo, Young Ah; Ernst, Robert K; Clarke, David J; Goodlett, David R

    2017-02-01

    We present a simple algorithm for robust and unsupervised peak detection by determining a noise threshold in isotopically resolved mass spectrometry data. Solving this problem will greatly reduce the subjective and time-consuming manual picking of mass spectral peaks and so will prove beneficial in many research applications. The Autopiquer approach uses autocorrelation to test for the presence of (isotopic) structure in overlapping windows across the spectrum. Within each window, a noise threshold is optimized to remove the most unstructured data, whilst keeping as much of the (isotopic) structure as possible. This algorithm has been successfully demonstrated for both peak detection and spectral compression on data from many different classes of mass spectrometer and for different sample types, and this approach should also be extendible to other types of data that contain regularly spaced discrete peaks. Graphical Abstract ᅟ.

  14. Test-retest reliability and smallest detectable change of the Bristol Impact of Hypermobility (BIoH) questionnaire.

    Science.gov (United States)

    Palmer, S; Manns, S; Cramp, F; Lewis, R; Clark, E M

    2017-12-01

    The Bristol Impact of Hypermobility (BIoH) questionnaire is a patient-reported outcome measure developed in conjunction with adults with Joint Hypermobility Syndrome (JHS). It has demonstrated strong concurrent validity with the Short Form-36 (SF-36) physical component score but other psychometric properties have yet to be established. This study aimed to determine its test-retest reliability and smallest detectable change (SDC). A test-retest reliability study. Participants were recruited from the Hypermobility Syndromes Association, a patient organisation in the United Kingdom. Recruitment packs were sent to 1080 adults who had given permission to be contacted about research. BIoH and SF-36 questionnaires were administered at baseline and repeated two weeks later. An 11-point global rating of change scale (-5 to +5) was also administered at two weeks. Test-retest analysis and calculation of the SDC was conducted on 'stable' patients (defined as global rating of change -1 to +1). 462 responses were received. 233 patients reported a 'stable' condition and were included in analysis (95% women; mean (SD) age 44.5 (13.9) years; BIoH score 223.6 (54.0)). The BIoH questionnaire demonstrated excellent test-retest reliability (ICC 0.923, 95% CI 0.900-0.940). The SDC was 42 points (equivalent to 19% of the mean baseline score). The SF-36 physical and mental component scores demonstrated poorer test-retest reliability and larger SDCs (as a proportion of the mean baseline scores). The results provide further evidence of the potential of the BIoH questionnaire to underpin research and clinical practice for people with JHS. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Test-Retest Reliability and Minimal Detectable Change of the D2 Test of Attention in Patients with Schizophrenia.

    Science.gov (United States)

    Lee, Posen; Lu, Wen-Shian; Liu, Chin-Hsuan; Lin, Hung-Yu; Hsieh, Ching-Lin

    2017-12-08

    The d2 Test of Attention (D2) is a commonly used measure of selective attention for patients with schizophrenia. However, its test-retest reliability and minimal detectable change (MDC) are unknown in patients with schizophrenia, limiting its utility in both clinical and research settings. The aim of the present study was to examine the test-retest reliability and MDC of the D2 in patients with schizophrenia. A rater administered the D2 on 108 patients with schizophrenia twice at a 1-month interval. Test-retest reliability was determined through the calculation of the intra-class correlation coefficient (ICC). We also carried out Bland-Altman analysis, which included a scatter plot of the differences between test and retest against their mean. Systematic biases were evaluated by use of a paired t-test. The ICCs for the D2 ranged from 0.78 to 0.94. The MDCs (MDC%) of the seven subscores were 102.3 (29.7), 19.4 (85.0), 7.2 (94.6), 21.0 (69.0), 104.0 (33.1), 105.0 (35.8), and 7.8 (47.8), which represented limited-to-acceptable random measurement error. Trends in the Bland-Altman plots of the omissions (E1), commissions (E2), and errors (E) were noted, presenting that the data had heteroscedasticity. According to the results, the D2 had good test-retest reliability, especially in the scores of TN, TN-E, and CP. For the further research, finding a way to improve the administration procedure to reduce random measurement error would be important for the E1, E2, E, and FR subscores. © The Author(s) 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. A novel method for rapid and reliable detection of complex vertebral malformation and bovine leukocyte adhesion deficiency in Holstein cattle

    Directory of Open Access Journals (Sweden)

    Zhang Yi

    2012-07-01

    Full Text Available Abstract Background Complex vertebral malformation (CVM and bovine leukocyte adhesion deficiency (BLAD are two autosomal recessive lethal genetic defects frequently occurring in Holstein cattle, identifiable by single nucleotide polymorphisms. The objective of this study is to develop a rapid and reliable genotyping assay to screen the active Holstein sires and determine the carrier frequency of CVM and BLAD in Chinese dairy cattle population. Results We developed real-time PCR-based assays for discrimination of wild-type and defective alleles, so that carriers can be detected. Only one step was required after the DNA extraction from the sample and time consumption was about 2 hours. A total of 587 Chinese Holstein bulls were assayed, and fifty-six CVM-carriers and eight BLAD-carriers were identified, corresponding to heterozygote carrier frequencies of 9.54% and 1.36%, respectively. The pedigree analysis showed that most of the carriers could be traced back to the common ancestry, Osborndale Ivanhoe for BLAD and Pennstate Ivanhoe Star for CVM. Conclusions These results demonstrate that real-time PCR is a simple, rapid and reliable assay for BLAD and CVM defective allele detection. The high frequency of the CVM allele suggests that implementing a routine testing system is necessary to gradually eradicate the deleterious gene from the Chinese Holstein population.

  17. Nanostructured materials with plasmonic nanobiosensors for early cancer detection: A past and future prospect.

    Science.gov (United States)

    Sugumaran, Sathish; Jamlos, Mohd Faizal; Ahmad, Mohd Noor; Bellan, Chandar Shekar; Schreurs, Dominique

    2018-02-15

    Early cancer detection and treatment is an emerging and fascinating field of plasmonic nanobiosensor research. It paves to enrich a life without affecting living cells leading to a possible survival of the patient. This review describes a past and future prospect of an integrated research field on nanostructured metamaterials, microwave transmission, surface plasmonic resonance, nanoantennas, and their manifested versatile properties with nano-biosensors towards early cancer detection to preserve human health. Interestingly, (i) microwave transmission shows more advantages than other electromagnetic radiation in reacting with biological tissues, (ii) nanostructured metamaterial (Au) with special properties like size and shape can stimulate plasmonic effects, (iii) plasmonic based nanobiosensors are to explore the efficacy for early cancer tumour detection or single molecular detection and (iv) nanoantenna wireless communication by using microwave inverse scattering nanomesh (MISN) technique instead of conventional techniques can be adopted to characterize the microwave scattered signals from the biomarkers. It reveals that the nanostructured material with plasmonic nanobiosensor paves a fascinating platform towards early detection of cancer tumour and is anticipated to be exploited as a magnificent field in the future. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Can magnetic resonance imaging at 3.0-Tesla reliably detect patients with endometriosis? Initial results.

    Science.gov (United States)

    Thomeer, Maarten G; Steensma, Anneke B; van Santbrink, Evert J; Willemssen, Francois E; Wielopolski, Piotr A; Hunink, Myriam G; Spronk, Sandra; Laven, Joop S; Krestin, Gabriel P

    2014-04-01

    The aim of this study was to determine whether an optimized 3.0-Tesla magnetic resonance imaging (MRI) protocol is sensitive and specific enough to detect patients with endometriosis. This was a prospective cohort study with consecutive patients. Forty consecutive patients with clinical suspicion of endometriosis underwent 3.0-Tesla MRI, including a T2-weighted high-resolution fast spin echo sequence (spatial resolution=0.75 ×1.2 ×1.5 mm³) and a 3D T1-weighted high-resolution gradient echo sequence (spatial resolution=0.75 ×1.2 × 2.0 mm³). Two radiologists reviewed the dataset with consensus reading. During laparoscopy, which was used as reference standard, all lesions were characterized according to the revised criteria of the American Fertility Society. Patient-level and region-level sensitivities and specificities and lesion-level sensitivities were calculated. Patient-level sensitivity was 42% for stage I (5/12) and 100% for stages II, III and IV (25/25). Patient-level specificity for all stages was 100% (3/3). The region-level sensitivity and specificity was 63% and 97%, respectively. The sensitivity per lesion was 61% (90% for deep lesions, 48% for superficial lesions and 100% for endometriomata). The detection rate of obliteration of the cul-the-sac was 100% (10/10) with no false positive findings. The interreader agreement was substantial to perfect (kappa=1 per patient, 0.65 per lesion and 0.71 for obliteration of the cul-the-sac). An optimized 3.0-Tesla MRI protocol is accurate in detecting stage II to stage IV endometriosis. © 2014 The Authors. Journal of Obstetrics and Gynaecology Research © 2014 Japan Society of Obstetrics and Gynecology.

  19. Water chemistry data acquisition, processing, evaluation and diagnostic systems in Light Water Reactors: Future improvement of plant reliability and safety

    International Nuclear Information System (INIS)

    Uchida, S.; Takiguchi, H.; Ishigure, K.

    2006-01-01

    Data acquisition, processing and evaluation systems have been applied in major Japanese PWRs and BWRs to provide (1) reliable and quick data acquisition with manpower savings in plant chemical laboratories and (2) smooth and reliable information transfer among chemists, plant operators, and supervisors. Data acquisition systems in plants consist of automatic and semi-automatic instruments for chemical analyses, e. g., X-ray fluorescence analysis and ion chromatography, while data processing systems consist of PC base-sub-systems, e.g., data storage, reliability evaluation, clear display, and document preparation for understanding the plant own water chemistry trends. Precise and reliable evaluations of water chemistry data are required in order to improve plant reliability and safety. For this, quality assurance of the water chemistry data acquisition system is needed. At the same time, theoretical models are being applied to bridge the gaps between measured water chemistry data and the information desired to understand the interaction of materials and cooling water in plants. Major models which have already been applied for plant evaluation are: (1) water radiolysis models for BWRs and PWRs; (2) crevice radiolysis model for SCC in BWRs; and (3) crevice pH model for SG tubing in PWRs. High temperature water chemistry sensors and automatic plant diagnostic systems have been applied in only restricted areas. ECP sensors are gaining popularity as tools to determine the effects of hydrogen injection in BWR systems. Automatic plant diagnostic systems based on artificial intelligence will be more popular after having sufficient experience with off line diagnostic systems. (author)

  20. Lipase-nanoporous gold biocomposite modified electrode for reliable detection of triglycerides.

    Science.gov (United States)

    Wu, Chao; Liu, Xueying; Li, Yufei; Du, Xiaoyu; Wang, Xia; Xu, Ping

    2014-03-15

    For triglycerides biosensor design, protein immobilization is necessary to create the interface between the enzyme and the electrode. In this study, a glassy carbon electrode (GCE) was modified with lipase-nanoporous gold (NPG) biocomposite (denoted as lipase/NPG/GCE). Due to highly conductive, porous, and biocompatible three-dimensional structure, NPG is suitable for enzyme immobilization. In cyclic voltammetry experiments, the lipase/NPG/GCE bioelectrode displayed surface-confined reaction in a phosphate buffer solution. Linear responses were obtained for tributyrin concentrations ranging from 50 to 250 mg dl(-1) and olive oil concentrations ranging from 10 to 200 mg dl(-1). The value of apparent Michaelis-Menten constant for tributyrin was 10.67 mg dl(-1) and the detection limit was 2.68 mg dl(-1). Further, the lipase/NPG/GCE bioelectrode had strong anti-interference ability against urea, glucose, cholesterol, and uric acid as well as a long shelf-life. For the detection of triglycerides in human serum, the values given by the lipase/NPG/GCE bioelectrode were in good agreement with those of an automatic biochemical analyzer. These properties along with a long self-life make the lipase/NPG/GCE bioelectrode an excellent choice for the construction of triglycerides biosensor. © 2013 Elsevier B.V. All rights reserved.

  1. Continuous AE monitoring of nuclear plants to detect flaws - status and future

    International Nuclear Information System (INIS)

    Hutton, P.H.

    1986-01-01

    This paper gives a brief commentary on the evolution of acoustic emission (AE) technology for continuous monitoring of nuclear reactors and the current status. The technical work described to support the status description has the objective of developing and validating the use of AE to detect, locate, and evaluate growing flaws in reactor pressure boundaries. The future of AE for continuous monitoring is discussed in terms of envisioned applications and further accomplishments required to achieve them. 12 refs.

  2. Delta flow: An accurate, reliable system for detecting kicks and loss of circulation during drilling

    Energy Technology Data Exchange (ETDEWEB)

    Speers, J.M.; Gehrig, G.F.

    1987-12-01

    A system to monitor drilling-fluid flow rate has been developed that detects kicks and lost returns in floating, fixed-platform, and land-base drilling operations. The system uses flowmeters that monitor the flow rates of drilling fluids entering the borehole through the standpipe and leaving the well through the return flowline. These readings are processed in a computer-based, data-acquisition system to form a filtered delta-flow signal that identified the occurrence of downhole fluid gains or losses. The system is designed to trip an alarm when a gain or loss exceeds 25 gal/min (1.6 dm/sup 3//s), even in a floating drilling environment. This sensitivity will generally keep gains or losses to less than 5 bbl (0.8 m/sup 3/).

  3. A rapid and reliable determination of doxycycline hyclate by HPLC with UV detection in pharmaceutical samples

    Directory of Open Access Journals (Sweden)

    SNEZANA S. MITIC

    2008-06-01

    Full Text Available An accurate, sensitive and reproducible high performance liquid chromatographic (HPLC method for the quantification of doxycycline hyclate in pharmaceutical samples has been developed and validated. The drug and the standard were eluted from a Lichrosorb RP-8 (250 mm´4.6 mm, 10 mm particle size at 20 °C with a mobile phase consisting of methanol, acetonitrile and 0.010 M aqueous solution of oxalic acid (2:3:5, v/v/v. The flow rate was 1.25 ml min-1. A UV detector set at 350 nm was used to monitor the effluent. Each analysis required no longer than 4 min. The limits of detection and quantification were 1.15 and 3.84 μg ml-1, respectively. Recoveries for different concentrations ranged from 99.58 to 101.93 %.

  4. Reliability of Using Retinal Vascular Fractal Dimension as a Biomarker in the Diabetic Retinopathy Detection.

    Science.gov (United States)

    Huang, Fan; Dashtbozorg, Behdad; Zhang, Jiong; Bekkers, Erik; Abbasi-Sureshjani, Samaneh; Berendschot, Tos T J M; Ter Haar Romeny, Bart M

    2016-01-01

    The retinal fractal dimension (FD) is a measure of vasculature branching pattern complexity. FD has been considered as a potential biomarker for the detection of several diseases like diabetes and hypertension. However, conflicting findings were found in the reported literature regarding the association between this biomarker and diseases. In this paper, we examine the stability of the FD measurement with respect to (1) different vessel annotations obtained from human observers, (2) automatic segmentation methods, (3) various regions of interest, (4) accuracy of vessel segmentation methods, and (5) different imaging modalities. Our results demonstrate that the relative errors for the measurement of FD are significant and FD varies considerably according to the image quality, modality, and the technique used for measuring it. Automated and semiautomated methods for the measurement of FD are not stable enough, which makes FD a deceptive biomarker in quantitative clinical applications.

  5. Detecting recurrent major depressive disorder within primary care rapidly and reliably using short questionnaire measures.

    Science.gov (United States)

    Thapar, Ajay; Hammerton, Gemma; Collishaw, Stephan; Potter, Robert; Rice, Frances; Harold, Gordon; Craddock, Nicholas; Thapar, Anita; Smith, Daniel J

    2014-01-01

    Major depressive disorder (MDD) is often a chronic disorder with relapses usually detected and managed in primary care using a validated depression symptom questionnaire. However, for individuals with recurrent depression the choice of which questionnaire to use and whether a shorter measure could suffice is not established. To compare the nine-item Patient Health Questionnaire (PHQ-9), the Beck Depression Inventory, and the Hospital Anxiety and Depression Scale against shorter PHQ-derived measures for detecting episodes of DSM-IV major depression in primary care patients with recurrent MDD. Diagnostic accuracy study of adults with recurrent depression in primary care predominantly from Wales Scores on each of the depression questionnaire measures were compared with the results of a semi-structured clinical diagnostic interview using Receiver Operating Characteristic curve analysis for 337 adults with recurrent MDD. Concurrent questionnaire and interview data were available for 272 participants. The one-month prevalence rate of depression was 22.2%. The area under the curve (AUC) and positive predictive value (PPV) at the derived optimal cut-off value for the three longer questionnaires were comparable (AUC = 0.86-0.90, PPV = 49.4-58.4%) but the AUC for the PHQ-9 was significantly greater than for the PHQ-2. However, by supplementing the PHQ-2 score with items on problems concentrating and feeling slowed down or restless, the AUC (0.91) and the PPV (55.3%) were comparable with those for the PHQ-9. A novel four-item PHQ-based questionnaire measure of depression performs equivalently to three longer depression questionnaires in identifying depression relapse in patients with recurrent MDD.

  6. Can a future mission detect a habitable ecosystem on Europa, or Ganymede?

    Science.gov (United States)

    Chela Flores, Julian

    2010-05-01

    orbital probes in the future exploration of Jupiter's System (Gowen et al., 2009). There are alternative views on the effect of space weather on the radiation-induced S-cycles produced on the surficial molecules; but S is common to both interpretations (Carlson et al., 1999; McCord et al., 1999). The largest known S-fractionations are due to microbial reduction, and not to thermochemical processes. Besides, sulphate abiotic reductions are generally not as large as the biogenic ones (Kiyosu and Krouse, 1990). From experience with a natural population, this type of biota is able to fractionate efficiently the S-isotopes up to delta 34S of -70 per mil (Wortmann et al., 2001). Dissimilatory sulphate reducers are ubiquitous on Earth, producing the largest fractionations in the sulphur stable isotopes. These microbes are widely distributed in terrestrial anoxic environments.Consequently, sulphate reducers are the most evident candidates for the microorganisms populating a habitable Europan ecosystem. Microbial fractionation of stable S-isotopes argue in favour of penetrators for surveying the surface of not only Europa, but also of Ganymede, where surficial sulphur has been detected (McCord et al., 1997). The Europa-Jupiter System Mission (EJSM) intends to explore in the 2020s both of these satellites (Grasset et al., 2009). According to our hypothesis we predict that penetrators (supplied with mass spectrometry) should yield different results for fractionated sulphur. The icy patches on Europa should give substantial depletions of delta 34S, while measurements on Ganymede should give significantly lower values for the depletion of delta 34S. (Since the largest of the Galilean satellites lacks an ocean-core interface, according to our hypothesis it would not support life.) These diverging results—a large minus delta 34S for the Europan sulphur patches, and a small minus delta 34S for the Ganymede surficial sulphur—would provide a clear test for the hypothesis that a

  7. [Autism Spectrum Disorder in DSM-5 - concept, validity, and reliability, impact on clinical care and future research].

    Science.gov (United States)

    Freitag, Christine M

    2014-05-01

    Autism Spectrum Disorder (ASD) in DSM-5 comprises the former DSM-IV-TR diagnoses of Autistic Disorder, Asperger's Disorder and PDD-nos. The criteria for ASD in DSM-5 were considerably revised from those of ICD-10 and DSM-IV-TR. The present article compares the diagnostic criteria, presents studies on the validity and reliability of ASD, and discusses open questions. It ends with a clinical and research perspective.

  8. Seismic Azimuthal Anisotropy of the Lower Paleozoic Shales in Northern Poland: can we reliably detect it?

    Science.gov (United States)

    Cyz, Marta; Malinowski, Michał

    2017-04-01

    Analysis of the azimuthal anisotropy is an important aspect of characterization the Lower Paleozoic shale play in northern Poland, since it can be used to map pre-existing fracture networks or help in optimal placement of the horizontal wells. Previous studies employed Velocity versus Azimuth (VVAz) method and found that this anisotropy is weak - on the order of 1-2%, only locally - close to major fault zones - being higher (ca. 7%). This is consistent with the recent re-interpretation of the cross-dipole sonic data, which indicates average shear wave anisotropy of 1%. The problem with the VVAz method is that it requires good definition of the interval, for which the analysis is made and it should be minimum 100 ms thick. In our case, the target intervals are thin - upper reservoir (Lower Silurian Jantar formation) is 15 m thick, lower reservoir (Upper Ordovician Sasino formation) is 25 m thick. Therefore, we prefer to use the Amplitude vs Azimuth (AVAz) method, which can be applied on a single horizon (e.g. the base of the reservoir). However, the AVAz method depends critically on the quality of the seismic data and preservation of amplitudes during processing. On top of the above mentioned issues, physical properties of the Lower Paleozoic shales from Poland seem to be unfavourable for detecting azimuthal anisotropy. For example, for both target formations, parameter g=(Vs/Vp)2 is close to 0.32, which implies that the anisotropy expressed by the anisotropic gradient in the dry (i.e. gas-filled fractures) case is close to zero. In case of e.g. the Bakken Shale, g is much higher (0.38-0.4), leading to a detectable anisotropic signature even in the dry case. Modelling of the synthetic AVAz response performed using available well data suggested that anisotropic gradient in the wet (fluid-filled) case should be detectable even in case of the weak anisotropy (1-2%). This scenario is consistent with the observation, that the studied area is located in the liquid

  9. Detecting inflammation in the unprepared pediatric colon - how reliable is magnetic resonance enterography?

    Energy Technology Data Exchange (ETDEWEB)

    Barber, Joy L.; Watson, Tom A. [Great Ormond Street Hospital for Children NHS Foundation Trust, Department of Radiology, London (United Kingdom); Lozinsky, Adriana Chebar; Kiparissi, Fevronia; Shah, Neil [Great Ormond Street Hospital for Children NHS Foundation Trust, Department of Gastroenterology, London (United Kingdom)

    2016-05-15

    Pediatric inflammatory bowel disease frequently affects the colon. MR enterography is used to assess the small bowel but it also depicts the colon. To compare the accuracy of MR enterography and direct visualization at endoscopy in assessing the colon in pediatric inflammatory bowel disease. We included children with inflammatory bowel disease who had undergone both MR enterography and endoscopy, and we restrospectively assessed the imaging and endoscopic findings. We scored the colonic appearance at MR using a total colon score. We then compared scores for the whole colon and for its individual segments with endoscopy and histology. We included 15 children. An elevated MR colonic segmental score predicted the presence of active inflammation on biopsy with a specificity of 90% (95% confidence interval [CI] 79.5-96.2%) and sensitivity of 60% (CI 40.6-77.3%); this compares reasonably with the predictive values for findings at colonoscopy - specificity 85% (CI 73.4 - 92.9%) and sensitivity 53.3% (CI 34.3%-71.6%). Accuracy did not change significantly with increasing bowel distension. MR-derived scores had comparable accuracy to those derived during visualization at colonoscopy for detecting biopsy-proven inflammation in our patient group. MR enterography might prove useful in guiding biopsy or monitoring treatment response. Collapse of a colonic segment did not impair assessment of inflammation. (orig.)

  10. Can the comet assay be used reliably to detect nanoparticle-induced genotoxicity?

    Science.gov (United States)

    Karlsson, Hanna L; Di Bucchianico, Sebastiano; Collins, Andrew R; Dusinska, Maria

    2015-03-01

    The comet assay is a sensitive method to detect DNA strand breaks as well as oxidatively damaged DNA at the level of single cells. Today the assay is commonly used in nano-genotoxicology. In this review we critically discuss possible interactions between nanoparticles (NPs) and the comet assay. Concerns for such interactions have arisen from the occasional observation of NPs in the "comet head", which implies that NPs may be present while the assay is being performed. This could give rise to false positive or false negative results, depending on the type of comet assay endpoint and NP. For most NPs, an interaction that substantially impacts the comet assay results is unlikely. For photocatalytically active NPs such as TiO2 , on the other hand, exposure to light containing UV can lead to increased DNA damage. Samples should therefore not be exposed to such light. By comparing studies in which both the comet assay and the micronucleus assay have been used, a good consistency between the assays was found in general (69%); consistency was even higher when excluding studies on TiO2 NPs (81%). The strong consistency between the comet and micronucleus assays for a range of different NPs-even though the two tests measure different endpoints-implies that both can be trusted in assessing the genotoxicity of NPs, and that both could be useful in a standard battery of test methods. © 2014 Wiley Periodicals, Inc.

  11. Robust and reliable banknote authentification and print flaw detection with opto-acoustical sensor fusion methods

    Science.gov (United States)

    Lohweg, Volker; Schaede, Johannes; Türke, Thomas

    2006-02-01

    The authenticity checking and inspection of bank notes is a high labour intensive process where traditionally every note on every sheet is inspected manually. However with the advent of more and more sophisticated security features, both visible and invisible, and the requirement of cost reduction in the printing process, it is clear that automation is required. As more and more print techniques and new security features will be established, total quality security, authenticity and bank note printing must be assured. Therefore, this factor necessitates amplification of a sensorial concept in general. We propose a concept for both authenticity checking and inspection methods for pattern recognition and classification for securities and banknotes, which is based on the concept of sensor fusion and fuzzy interpretation of data measures. In the approach different methods of authenticity analysis and print flaw detection are combined, which can be used for vending or sorting machines, as well as for printing machines. Usually only the existence or appearance of colours and their textures are checked by cameras. Our method combines the visible camera images with IR-spectral sensitive sensors, acoustical and other measurements like temperature and pressure of printing machines.

  12. Detecting inflammation in the unprepared pediatric colon - how reliable is magnetic resonance enterography?

    International Nuclear Information System (INIS)

    Barber, Joy L.; Watson, Tom A.; Lozinsky, Adriana Chebar; Kiparissi, Fevronia; Shah, Neil

    2016-01-01

    Pediatric inflammatory bowel disease frequently affects the colon. MR enterography is used to assess the small bowel but it also depicts the colon. To compare the accuracy of MR enterography and direct visualization at endoscopy in assessing the colon in pediatric inflammatory bowel disease. We included children with inflammatory bowel disease who had undergone both MR enterography and endoscopy, and we restrospectively assessed the imaging and endoscopic findings. We scored the colonic appearance at MR using a total colon score. We then compared scores for the whole colon and for its individual segments with endoscopy and histology. We included 15 children. An elevated MR colonic segmental score predicted the presence of active inflammation on biopsy with a specificity of 90% (95% confidence interval [CI] 79.5-96.2%) and sensitivity of 60% (CI 40.6-77.3%); this compares reasonably with the predictive values for findings at colonoscopy - specificity 85% (CI 73.4 - 92.9%) and sensitivity 53.3% (CI 34.3%-71.6%). Accuracy did not change significantly with increasing bowel distension. MR-derived scores had comparable accuracy to those derived during visualization at colonoscopy for detecting biopsy-proven inflammation in our patient group. MR enterography might prove useful in guiding biopsy or monitoring treatment response. Collapse of a colonic segment did not impair assessment of inflammation. (orig.)

  13. Simultaneous amplification of two bacterial genes: more reliable method of Helicobacter pylori detection in microbial rich dental plaque samples.

    Science.gov (United States)

    Chaudhry, Saima; Idrees, Muhammad; Izhar, Mateen; Butt, Arshad Kamal; Khan, Ayyaz Ali

    2011-01-01

    Polymerase Chain reaction (PCR) assay is considered superior to other methods for detection of Helicobacter pylori (H. pylori) in oral cavity; however, it also has limitations when sample under study is microbial rich dental plaque. The type of gene targeted and number of primers used for bacterial detection in dental plaque samples can have a significant effect on the results obtained as there are a number of closely related bacterial species residing in plaque biofilm. Also due to high recombination rate of H. pylori some of the genes might be down regulated or absent. The present study was conducted to determine the frequency of H. pylori colonization of dental plaque by simultaneously amplifying two genes of the bacterium. One hundred dental plaque specimens were collected from dyspeptic patients before their upper gastrointestinal endoscopy and presence of H. pylori was determined through PCR assay using primers targeting two different genes of the bacterium. Eighty-nine of the 100 samples were included in final analysis. With simultaneous amplification of two bacterial genes 51.6% of the dental plaque samples were positive for H. pylori while this prevalence increased to 73% when only one gene amplification was used for bacterial identification. Detection of H. pylori in dental plaque samples is more reliable when two genes of the bacterium are simultaneously amplified as compared to one gene amplification only.

  14. The sensitivity of computed tomography (CT) scans in detecting trauma: are CT scans reliable enough for courtroom testimony?

    Science.gov (United States)

    Molina, D Kimberley; Nichols, Joanna J; Dimaio, Vincent J M

    2007-09-01

    Rapid and accurate recognition of traumatic injuries is extremely important in emergency room and surgical settings. Emergency departments depend on computed tomography (CT) scans to provide rapid, accurate injury assessment. We conducted an analysis of all traumatic deaths autopsied at the Bexar County Medical Examiner's Office in which perimortem medical imaging (CT scan) was performed to assess the reliability of the CT scan in detecting trauma with sufficient accuracy for courtroom testimony. Cases were included in the study if an autopsy was conducted, a CT scan was performed within 24 hours before death, and there was no surgical intervention. Analysis was performed to assess the correlation between the autopsy and CT scan results. Sensitivity, specificity, positive predictive value, and negative predictive value were defined for the CT scan based on the autopsy results. The sensitivity of the CT scan ranged from 0% for cerebral lacerations, cervical vertebral body fractures, cardiac injury, and hollow viscus injury to 75% for liver injury. This study reveals that CT scans are an inadequate detection tool for forensic pathologists, where a definitive diagnosis is required, because they have a low level of accuracy in detecting traumatic injuries. CT scans may be adequate for clinicians in the emergency room setting, but are inadequate for courtroom testimony. If the evidence of trauma is based solely on CT scan reports, there is a high possibility of erroneous accusations, indictments, and convictions.

  15. Water-soluble upper GI based on clinical findings is reliable to detect anastomotic leaks after laparoscopic gastric bypass.

    Science.gov (United States)

    Katasani, V G; Leeth, R R; Tishler, D S; Leath, T D; Roy, B P; Canon, C L; Vickers, S M; Clements, R H

    2005-11-01

    Anastomotic leak after laparoscopic Roux-en-Y gastric bypass (LGB) is a major complication that must be recognized and treated early for best results. There is controversy in the literature regarding the reliability of upper GI series (UGI) in diagnosing leaks. LGB was performed in patients meeting NIH criteria for the surgical treatment of morbid obesity. All leaks identified at the time of surgery were repaired with suture and retested. Drains were placed at the surgeon's discretion. Postoperatively, UGI was performed by an experienced radiologist if there was a clinical suspicion of leak. From September 2001 until October 2004, a total of 553 patients (age 40.4 +/- 9.2 years, BMI 48.6 +/- 7.2) underwent LGB at UAB. Seventy-eight per cent (431 of 553) of patients had no clinical evidence suggesting anastomotic leak and were managed expectantly. Twenty-two per cent (122 of 553) of patients met at least one inclusion criteria for leak and underwent UGI. Four of 122 patients (3.2%) had a leak, two from anastomosis and two from the perforation of the stapled end of the Roux limb. No patient returned to the operating room without a positive UGI. High clinical suspicion and selectively performed UGI based on clinical evidence is reliable in detecting leaks.

  16. Detection and forecasting of oyster norovirus outbreaks: recent advances and future perspectives.

    Science.gov (United States)

    Wang, Jiao; Deng, Zhiqiang

    2012-09-01

    Norovirus is a highly infectious pathogen that is commonly found in oysters growing in fecally contaminated waters. Norovirus outbreaks can cause the closure of oyster harvesting waters and acute gastroenteritis in humans associated with consumption of contaminated raw oysters. Extensive efforts and progresses have been made in detection and forecasting of oyster norovirus outbreaks over the past decades. The main objective of this paper is to provide a literature review of methods and techniques for detecting and forecasting oyster norovirus outbreaks and thereby to identify the future directions for improving the detection and forecasting of norovirus outbreaks. It is found that (1) norovirus outbreaks display strong seasonality with the outbreak peak occurring commonly in December-March in the U.S. and April-May in the Europe; (2) norovirus outbreaks are affected by multiple environmental factors, including but not limited to precipitation, temperature, solar radiation, wind, and salinity; (3) various modeling approaches may be employed to forecast norovirus outbreaks, including Bayesian models, regression models, Artificial Neural Networks, and process-based models; and (4) diverse techniques are available for near real-time detection of norovirus outbreaks, including multiplex PCR, seminested PCR, real-time PCR, quantitative PCR, and satellite remote sensing. The findings are important to the management of oyster growing waters and to future investigations into norovirus outbreaks. It is recommended that a combined approach of sensor-assisted real time monitoring and modeling-based forecasting should be utilized for an efficient and effective detection and forecasting of norovirus outbreaks caused by consumption of contaminated oysters. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Reliability of the MicroScan WalkAway PC21 panel in identifying and detecting oxacillin resistance in clinical coagulase-negative staphylococci strains.

    Science.gov (United States)

    Olendzki, A N; Barros, E M; Laport, M S; Dos Santos, K R N; Giambiagi-Demarval, M

    2014-01-01

    The purpose of this study was to determine the reliability of the MicroScan WalkAway PosCombo21 (PC21) system for the identification of coagulase-negative staphylococci (CNS) strains and the detection of oxacillin resistance. Using molecular and phenotypic methods, 196 clinical strains were evaluated. The automated system demonstrated 100 % reliability for the identification of the clinical strains Staphylococcus haemolyticus, Staphylococcus hominis and Staphylococcus cohnii; 98.03 % reliability for the identification of Staphylococcus epidermidis; 70 % reliability for the identification of Staphylococcus lugdunensis; 40 % reliability for the identification of Staphylococcus warneri; and 28.57 % reliability for the identification of Staphylococcus capitis, but no reliability for the identification of Staphylococcus auricularis, Staphylococcus simulans and Staphylococcus xylosus. We concluded that the automated system provides accurate results for the more common CNS species but often fails to accurately identify less prevalent species. For the detection of oxacillin resistance, the automated system showed 100 % specificity and 90.22 % sensitivity. Thus, the PC21 panel detects oxacillin-resistant strains, but is limited by the heteroresistance that is observed when using most phenotypic methods.

  18. Field Effect Sensors for Nucleic Acid Detection: Recent Advances and Future Perspectives

    Directory of Open Access Journals (Sweden)

    Bruno Veigas

    2015-05-01

    Full Text Available In the last decade the use of field-effect-based devices has become a basic structural element in a new generation of biosensors that allow label-free DNA analysis. In particular, ion sensitive field effect transistors (FET are the basis for the development of radical new approaches for the specific detection and characterization of DNA due to FETs’ greater signal-to-noise ratio, fast measurement capabilities, and possibility to be included in portable instrumentation. Reliable molecular characterization of DNA and/or RNA is vital for disease diagnostics and to follow up alterations in gene expression profiles. FET biosensors may become a relevant tool for molecular diagnostics and at point-of-care. The development of these devices and strategies should be carefully designed, as biomolecular recognition and detection events must occur within the Debye length. This limitation is sometimes considered to be fundamental for FET devices and considerable efforts have been made to develop better architectures. Herein we review the use of field effect sensors for nucleic acid detection strategies—from production and functionalization to integration in molecular diagnostics platforms, with special focus on those that have made their way into the diagnostics lab.

  19. Ambient Pressure Laser Desorption—Chemical Ionization Mass Spectrometry for Fast and Reliable Detection of Explosives, Drugs, and Their Precursors

    Directory of Open Access Journals (Sweden)

    René Reiss

    2018-06-01

    Full Text Available Fast and reliable information is crucial for first responders to draw correct conclusions at crime scenes. An ambient pressure laser desorption (APLD mass spectrometer is introduced for this scenario, which enables detecting substances on surfaces without sample pretreatment. It is especially useful for substances with low vapor pressure and thermolabile ones. The APLD allows for the separation of desorption and ionization into two steps and, therefore, both can be optimized separately. Within this work, an improved version of the developed system is shown that achieves limits of detection (LOD down to 500 pg while remaining fast and flexible. Furthermore, realistic scenarios are applied to prove the usability of this system in real-world issues. For this purpose, post-blast residues of a bomb from the Second World War were analyzed, and the presence of PETN was proven without sample pretreatment. In addition, the analyzable substance range could be expanded by various drugs and drug precursors. Thus, the presented instrumentation can be utilized for an increased number of forensically important compound classes without changing the setup. Drug precursors revealed a LOD ranging from 6 to 100 ng. Drugs such as cocaine hydrochloride, heroin, (3,4-methylendioxy-methamphetamine hydrochloride (MDMA hydrochloride, and others exhibit a LOD between 10 to 200 ng.

  20. Computer-aided detection system for lung cancer in computed tomography scans: Review and future prospects

    Science.gov (United States)

    2014-01-01

    Introduction The goal of this paper is to present a critical review of major Computer-Aided Detection systems (CADe) for lung cancer in order to identify challenges for future research. CADe systems must meet the following requirements: improve the performance of radiologists providing high sensitivity in the diagnosis, a low number of false positives (FP), have high processing speed, present high level of automation, low cost (of implementation, training, support and maintenance), the ability to detect different types and shapes of nodules, and software security assurance. Methods The relevant literature related to “CADe for lung cancer” was obtained from PubMed, IEEEXplore and Science Direct database. Articles published from 2009 to 2013, and some articles previously published, were used. A systemic analysis was made on these articles and the results were summarized. Discussion Based on literature search, it was observed that many if not all systems described in this survey have the potential to be important in clinical practice. However, no significant improvement was observed in sensitivity, number of false positives, level of automation and ability to detect different types and shapes of nodules in the studied period. Challenges were presented for future research. Conclusions Further research is needed to improve existing systems and propose new solutions. For this, we believe that collaborative efforts through the creation of open source software communities are necessary to develop a CADe system with all the requirements mentioned and with a short development cycle. In addition, future CADe systems should improve the level of automation, through integration with picture archiving and communication systems (PACS) and the electronic record of the patient, decrease the number of false positives, measure the evolution of tumors, evaluate the evolution of the oncological treatment, and its possible prognosis. PMID:24713067

  1. Detecting acute distress and risk of future psychological morbidity in critically ill patients: validation of the intensive care psychological assessment tool.

    Science.gov (United States)

    Wade, Dorothy M; Hankins, Matthew; Smyth, Deborah A; Rhone, Elijah E; Mythen, Michael G; Howell, David C J; Weinman, John A

    2014-09-24

    The psychological impact of critical illness on a patient can be severe, and frequently results in acute distress as well as psychological morbidity after leaving hospital. A UK guideline states that patients should be assessed in critical care units, both for acute distress and risk of future psychological morbidity; but no suitable method for carrying out this assessment exists. The Intensive care psychological assessment tool (IPAT) was developed as a simple, quick screening tool to be used routinely to detect acute distress, and the risk of future psychological morbidity, in critical care units. A validation study of IPAT was conducted in the critical care unit of a London hospital. Once un-sedated, orientated and alert, critical care patients were assessed with the IPAT and validated tools for distress, to determine the IPAT's concurrent validity. Fifty six patients took IPAT again to establish test-retest reliability. Finally, patients completed posttraumatic stress disorder (PTSD), depression and anxiety questionnaires at three months, to determine predictive validity of the IPAT. One hundred and sixty six patients completed the IPAT, and 106 completed follow-up questionnaires at 3 months. Scale analysis showed IPAT was a reliable 10-item measure of critical care-related psychological distress. Test-retest reliability was good (r =0.8). There was good concurrent validity with measures of anxiety and depression (r =0.7, P psychological morbidity was good (r =0.4, P psychological morbidity (AUC =0.7). The IPAT was found to have good reliability and validity. Sensitivity and specificity analysis suggest the IPAT could provide a way of allowing staff to assess psychological distress among critical care patients after further replication and validation. Further work is also needed to determine its utility in predicting future psychological morbidity.

  2. Is air-displacement plethysmography a reliable method of detecting ongoing changes in percent body fat within obese children involved in a weight management program?

    DEFF Research Database (Denmark)

    Ewane, Cecile; McConkey, Stacy A; Kreiter, Clarence D

    2010-01-01

    (percent body fat) over time. The gold standard method, hydrodensitometry, has severe limitations for the pediatric population. OBJECTIVE: This study examines the reliability of air-displacement plethysmography (ADP) in detecting percent body fat changes within obese children over time. METHODS: Percent...... body fat by ADP, weight, and body mass index (BMI) were measured for eight obese children aged 5-12 years enrolled in a weight management program over a 12-month period. These measurements were taken at initial evaluation, 1.5 months, 3 months, 6 months, and 12 months to monitor the progress...... of the subjects and detect any changes in these measures over time. Statistical analysis was used to determine the reliability of the data collected. RESULTS: The reliability estimate for percent body fat by ADP was 0.78. This was much lower than the reliability of BMI, 0.98, and weight measurements, 0...

  3. 3-lead electrocardiogram is more reliable than pulse oximetry to detect bradycardia during stabilisation at birth of very preterm infants.

    Science.gov (United States)

    Iglesias, Beatriz; Rodrí Guez, Marí A José; Aleo, Esther; Criado, Enrique; Martí Nez-Orgado, Jose; Arruza, Luis

    2018-05-01

    Current neonatal resuscitation guidelines suggest the use of ECG in the delivery room (DR) to assess heart rate (HR). However, reliability of ECG compared with pulse oximetry (PO) in a situation of bradycardia has not been specifically investigated. The objective of the present study was to compare HR monitoring using ECG or PO in a situation of bradycardia (HR <100 beats per minute (bpm)) during preterm stabilisation in the DR. Video recordings of resuscitations of infants <32 weeks of gestation were reviewed. HR readings in a situation of bradycardia (<100 bpm) at any moment during stabilisation were registered with both devices every 5 s from birth. A total of 29 episodes of bradycardia registered by the ECG in 39 video recordings were included in the analysis (n=29). PO did not detect the start of these events in 20 cases (69%). PO detected the start and the end of bradycardia later than the ECG (median (IQR): 5 s (0-10) and 5 s (0-7.5), respectively). A decline in PO accuracy was observed as bradycardia progressed so that by the end of the episode PO offered significantly lower HR readings than ECG. PO detects the start and recovery of bradycardia events slower and less accurately than ECG during stabilisation at birth of very preterm infants. ECG use in this scenario may contribute to an earlier initiation of resuscitation manoeuvres and to avoid unnecessary prolongation of resuscitation efforts after recovery. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Futures

    DEFF Research Database (Denmark)

    Pedersen, Michael Haldrup

    2017-01-01

    Currently both design thinking and critical social science experience an increased interest in speculating in alternative future scenarios. This interest is not least related to the challenges issues of global sustainability present for politics, ethics and design. This paper explores the potenti......Currently both design thinking and critical social science experience an increased interest in speculating in alternative future scenarios. This interest is not least related to the challenges issues of global sustainability present for politics, ethics and design. This paper explores...... the potentials of speculative thinking in relation to design and social and cultural studies, arguing that both offer valuable insights for creating a speculative space for new emergent criticalities challenging current assumptions of the relations between power and design. It does so by tracing out discussions...... of ‘futurity’ and ‘futuring’ in design as well as social and cultural studies. Firstly, by discussing futurist and speculative approaches in design thinking; secondly by engaging with ideas of scenario thinking and utopianism in current social and cultural studies; and thirdly by showing how the articulation...

  5. Fueling our future: four steps to a new, reliable, cleaner, decentralized energy supply based on hydrogen and fuel cells

    International Nuclear Information System (INIS)

    Evers, A.A.

    2004-01-01

    'Full text:' This manuscript demonstrates the possible driving factors and necessary elements needed to move Hydrogen and Fuel Cells (H2/FC) to worldwide commercialisation. Focusing not only on the technology itself, we look at the 'bigger picture' explaining how certain trends have impacted the progress of new technologies developments in the past. In this process, the consumer has played and will continue to play the key and leading role. We also examine different Distributed Generation scenarios including distributed generation via fuel cells for automotive applications which may be the catalyst to the Hydrogen Economy. One possible step could be the use of Personal Power Cars equipped with Fuel Cells which not only drive on Hydrogen, but also supply (while standing) electricity /heat to residential and commercial buildings. With 1.3 billion potential customers, P.R. China is one country where such a scenario may fit. The up-and-coming Chinese H2/FC industry deals with applied fundamental research such as advances in Hydrogen production from Natural Gas, Methanol and Gasoline. The current activities in P.R. China certain to further accelerate the trend towards the coming Hydrogen Economy, together with the steps necessary to achieve a new reliable, cleaner and decentralized Energy Supply based on H2/FC are examined. (author)

  6. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  7. Reliable allele detection using SNP-based PCR primers containing Locked Nucleic Acid: application in genetic mapping

    Directory of Open Access Journals (Sweden)

    Trognitz Friederike

    2007-02-01

    Full Text Available Abstract Background The diploid, Solanum caripense, a wild relative of potato and tomato, possesses valuable resistance to potato late blight and we are interested in the genetic base of this resistance. Due to extremely low levels of genetic variation within the S. caripense genome it proved impossible to generate a dense genetic map and to assign individual Solanum chromosomes through the use of conventional chromosome-specific SSR, RFLP, AFLP, as well as gene- or locus-specific markers. The ease of detection of DNA polymorphisms depends on both frequency and form of sequence variation. The narrow genetic background of close relatives and inbreds complicates the detection of persisting, reduced polymorphism and is a challenge to the development of reliable molecular markers. Nonetheless, monomorphic DNA fragments representing not directly usable conventional markers can contain considerable variation at the level of single nucleotide polymorphisms (SNPs. This can be used for the design of allele-specific molecular markers. The reproducible detection of allele-specific markers based on SNPs has been a technical challenge. Results We present a fast and cost-effective protocol for the detection of allele-specific SNPs by applying Sequence Polymorphism-Derived (SPD markers. These markers proved highly efficient for fingerprinting of individuals possessing a homogeneous genetic background. SPD markers are obtained from within non-informative, conventional molecular marker fragments that are screened for SNPs to design allele-specific PCR primers. The method makes use of primers containing a single, 3'-terminal Locked Nucleic Acid (LNA base. We demonstrate the applicability of the technique by successful genetic mapping of allele-specific SNP markers derived from monomorphic Conserved Ortholog Set II (COSII markers mapped to Solanum chromosomes, in S. caripense. By using SPD markers it was possible for the first time to map the S. caripense alleles

  8. Towards improving the reliability of future regional climate projections: A bias-correction method applied to precipitation over the west coast of Norway

    Science.gov (United States)

    Valved, A.; Barstad, I.; Sobolowski, S.

    2012-04-01

    The early winter of 2011/2012 in the city of Bergen, located on the west coast of Norway, was dominated by warm, wet and extreme weather. This might be a glimpse of future average climate conditions under continued atmospheric warming and an enhanced hydrological cycle. The extreme weather events have resulted in drainage/sewage problems, landslides, flooding property damage and even death. As the Municipality plans for the future they must contend with a growing population in a geographically complex area in addition to any effects attributable to climate change. While the scientific community is increasingly confident in the projections of large scale changes over the mid - high latitudes this confidence does not extend to the local - regional scale where the magnitude and even direction of change may be highly uncertain. Meanwhile it is precisely these scales that Municipalities such as Bergen require information if they are to plan effectively. Thus, there is a need for reliable, local climate projections, which can aid policy makers and planners in decision-making. Current state of the art regional climate models are capable of providing detailed simulations on the order of 1 or 10km. However, due to the increased computational demands of these simulations, large ensembles, such as those used for GCM experiments, are often not possible. Thus, greater detail, under these circumstances, does not necessarily correspond to greater reliability. One way to deal with this issue is to apply a statistical bias correction method where model results are fitted to observationally derived probability density functions (pdfs). In this way, a full distribution of potential changes may be generated which are constrained by known, observed data.This will result in a shifted model distribution with mean and spread that more closely follows observations. In short, the method temporarily removes the climate signals from the model run working on the different percentiles, fits the

  9. Detectability of rotation-powered pulsars in future hard X-ray surveys

    International Nuclear Information System (INIS)

    Wang Wei

    2009-01-01

    Recent INTEGRAL/IBIS hard X-ray surveys have detected about 10 young pulsars. We show hard X-ray properties of these 10 young pulsars, which have a luminosity of 10 33 -10 37 erg s -1 and a photon index of 1.6-2.1 in the energy range of 20-100 keV. The correlation between X-ray luminosity and spin-down power of L X ∝ L sd 1.31 suggests that the hard X-ray emission in rotation-powered pulsars is dominated by the pulsar wind nebula (PWN) component. Assuming spectral properties are similar in 20-100 keV and 2-10 keV for both the pulsar and PWN components, the hard X-ray luminosity and flux of 39 known young X-ray pulsars and 8 millisecond pulsars are obtained, and a correlation of L X ∝ L sd 1.5 is derived. About 20 known young X-ray pulsars and 1 millisecond pulsars could be detected with future INTEGRAL and HXMT surveys. We also carry out Monte Carlo simulations of hard X-ray pulsars in the Galaxy and the Gould Belt, assuming values for the pulsar birth rate, initial position, proper motion velocity, period, and magnetic field distribution and evolution based on observational statistics and the L X - L sd relations: L X ∝ L sd 1.31 and L X ∝ L sd 1.5 . More than 40 young pulsars (mostly in the Galactic plane) could be detected after ten years of INTEGRAL surveys and the launch of HXMT. So, the young pulsars would be a significant part of the hard X-ray source population in the sky, and will contribute to unidentified hard X-ray sources in present and future hard X-ray surveys by INTEGRAL and HXMT.

  10. Ebolavirus diagnosis made simple, comparable and faster than molecular detection methods: preparing for the future.

    Science.gov (United States)

    James, Ameh S; Todd, Shawn; Pollak, Nina M; Marsh, Glenn A; Macdonald, Joanne

    2018-04-23

    The 2014/2015 Ebolavirus outbreak resulted in more than 28,000 cases and 11,323 reported deaths, as of March 2016. Domestic transmission of the Guinea strain associated with the outbreak occurred mainly in six African countries, and international transmission was reported in four countries. Outbreak management was limited by the inability to rapidly diagnose infected cases. A further fifteen countries in Africa are predicted to be at risk of Ebolavirus outbreaks in the future as a consequence of climate change and urbanization. Early detection of cases and reduction of transmission rates is critical to prevent and manage future severe outbreaks. We designed a rapid assay for detection of Ebolavirus using recombinase polymerase amplification, a rapid isothermal amplification technology that can be combined with portable lateral flow detection technology. The developed rapid assay operates in 30 min and was comparable with real-time TaqMan™ PCR. Designed, screened, selected and optimized oligonucleotides using the NP coding region from Ebola Zaire virus (Guinea strain). We determined the analytical sensitivity of our Ebola rapid molecular test by testing selected primers and probe with tenfold serial dilutions (1.34 × 10 10-  1.34 × 10 1 copies/μL) of cloned NP gene from Mayinga strain of Zaire ebolavirus in pCAGGS vector, and serially diluted cultured Ebolavirus as established by real-time TaqMan™ PCR that was performed using ABI7500 in Fast Mode. We tested extracted and reverse transcribed RNA from cultured Zaire ebolavirus strains - Mayinga, Gueckedou C05, Gueckedou C07, Makona, Kissidougou and Kiwit. We determined the analytical specificity of our assay with related viruses: Marburg, Ebola Reston and Ebola Sudan. We further tested for Dengue virus 1-4, Plasmodium falciparum and West Nile Virus (Kunjin strain). The assay had a detection limit of 134 copies per μL of plasmid containing the NP gene of Ebolavirus Mayinga, and cultured Ebolavirus

  11. A Path Tracking Algorithm Using Future Prediction Control with Spike Detection for an Autonomous Vehicle Robot

    Directory of Open Access Journals (Sweden)

    Muhammad Aizzat Zakaria

    2013-08-01

    Full Text Available Trajectory tracking is an important aspect of autonomous vehicles. The idea behind trajectory tracking is the ability of the vehicle to follow a predefined path with zero steady state error. The difficulty arises due to the nonlinearity of vehicle dynamics. Therefore, this paper proposes a stable tracking control for an autonomous vehicle. An approach that consists of steering wheel control and lateral control is introduced. This control algorithm is used for a non-holonomic navigation problem, namely tracking a reference trajectory in a closed loop form. A proposed future prediction point control algorithm is used to calculate the vehicle's lateral error in order to improve the performance of the trajectory tracking. A feedback sensor signal from the steering wheel angle and yaw rate sensor is used as feedback information for the controller. The controller consists of a relationship between the future point lateral error, the linear velocity, the heading error and the reference yaw rate. This paper also introduces a spike detection algorithm to track the spike error that occurs during GPS reading. The proposed idea is to take the advantage of the derivative of the steering rate. This paper aims to tackle the lateral error problem by applying the steering control law to the vehicle, and proposes a new path tracking control method by considering the future coordinate of the vehicle and the future estimated lateral error. The effectiveness of the proposed controller is demonstrated by a simulation and a GPS experiment with noisy data. The approach used in this paper is not limited to autonomous vehicles alone since the concept of autonomous vehicle tracking can be used in mobile robot platforms, as the kinematic model of these two platforms is similar.

  12. Mammographic casting-type calcification associated with small screen-detected invasive breast cancers: is this a reliable prognostic indicator?

    International Nuclear Information System (INIS)

    Peacock, C.; Given-Wilson, R.M.; Duffy, S.W.

    2004-01-01

    AIM: The aim of the present study was to establish whether mammographic casting-type calcification associated with small screen-detected invasive breast cancers is a reliable prognostic indicator. METHODS AND MATERIALS: We retrospectively identified 50 consecutive women diagnosed with an invasive cancer less than 15 mm who showed associated casting calcification on their screening mammograms. Controls were identified that showed no microcalcification and were matched for tumour size, histological type and lymph node status. A minimum of 5 years follow-up was obtained, noting recurrence and outcome. Conditional and unconditional logistic regression, depending on the outcome variable, were used to analyse the data, taking the matched design into account in both cases. Where small numbers prohibited the use of logistic regression, Fisher's exact test was used. RESULTS: Five deaths from breast cancer occurred out of the 50 cases, of which three were lymph node positive, two were lymph node negative and none were grade 3. None of the 78 control cases died from breast cancer. The difference in breast cancer death rates was significant by Fisher's exact test (p=0.02). Risk of recurrence was also significantly increased in the casting cases (OR=3.55, 95% CI 1.02-12.33, p=0.046). CONCLUSION: Although the overall outcome for small screen-detected breast cancers is good, our study suggests that casting calcification is a poorer prognostic factor. The advantage of a mammographic feature as an independent prognostic indicator lies in early identification of high-risk patients, allowing optimization of management

  13. Strongly lensed neutral hydrogen emission: detection predictions with current and future radio interferometers

    Science.gov (United States)

    Deane, R. P.; Obreschkow, D.; Heywood, I.

    2015-09-01

    Strong gravitational lensing provides some of the deepest views of the Universe, enabling studies of high-redshift galaxies only possible with next-generation facilities without the lensing phenomenon. To date, 21-cm radio emission from neutral hydrogen has only been detected directly out to z ˜ 0.2, limited by the sensitivity and instantaneous bandwidth of current radio telescopes. We discuss how current and future radio interferometers such as the Square Kilometre Array (SKA) will detect lensed H I emission in individual galaxies at high redshift. Our calculations rely on a semi-analytic galaxy simulation with realistic H I discs (by size, density profile and rotation), in a cosmological context, combined with general relativistic ray tracing. Wide-field, blind H I surveys with the SKA are predicted to be efficient at discovering lensed H I systems, increasingly so at z ≳ 2. This will be enabled by the combination of the magnification boosts, the steepness of the H I luminosity function at the high-mass end, and the fact that the H I spectral line is relatively isolated in frequency. These surveys will simultaneously provide a new technique for foreground lens selection and yield the highest redshift H I emission detections. More near term (and existing) cm-wave facilities will push the high-redshift H I envelope through targeted surveys of known lenses.

  14. Sampling inspection for the evaluation of time-dependent reliability of deteriorating systems under imperfect defect detection

    International Nuclear Information System (INIS)

    Kuniewski, Sebastian P.; Weide, Johannes A.M. van der; Noortwijk, Jan M. van

    2009-01-01

    The paper presents a sampling-inspection strategy for the evaluation of time-dependent reliability of deteriorating systems, where the deterioration is assumed to initiate at random times and at random locations. After initiation, defects are weakening the system's resistance. The system becomes unacceptable when at least one defect reaches a critical depth. The defects are assumed to initiate at random times modeled as event times of a non-homogeneous Poisson process (NHPP) and to develop according to a non-decreasing time-dependent gamma process. The intensity rate of the NHPP is assumed to be a combination of a known time-dependent shape function and an unknown proportionality constant. When sampling inspection (i.e. inspection of a selected subregion of the system) results in a number of defect initiations, Bayes' theorem can be used to update prior beliefs about the proportionality constant of the NHPP intensity rate to the posterior distribution. On the basis of a time- and space-dependent Poisson process for the defect initiation, an adaptive Bayesian model for sampling inspection is developed to determine the predictive probability distribution of the time to failure. A potential application is, for instance, the inspection of a large vessel or pipeline suffering pitting/localized corrosion in the oil industry. The possibility of imperfect defect detection is also incorporated in the model.

  15. Fueling our future: Four steps to a new, reliable, cleaner, decentralized energy supply based on hydrogen and fuel cells

    Energy Technology Data Exchange (ETDEWEB)

    Evers, A. A. [Arno A. Evers FAIR-PR, Starnberg (Germany)

    2004-07-01

    The necessary preconditions and the driving forces operating to move hydrogen and fuel cells to world-wide commercialization are examined, focusing on trends that impacted the progress of new technologies in the past. The consensus is that consumers have played a vital role in the past, and will continue to play an even more vital role in the future as drivers in the mass market evolution of technological progress. The automobile, aircraft and cell phone industries are examined as examples of consumer influence on technology development. One such scenario, specific to the hydrogen economy is the potential dual role played by fuel cell-powered personal automobiles which may not only provide transportation but also supply electricity and heat to residential and commercial buildings while in a stationary mode. It is suggested that given the size of the population and the current level of economic development in the Peoples' Republic of China, conditions there are most favourable to accelerate the development of a hydrogen and fuel cell-based economy. Details of developments in China and how the hydrogen-fuel cells scenario may develop there, are discussed. 11 figs.

  16. The ALMA high speed optical communication link is here: an essential component for reliable present and future operations

    Science.gov (United States)

    Filippi, G.; Ibsen, J.; Jaque, S.; Liello, F.; Ovando, N.; Astudillo, A.; Parra, J.; Saldias, Christian

    2016-07-01

    Announced in 2012, started in 2013 and completed in 2015, the ALMA high bandwidth communication system has become a key factor to achieve the operational and scientific goals of ALMA. This paper summarizes the technical, organizational, and operational goals of the ALMA Optical Link Project, focused in the creation and operation of an effective and sustainable communication infrastructure to connect the ALMA Operations Support Facility and Array Operations Site, both located in the Atacama Desert in the Northern region of Chile, with the point of presence of REUNA in Antofagasta, about 400km away, and from there to the Santiago Central Office in the Chilean capital through the optical infrastructure created by the EC-funded EVALSO project and now an integral part of the REUNA backbone. This new infrastructure completed in 2014 and now operated on behalf of ALMA by REUNA, the Chilean National Research and Education Network, uses state of the art technologies, like dark fiber from newly built cables and DWDM transmission, allowing extending the reach of high capacity communication to the remote region where the Observatory is located. The paper also reports on the results obtained during the first year and a half testing and operation period, where different operational set ups have been experienced for data transfer, remote collaboration, etc. Finally, the authors will present a forward look of the impact of it to both the future scientific development of the Chajnantor Plateau, where many installations area are (and will be) located, as well as the potential Chilean scientific backbone long term development.

  17. Reliability, standard error, and minimum detectable change of clinical pressure pain threshold testing in people with and without acute neck pain.

    Science.gov (United States)

    Walton, David M; Macdermid, Joy C; Nielson, Warren; Teasell, Robert W; Chiasson, Marco; Brown, Lauren

    2011-09-01

    Clinical measurement. To evaluate the intrarater, interrater, and test-retest reliability of an accessible digital algometer, and to determine the minimum detectable change in normal healthy individuals and a clinical population with neck pain. Pressure pain threshold testing may be a valuable assessment and prognostic indicator for people with neck pain. To date, most of this research has been completed using algometers that are too resource intensive for routine clinical use. Novice raters (physiotherapy students or clinical physiotherapists) were trained to perform algometry testing over 2 clinically relevant sites: the angle of the upper trapezius and the belly of the tibialis anterior. A convenience sample of normal healthy individuals and a clinical sample of people with neck pain were tested by 2 different raters (all participants) and on 2 different days (healthy participants only). Intraclass correlation coefficient (ICC), standard error of measurement, and minimum detectable change were calculated. A total of 60 healthy volunteers and 40 people with neck pain were recruited. Intrarater reliability was almost perfect (ICC = 0.94-0.97), interrater reliability was substantial to near perfect (ICC = 0.79-0.90), and test-retest reliability was substantial (ICC = 0.76-0.79). Smaller change was detectable in the trapezius compared to the tibialis anterior. This study provides evidence that novice raters can perform digital algometry with adequate reliability for research and clinical use in people with and without neck pain.

  18. Audio gunshot detection and localization systems: History, basic design, and future possibilities

    Science.gov (United States)

    Graves, Jordan R.

    For decades, law enforcement organizations have increasingly utilized audio detection and localization systems to identify potential gunshot incidents and to respond accordingly. These systems have grown from simple microphone configurations used to estimate location into complex arrays that seem to pinpoint gunfire to within mere feet of its actual occurrence. Such technology comes from a long and dynamic history of developing equipment dating back to the First World War. Additionally, though basic designs require little in terms of programming or engineering experience, the mere presence of this tool invokes a firestorm of debate amongst economists, law enforcement groups, and the general public, which leads to questions about future possibilities for its use. The following pages will retell the history of these systems from theoretical conception to current capabilities. This work will also dissect these systems to reveal fundamental elements of their inner workings, in order to build a basic demonstrative system. Finally, this work will discuss some legal and moral points of dissension, and will explore these systems’ roles in society now and in the future, in additional applications as well.

  19. The reliability and accuracy of two methods for proximal caries detection and depth on directly visible proximal surfaces: an in vitro study

    DEFF Research Database (Denmark)

    Ekstrand, K R; Alloza, Alvaro Luna; Promisiero, L

    2011-01-01

    This study aimed to determine the reliability and accuracy of the ICDAS and radiographs in detecting and estimating the depth of proximal lesions on extracted teeth. The lesions were visible to the naked eye. Three trained examiners scored a total of 132 sound/carious proximal surfaces from 106 p...

  20. Smart Sensing System for Early Detection of Bone Loss: Current Status and Future Possibilities

    Directory of Open Access Journals (Sweden)

    Nasrin Afsarimanesh

    2018-02-01

    Full Text Available Bone loss and osteoporosis is a serious health problem worldwide. The impact of osteoporosis is far greater than many other serious health problems, such as breast and prostate cancers. Statistically, one in three women and one in five men over 50 years of age will experience osteoporotic fractures in their life. In this paper, the design and development of a portable IoT-based sensing system for early detection of bone loss have been presented. The CTx-I biomarker was measured in serum samples as a marker of bone resorption. A planar interdigital sensor was used to evaluate the changes in impedance by any variation in the level of CTx-I. Artificial antibodies were used to introduce selectivity to the sensor for CTx-I molecule. Artificial antibodies for CTx-I molecules were created using molecular imprinted polymer (MIP technique in order to increase the stability of the system and reduce the production cost and complexity of the assay procedure. Real serum samples collected from sheep blood were tested and the result validation was done by using an ELISA kit. The PoC device was able to detect CTx-I concentration as low as 0.09 ng/mL. It exhibited an excellent linear behavior in the range of 0.1–2.5 ng/mL, which covers the normal reference ranges required for bone loss detection. Future possibilities to develop a smart toilet for simultaneous measurement of different bone turnover biomarkers was also discussed.

  1. Improvement of Matrix Converter Drive Reliability by Online Fault Detection and a Fault-Tolerant Switching Strategy

    DEFF Research Database (Denmark)

    Nguyen-Duy, Khiem; Liu, Tian-Hua; Chen, Der-Fa

    2011-01-01

    The matrix converter system is becoming a very promising candidate to replace the conventional two-stage ac/dc/ac converter, but system reliability remains an open issue. The most common reliability problem is that a bidirectional switch has an open-switch fault during operation. In this paper, a...

  2. Reliability and minimal detectable change of a modified passive neck flexion test in patients with chronic nonspecific neck pain and asymptomatic subjects.

    Science.gov (United States)

    López-de-Uralde-Villanueva, Ibai; Acuyo-Osorio, Mario; Prieto-Aldana, María; La Touche, Roy

    2017-04-01

    The Passive Neck Flexion Test (PNFT) can diagnose meningitis and potential spinal disorders. Little evidence is available concerning the use of a modified version of the PNFT (mPNFT) in patients with chronic nonspecific neck pain (CNSNP). To assess the reliability of the mPNFT in subjects with and without CNSNP. The secondary objective was to assess the differences in the symptoms provoked by the mPNFT between these two populations. We used repeated measures concordance design for the main objective and cross-sectional design for the secondary objective. A total of 30 asymptomatic subjects and 34 patients with CNSNP were recruited. The following measures were recorded: the range of motion at the onset of symptoms (OS-mPNFT), the range of motion at the submaximal pain (SP-mPNFT), and evoked pain intensity on the mPNFT (VAS-mPNFT). Good to excellent reliability was observed for OS-mPNFT and SP-mPNFT in the asymptomatic group (intra-examiner reliability: 0.95-0.97; inter-examiner reliability: 0.86-0.90; intra-examiner test-retest reliability: 0.84-0.87). In the CNSNP group, a good to excellent reliability was obtained for the OS-mPNFT (intra-examiner reliability: 0.89-0.96; inter-examiner reliability: 0.83-0.86; intra-examiner test-retest reliability: 0.83-0.85) and the SP-PNFT (intra-examiner reliability: 0.94-0.98; inter-examiner reliability: 0.80-0.82; intra-examiner test-retest reliability: 0.88-0.91). The CNSNP group showed statistically significant differences in OS-mPNFT (t = 4.92; P reliable tool regardless of the examiner and the time factor. Patients with CNSNP have a decrease range of motion and more pain than asymptomatic subjects in the mPNFT. This exceeds the minimal detectable changes for OS-mPNFT and VAS-mPNFT. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Nuclear power - a reliable future

    International Nuclear Information System (INIS)

    Valeca, Serban

    2002-01-01

    The Ministry of Education and Research - Department of Research has implemented a national Research and Development program taking into consideration the following: - the requirements of the European Union on research as a factor of development of the knowledge-based society; - the commitments to the assimilation and enforcement of the recommendations of the European Union on nuclear power prompted by the negotiations of the sections 'Science and Research' and ' Energy' of the aquis communautaire; - the major lines of interest in Romania in the nuclear power field established by National Framework Program of Cooperation with IAEA, signed on April 2001; - the short and medium term nuclear options of the Romanian Government; - the objectives of the National Nuclear Plan. The major elements of the nuclear research and development program MENER (Environment, Energy, Resources) supported by the Department of Research of the Ministry of Education and Research are the following: - reactor physics and nuclear fuel management; - operation safety of the Power Unit 1 of Cernavoda Nuclear Electric Power Station; - improved nuclear technological solutions at the Cernavoda NPP; - development of technologies for nuclear fuel cycle; - operation safety of the other nuclear plants in Romania; - assessment of nuclear risks and estimation of the radiological impact on the environment; - behavior of materials under the reactor service conditions and environmental conditions; - design of nuclear systems and equipment for the nuclear power stations and nuclear facilities; - radiological safety; - application of nuclear techniques and technologies in industry, agriculture, medicine and other fields of social life. Research to develop high performance methods and equipment for monitoring nuclear impact on environment are conducted to endorse the measures for radiation protection. Also mentioned are the research on implementing a new type of nuclear fuel cycle in CANDU reactors as well as development of advanced fuels based on slightly enriched uranium recovered from enriched fuel treatment as well as on fuel cycles using the spent fuel from PWR reactors in CANDU reactors. The paper addresses also legal aspects of nuclear power, international conventions and agreements and international cooperation in the nuclear field

  4. Psychometric properties of the Need for Recovery after work scale: test-retest reliability and sensitivity to detect change

    NARCIS (Netherlands)

    de Croon, E. M.; Sluiter, J. K.; Frings-Dresen, M. H. W.

    2006-01-01

    BACKGROUND: Monitoring worker health and evaluating occupational healthcare interventions requires sensitive instruments that are reliable over time. The Need for Recovery scale (NFR), which quantifies workers' difficulties in recovering from work related exertions, may be a relevant instrument in

  5. Ionospheric detection of tsunami earthquakes: observation, modeling and ideas for future early warning

    Science.gov (United States)

    Occhipinti, G.; Manta, F.; Rolland, L.; Watada, S.; Makela, J. J.; Hill, E.; Astafieva, E.; Lognonne, P. H.

    2017-12-01

    Detection of ionospheric anomalies following the Sumatra and Tohoku earthquakes (e.g., Occhipinti 2015) demonstrated that ionosphere is sensitive to earthquake and tsunami propagation: ground and oceanic vertical displacement induces acoustic-gravity waves propagating within the neutral atmosphere and detectable in the ionosphere. Observations supported by modelling proved that ionospheric anomalies related to tsunamis are deterministic and reproducible by numerical modeling via the ocean/neutral-atmosphere/ionosphere coupling mechanism (Occhipinti et al., 2008). To prove that the tsunami signature in the ionosphere is routinely detected we show here perturbations of total electron content (TEC) measured by GPS and following tsunamigenic earthquakes from 2004 to 2011 (Rolland et al. 2010, Occhipinti et al., 2013), nominally, Sumatra (26 December, 2004 and 12 September, 2007), Chile (14 November, 2007), Samoa (29 September, 2009) and the recent Tohoku-Oki (11 Mars, 2011). Based on the observations close to the epicenter, mainly performed by GPS networks located in Sumatra, Chile and Japan, we highlight the TEC perturbation observed within the first 8 min after the seismic rupture. This perturbation contains information about the ground displacement, as well as the consequent sea surface displacement resulting in the tsunami. In addition to GNSS-TEC observations close to the epicenter, new exciting measurements in the far-field were performed by airglow measurement in Hawaii show the propagation of the internal gravity waves induced by the Tohoku tsunami (Occhipinti et al., 2011). This revolutionary imaging technique is today supported by two new observations of moderate tsunamis: Queen Charlotte (M: 7.7, 27 October, 2013) and Chile (M: 8.2, 16 September 2015). We finally detail here our recent work (Manta et al., 2017) on the case of tsunami alert failure following the Mw7.8 Mentawai event (25 October, 2010), and its twin tsunami alert response following the Mw7

  6. Fluorescence detection, enumeration and characterization of single circulating cells in vivo: technology, applications and future prospects

    Science.gov (United States)

    Hartmann, Carolin; Patil, Roshani; Lin, Charles P.; Niedre, Mark

    2018-01-01

    There are many diseases and biological processes that involve circulating cells in the bloodstream, such as cancer metastasis, immunology, reproductive medicine, and stem cell therapies. This has driven significant interest in new technologies for the study of circulating cells in small animal research models and clinically. Most currently used methods require drawing and enriching blood samples from the body, but these suffer from a number of limitations. In contrast, ‘in vivo flow cytometry’ (IVFC) refers to set of technologies that allow study of cells directly in the bloodstream of the organism in vivo. In recent years the IVFC field has grown significantly and new techniques have been developed, including fluorescence microscopy, multi-photon, photo-acoustic, and diffuse fluorescence IVFC. In this paper we review recent technical advances in IVFC, with emphasis on instrumentation, contrast mechanisms, and detection sensitivity. We also describe key applications in biomedical research, including cancer research and immunology. Last, we discuss future directions for IVFC, as well as prospects for broader adoption by the biomedical research community and translation to humans clinically.

  7. Pocket Handbook on Reliability

    Science.gov (United States)

    1975-09-01

    exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future

  8. Reliable Maintanace of Wireless Sensor Networks for Event-detection Applications%事件检测型传感器网络的可靠性维护

    Institute of Scientific and Technical Information of China (English)

    胡四泉; 杨金阳; 王俊峰

    2011-01-01

    The reliability maintannace of the wireless sensor network is a key point to keep the alarm messages delivered reliably to the monitor center on time in a event-detection application. Based on the unreliable links in the wireless sensor network and the network charateristics of an event detection application,MPRRM,a multiple path redundant reliability maintanace algoritm was proposed in this paper. Both analytical and simulation results show that the MPRRM algorithm is superior to the previous published solutions in the metrics of reliability, false positive rate, latency and message overhead.%传感器网络(Wireless Sensor Networks,WSN)的事件检测型应用中,如何通过可靠性维护来保证在检测到事件时报警信息能及时、可靠地传输到监控主机至关重要.通过对不可靠的无线链路和网络传输的分析,提出多路冗余可靠性维护算法MPRRM.通过解析方法和仿真分析证明,该算法在可靠性、误报率、延迟和消息开销量上比同类算法具有优势.

  9. Stakeholders' opinions on a future in-vehicle alcohol detection system for prevention of drunk driving.

    Science.gov (United States)

    Anund, Anna; Antonson, Hans; Ihlström, Jonas

    2015-01-01

    There is a common understanding that driving under the influence of alcohol is associated with higher risk of being involved in crashes with injuries and possible fatalities as the outcome. Various countermeasures have therefore from time to time been taken by the authorities to prevent drunk driving. One of them has been the alcohol interlock. Up to now, interlocks have mainly been used by previously convicted drunk drivers and in the commercial road transport sector, but not in private cars. New technology has today reached a level where broader implementation might be possible. To our knowledge, however, little is known about different stakeholders' opinions of a broader implementation of such systems. In order to increase that knowledge, we conducted a focus group study to collect in-depth thoughts from different stakeholders on this topic. Eight focus groups representing a broad societal span were recruited and conducted for the purpose. The results show that most stakeholders thought that an integrated system for alcohol detection in vehicles might be beneficial in lowering the number of drunk driving crashes. They said that the system would probably mainly prevent driving by people who unintentionally and unknowingly drive under the influence of alcohol. The groups did, however, not regard the system as a final solution to the drunk driving problem, and believed that certain groups, such as criminals and alcoholics, would most likely find a way around the system. Concerns were raised about the risk of increased sleepy driving and driving just under the legal blood alcohol concentration (BAC) limit. The results also indicate that stakeholders preferred a system that provides information on the BAC up to the legal limit, but not for levels above the limit; for those, the system should simply prevent the car from starting. Acceptance of the system depended on the reliability of the system, on its ability to perform fast sampling, and on the analytical process

  10. Pharyngeal pH alone is not reliable for the detection of pharyngeal reflux events: A study with oesophageal and pharyngeal pH-impedance monitoring

    Science.gov (United States)

    Desjardin, Marie; Roman, Sabine; des Varannes, Stanislas Bruley; Gourcerol, Guillaume; Coffin, Benoit; Ropert, Alain; Mion, François

    2013-01-01

    Background Pharyngeal pH probes and pH-impedance catheters have been developed for the diagnosis of laryngo-pharyngeal reflux. Objective To determine the reliability of pharyngeal pH alone for the detection of pharyngeal reflux events. Methods 24-h pH-impedance recordings performed in 45 healthy subjects with a bifurcated probe for detection of pharyngeal and oesophageal reflux events were reviewed. Pharyngeal pH drops to below 4 and 5 were analysed for the simultaneous occurrence of pharyngeal reflux, gastro-oesophageal reflux, and swallows, according to impedance patterns. Results Only 7.0% of pharyngeal pH drops to below 5 identified with impedance corresponded to pharyngeal reflux, while 92.6% were related to swallows and 10.2 and 13.3% were associated with proximal and distal gastro-oesophageal reflux events, respectively. Of pharyngeal pH drops to below 4, 13.2% were related to pharyngeal reflux, 87.5% were related to swallows, and 18.1 and 21.5% were associated with proximal and distal gastro-oesophageal reflux events, respectively. Conclusions This study demonstrates that pharyngeal pH alone is not reliable for the detection of pharyngeal reflux and that adding distal oesophageal pH analysis is not helpful. The only reliable analysis should take into account impedance patterns demonstrating the presence of pharyngeal reflux event preceded by a distal and proximal reflux event within the oesophagus. PMID:24917995

  11. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  12. Reliability data banks

    International Nuclear Information System (INIS)

    Cannon, A.G.; Bendell, A.

    1991-01-01

    Following an introductory chapter on Reliability, what is it, why it is needed, how it is achieved and measured, the principles of reliability data bases and analysis methodologies are the subject of the next two chapters. Achievements due to the development of data banks are mentioned for different industries in the next chapter, FACTS, a comprehensive information system for industrial safety and reliability data collection in process plants are covered next. CREDO, the Central Reliability Data Organization is described in the next chapter and is indexed separately, as is the chapter on DANTE, the fabrication reliability Data analysis system. Reliability data banks at Electricite de France and IAEA's experience in compiling a generic component reliability data base are also separately indexed. The European reliability data system, ERDS, and the development of a large data bank come next. The last three chapters look at 'Reliability data banks, - friend foe or a waste of time'? and future developments. (UK)

  13. Myth of the Master Detective: Reliability of Interpretations for Kaufman's "Intelligent Testing" Approach to the WISC-III.

    Science.gov (United States)

    Macmann, Gregg M.; Barnett, David W.

    1997-01-01

    Used computer simulation to examine the reliability of interpretations for Kaufman's "intelligent testing" approach to the Wechsler Intelligence Scale for Children (3rd ed.) (WISC-III). Findings indicate that factor index-score differences and other measures could not be interpreted with confidence. Argues that limitations of IQ testing…

  14. A System of Deception and Fraud Detection Using Reliable Linguistic Cues Including Hedging, Disfluencies, and Repeated Phrases

    Science.gov (United States)

    Humpherys, Sean LaMarc

    2010-01-01

    Given the increasing problem of fraud, crime, and national security threats, assessing credibility is a recurring research topic in Information Systems and in other disciplines. Decision support systems can help. But the success of the system depends on reliable cues that can distinguish deceptive/truthful behavior and on a proven classification…

  15. Web-based tools can be used reliably to detect patients with major depressive disorder and subsyndromal depressive symptoms

    Directory of Open Access Journals (Sweden)

    Tsai Shih-Jen

    2007-04-01

    Full Text Available Abstract Background Although depression has been regarded as a major public health problem, many individuals with depression still remain undetected or untreated. Despite the potential for Internet-based tools to greatly improve the success rate of screening for depression, their reliability and validity has not been well studied. Therefore the aim of this study was to evaluate the test-retest reliability and criterion validity of a Web-based system, the Internet-based Self-assessment Program for Depression (ISP-D. Methods The ISP-D to screen for major depressive disorder (MDD, minor depressive disorder (MinD, and subsyndromal depressive symptoms (SSD was developed in traditional Chinese. Volunteers, 18 years and older, were recruited via the Internet and then assessed twice on the online ISP-D system to investigate the test-retest reliability of the test. They were subsequently prompted to schedule face-to-face interviews. The interviews were performed by the research psychiatrists using the Mini-International Neuropsychiatric Interview and the diagnoses made according to DSM-IV diagnostic criteria were used for the statistics of criterion validity. Kappa (κ values were calculated to assess test-retest reliability. Results A total of 579 volunteer subjects were administered the test. Most of the subjects were young (mean age: 26.2 ± 6.6 years, female (77.7%, single (81.6%, and well educated (61.9% college or higher. The distributions of MDD, MinD, SSD and no depression specified were 30.9%, 7.4%, 15.2%, and 46.5%, respectively. The mean time to complete the ISP-D was 8.89 ± 6.77 min. One hundred and eighty-four of the respondents completed the retest (response rate: 31.8%. Our analysis revealed that the 2-week test-retest reliability for ISP-D was excellent (weighted κ = 0.801. Fifty-five participants completed the face-to-face interview for the validity study. The sensitivity, specificity, positive, and negative predictive values for major

  16. The reliability and internal consistency of one-shot and flicker change detection for measuring individual differences in visual working memory capacity.

    Science.gov (United States)

    Pailian, Hrag; Halberda, Justin

    2015-04-01

    We investigated the psychometric properties of the one-shot change detection task for estimating visual working memory (VWM) storage capacity-and also introduced and tested an alternative flicker change detection task for estimating these limits. In three experiments, we found that the one-shot whole-display task returns estimates of VWM storage capacity (K) that are unreliable across set sizes-suggesting that the whole-display task is measuring different things at different set sizes. In two additional experiments, we found that the one-shot single-probe variant shows improvements in the reliability and consistency of K estimates. In another additional experiment, we found that a one-shot whole-display-with-click task (requiring target localization) also showed improvements in reliability and consistency. The latter results suggest that the one-shot task can return reliable and consistent estimates of VWM storage capacity (K), and they highlight the possibility that the requirement to localize the changed target is what engenders this enhancement. Through a final series of four experiments, we introduced and tested an alternative flicker change detection method that also requires the observer to localize the changing target and that generates, from response times, an estimate of VWM storage capacity (K). We found that estimates of K from the flicker task correlated with estimates from the traditional one-shot task and also had high reliability and consistency. We highlight the flicker method's ability to estimate executive functions as well as VWM storage capacity, and discuss the potential for measuring multiple abilities with the one-shot and flicker tasks.

  17. Numerical and structural genomic aberrations are reliably detectable in tissue microarrays of formalin-fixed paraffin-embedded tumor samples by fluorescence in-situ hybridization.

    Directory of Open Access Journals (Sweden)

    Heike Horn

    Full Text Available Few data are available regarding the reliability of fluorescence in-situ hybridization (FISH, especially for chromosomal deletions, in high-throughput settings using tissue microarrays (TMAs. We performed a comprehensive FISH study for the detection of chromosomal translocations and deletions in formalin-fixed and paraffin-embedded (FFPE tumor specimens arranged in TMA format. We analyzed 46 B-cell lymphoma (B-NHL specimens with known karyotypes for translocations of IGH-, BCL2-, BCL6- and MYC-genes. Locus-specific DNA probes were used for the detection of deletions in chromosome bands 6q21 and 9p21 in 62 follicular lymphomas (FL and six malignant mesothelioma (MM samples, respectively. To test for aberrant signals generated by truncation of nuclei following sectioning of FFPE tissue samples, cell line dilutions with 9p21-deletions were embedded into paraffin blocks. The overall TMA hybridization efficiency was 94%. FISH results regarding translocations matched karyotyping data in 93%. As for chromosomal deletions, sectioning artefacts occurred in 17% to 25% of cells, suggesting that the proportion of cells showing deletions should exceed 25% to be reliably detectable. In conclusion, FISH represents a robust tool for the detection of structural as well as numerical aberrations in FFPE tissue samples in a TMA-based high-throughput setting, when rigorous cut-off values and appropriate controls are maintained, and, of note, was superior to quantitative PCR approaches.

  18. Breast-i Is an Effective and Reliable Adjunct Screening Tool for Detecting Early Tumour Related Angiogenesis of Breast Cancers in Low Resource Sub-Saharan Countries

    Directory of Open Access Journals (Sweden)

    Frank Naku Ghartey

    2018-01-01

    Full Text Available Background. What cheaper alternative breast screening procedures are available to younger women in addition to clinical breast examination (CBE in Sub-Saharan countries? In 2009, we first described BreastLight for screening and reported high sensitivity at detecting breast cancer. Due to limitations of BreastLight, we have since 2014 been using the more technologically advanced Breast-i to screen 2204 women to find cheaper screening alternatives. Methodology. First, the participant lies down for CBE and then, in a darkened room, Breast-i was placed underneath each breast and trained personnel confirm vein pattern and look out for dark spot(s to ascertain the presence of suspicious angiogenic lesion(s. Results. CBE detected 153 palpable breast masses and Breast-i, which detects angiogenesis, confirmed 136. However, Breast-i detected 22 more cases of which 7 had angiogenesis but were not palpable and 15 were missed by CBE due to large breast size. Overall confirmed cases were 26, with Breast-i detecting 7 cases missed by CBE. Breast-i and CBE gave sensitivities of 92.3% and 73%, respectively. Conclusion. Breast-i with its high sensitivity to angiogenesis, reliability, and affordability will be an effective adjunct detection device that can be used effectively to increase early detection in younger women, thereby increasing treatment success.

  19. Test-retest reliability and agreement of the SPI-Questionnaire to detect symptoms of digital ischemia in elite volleyball players.

    Science.gov (United States)

    van de Pol, Daan; Zacharian, Tigran; Maas, Mario; Kuijer, P Paul F M

    2017-06-01

    The Shoulder posterior circumflex humeral artery Pathology and digital Ischemia - questionnaire (SPI-Q) has been developed to enable periodic surveillance of elite volleyball players, who are at risk for digital ischemia. Prior to implementation, assessing reliability is mandatory. Therefore, the test-retest reliability and agreement of the SPI-Q were evaluated among the population at risk. A questionnaire survey was performed with a 2-week interval among 65 elite male volleyball players assessing symptoms of cold, pale and blue digits in the dominant hand during or after practice or competition using a 4-point Likert scale (never, sometimes, often and always). Kappa (κ) and percentage of agreement (POA) were calculated for individual symptoms, and to distinguish symptomatic and asymptomatic players. For the individual symptoms, κ ranged from "poor" (0.25) to "good" (0.63), and POA ranged from "moderate" (78%) to "good" (97%). To classify symptomatic players, the SPI-Q showed "good" reliability (κ = 0.83; 95%CI 0.69-0.97) and "good" agreement (POA = 92%). The current study has proven the SPI-Q to be reliable for detecting elite male indoor volleyball players with symptoms of digital ischemia.

  20. Current status and future advancements in analytical post-factum detection of food irradiation within the EU and in global trade

    International Nuclear Information System (INIS)

    Boegl, K.W.

    1999-01-01

    The paper summarizes and explains officially approved methods for reliable analytical detection of illegal or non-labelled treatment of foods with ionizing radiation, referring inter alia to a German study published in 1997. (orig./CB) [de

  1. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  2. The reliability, minimal detectable change and concurrent validity of a gravity-based bubble inclinometer and iphone application for measuring standing lumbar lordosis.

    Science.gov (United States)

    Salamh, Paul A; Kolber, Morey

    2014-01-01

    To investigate the reliability, minimal detectable change (MDC90) and concurrent validity of a gravity-based bubble inclinometer (inclinometer) and iPhone® application for measuring standing lumbar lordosis. Two investigators used both an inclinometer and an iPhone® with an inclinometer application to measure lumbar lordosis of 30 asymptomatic participants. ICC models 3,k and 2,k were used for the intrarater and interrater analysis, respectively. Good interrater and intrarater reliability was present for the inclinometer with Intraclass Correlation Coefficients (ICC) of 0.90 and 0.85, respectively and the iPhone® application with ICC values of 0.96 and 0.81. The minimal detectable change (MDC90) indicates that a change greater than or equal to 7° and 6° is needed to exceed the threshold of error using the iPhone® and inclinometer, respectively. The concurrent validity between the two instruments was good with a Pearson product-moment coefficient of correlation (r) of 0.86 for both raters. Ninety-five percent limits of agreement identified differences ranging from 9° greater in regards to the iPhone® to 8° less regarding the inclinometer. Both the inclinometer and iPhone® application possess good interrater reliability, intrarater reliability and concurrent validity for measuring standing lumbar lordosis. This investigation provides preliminary evidence to suggest that smart phone applications may offer clinical utility comparable to inclinometry for quantifying standing lumbar lordosis. Clinicians should recognize potential individual differences when using these devices interchangeably.

  3. The Validity and Reliability of the Mini-Mental State Examination-2 for Detecting Mild Cognitive Impairment and Alzheimer's Disease in a Korean Population.

    Directory of Open Access Journals (Sweden)

    Min Jae Baek

    Full Text Available To examine the validity and reliability of the MMSE-2 for assessing patients with mild cognitive impairment (MCI and Alzheimer's disease (AD in a Korean population. Specifically, the usefulness of the MMSE-2 as a screening measure for detecting early cognitive change, which has not been detectable through the MMSE, was examined.Two-hundred and twenty-six patients with MCI, 97 patients with AD, and 91 healthy older adults were recruited. All participants consented to examination with the MMSE-2, the MMSE, and other detailed neuropsychological assessments.The MMSE-2 performed well in discriminating participants across Clinical Dementia Rating (CDR stages and CDR-Sum of Boxes (CDR-SOB, and it showed excellent internal consistency, high test-retest reliability, high interrater reliability, and good concurrent validity with the MMSE and other detailed neuropsychological assessments. The MMSE-2 was divided into two factors (tests that are sensitive to decline in cognitive functions vs. tests that are not sensitive to decline in cognitive functions in normal cognitive aging. Moreover, the MMSE-2 was divided into two factors (tests related overall cognitive functioning other than memory vs. tests related to episodic memory in patients with AD. Finally, the MMSE-2 was divided into three factors (tests related to working memory and frontal lobe functioning vs. tests related to verbal memory vs. tests related to orientation and immediate recall in patients with MCI. The sensitivity and specificity of the three versions of the MMSE-2 were relatively high in discriminating participants with normal cognitive aging from patients with MCI and AD.The MMSE-2 is a valid and reliable cognitive screening instrument for assessing cognitive impairment in a Korean population, but its ability to distinguish patients with MCI from those with normal cognitive aging may not be as highly sensitive as expected.

  4. The Validity and Reliability of the Mini-Mental State Examination-2 for Detecting Mild Cognitive Impairment and Alzheimer's Disease in a Korean Population.

    Science.gov (United States)

    Baek, Min Jae; Kim, Karyeong; Park, Young Ho; Kim, SangYun

    To examine the validity and reliability of the MMSE-2 for assessing patients with mild cognitive impairment (MCI) and Alzheimer's disease (AD) in a Korean population. Specifically, the usefulness of the MMSE-2 as a screening measure for detecting early cognitive change, which has not been detectable through the MMSE, was examined. Two-hundred and twenty-six patients with MCI, 97 patients with AD, and 91 healthy older adults were recruited. All participants consented to examination with the MMSE-2, the MMSE, and other detailed neuropsychological assessments. The MMSE-2 performed well in discriminating participants across Clinical Dementia Rating (CDR) stages and CDR-Sum of Boxes (CDR-SOB), and it showed excellent internal consistency, high test-retest reliability, high interrater reliability, and good concurrent validity with the MMSE and other detailed neuropsychological assessments. The MMSE-2 was divided into two factors (tests that are sensitive to decline in cognitive functions vs. tests that are not sensitive to decline in cognitive functions) in normal cognitive aging. Moreover, the MMSE-2 was divided into two factors (tests related overall cognitive functioning other than memory vs. tests related to episodic memory) in patients with AD. Finally, the MMSE-2 was divided into three factors (tests related to working memory and frontal lobe functioning vs. tests related to verbal memory vs. tests related to orientation and immediate recall) in patients with MCI. The sensitivity and specificity of the three versions of the MMSE-2 were relatively high in discriminating participants with normal cognitive aging from patients with MCI and AD. The MMSE-2 is a valid and reliable cognitive screening instrument for assessing cognitive impairment in a Korean population, but its ability to distinguish patients with MCI from those with normal cognitive aging may not be as highly sensitive as expected.

  5. Chemical warfare agent detection: a review of current trends and future perspective.

    Science.gov (United States)

    Pacsial-Ong, Eden Joy; Aguilar, Zoraida P

    2013-01-01

    The World Health Organization recommends countries to create a public health system that can respond to the deliberate release of chemical warfare agents (CWAs). Procedures for preparedness, response, decontamination protocols and medical countermeasures against CWA attacks are described. Known CWAs, including their properties and pharmacological consequences upon exposure, are tabulated and discussed. Requirements imposed on detection systems by various applications and environmental needs are presented in order to assess the devices for detection and identification of specific CWAs. The review surveys current and near-term detection technologies and equipments, as well as devices that are currently available to the military and civilian first responders. Brief technical discussions of several detection technologies are presented, with emphasis placed in the principles of detection. Finally, enabling technologies that form the basis for advanced sensing systems and devices are described.

  6. Brain GABA Detection in vivo with the J-editing 1H MRS Technique: A Comprehensive Methodological Evaluation of Sensitivity Enhancement, Macromolecule Contamination and Test-Retest Reliability

    Science.gov (United States)

    Shungu, Dikoma C.; Mao, Xiangling; Gonzales, Robyn; Soones, Tacara N.; Dyke, Jonathan P.; van der Veen, Jan Willem; Kegeles, Lawrence S.

    2016-01-01

    Abnormalities in brain γ-aminobutyric acid (GABA) have been implicated in various neuropsychiatric and neurological disorders. However, in vivo GABA detection by proton magnetic resonance spectroscopy (1H MRS) presents significant challenges arising from low brain concentration, overlap by much stronger resonances, and contamination by mobile macromolecule (MM) signals. This study addresses these impediments to reliable brain GABA detection with the J-editing difference technique on a 3T MR system in healthy human subjects by (a) assessing the sensitivity gains attainable with an 8-channel phased-array head coil, (b) determining the magnitude and anatomic variation of the contamination of GABA by MM, and (c) estimating the test-retest reliability of measuring GABA with this method. Sensitivity gains and test-retest reliability were examined in the dorsolateral prefrontal cortex (DLPFC), while MM levels were compared across three cortical regions: the DLPFC, the medial prefrontal cortex (MPFC) and the occipital cortex (OCC). A 3-fold higher GABA detection sensitivity was attained with the 8-channel head coil compared to the standard single-channel head coil in DLPFC. Despite significant anatomic variation in GABA+MM and MM across the three brain regions (p GABA+MM was relatively stable across the three voxels, ranging from 41% to 49%, a non-significant regional variation (p = 0.58). The test-retest reliability of GABA measurement, expressed either as ratios to voxel tissue water (W) or total creatine, was found to be very high for both the single-channel coil and the 8-channel phased-array coil. For the 8-channel coil, for example, Pearson’s correlation coefficient of test vs. retest for GABA/W was 0.98 (R2 = 0.96, p = 0.0007), the percent coefficient of variation (CV) was 1.25%, and the intraclass correlation coefficient (ICC) was 0.98. Similar reliability was also found for the co-edited resonance of combined glutamate and glutamine (Glx) for both coils. PMID

  7. Reliability considerations of NDT by probability of detection (POD). Determination using ultrasound phased array. Results from a project in frame of the German nuclear safety research program

    International Nuclear Information System (INIS)

    Kurz, Jochen H.; Dugan, Sandra; Juengert, Anne

    2013-01-01

    Reliable assessment procedures are an important aspect of maintenance concepts. Non-destructive testing (NDT) methods are an essential part of a variety of maintenance plans. Fracture mechanical assessments require knowledge of flaw dimensions, loads and material parameters. NDT methods are able to acquire information on all of these areas. However, it has to be considered that the level of detail information depends on the case investigated and therefore on the applicable methods. Reliability aspects of NDT methods are of importance if quantitative information is required. Different design concepts e.g. the damage tolerance approach in aerospace already include reliability criteria of NDT methods applied in maintenance plans. NDT is also an essential part during construction and maintenance of nuclear power plants. In Germany, type and extent of inspection are specified in Safety Standards of the Nuclear Safety Standards Commission (KTA). Only certified inspections are allowed in the nuclear industry. The qualification of NDT is carried out in form of performance demonstrations of the inspection teams and the equipment, witnessed by an authorized inspector. The results of these tests are mainly statements regarding the detection capabilities of certain artificial flaws. In other countries, e.g. the U.S., additional blind tests on test blocks with hidden and unknown flaws may be required, in which a certain percentage of these flaws has to be detected. The knowledge of the probability of detection (POD) curves of specific flaws in specific testing conditions is often not present. This paper shows the results of a research project designed for POD determination of ultrasound phased array inspections of real and artificial cracks. The continuative objective of this project was to generate quantitative POD results. The distribution of the crack sizes of the specimens and the inspection planning is discussed, and results of the ultrasound inspections are presented. In

  8. Test-Retest Reliability and Minimal Detectable Change of Randomized Dichotic Digits in Learning-Disabled Children: Implications for Dichotic Listening Training.

    Science.gov (United States)

    Mahdavi, Mohammad Ebrahim; Pourbakht, Akram; Parand, Akram; Jalaie, Shohreh

    2018-03-01

    Evaluation of dichotic listening to digits is a common part of many studies for diagnosis and managing auditory processing disorders in children. Previous researchers have verified test-retest relative reliability of dichotic digits results in normal children and adults. However, detecting intervention-related changes in the ear scores after dichotic listening training requires information regarding trial-to-trial typical variation of individual ear scores that is estimated using indices of absolute reliability. Previous studies have not addressed absolute reliability of dichotic listening results. To compare the results of the Persian randomized dichotic digits test (PRDDT) and its relative and absolute indices of reliability between typical achieving (TA) and learning-disabled (LD) children. A repeated measures observational study. Fifteen LD children were recruited from a previously performed study with age range of 7-12 yr. The control group consisted of 15 TA schoolchildren with age range of 8-11 yr. The Persian randomized dichotic digits test was administered on the children under free recall condition in two test sessions 7-12 days apart. We compared the average of the ear scores and ear advantage between TA and LD children. Relative indices of reliability included Pearson's correlation and intraclass correlation (ICC 2,1 ) coefficients and absolute reliability was evaluated by calculation of standard error of measurement (SEM) and minimal detectable change (MDC) using the raw ear scores. The Pearson correlation coefficient indicated that in both groups of children the ear scores of test and retest sessions were strongly and positively (greater than +0.8) correlated. The ear scores showed excellent ICC coefficient of consistency (0.78-0.82) and fair to excellent ICC coefficient of absolute agreement (0.62-0.74) in TA children and excellent ICC coefficients of consistency and absolute agreement in LD children (0.76-0.87). SEM and SEM% of the ear scores in TA

  9. Current Status and Future Prospects for Aptamer-Based Mycotoxin Detection.

    Science.gov (United States)

    Ruscito, Annamaria; Smith, McKenzie; Goudreau, Daniel N; DeRosa, Maria C

    2016-07-01

    Aptamers are single-stranded oligonucleotides with the ability to bind tightly and selectively to a target analyte. High-affinity and specific aptamers for a variety of mycotoxins have been reported over the past decade. Increasingly, these molecular recognition elements are finding applications in biosensors and assays for the detection of mycotoxins in a variety of complex matrixes. This review article highlights the mycotoxin aptamers that are available for mycotoxin detection and the array of biosensing platforms into which they have been incorporated. Key advantages that aptamers have over analogous technology, and areas in which these advantages may be applied for the benefit of practical mycotoxin detection, are also discussed.

  10. Probabilistic reliability analyses to detect weak points in secondary-side residual heat removal systems of KWU PWR plants

    International Nuclear Information System (INIS)

    Schilling, R.

    1984-01-01

    Requirements made by Federal German licensing authorities called for the analysis of the second-side residual heat removal systems of new PWR plants with regard to availability, possible weak points and the balanced nature of the overall system for different incident sequences. Following a description of the generic concept and the process and safety-related systems for steam generator feed and main steam discharge, the reliability of the latter is analyzed for the small break LOCA and emergency power mode incidents, weak points in the process systems are identified, remedial measures of a system-specific and test-strategic nature are presented and their contribution to improving system availability is quantified. A comparison with the results of the German Risk Study on Nuclear Power Plants (GRS) shows a distinct reduction in core meltdown frequency. (orig.)

  11. Reliability analyses to detect weak points in secondary-side residual heat removal systems of KWU PWR plants

    International Nuclear Information System (INIS)

    Schilling, R.

    1983-01-01

    Requirements made by Federal German licensing authorities called for the analysis of the secondary-side residual heat removal systems of new PWR plants with regard to availability, possible weak points and the balanced nature of the overall system for different incident sequences. Following a description of the generic concept and the process and safety-related systems for steam generator feed and main steam discharge, the reliability of the latter is analyzed for the small break LOCA and emergency power mode incidents, weak points in the process systems identified, remedial measures of a system-specific and test-strategic nature presented and their contribution to improving system availability quantified. A comparison with the results of the German Risk Study on Nuclear Power Plants (GRS) shows a distinct reduction in core meltdown frequency. (orig.)

  12. Automatic Trip Detection with the Dutch Mobile Mobility Panel: Towards Reliable Multiple-Week Trip Registration for Large Samples

    NARCIS (Netherlands)

    Thomas, Tom; Geurs, Karst T.; Koolwaaij, Johan; Bijlsma, Marcel E.

    2018-01-01

    This paper examines the accuracy of trip and mode choice detection of the last wave of the Dutch Mobile Mobility Panel, a large-scale three-year, smartphone-based travel survey. Departure and arrival times, origins, destinations, modes, and travel purposes were recorded during a four week period in

  13. Methods and Reliability of Radiographic Vertebral Fracture Detection in Older Men: The Osteoporotic Fractures in Men Study

    Science.gov (United States)

    Cawthon, Peggy M.; Haslam, Jane; Fullman, Robin; Peters, Katherine W.; Black, Dennis; Ensrud, Kristine E.; Cummings, Steven R.; Orwoll, Eric S.; Barrett-Connor, Elizabeth; Marshall, Lynn; Steiger, Peter; Schousboe, John T.

    2014-01-01

    We describe the methods and reliability of radiographic vertebral fracture assessment in MrOS, a cohort of community dwelling men aged ≥65 yrs. Lateral spine radiographs were obtained at Visit 1 (2000-2) and 4.6 years later (Visit 2). Using a workflow tool (SpineAnalyzer™, Optasia Medical), a physician reader completed semi-quantitative (SQ) scoring. Prior to SQ scoring, technicians performed “triage” to reduce physician reader workload, whereby clearly normal spine images were eliminated from SQ scoring with all levels assumed to be SQ=0 (no fracture, “triage negative”); spine images with any possible fracture or abnormality were passed to the physician reader as “triage positive” images. Using a quality assurance sample of images (n=20 participants; 8 with baseline only and 12 with baseline and follow-up images) read multiple times, we calculated intra-reader kappa statistics and percent agreement for SQ scores. A subset of 494 participants' images were read regardless of triage classification to calculate the specificity and sensitivity of triage. Technically adequate images were available for 5958 of 5994 participants at Visit 1, and 4399 of 4423 participants at Visit 2. Triage identified 3215 (53.9%) participants with radiographs that required further evaluation by the physician reader. For prevalent fractures at Visit 1 (SQ≥1), intra-reader kappa statistics ranged from 0.79-0.92; percent agreement ranged from 96.9%-98.9%; sensitivity of the triage was 96.8% and specificity of triage was 46.3%. In conclusion, SQ scoring had excellent intra-rater reliability in our study. The triage process reduces expert reader workload without hindering the ability to identify vertebral fractures. PMID:25003811

  14. Interrater and Test-Retest Reliability and Minimal Detectable Change of the Balance Evaluation Systems Test (BESTest) and Subsystems With Community-Dwelling Older Adults.

    Science.gov (United States)

    Wang-Hsu, Elizabeth; Smith, Susan S

    2017-01-10

    Falls are a common cause of injuries and hospital admissions in older adults. Balance limitation is a potentially modifiable factor contributing to falls. The Balance Evaluation Systems Test (BESTest), a clinical balance measure, categorizes balance into 6 underlying subsystems. Each of the subsystems is scored individually and summed to obtain a total score. The reliability of the BESTest and its individual subsystems has been reported in patients with various neurological disorders and cancer survivors. However, the reliability and minimal detectable change (MDC) of the BESTest with community-dwelling older adults have not been reported. The purposes of our study were to (1) determine the interrater and test-retest reliability of the BESTest total and subsystem scores; and (2) estimate the MDC of the BESTest and its individual subsystem scores with community-dwelling older adults. We used a prospective cohort methodological design. Community-dwelling older adults (N = 70; aged 70-94 years; mean = 85.0 [5.5] years) were recruited from a senior independent living community. Trained testers (N = 3) administered the BESTest. All participants were tested with the BESTest by the same tester initially and then retested 7 to 14 days later. With 32 of the participants, a second tester concurrently scored the retest for interrater reliability. Testers were blinded to each other's scores. Intraclass correlation coefficients [ICC(2,1)] were used to determine the interrater and test-retest reliability. Test-retest reliability was also analyzed using method error and the associated coefficients of variation (CVME). MDC was calculated using standard error of measurement. Interrater reliability (N = 32) of the BESTest total score was ICC(2, 1) = 0.97 (95% confidence interval [CI], 0.94-0.99). The ICCs for the individual subsystem scores ranged from 0.85 to 0.94. Test-retest reliability (N = 70) of the BESTest total score was ICC(2,1) = 0.93 (95% CI, 0.89-0.96). ICCs for the

  15. Computer simulation of charged fusion-product trajectories and detection efficiency expected for future experiments within the COMPASS tokamak

    International Nuclear Information System (INIS)

    Kwiatkowski, Roch; Malinowski, Karol; Sadowski, Marek J

    2014-01-01

    This paper presents results of computer simulations of charged particle motions and detection efficiencies for an ion-pinhole camera of a new diagnostic system to be used in future COMPASS tokamak experiments. A probe equipped with a nuclear track detector can deliver information about charged products of fusion reactions. The calculations were performed with a so-called Gourdon code, based on a single-particle model and toroidal symmetry. There were computed trajectories of fast ions (> 500 keV) in medium-dense plasma (n e  < 10 14  cm −3 ) and an expected detection efficiency (a ratio of the number of detected particles to that of particles emitted from plasma). The simulations showed that charged fusion products can reach the new diagnostic probe, and the expected detection efficiency can reach 2 × 10 −8 . Based on such calculations, one can determine the optimal position and orientation of the probe. The obtained results are of importance for the interpretation of fusion-product images to be recorded in future COMPASS experiments. (paper)

  16. Detecting Market Transitions and Energy Futures Risk Management Using Principal Components

    NARCIS (Netherlands)

    Borovkova, S.A.

    2006-01-01

    An empirical approach to analysing the forward curve dynamics of energy futures is presented. For non-seasonal commodities-such as crude oil-the forward curve is well described by the first three principal components: the level, slope and curvature. A principal component indicator is described that

  17. Reliability, validity, and minimal detectable change of the push-off test scores in assessing upper extremity weight-bearing ability.

    Science.gov (United States)

    Mehta, Saurabh P; George, Hannah R; Goering, Christian A; Shafer, Danielle R; Koester, Alan; Novotny, Steven

    2017-11-01

    Clinical measurement study. The push-off test (POT) was recently conceived and found to be reliable and valid for assessing weight bearing through injured wrist or elbow. However, further research with larger sample can lend credence to the preliminary findings supporting the use of the POT. This study examined the interrater reliability, construct validity, and measurement error for the POT in patients with wrist conditions. Participants with musculoskeletal (MSK) wrist conditions were recruited. The performance on the POT, grip isometric strength of wrist extensors was assessed. The shortened version of the Disabilities of the Arm, Shoulder and Hand and numeric pain rating scale were completed. The intraclass correlation coefficient assessed interrater reliability of the POT. Pearson correlation coefficients (r) examined the concurrent relationships between the POT and other measures. The standard error of measurement and the minimal detectable change at 90% confidence interval were assessed as measurement error and index of true change for the POT. A total of 50 participants with different elbow or wrist conditions (age: 48.1 ± 16.6 years) were included in this study. The results of this study strongly supported the interrater reliability (intraclass correlation coefficient: 0.96 and 0.93 for the affected and unaffected sides, respectively) of the POT in patients with wrist MSK conditions. The POT showed convergent relationships with the grip strength on the injured side (r = 0.89) and the wrist extensor strength (r = 0.7). The POT showed smaller standard error of measurement (1.9 kg). The minimal detectable change at 90% confidence interval for the POT was 4.4 kg for the sample. This study provides additional evidence to support the reliability and validity of the POT. This is the first study that provides the values for the measurement error and true change on the POT scores in patients with wrist MSK conditions. Further research should examine the

  18. How reliable are the sup 14 C-urea breath test and specific serology for the detection of gastric campylobacter

    Energy Technology Data Exchange (ETDEWEB)

    Husebye, E; O' Leary, D; Skar, V; Melby, K [Ullevaal Sykehus, Oslo (Norway)

    1990-01-01

    Detection of gastric campylobacter by the {sup 14}C-urea breath test and serology were correlated to biopsy culture in 25 unselected outpatients referred for gastroscopy. All the 17 culture-positive patients had positive {sup 14}C-urea breath test, and 16 had positive serology. Of eight culture-negative patients, six patients had negative breath test and seven negative serology. A high degree of reproducibility was found when two subsequent breath tests were performed in 11 healthy volunteers. The breath test values obtained at 10 min showed a strong correlation to the accumulated values within 30 min. Breath sampling once, 10 min after intake of 2.5 {mu}Ci {sup 14}C-urea, seems sufficient for the detection of gastric campylobacter. The {sup 14}C-urea breath test correlates well with biopsy culture and provides a sensitive tool for the detection of gastric campylobacter. Serology also corresponds well with biopsy culture and should provide a useful tool for epidemiologic studies. 22 refs., 4 figs., 1 tab.

  19. Adaptation of the ToxRTool to Assess the Reliability of Toxicology Studies Conducted with Genetically Modified Crops and Implications for Future Safety Testing.

    Science.gov (United States)

    Koch, Michael S; DeSesso, John M; Williams, Amy Lavin; Michalek, Suzanne; Hammond, Bruce

    2016-01-01

    To determine the reliability of food safety studies carried out in rodents with genetically modified (GM) crops, a Food Safety Study Reliability Tool (FSSRTool) was adapted from the European Centre for the Validation of Alternative Methods' (ECVAM) ToxRTool. Reliability was defined as the inherent quality of the study with regard to use of standardized testing methodology, full documentation of experimental procedures and results, and the plausibility of the findings. Codex guidelines for GM crop safety evaluations indicate toxicology studies are not needed when comparability of the GM crop to its conventional counterpart has been demonstrated. This guidance notwithstanding, animal feeding studies have routinely been conducted with GM crops, but their conclusions on safety are not always consistent. To accurately evaluate potential risks from GM crops, risk assessors need clearly interpretable results from reliable studies. The development of the FSSRTool, which provides the user with a means of assessing the reliability of a toxicology study to inform risk assessment, is discussed. Its application to the body of literature on GM crop food safety studies demonstrates that reliable studies report no toxicologically relevant differences between rodents fed GM crops or their non-GM comparators.

  20. Reliability versus reproducibility

    International Nuclear Information System (INIS)

    Lautzenheiser, C.E.

    1976-01-01

    Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results

  1. Validity and reliability of methods for the detection of secondary caries around amalgam restorations in primary teeth

    Directory of Open Access Journals (Sweden)

    Mariana Minatel Braga

    2010-03-01

    Full Text Available Secondary caries has been reported as the main reason for restoration replacement. The aim of this in vitro study was to evaluate the performance of different methods - visual inspection, laser fluorescence (DIAGNOdent, radiography and tactile examination - for secondary caries detection in primary molars restored with amalgam. Fifty-four primary molars were photographed and 73 suspect sites adjacent to amalgam restorations were selected. Two examiners evaluated independently these sites using all methods. Agreement between examiners was assessed by the Kappa test. To validate the methods, a caries-detector dye was used after restoration removal. The best cut-off points for the sample were found by a Receiver Operator Characteristic (ROC analysis, and the area under the ROC curve (Az, and the sensitivity, specificity and accuracy of the methods were calculated for enamel (D2 and dentine (D3 thresholds. These parameters were found for each method and then compared by the McNemar test. The tactile examination and visual inspection presented the highest inter-examiner agreement for the D2 and D3 thresholds, respectively. The visual inspection also showed better performance than the other methods for both thresholds (Az = 0.861 and Az = 0.841, respectively. In conclusion, the visual inspection presented the best performance for detecting enamel and dentin secondary caries in primary teeth restored with amalgam.

  2. Method to improve reliability of a fuel cell system using low performance cell detection at low power operation

    Science.gov (United States)

    Choi, Tayoung; Ganapathy, Sriram; Jung, Jaehak; Savage, David R.; Lakshmanan, Balasubramanian; Vecasey, Pamela M.

    2013-04-16

    A system and method for detecting a low performing cell in a fuel cell stack using measured cell voltages. The method includes determining that the fuel cell stack is running, the stack coolant temperature is above a certain temperature and the stack current density is within a relatively low power range. The method further includes calculating the average cell voltage, and determining whether the difference between the average cell voltage and the minimum cell voltage is greater than a predetermined threshold. If the difference between the average cell voltage and the minimum cell voltage is greater than the predetermined threshold and the minimum cell voltage is less than another predetermined threshold, then the method increments a low performing cell timer. A ratio of the low performing cell timer and a system run timer is calculated to identify a low performing cell.

  3. In Situ Biological Contamination Studies of the Moon: Implications for Future Planetary Protection and Life Detection Missions

    Science.gov (United States)

    Glavin, Daniel P.; Dworkin, Jason P.; Lupisella, Mark; Kminek, Gerhard; Rummel, John D.

    2010-01-01

    NASA and ESA have outlined visions for solar system exploration that will include a series of lunar robotic precursor missions to prepare for, and support a human return to the Moon, and future human exploration of Mars and other destinations. One of the guiding principles for exploration is to pursue compelling scientific questions about the origin and evolution of life. The search for life on objects such as Mars will require that all spacecraft and instrumentation be sufficiently cleaned and sterilized prior to launch to ensure that the scientific integrity of extraterrestrial samples is not jeopardized by terrestrial organic contamination. Under the Committee on Space Research's (COSPAR's) current planetary protection policy for the Moon, no sterilization procedures are required for outbound lunar spacecraft, nor is there yet a planetary protection category for human missions. Future in situ investigations of a variety of locations on the Moon by highly sensitive instruments designed to search for biologically derived organic compounds would help assess the contamination of the Moon by lunar spacecraft. These studies could also provide valuable "ground truth" data for Mars sample return missions and help define planetary protection requirements for future Mars bound spacecraft carrying life detection experiments. In addition, studies of the impact of terrestrial contamination of the lunar surface by the Apollo astronauts could provide valuable data to help refine future Mars surface exploration plans for a human mission to Mars.

  4. Children's success at detecting circular explanations and their interest in future learning.

    Science.gov (United States)

    Mills, Candice M; Danovitch, Judith H; Rowles, Sydney P; Campbell, Ian L

    2017-10-01

    These studies explore elementary-school-aged children's ability to evaluate circular explanations and whether they respond to receiving weak explanations by expressing interest in additional learning. In the first study, 6-, 8-, and 10-year-olds (n = 53) heard why questions about unfamiliar animals. For each question, they rated the quality of single explanations and later selected the best explanation between pairs of circular and noncircular explanations. When judging single explanations, 8- and 10-year-olds, and to some extent 6-year-olds, provided higher ratings for noncircular explanations compared to circular ones. When selecting between pairs of explanations, all age groups preferred noncircular explanations to circular ones, but older children did so more consistently than 6-year-olds. Children who recognized the weakness of the single circular explanations were more interested in receiving additional information about the question topics. In Study 2, all three age groups (n = 87) provided higher ratings for noncircular explanations compared to circular ones when listening to responses to how questions, but older children showed a greater distinction in their ratings than 6-year-olds. Moreover, the link between recognizing circular explanations as weak and interest in future learning could not be accounted for solely by individual differences in verbal intelligence. These findings illustrate the developmental trajectory of explanation evaluation and support that recognition of weak explanations is linked to interest in future learning across the elementary years. Implications for education are discussed.

  5. Reliability of cortical lesion detection on double inversion recovery MRI applying the MAGNIMS-Criteria in multiple sclerosis patients within a 16-months period.

    Directory of Open Access Journals (Sweden)

    Tobias Djamsched Faizy

    Full Text Available In patients with multiple sclerosis (MS, Double Inversion Recovery (DIR magnetic resonance imaging (MRI can be used to identify cortical lesions (CL. We sought to evaluate the reliability of CL detection on DIR longitudinally at multiple subsequent time-points applying the MAGNIMs scoring criteria for CLs.26 MS patients received a 3T-MRI (Siemens, Skyra with DIR at 12 time-points (TP within a 16 months period. Scans were assessed in random order by two different raters. Both raters separately marked all CLs on each scan and total lesion numbers were obtained for each scan-TP and patient. After a retrospective re-evaluation, the number of consensus CLs (conL was defined as the total number of CLs, which both raters finally agreed on. CLs volumes, relative signal intensities and CLs localizations were determined. Both ratings (conL vs. non-consensus scoring were compared for further analysis.A total number of n = 334 CLs were identified by both raters in 26 MS patients with a first agreement of both raters on 160 out of 334 of the CLs found (κ = 0.48. After the retrospective re-evaluation, consensus agreement increased to 233 out of 334 CL (κ = 0.69. 93.8% of conL were visible in at least 2 consecutive TP. 74.7% of the conL were visible in all 12 consecutive TP. ConL had greater mean lesion volumes and higher mean signal intensities compared to lesions that were only detected by one of the raters (p<0.05. A higher number of CLs in the frontal, parietal, temporal and occipital lobe were identified by both raters than the number of those only identified by one of the raters (p<0.05.After a first assessment, slightly less than a half of the CL were considered as reliably detectable on longitudinal DIR images. A retrospective re-evaluation notably increased the consensus agreement. However, this finding is narrowed, considering the fact that retrospective evaluation steps might not be practicable in clinical routine. Lesions that were not reliably

  6. Detecting peptidic drugs, drug candidates and analogs in sports doping: current status and future directions.

    Science.gov (United States)

    Thevis, Mario; Thomas, Andreas; Schänzer, Wilhelm

    2014-12-01

    With the growing availability of mature systems and strategies in biotechnology and the continuously expanding knowledge of cellular processes and involved biomolecules, human sports drug testing has become a considerably complex field in the arena of analytical chemistry. Proving the exogenous origin of peptidic drugs and respective analogs at lowest concentration levels in biological specimens (commonly blood, serum and urine) of rather limited volume is required to pursue an action against cheating athletes. Therefore, approaches employing chromatographic-mass spectrometric, electrophoretic, immunological and combined test methods have been required and developed. These allow detecting the misuse of peptidic compounds of lower (such as growth hormone-releasing peptides, ARA-290, TB-500, AOD-9604, CJC-1295, desmopressin, luteinizing hormone-releasing hormones, synacthen, etc.), intermediate (e.g., insulins, IGF-1 and analogs, 'full-length' mechano growth factor, growth hormone, chorionic gonadotropin, erythropoietin, etc.) and higher (e.g., stamulumab) molecular mass with desired specificity and sensitivity. A gap between the technically possible detection and the day-to-day analytical practice, however, still needs to be closed.

  7. Reliability of sickness certificates in detecting potential sick leave reduction by modifying working conditions: a clinical epidemiology study

    Directory of Open Access Journals (Sweden)

    Johnsen Roar

    2004-03-01

    Full Text Available Abstract Background Medical sickness certificates are generally the main source for information when scrutinizing the need for aimed intervention strategies to avoid or reduce the individual and community side effects of sick leave. This study explored the value of medical sickness certificates related to daily work in Norwegian National Insurance Offices to identify sick-listed persons, where modified working conditions might reduce the ongoing sick leave. Methods The potential for reducing the ongoing sick leave by modifying working conditions was individually assessed on routine sickness certificates in 999 consecutive sick leave episodes by four Norwegian National Insurance collaborators, two with and two without formal medical competence. The study took place in Northern Norway in 1997 and 1998. Agreement analysed with differences against mean, kappa, and proportional-agreement analysis within and between groups of assessors was used in the judgement. Agreements between the assessors and the self-assessment of sick-listed subjects were additionally analysed in 159 sick-leave episodes. Results Both sick-listed subjects and National Insurance collaborators anticipated a potential reduction in sick leave in 20–30% of cases, and in another 20% the potential was assessed as possible. The chance corrected agreements, however, were poor (k Conclusion Information in medical sickness certificates proved ineffective in detecting cases where modified working conditions may reduce sick leave, and focusing on medical certificates may prevent identification of needed interventions. Strategies on how to communicate directly with sick-listed subjects would enable social authorities to exploit more of the sick leave reduction potential by modifying the working conditions than strategies on improving medical information.

  8. Fiber-based optic sensor for detecting human blood clot: present and future revival

    Science.gov (United States)

    Elshikeri, Nada; Bakhtiar, Hazri

    2018-05-01

    Sustaining human’s life-frame away from being impeded by the clot - ghost term, we attempt to approach a mobile fiber-based optical sensor (f-s) for detecting blood clot in a blood vessel (intra-arteries/veins). Blood vessels are the part of the circulatory system that transport blood throughout the human body, thus their significance of being protected arise to the monograph focus. MRI (magnetic resonance imaging), X-rays and other medical instruments are diagnostic immobility techniques with a slackest interval. The corer causation of fiber-based optical sensor is to detect a clump of blood in the bloodstream by providing a prompt mobile diagnostic intervals preserving last-minutes-breath of human’s life. The detector (f-s) has been etched by diluting sulphuric acid ~10% at certain zone to sensate its function. The in-vitro monograph peaks its maximal monitoring when the sensor is attached to Raman Spectroscopy (RS) setup. RS quantifies the relative intensities of fibrinogen bond, which is the first type of blood coagulation elements of blood plasma. Blood coagulation parameters are the major concern of the monograph investigation, such as total haemoglobin (tHb), clotting reaction time (t), clot progression time (t2), maximum clot amplitude (ma) and mean refractive index (r). A blood sample will be drawn from the patient and after centrifugation to separate blood plasma from its constituents, then an immediate sloshing of blood plasma in the (f-s) packet which has its plug-in to RS. Estimating the quantitative analysis of blood sample concentration, RS will determine the presence of coagulation in terms of intensity and medical procedures will dominate the treatment process. Thus, the suggestive monograph provides a definite instrument for investigating blood coagulation intra-arteries/veins promptly.

  9. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  10. In vivo laser-based imaging of the human fallopian tube for future cancer detection

    Science.gov (United States)

    Seibel, Eric J.; Melville, C. David; Johnston, Richard S.; Gong, Yuanzheng; Agnew, Kathy; Chiang, Seine; Swisher, Elizabeth M.

    2015-03-01

    Inherited mutations in BRCA1 and BRCA2 lead to 20-50% lifetime risk of ovarian, tubal, or peritoneal carcinoma. Clinical recommendations for women with these genetic mutations include the prophylactic removal of ovaries and fallopian tubes by age 40 after child-bearing. Recent findings suggest that many presumed ovarian or peritoneal carcinomas arise in fallopian tube epithelium. Although survival rate is screening techniques have mistakenly focused on the ovary as origin of ovarian carcinoma. Unlike ovaries, the fallopian tubes are amenable to direct visual imaging without invasive surgery, using access through the cervix. To develop future screening protocols, we investigated using our 1.2- mm diameter, forward-viewing, scanning fiber endoscope (SFE) to image luminal surfaces of the fallopian tube before laparoscopic surgical removal. Three anesthetized human subjects participated in our protocol development which eventually led to 70-80% of the length of fallopian tubes being imaged in scanning reflectance, using red (632nm), green (532nm), and blue (442nm) laser light. A hysteroscope with saline uterine distention was used to locate the tubal ostia. To facilitate passage of the SFE through the interstitial portion of the fallopian tube, an introducer catheter was inserted 1- cm through each ostia. During insertion, saline was flushed to reduce friction and provide clearer viewing. This is likely the first high-resolution intraluminal visualization of fallopian tubes.

  11. Development and evaluation of automated systems for detection and classification of banded chromosomes: current status and future perspectives

    International Nuclear Information System (INIS)

    Wang Xingwei; Zheng Bin; Wood, Marc; Li Shibo; Chen Wei; Liu Hong

    2005-01-01

    Automated detection and classification of banded chromosomes may help clinicians diagnose cancers and other genetic disorders at an early stage more efficiently and accurately. However, developing such an automated system (including both a high-speed microscopic image scanning device and related computer-assisted schemes) is quite a challenging and difficult task. Since the 1980s, great research efforts have been made to develop fast and more reliable methods to assist clinical technicians in performing this important and time-consuming task. A number of computer-assisted methods including classical statistical methods, artificial neural networks and knowledge-based fuzzy logic systems, have been applied and tested. Based on the initial test using limited datasets, encouraging results in algorithm and system development have been demonstrated. Despite the significant research effort and progress made over the last two decades, computer-assisted chromosome detection and classification systems have not been routinely accepted and used in clinical laboratories. Further research and development is needed

  12. Development and evaluation of automated systems for detection and classification of banded chromosomes: current status and future perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Wang Xingwei [Center for Bioengineering and School of Electrical and Computer Engineering, University of Oklahoma, OK (United States); Zheng Bin [Department of Radiology, University of Pittsburgh Medical Center, Pittsburgh, PA (United States); Wood, Marc [Center for Bioengineering and School of Electrical and Computer Engineering, University of Oklahoma, OK (United States); Li Shibo [Department of Pediatrics, University of Oklahoma Medical Center, Oklahoma City, OK (United States); Chen Wei [Department of Physics and Engineering, University of Central Oklahoma, Edmond, OK (United States); Liu Hong [Center for Bioengineering and School of Electrical and Computer Engineering, University of Oklahoma, OK (United States)

    2005-08-07

    Automated detection and classification of banded chromosomes may help clinicians diagnose cancers and other genetic disorders at an early stage more efficiently and accurately. However, developing such an automated system (including both a high-speed microscopic image scanning device and related computer-assisted schemes) is quite a challenging and difficult task. Since the 1980s, great research efforts have been made to develop fast and more reliable methods to assist clinical technicians in performing this important and time-consuming task. A number of computer-assisted methods including classical statistical methods, artificial neural networks and knowledge-based fuzzy logic systems, have been applied and tested. Based on the initial test using limited datasets, encouraging results in algorithm and system development have been demonstrated. Despite the significant research effort and progress made over the last two decades, computer-assisted chromosome detection and classification systems have not been routinely accepted and used in clinical laboratories. Further research and development is needed.

  13. Complete validation of a unique digestion assay to detect Trichinella larvae in horse meat demonstrates the reliability of this assay for meeting food safety and trade requirements.

    Science.gov (United States)

    Forbes, L B; Hill, D E; Parker, S; Tessaro, S V; Gamble, H R; Gajadhar, A A

    2008-03-01

    A tissue digestion assay using a double separatory funnel procedure for the detection of Trichinella larvae in horse meat was validated for application in food safety programs and trade. The assay consisted of a pepsin-HCl digestion step to release larvae from muscle tissue and two sequential sedimentation steps in separatory funnels to recover and concentrate larvae for detection with a stereomicroscope. With defined critical control points, the assay was conducted within a quality assurance system compliant with International Organization for Standardization-International Electrotechnical Commission (ISO/IEC) 17025 guidelines. Samples used in the validation were obtained from horses experimentally infected with Trichinella spiralis to obtain a range of muscle larvae densities. One-, 5-, and 10-g samples of infected tissue were combined with 99, 95, and 90 g, respectively, of known negative horse tissue to create a 100-g sample for testing. Samples of 5 and 10 g were more likely to be positive than were 1-g samples when larval densities were less than three larvae per gram (lpg). This difference is important because ingested meat with 1 lpg is considered the threshold for clinical disease in humans. Using a 5-g sample size, all samples containing 1.3 to 2 lpg were detected, and 60 to 100% of samples with infected horse meat containing 0.1 to 0.7 lpg were detected. In this study, the double separatory funnel digestion assay was efficient and reliable for its intended use in food safety and trade. This procedure is the only digestion assay for Trichinella in horse meat that has been validated as consistent and effective at critical levels of sensitivity.

  14. AST Critical Propulsion and Noise Reduction Technologies for Future Commercial Subsonic Engines Area of Interest 1.0: Reliable and Affordable Control Systems

    Science.gov (United States)

    Myers, William; Winter, Steve

    2006-01-01

    The General Electric Reliable and Affordable Controls effort under the NASA Advanced Subsonic Technology (AST) Program has designed, fabricated, and tested advanced controls hardware and software to reduce emissions and improve engine safety and reliability. The original effort consisted of four elements: 1) a Hydraulic Multiplexer; 2) Active Combustor Control; 3) a Variable Displacement Vane Pump (VDVP); and 4) Intelligent Engine Control. The VDVP and Intelligent Engine Control elements were cancelled due to funding constraints and are reported here only to the state they progressed. The Hydraulic Multiplexing element developed and tested a prototype which improves reliability by combining the functionality of up to 16 solenoids and servo-valves into one component with a single electrically powered force motor. The Active Combustor Control element developed intelligent staging and control strategies for low emission combustors. This included development and tests of a Controlled Pressure Fuel Nozzle for fuel sequencing, a Fuel Multiplexer for individual fuel cup metering, and model-based control logic. Both the Hydraulic Multiplexer and Controlled Pressure Fuel Nozzle system were cleared for engine test. The Fuel Multiplexer was cleared for combustor rig test which must be followed by an engine test to achieve full maturation.

  15. Force-detected nuclear magnetic resonance: recent advances and future challenges.

    Science.gov (United States)

    Poggio, M; Degen, C L

    2010-08-27

    We review recent efforts to detect small numbers of nuclear spins using magnetic resonance force microscopy. Magnetic resonance force microscopy (MRFM) is a scanning probe technique that relies on the mechanical measurement of the weak magnetic force between a microscopic magnet and the magnetic moments in a sample. Spurred by the recent progress in fabricating ultrasensitive force detectors, MRFM has rapidly improved its capability over the last decade. Today it boasts a spin sensitivity that surpasses conventional, inductive nuclear magnetic resonance detectors by about eight orders of magnitude. In this review we touch on the origins of this technique and focus on its recent application to nanoscale nuclear spin ensembles, in particular on the imaging of nanoscale objects with a three-dimensional (3D) spatial resolution better than 10 nm. We consider the experimental advances driving this work and highlight the underlying physical principles and limitations of the method. Finally, we discuss the challenges that must be met in order to advance the technique towards single nuclear spin sensitivity-and perhaps-to 3D microscopy of molecules with atomic resolution.

  16. Assessing the Accuracy and Reliability of Root Crack and Fracture Detection in Teeth Using Sweep Imaging with Fourier Transform (SWIFT) Magnetic Resonance Imaging (MRI)

    Science.gov (United States)

    Schuurmans, Tyler J.

    Introduction: Magnetic Resonance Imaging (MRI) has the potential to aid in determining the presence and extent of cracks/fractures in teeth due to more advantageous contrast, without ionizing radiation. An MRI technique called Sweep Imaging with Fourier Transform (SWIFT) has overcome many of the inherent difficulties of conventional MRI with detecting fast-relaxing signals from densely mineralized dental tissues. The objectives of this in vitro investigation were to develop MRI criteria for root crack/fracture identification in teeth and to establish intra- and inter-rater reliabilities and corresponding sensitivity and specificity values for the detection of tooth-root cracks/fractures in SWIFT MRI and limited field of view (FOV) CBCT. Materials and Methods: MRI-based criteria for crack/fracture appearance was developed by an MRI physicist and 6 dentists, including 3 endodontists and 1 Oral and Maxillofacial (OMF) radiologist. Twenty-nine human adult teeth previously extracted following clinical diagnosis by a board-certified endodontist of a root crack/fracture were frequency-matched to 29 non-cracked controls. Crack/fracture status confirmation was performed with magnified visual inspection, transillumination and vital staining. Samples were scanned with two 3D imaging modalities: 1) SWIFT MRI (10 teeth/scan) via a custom oral radiofrequency (RF) coil and a 90cm, 4-T magnet; 2) Limited FOV CBCT (1 tooth/scan) via a Carestream (CS) 9000 (Rochester, NY). Following a training period, a blinded 4-member panel (3 endodontists, 1 OMF radiologist) evaluated the images with a proportion randomly re-tested to establish intra-rater reliability. Overall observer agreement was measured using Cohen's kappa and levels of agreement judged using the criteria of Landis and Koch. Sensitivity and specificity were computed with 95% confidence interval (CI); statistical significance was set at alpha ≤ 0.05. Results: MRI-based crack/fracture criteria were defined as 1-2 sharply

  17. A construction of standardized near infrared hyper-spectral teeth database: a first step in the development of reliable diagnostic tool for quantification and early detection of caries

    Science.gov (United States)

    Bürmen, Miran; Usenik, Peter; Fidler, Aleš; Pernuš, Franjo; Likar, Boštjan

    2011-03-01

    Dental caries is a disease characterized by demineralization of enamel crystals leading to the penetration of bacteria into the dentin and pulp. If left untreated, the disease can lead to pain, infection and tooth loss. Early detection of enamel demineralization resulting in increased enamel porosity, commonly known as white spots, is a difficult diagnostic task. Several papers reported on near infrared (NIR) spectroscopy to be a potentially useful noninvasive spectroscopic technique for early detection of caries lesions. However, the conducted studies were mostly qualitative and did not include the critical assessment of the spectral variability of the sound and carious dental tissues and influence of the water content. Such assessment is essential for development and validation of reliable qualitative and especially quantitative diagnostic tools based on NIR spectroscopy. In order to characterize the described spectral variability, a standardized diffuse reflectance hyper-spectral database was constructed by imaging 12 extracted human teeth with natural lesions of various degrees in the spectral range from 900 to 1700 nm with spectral resolution of 10 nm. Additionally, all the teeth were imaged by digital color camera. The influence of water content on the acquired spectra was characterized by monitoring the teeth during the drying process. The images were assessed by an expert, thereby obtaining the gold standard. By analyzing the acquired spectra we were able to accurately model the spectral variability of the sound dental tissues and identify the advantages and limitations of NIR hyper-spectral imaging.

  18. A New Method to Detect and Correct the Critical Errors and Determine the Software-Reliability in Critical Software-System

    International Nuclear Information System (INIS)

    Krini, Ossmane; Börcsök, Josef

    2012-01-01

    In order to use electronic systems comprising of software and hardware components in safety related and high safety related applications, it is necessary to meet the Marginal risk numbers required by standards and legislative provisions. Existing processes and mathematical models are used to verify the risk numbers. On the hardware side, various accepted mathematical models, processes, and methods exist to provide the required proof. To this day, however, there are no closed models or mathematical procedures known that allow for a dependable prediction of software reliability. This work presents a method that makes a prognosis on the residual critical error number in software. Conventional models lack this ability and right now, there are no methods that forecast critical errors. The new method will show that an estimate of the residual error number of critical errors in software systems is possible by using a combination of prediction models, a ratio of critical errors, and the total error number. Subsequently, the critical expected value-function at any point in time can be derived from the new solution method, provided the detection rate has been calculated using an appropriate estimation method. Also, the presented method makes it possible to make an estimate on the critical failure rate. The approach is modelled on a real process and therefore describes two essential processes - detection and correction process.

  19. A knowledge-based operator advisor system for integration of fault detection, control, and diagnosis to enhance the safe and reliable operation of nuclear power plants

    International Nuclear Information System (INIS)

    Bhatnagar, R.

    1989-01-01

    A Knowledged-Based Operator Advisor System has been developed for enhancing the complex task of maintaining safe and reliable operation of nuclear power plants. The operator's activities have been organized into the four tasks of data interpretation for abstracting high level information from sensor data, plant state monitoring for identification of faults, plan execution for controlling the faults, and diagnosis for determination of root causes of faults. The Operator Advisor System is capable of identifying the abnormal functioning of the plant in terms of: (1) deviations from normality, (2) pre-enumerated abnormal events, and (3) safety threats. The classification of abnormal functioning into the three categories of deviations from normality, abnormal events, and safety threats allows the detection of faults at three levels of: (1) developing faults, (2) developed faults, and (3) safety threatening faults. After the identification of abnormal functioning the system will identify the procedures to be executed to mitigate the consequences of abnormal functioning and will help the operator by displaying the procedure steps and monitoring the success of actions taken. The system also is capable of diagnosing the root causes of abnormal functioning. The identification, and diagnosis of root causes of abnormal functioning are done in parallel to the task of procedure execution, allowing the detection of more critical safety threats while executing procedures to control abnormal events

  20. FMR1 CGG repeat expansion mutation detection and linked haplotype analysis for reliable and accurate preimplantation genetic diagnosis of fragile X syndrome.

    Science.gov (United States)

    Rajan-Babu, Indhu-Shree; Lian, Mulias; Cheah, Felicia S H; Chen, Min; Tan, Arnold S C; Prasath, Ethiraj B; Loh, Seong Feei; Chong, Samuel S

    2017-07-19

    Fragile X mental retardation 1 (FMR1) full-mutation expansion causes fragile X syndrome. Trans-generational fragile X syndrome transmission can be avoided by preimplantation genetic diagnosis (PGD). We describe a robust PGD strategy that can be applied to virtually any couple at risk of transmitting fragile X syndrome. This novel strategy utilises whole-genome amplification, followed by triplet-primed polymerase chain reaction (TP-PCR) for robust detection of expanded FMR1 alleles, in parallel with linked multi-marker haplotype analysis of 13 highly polymorphic microsatellite markers located within 1 Mb of the FMR1 CGG repeat, and the AMELX/Y dimorphism for gender identification. The assay was optimised and validated on single lymphoblasts isolated from fragile X reference cell lines, and applied to a simulated PGD case and a clinical in vitro fertilisation (IVF)-PGD case. In the simulated PGD case, definitive diagnosis of the expected results was achieved for all 'embryos'. In the clinical IVF-PGD case, delivery of a healthy baby girl was achieved after transfer of an expansion-negative blastocyst. FMR1 TP-PCR reliably detects presence of expansion mutations and obviates reliance on informative normal alleles for determining expansion status in female embryos. Together with multi-marker haplotyping and gender determination, misdiagnosis and diagnostic ambiguity due to allele dropout is minimised, and couple-specific assay customisation can be avoided.

  1. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  2. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  3. Highly reliable TOFD UT Technique

    International Nuclear Information System (INIS)

    Acharya, G.D.; Trivedi, S.A.R.; Pai, K.B.

    2003-01-01

    The high performance of the time of flight diffraction technique (TOFD) with regard to the detection capabilities of weld defects such as crack, slag, lack of fusion has led to a rapidly increasing acceptance of the technique as a pre?service inspection tool. Since the early 1990s TOFD has been applied to several projects, where it replaced the commonly used radiographic testing. The use of TOM lead to major time savings during new build and replacement projects. At the same time the TOFD technique was used as base line inspection, which enables monitoring in the future for critical welds, but also provides documented evidence for life?time. The TOFD technique as the ability to detect and simultaneously size flows of nearly any orientation within the weld and heat affected zone. TOM is recognized as a reliable, proven technique for detection and sizing of defects and proven to be a time saver, resulting in shorter shutdown periods and construction project times. Thus even in cases where inspection price of TOFD per welds is higher, in the end it will result in significantly lower overall costs and improve quality. This paper deals with reliability, economy, acceptance criteria and field experience. It also covers comparative study between radiography technique Vs. TOFD. (Author)

  4. Fast Metabolite Identification in Nuclear Magnetic Resonance Metabolomic Studies: Statistical Peak Sorting and Peak Overlap Detection for More Reliable Database Queries.

    Science.gov (United States)

    Hoijemberg, Pablo A; Pelczer, István

    2018-01-05

    A lot of time is spent by researchers in the identification of metabolites in NMR-based metabolomic studies. The usual metabolite identification starts employing public or commercial databases to match chemical shifts thought to belong to a given compound. Statistical total correlation spectroscopy (STOCSY), in use for more than a decade, speeds the process by finding statistical correlations among peaks, being able to create a better peak list as input for the database query. However, the (normally not automated) analysis becomes challenging due to the intrinsic issue of peak overlap, where correlations of more than one compound appear in the STOCSY trace. Here we present a fully automated methodology that analyzes all STOCSY traces at once (every peak is chosen as driver peak) and overcomes the peak overlap obstacle. Peak overlap detection by clustering analysis and sorting of traces (POD-CAST) first creates an overlap matrix from the STOCSY traces, then clusters the overlap traces based on their similarity and finally calculates a cumulative overlap index (COI) to account for both strong and intermediate correlations. This information is gathered in one plot to help the user identify the groups of peaks that would belong to a single molecule and perform a more reliable database query. The simultaneous examination of all traces reduces the time of analysis, compared to viewing STOCSY traces by pairs or small groups, and condenses the redundant information in the 2D STOCSY matrix into bands containing similar traces. The COI helps in the detection of overlapping peaks, which can be added to the peak list from another cross-correlated band. POD-CAST overcomes the generally overlooked and underestimated presence of overlapping peaks and it detects them to include them in the search of all compounds contributing to the peak overlap, enabling the user to accelerate the metabolite identification process with more successful database queries and searching all tentative

  5. RTE - 2013 Reliability Report

    International Nuclear Information System (INIS)

    Denis, Anne-Marie

    2014-01-01

    RTE publishes a yearly reliability report based on a standard model to facilitate comparisons and highlight long-term trends. The 2013 report is not only stating the facts of the Significant System Events (ESS), but it moreover underlines the main elements dealing with the reliability of the electrical power system. It highlights the various elements which contribute to present and future reliability and provides an overview of the interaction between the various stakeholders of the Electrical Power System on the scale of the European Interconnected Network. (author)

  6. Business of reliability

    Science.gov (United States)

    Engel, Pierre

    1999-12-01

    The presentation is organized around three themes: (1) The decrease of reception equipment costs allows non-Remote Sensing organization to access a technology until recently reserved to scientific elite. What this means is the rise of 'operational' executive agencies considering space-based technology and operations as a viable input to their daily tasks. This is possible thanks to totally dedicated ground receiving entities focusing on one application for themselves, rather than serving a vast community of users. (2) The multiplication of earth observation platforms will form the base for reliable technical and financial solutions. One obstacle to the growth of the earth observation industry is the variety of policies (commercial versus non-commercial) ruling the distribution of the data and value-added products. In particular, the high volume of data sales required for the return on investment does conflict with traditional low-volume data use for most applications. Constant access to data sources supposes monitoring needs as well as technical proficiency. (3) Large volume use of data coupled with low- cost equipment costs is only possible when the technology has proven reliable, in terms of application results, financial risks and data supply. Each of these factors is reviewed. The expectation is that international cooperation between agencies and private ventures will pave the way for future business models. As an illustration, the presentation proposes to use some recent non-traditional monitoring applications, that may lead to significant use of earth observation data, value added products and services: flood monitoring, ship detection, marine oil pollution deterrent systems and rice acreage monitoring.

  7. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  8. Reliability training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  9. [Reliability for detection of developmental problems using the semaphore from the Child Development Evaluation test: Is a yellow result different from a red result?

    Science.gov (United States)

    Rizzoli-Córdoba, Antonio; Ortega-Ríosvelasco, Fernando; Villasís-Keever, Miguel Ángel; Pizarro-Castellanos, Mariel; Buenrostro-Márquez, Guillermo; Aceves-Villagrán, Daniel; O'Shea-Cuevas, Gabriel; Muñoz-Hernández, Onofre

    The Child Development Evaluation (CDE) is a screening tool designed and validated in Mexico for detecting developmental problems. The result is expressed through a semaphore. In the CDE test, both yellow and red results are considered positive, although a different intervention is proposed for each. The aim of this work was to evaluate the reliability of the CDE test to discriminate between children with yellow/red result based on the developmental domain quotient (DDQ) obtained through the Battelle Development Inventory, 2nd edition (in Spanish) (BDI-2). The information was obtained for the study from the validation. Children with a normal (green) result in the CDE were excluded. Two different cut-off points of the DDQ were used (BDI-2): social: 20.1% vs. 28.9%; and adaptive: 6.9% vs. 20.4%. The semaphore result yellow/red allows identifying different magnitudes of delay in developmental domains or subdomains, supporting the recommendation of different interventions for each one. Copyright © 2014 Hospital Infantil de México Federico Gómez. Publicado por Masson Doyma México S.A. All rights reserved.

  10. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  11. RTE - 2015 Reliability Report. Summary

    International Nuclear Information System (INIS)

    2016-01-01

    Every year, RTE produces a reliability report for the past year. This report includes a number of results from previous years so that year-to-year comparisons can be drawn and long-term trends analysed. The 2015 report underlines the major factors that have impacted on the reliability of the electrical power system, without focusing exclusively on Significant System Events (ESS). It describes various factors which contribute to present and future reliability and the numerous actions implemented by RTE to ensure reliability today and in the future, as well as the ways in which the various parties involved in the electrical power system interact across the whole European interconnected network

  12. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  13. The sensitivity, specificity and reliability of the GALS (gait, arms, legs and spine) examination when used by physiotherapists and physiotherapy students to detect rheumatoid arthritis.

    Science.gov (United States)

    Beattie, Karen A; Macintyre, Norma J; Pierobon, Jessica; Coombs, Jennifer; Horobetz, Diana; Petric, Alexis; Pimm, Mara; Kean, Walter; Larché, Maggie J; Cividino, Alfred

    2011-09-01

    To evaluate the sensitivity, specificity and reliability of the gait, arms, legs and spine (GALS) examination to detect signs and symptoms of rheumatoid arthritis when used by physiotherapy students and physiotherapists. Two physiotherapy students and two physiotherapists were trained to perform the GALS examination by viewing an instructional DVD and attending a workshop. Two rheumatologists familiar with the GALS examination also participated in the workshop. All healthcare professionals performed the GALS examination on 25 participants with rheumatoid arthritis recruited through a rheumatology practice and 23 participants without any arthritides recruited from a primary care centre. Each participant was assessed by one rheumatologist, one physiotherapist and one physiotherapy student. Abnormalities of gait, arms, legs and spine, including their location and description, were recorded, along with whether or not a diagnosis of rheumatoid arthritis was suspected. Healthcare professionals understood the study's objective to be their agreement on GALS findings and were unaware that half of the participants had rheumatoid arthritis. Sensitivity, specificity and likelihood ratios were calculated to determine the ability of the GALS examination to screen for rheumatoid arthritis. Using rheumatologists' findings on the study day as the standard for comparison, sensitivity and specificity were 71 to 86% and 69 to 93%, respectively. Positive likelihood ratios ranged from 2.74 to 10.18, while negative likelihood ratios ranged from 0.21 to 0.38. The GALS examination may be a useful tool for physiotherapists to rule out rheumatoid arthritis in a direct access setting. Differences in duration and type of experience of each healthcare professional may contribute to the variation in results. The merits of introducing the GALS examination into physiotherapy curricula and practice should be explored. Copyright © 2010 Chartered Society of Physiotherapy. Published by Elsevier Ltd

  14. Human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1992-01-01

    This book resulted from the activity of Task Force 4.2 - 'Human Reliability'. This group was established on February 27th, 1986, at the plenary meeting of the Technical Reliability Committee of VDI, within the framework of the joint committee of VDI on industrial systems technology - GIS. It is composed of representatives of industry, representatives of research institutes, of technical control boards and universities, whose job it is to study how man fits into the technical side of the world of work and to optimize this interaction. In a total of 17 sessions, information from the part of ergonomy dealing with human reliability in using technical systems at work was exchanged, and different methods for its evaluation were examined and analyzed. The outcome of this work was systematized and compiled in this book. (orig.) [de

  15. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  16. The objective of this program is to develop innovative DNA detection technologies to achieve fast microbial community assessment. The specific approaches are (1) to develop inexpensive and reliable sequence-proof hybridization DNA detection technology (2) to develop quantitative DNA hybridization technology for microbial community assessment and (3) to study the microbes which have demonstrated the potential to have nuclear waste bioremediation

    International Nuclear Information System (INIS)

    Chen, Chung H.

    2004-01-01

    The objective of this program is to develop innovative DNA detection technologies to achieve fast microbial community assessment. The specific approaches are (1) to develop inexpensive and reliable sequence-proof hybridization DNA detection technology (2) to develop quantitative DNA hybridization technology for microbial community assessment and (3) to study the microbes which have demonstrated the potential to have nuclear waste bioremediation

  17. Mathematical reliability an expository perspective

    CERN Document Server

    Mazzuchi, Thomas; Singpurwalla, Nozer

    2004-01-01

    In this volume consideration was given to more advanced theoretical approaches and novel applications of reliability to ensure that topics having a futuristic impact were specifically included. Topics like finance, forensics, information, and orthopedics, as well as the more traditional reliability topics were purposefully undertaken to make this collection different from the existing books in reliability. The entries have been categorized into seven parts, each emphasizing a theme that seems poised for the future development of reliability as an academic discipline with relevance. The seven parts are networks and systems; recurrent events; information and design; failure rate function and burn-in; software reliability and random environments; reliability in composites and orthopedics, and reliability in finance and forensics. Embedded within the above are some of the other currently active topics such as causality, cascading, exchangeability, expert testimony, hierarchical modeling, optimization and survival...

  18. Leprosy New Case Detection Trends and the Future Effect of Preventive Interventions in Pará State, Brazil: A Modelling Study

    NARCIS (Netherlands)

    H.J. de Matos (Haroldo José); D.J. Blok (David); S.J. de Vlas (Sake); J.H. Richardus (Jan Hendrik)

    2016-01-01

    textabstractBackground: Leprosy remains a public health problem in Brazil. Although the overall number of new cases is declining, there are still areas with a high disease burden, such as Pará State in the north of the country. We aim to predict future trends in new case detection rate (NCDR) and

  19. Redefining reliability

    International Nuclear Information System (INIS)

    Paulson, S.L.

    1995-01-01

    Want to buy some reliability? The question would have been unthinkable in some markets served by the natural gas business even a few years ago, but in the new gas marketplace, industrial, commercial and even some residential customers have the opportunity to choose from among an array of options about the kind of natural gas service they need--and are willing to pay for. The complexities of this brave new world of restructuring and competition have sent the industry scrambling to find ways to educate and inform its customers about the increased responsibility they will have in determining the level of gas reliability they choose. This article discusses the new options and the new responsibilities of customers, the needed for continuous education, and MidAmerican Energy Company's experiment in direct marketing of natural gas

  20. Enhanced reliability and accuracy for field deployable bioforensic detection and discrimination of Xylella fastidiosa subsp. pauca, causal agent of citrus variegated chlorosis using razor ex technology and TaqMan quantitative PCR.

    Science.gov (United States)

    Ouyang, Ping; Arif, Mohammad; Fletcher, Jacqueline; Melcher, Ulrich; Ochoa Corona, Francisco Manuel

    2013-01-01

    A reliable, accurate and rapid multigene-based assay combining real time quantitative PCR (qPCR) and a Razor Ex BioDetection System (Razor Ex) was validated for detection of Xylella fastidiosa subsp. pauca (Xfp, a xylem-limited bacterium that causes citrus variegated chlorosis [CVC]). CVC, which is exotic to the United States, has spread through South and Central America and could significantly impact U.S. citrus if it arrives. A method for early, accurate and sensitive detection of Xfp in plant tissues is needed by plant health officials for inspection of products from quarantined locations, and by extension specialists for detection, identification and management of disease outbreaks and reservoir hosts. Two sets of specific PCR primers and probes, targeting Xfp genes for fimbrillin and the periplasmic iron-binding protein were designed. A third pair of primers targeting the conserved cobalamin synthesis protein gene was designed to detect all possible X. fastidiosa (Xf) strains. All three primer sets detected as little as 1 fg of plasmid DNA carrying X. fastidiosa target sequences and genomic DNA of Xfp at as little as 1 - 10 fg. The use of Razor Ex facilitates a rapid (about 30 min) in-field assay capability for detection of all Xf strains, and for specific detection of Xfp. Combined use of three primer sets targeting different genes increased the assay accuracy and broadened the range of detection. To our knowledge, this is the first report of a field-deployable rapid and reliable bioforensic detection and discrimination method for a bacterial phytopathogen based on multigene targets.

  1. Issues in cognitive reliability

    International Nuclear Information System (INIS)

    Woods, D.D.; Hitchler, M.J.; Rumancik, J.A.

    1984-01-01

    This chapter examines some problems in current methods to assess reactor operator reliability at cognitive tasks and discusses new approaches to solve these problems. The two types of human failures are errors in the execution of an intention and errors in the formation/selection of an intention. Topics considered include the types of description, error correction, cognitive performance and response time, the speed-accuracy tradeoff function, function based task analysis, and cognitive task analysis. One problem of human reliability analysis (HRA) techniques in general is the question of what are the units of behavior whose reliability are to be determined. A second problem for HRA is that people often detect and correct their errors. The use of function based analysis, which maps the problem space for plant control, is recommended

  2. Validity and reliability of 3D US for the detection of erosions in patients with rheumatoid arthritis using MRI as the gold standard

    DEFF Research Database (Denmark)

    Ellegaard, K; Bliddal, H; Møller Døhn, U

    2014-01-01

    PURPOSE: To test the reliability and validity of a 3D US erosion score in RA using MRI as the gold standard. MATERIALS AND METHODS: RA patients were examined with 3D US and 3 T MRI over the 2nd and 3rd metacarpophalangeal joints. 3D blocks were evaluated by two investigators. The erosions were...... estimated according to a semi-quantitative score (SQS) (0 - 3) and a quantitative score (QS) (mm²). MRI was evaluated according to the RAMRIS score. For the estimation of reliability, intra-class correlation coefficients (ICC) were used. Validity was tested using Spearman's rho (rs). The sensitivity...... and specificity were also calculated. RESULTS: 28 patients with RA were included. The ICC for the inter-observer reliability in the QS was 0.41 and 0.13 for the metacarpal bone and phalangeal bone, respectively, and 0.86 and 0.16, respectively, in the SQS.  The ICC for the intra-observer reliability in the QS...

  3. An Introduction To Reliability

    International Nuclear Information System (INIS)

    Park, Kyoung Su

    1993-08-01

    This book introduces reliability with definition of reliability, requirement of reliability, system of life cycle and reliability, reliability and failure rate such as summary, reliability characteristic, chance failure, failure rate which changes over time, failure mode, replacement, reliability in engineering design, reliability test over assumption of failure rate, and drawing of reliability data, prediction of system reliability, conservation of system, failure such as summary and failure relay and analysis of system safety.

  4. The influence of different error estimates in the detection of postoperative cognitive dysfunction using reliable change indices with correction for practice effects.

    Science.gov (United States)

    Lewis, Matthew S; Maruff, Paul; Silbert, Brendan S; Evered, Lis A; Scott, David A

    2007-02-01

    The reliable change index (RCI) expresses change relative to its associated error, and is useful in the identification of postoperative cognitive dysfunction (POCD). This paper examines four common RCIs that each account for error in different ways. Three rules incorporate a constant correction for practice effects and are contrasted with the standard RCI that had no correction for practice. These rules are applied to 160 patients undergoing coronary artery bypass graft (CABG) surgery who completed neuropsychological assessments preoperatively and 1 week postoperatively using error and reliability data from a comparable healthy nonsurgical control group. The rules all identify POCD in a similar proportion of patients, but the use of the within-subject standard deviation (WSD), expressing the effects of random error, as an error estimate is a theoretically appropriate denominator when a constant error correction, removing the effects of systematic error, is deducted from the numerator in a RCI.

  5. A novel molecular diagnostic tool for improved sensitivity and reliability detection of “Candidatus Liberibacter asiaticus”, bacterium associated with huanglongbing (HLB) bacterium Candidatus Liberibacter.

    Science.gov (United States)

    Sensitive and accurate detection is a prerequisite for efficient management and regulatory responses to prevent the introduction and spread of HLB-associated “Candidatus Liberibacter species to unaffected areas. To improve the current detection limit of HLB-associated “Ca. Liberibacter” spp, we deve...

  6. 77 FR 26714 - Transmission Planning Reliability Standards

    Science.gov (United States)

    2012-05-07

    ... Reliability Standards for the Bulk-Power System, Order No. 693, FERC Stats. & Regs. ] 31,242, order on reh'g... Standards for the Bulk Power System, 130 FERC ] 61,200 (2010) (March 2010 Order). \\12\\ Mandatory Reliability... excluded from future planning assessments and its potential impact to bulk electric system reliability...

  7. Waste package reliability analysis

    International Nuclear Information System (INIS)

    Pescatore, C.; Sastre, C.

    1983-01-01

    Proof of future performance of a complex system such as a high-level nuclear waste package over a period of hundreds to thousands of years cannot be had in the ordinary sense of the word. The general method of probabilistic reliability analysis could provide an acceptable framework to identify, organize, and convey the information necessary to satisfy the criterion of reasonable assurance of waste package performance according to the regulatory requirements set forth in 10 CFR 60. General principles which may be used to evaluate the qualitative and quantitative reliability of a waste package design are indicated and illustrated with a sample calculation of a repository concept in basalt. 8 references, 1 table

  8. Nonparametric predictive inference in reliability

    International Nuclear Information System (INIS)

    Coolen, F.P.A.; Coolen-Schrijner, P.; Yan, K.J.

    2002-01-01

    We introduce a recently developed statistical approach, called nonparametric predictive inference (NPI), to reliability. Bounds for the survival function for a future observation are presented. We illustrate how NPI can deal with right-censored data, and discuss aspects of competing risks. We present possible applications of NPI for Bernoulli data, and we briefly outline applications of NPI for replacement decisions. The emphasis is on introduction and illustration of NPI in reliability contexts, detailed mathematical justifications are presented elsewhere

  9. Accelerator Availability and Reliability Issues

    Energy Technology Data Exchange (ETDEWEB)

    Steve Suhring

    2003-05-01

    Maintaining reliable machine operations for existing machines as well as planning for future machines' operability present significant challenges to those responsible for system performance and improvement. Changes to machine requirements and beam specifications often reduce overall machine availability in an effort to meet user needs. Accelerator reliability issues from around the world will be presented, followed by a discussion of the major factors influencing machine availability.

  10. Computed tomography for the detection of distal radioulnar joint instability: normal variation and reliability of four CT scoring systems in 46 patients

    Energy Technology Data Exchange (ETDEWEB)

    Wijffels, Mathieu; Krijnen, Pieta; Schipper, Inger [Leiden University Medical Center, Department of Surgery-Trauma Surgery, P.O. Box 9600, Leiden (Netherlands); Stomp, Wouter; Reijnierse, Monique [Leiden University Medical Center, Department of Radiology, P.O. Box 9600, Leiden (Netherlands)

    2016-11-15

    The diagnosis of distal radioulnar joint (DRUJ) instability is clinically challenging. Computed tomography (CT) may aid in the diagnosis, but the reliability and normal variation for DRUJ translation on CT have not been established in detail. The aim of this study was to evaluate inter- and intraobserver agreement and normal ranges of CT scoring methods for determination of DRUJ translation in both posttraumatic and uninjured wrists. Patients with a conservatively treated, unilateral distal radius fracture were included. CT scans of both wrists were evaluated independently, by two readers using the radioulnar line method, subluxation ratio method, epicenter method and radioulnar ratio method. The inter- and intraobserver agreement was assessed and normal values were determined based on the uninjured wrists. Ninety-two wrist CTs (mean age: 56.5 years, SD: 17.0, mean follow-up 4.2 years, SD: 0.5) were evaluated. Interobserver agreement was best for the epicenter method [ICC = 0.73, 95 % confidence interval (CI) 0.65-0.79]. Intraobserver agreement was almost perfect for the radioulnar line method (ICC = 0.82, 95 % CI 0.77-0.87). Each method showed a wide normal range for normal DRUJ translation. Normal range for the epicenter method is -0.35 to -0.06 in pronation and -0.11 to 0.19 in supination. DRUJ translation on CT in pro- and supination can be reliably evaluated in both normal and posttraumatic wrists, however with large normal variation. The epicenter method seems the most reliable. Scanning of both wrists might be helpful to prevent the radiological overdiagnosis of instability. (orig.)

  11. RTE - Reliability report 2016

    International Nuclear Information System (INIS)

    2017-06-01

    Every year, RTE produces a reliability report for the past year. This document lays out the main factors that affected the electrical power system's operational reliability in 2016 and the initiatives currently under way intended to ensure its reliability in the future. Within a context of the energy transition, changes to the European interconnected network mean that RTE has to adapt on an on-going basis. These changes include the increase in the share of renewables injecting an intermittent power supply into networks, resulting in a need for flexibility, and a diversification in the numbers of stakeholders operating in the energy sector and changes in the ways in which they behave. These changes are dramatically changing the structure of the power system of tomorrow and the way in which it will operate - particularly the way in which voltage and frequency are controlled, as well as the distribution of flows, the power system's stability, the level of reserves needed to ensure supply-demand balance, network studies, assets' operating and control rules, the tools used and the expertise of operators. The results obtained in 2016 are evidence of a globally satisfactory level of reliability for RTE's operations in somewhat demanding circumstances: more complex supply-demand balance management, cross-border schedules at interconnections indicating operation that is closer to its limits and - most noteworthy - having to manage a cold spell just as several nuclear power plants had been shut down. In a drive to keep pace with the changes expected to occur in these circumstances, RTE implemented numerous initiatives to ensure high levels of reliability: - maintaining investment levels of euro 1.5 billion per year; - increasing cross-zonal capacity at borders with our neighbouring countries, thus bolstering the security of our electricity supply; - implementing new mechanisms (demand response, capacity mechanism, interruptibility, etc.); - involvement in tests or projects

  12. Non-invasive aneuploidy detection using free fetal DNA and RNA in maternal plasma: recent progress and future possibilities.

    NARCIS (Netherlands)

    Go, A.T.; Vugt, J.M.G. van; Oudejans, C.B.

    2011-01-01

    BACKGROUND: Cell-free fetal DNA (cff DNA) and RNA can be detected in maternal plasma and used for non-invasive prenatal diagnostics. Recent technical advances have led to a drastic change in the clinical applicability and potential uses of free fetal DNA and RNA. This review summarizes the latest

  13. Present and future potential of krypton-85 for the detection of clandestine reprocessing plants for treaty verification.

    Science.gov (United States)

    Schoeppner, Michael; Glaser, Alexander

    2016-10-01

    Burnup calculations are applied to determine the amount of krypton-85 that is produced during the irradiation of nuclear fuel. Since krypton-85 is most likely released into the atmosphere during reprocessing to separate plutonium, atmospheric transport modeling is used to calculate the worldwide distribution of krypton-85 concentrations stemming from emissions from declared reprocessing plants. The results are the basis for scenarios in which emissions from clandestine reprocessing facilities have to be detected against various background levels. It is concluded that today's background imposes heavily on the ability to detect small and medium plutonium separation rates; only high separation rates of 1 SQ per week and higher have a chance to be detected with feasible outlay. A fixed network of monitoring stations seems too costly; instead the high number of samples that are required rather calls for mobile sampling procedures, where air samples are collected at random locations over the world and are analyzed in regional laboratories for their krypton-85 concentration. Further, it is argued that krypton-85 emissions from declared reprocessing activities have to be significantly lowered to enable a worldwide verification of the absence of even smaller clandestine reprocessing. For each scenario the number of samples that have to be taken for probable detection is calculated. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Towards Reliable Integrated Services for Dependable Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Ravn, Anders Peter; Izadi-Zamanabadi, Roozbeh

    Reliability issues for various technical systems are discussed and focus is directed towards distributed systems, where communication facilities are vital to maintain system functionality. Reliability in communication subsystems is considered as a resource to be shared among a number of logical c...... applications residing on alternative routes. Details are provided for the operation of RRRSVP based on reliability slack calculus. Conclusions summarize the considerations and give directions for future research....... connections and a reliability management framework is suggested. We suggest a network layer level reliability management protocol RRSVP (Reliability Resource Reservation Protocol) as a counterpart of the RSVP for bandwidth and time resource management. Active and passive standby redundancy by background...

  15. Towards Reliable Integrated Services for Dependable Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Ravn, Anders Peter; Izadi-Zamanabadi, Roozbeh

    2003-01-01

    Reliability issues for various technical systems are discussed and focus is directed towards distributed systems, where communication facilities are vital to maintain system functionality. Reliability in communication subsystems is considered as a resource to be shared among a number of logical c...... applications residing on alternative routes. Details are provided for the operation of RRRSVP based on reliability slack calculus. Conclusions summarize the considerations and give directions for future research....... connections and a reliability management framework is suggested. We suggest a network layer level reliability management protocol RRSVP (Reliability Resource Reservation Protocol) as a counterpart of the RSVP for bandwidth and time resource management. Active and passive standby redundancy by background...

  16. Screening, early detection, education, and trends for melanoma: current status (2007-2013) and future directions: Part I. Epidemiology, high-risk groups, clinical strategies, and diagnostic technology.

    Science.gov (United States)

    Mayer, Jonathan E; Swetter, Susan M; Fu, Teresa; Geller, Alan C

    2014-10-01

    While most cancers have shown both decreased incidence and mortality over the past several decades, the incidence of melanoma has continued to grow, and mortality has only recently stabilized in the United States and in many other countries. Certain populations, such as men >60 years of age and lower socioeconomic status groups, face a greater burden from disease. For any given stage and across all ages, men have shown worse melanoma survival than women, and low socioeconomic status groups have increased levels of mortality. Novel risk factors can help identify populations at greatest risk for melanoma and can aid in targeted early detection. Risk assessment tools have been created to identify high-risk patients based on various factors, and these tools can reduce the number of patients needed to screen for melanoma detection. Diagnostic techniques, such as dermatoscopy and total body photography, and new technologies, such as multispectral imaging, may increase the accuracy and reliability of early melanoma detection. Copyright © 2014 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  17. Specifically colorimetric recognition of calcium, strontium, and barium ions using 2-mercaptosuccinic acid-functionalized gold nanoparticles and its use in reliable detection of calcium ion in water.

    Science.gov (United States)

    Zhang, Jia; Wang, Yong; Xu, Xiaowen; Yang, Xiurong

    2011-10-07

    A colorimetric probe based on 2-mercaptosuccinic acid-functionalized gold nanoparticles has been developed to exhibit selectivity towards Ca(2+), Sr(2+), and Ba(2+) ions over other metallic cations under specified conditions and finds its practical application in detecting Ca(2+) levels in water.

  18. Theoretical and experimental work on steam generator integrity and reliability with particular reference to leak development and detection. United Kingdom status report. October 1983

    International Nuclear Information System (INIS)

    Smedley, J.A.; Edge, D.M.

    1984-01-01

    This paper reviews the experimental and theoretical work in the UK on the characteristics of sodium-water reactions and describes work on the development of leak detection systems. A review of the operating experience with the PFR steam generators and the protection philosophy used on PFR is also given and the design studies for the Commercial Demonstration Fast Reactor (CDFR) are described

  19. Reliability of nucleic acid amplification methods for detection of Chlamydia trachomatis in urine: results of the first international collaborative quality control study among 96 laboratories

    NARCIS (Netherlands)

    R.P.A.J. Verkooyen (Roel); G.T. Noordhoek; P.E. Klapper; J. Reid; J. Schirm; G.M. Cleator; M. Ieven; G. Hoddevik

    2003-01-01

    textabstractThe first European Quality Control Concerted Action study was organized to assess the ability of laboratories to detect Chlamydia trachomatis in a panel of urine samples by nucleic acid amplification tests (NATs). The panel consisted of lyophilized urine samples,

  20. Can non-destructive inspection be reliable

    International Nuclear Information System (INIS)

    Silk, M.G.; Stoneham, A.M.; Temple, J.A.G.

    1988-01-01

    The paper on inspection is based on the book ''The reliability of non-destructive inspection: assessing the assessment of structures under stress'' by the present authors (published by Adam Hilger 1987). Emphasis is placed on the reliability of inspection and whether cracks in welds or flaws in components can be detected. The need for non-destructive testing and the historical attitudes to non-destructive testing are outlined, along with the case of failure. Factors influencing reliable inspection are discussed, and defect detection trials involving round robin tests are described. The development of reliable inspection techniques and the costs of reliability and unreliability are also examined. (U.K.)

  1. Frontiers of reliability

    CERN Document Server

    Basu, Asit P; Basu, Sujit K

    1998-01-01

    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  2. Detection and rapid recovery of the Sutter's Mill meteorite fall as a model for future recoveries worldwide

    Science.gov (United States)

    Fries, Marc; Le Corre, Lucille; Hankey, Mike; Fries, Jeff; Matson, Robert; Schaefer, Jake; Reddy, Vishnu

    2014-11-01

    The Sutter's Mill C-type meteorite fall occurred on 22 April 2012 in and around the town of Coloma, California. The exact location of the meteorite fall was determined within hours of the event using a combination of eyewitness reports, weather radar imagery, and seismometry data. Recovery of the first meteorites occurred within 2 days and continued for months afterward. The recovery effort included local citizens, scientists, and meteorite hunters, and featured coordination efforts by local scientific institutions. Scientific analysis of the collected meteorites revealed characteristics that were available for study only because the rapid collection of samples had minimized terrestrial contamination/alteration. This combination of factors—rapid and accurate location of the event, participation in the meteorite search by the public, and coordinated scientific investigation of recovered samples—is a model that was widely beneficial and should be emulated in future meteorite falls. The tools necessary to recreate the Sutter's Mill recovery are available, but are currently underutilized in much of the world. Weather radar networks, scientific institutions with interest in meteoritics, and the interested public are available globally. Therefore, it is possible to repeat the Sutter's Mill recovery model for future meteorite falls around the world, each for relatively little cost with a dedicated researcher. Doing so will significantly increase the number of fresh meteorite falls available for study, provide meteorite material that can serve as the nuclei of new meteorite collections, and will improve the public visibility of meteoritics research.

  3. Section on prospects for dark matter detection of the white paper on the status and future of ground-based TeV gamma-ray astronomy.

    Energy Technology Data Exchange (ETDEWEB)

    Byrum, K.; Horan, D.; Tait, T.; Wanger, R.; Zaharijas, G.; Buckley , J.; Baltz, E. A.; Bertone, G.; Dingus, B.; Fegan, S.; Ferrer, F.; Gondolo, P.; Hall, J.; Hooper, D.; Horan, D.; Koushiappas, S.; Krawczynksi, H.; LeBohec, S.; Pohl, M.; Profumo, S.; Silk , J; Vassilev, V.; Wood , M.; Wakely, S.; High Energy Physics; FNAL; Univ. of St. Louis; Stanford Univ.; Insti. d' Astrophysique; LANL; Univ. of California; Washington Univ.; Univ. of Utah; Brown Univ.; Oxford Univ.; Iowa State Univ.; Univ. of Chicago

    2009-05-13

    This is a report on the findings of the dark matter science working group for the white paper on the status and future of TeV gamma-ray astronomy. The white paper was commissioned by the American Physical Society, and the full white paper can be found on astro-ph (arXiv:0810.0444). This detailed section discusses the prospects for dark matter detection with future gamma-ray experiments, and the complementarity of gamma-ray measurements with other indirect, direct or accelerator-based searches. We conclude that any comprehensive search for dark matter should include gamma-ray observations, both to identify the dark matter particle (through the characteristics of the gamma-ray spectrum) and to measure the distribution of dark matter in galactic halos.

  4. Software reliability studies

    Science.gov (United States)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  5. Neck Flexor and Extensor Muscle Endurance in Subclinical Neck Pain: Intrarater Reliability, Standard Error of Measurement, Minimal Detectable Change, and Comparison With Asymptomatic Participants in a University Student Population.

    Science.gov (United States)

    Lourenço, Ana S; Lameiras, Carina; Silva, Anabela G

    2016-01-01

    The aims of this study were to assess intrarater reliability and to calculate the standard error of measurement (SEM) and minimal detectable change (MDC) for deep neck flexor and neck extensor muscle endurance tests, and compare the results between individuals with and without subclinical neck pain. Participants were students of the University of Aveiro reporting subclinical neck pain and asymptomatic participants matched for sex and age to the neck pain group. Data on endurance capacity of the deep neck flexors and neck extensors were collected by a blinded assessor using the deep neck flexor endurance test and the extensor endurance test, respectively. Intraclass correlation coefficients (ICCs), SEM, and MDC were calculated for measurements taken within a session by the same assessor. Differences between groups for endurance capacity were investigated using a Mann-Whitney U test. The deep neck flexor endurance test (ICC = 0.71; SEM = 6.91 seconds; MDC = 19.15 seconds) and neck extensor endurance test (ICC = 0.73; SEM = 9.84 minutes; MDC = 2.34 minutes) are reliable. No significant differences were found between participants with and without neck pain for both tests of muscle endurance (P > .05). The endurance capacity of the deep neck flexors and neck extensors can be reliably measured in participants with subclinical neck pain. However, the wide SEM and MDC might limit the sensitivity of these tests. Copyright © 2016. Published by Elsevier Inc.

  6. MycoKey Round Table Discussions of Future Directions in Research on Chemical Detection Methods, Genetics and Biodiversity of Mycotoxins

    Directory of Open Access Journals (Sweden)

    John F. Leslie

    2018-03-01

    Full Text Available MycoKey, an EU-funded Horizon 2020 project, includes a series of “Roundtable Discussions” to gather information on trending research areas in the field of mycotoxicology. This paper includes summaries of the Roundtable Discussions on Chemical Detection and Monitoring of mycotoxins and on the role of genetics and biodiversity in mycotoxin production. Discussions were managed by using the nominal group discussion technique, which generates numerous ideas and provides a ranking for those identified as the most important. Four questions were posed for each research area, as well as two questions that were common to both discussions. Test kits, usually antibody based, were one major focus of the discussions at the Chemical Detection and Monitoring roundtable because of their many favorable features, e.g., cost, speed and ease of use. The second area of focus for this roundtable was multi-mycotoxin detection protocols and the challenges still to be met to enable these protocols to become methods of choice for regulated mycotoxins. For the genetic and biodiversity group, both the depth and the breadth of trending research areas were notable. For some areas, e.g., microbiome studies, the suggested research questions were primarily of a descriptive nature. In other areas, multiple experimental approaches, e.g., transcriptomics, proteomics, RNAi and gene deletions, are needed to understand the regulation of toxin production and mechanisms underlying successful biological controls. Answers to the research questions will provide starting points for developing acceptable prevention and remediation processes. Forging a partnership between scientists and appropriately-placed communications experts was recognized by both groups as an essential step to communicating risks, while retaining overall confidence in the safety of the food supply and the integrity of the food production chain.

  7. MycoKey Round Table Discussions of Future Directions in Research on Chemical Detection Methods, Genetics and Biodiversity of Mycotoxins

    Science.gov (United States)

    Lattanzio, Veronica; Cary, Jeffrey; Chulze, Sofia N.; Gerardino, Annamaria; Liao, Yu-Cai; Maragos, Chris M.; Meca, Giuseppe; Moretti, Antonio; Munkvold, Gary; Mulè, Giuseppina; Njobeh, Patrick; Pecorelli, Ivan; Pietri, Amedeo; Proctor, Robert H.; Rahayu, Endang S.; Ramírez, Maria L.; Samson, Robert; Stroka, Jörg; Sumarah, Mark; Zhang, Qi; Zhang, Hao; Logrieco, Antonio F.

    2018-01-01

    MycoKey, an EU-funded Horizon 2020 project, includes a series of “Roundtable Discussions” to gather information on trending research areas in the field of mycotoxicology. This paper includes summaries of the Roundtable Discussions on Chemical Detection and Monitoring of mycotoxins and on the role of genetics and biodiversity in mycotoxin production. Discussions were managed by using the nominal group discussion technique, which generates numerous ideas and provides a ranking for those identified as the most important. Four questions were posed for each research area, as well as two questions that were common to both discussions. Test kits, usually antibody based, were one major focus of the discussions at the Chemical Detection and Monitoring roundtable because of their many favorable features, e.g., cost, speed and ease of use. The second area of focus for this roundtable was multi-mycotoxin detection protocols and the challenges still to be met to enable these protocols to become methods of choice for regulated mycotoxins. For the genetic and biodiversity group, both the depth and the breadth of trending research areas were notable. For some areas, e.g., microbiome studies, the suggested research questions were primarily of a descriptive nature. In other areas, multiple experimental approaches, e.g., transcriptomics, proteomics, RNAi and gene deletions, are needed to understand the regulation of toxin production and mechanisms underlying successful biological controls. Answers to the research questions will provide starting points for developing acceptable prevention and remediation processes. Forging a partnership between scientists and appropriately-placed communications experts was recognized by both groups as an essential step to communicating risks, while retaining overall confidence in the safety of the food supply and the integrity of the food production chain. PMID:29494529

  8. Melting curve analysis after T allele enrichment (MelcaTle as a highly sensitive and reliable method for detecting the JAK2V617F mutation.

    Directory of Open Access Journals (Sweden)

    Soji Morishita

    Full Text Available Detection of the JAK2V617F mutation is essential for diagnosing patients with classical myeloproliferative neoplasms (MPNs. However, detection of the low-frequency JAK2V617F mutation is a challenging task due to the necessity of discriminating between true-positive and false-positive results. Here, we have developed a highly sensitive and accurate assay for the detection of JAK2V617F and named it melting curve analysis after T allele enrichment (MelcaTle. MelcaTle comprises three steps: 1 two cycles of JAK2V617F allele enrichment by PCR amplification followed by BsaXI digestion, 2 selective amplification of the JAK2V617F allele in the presence of a bridged nucleic acid (BNA probe, and 3 a melting curve assay using a BODIPY-FL-labeled oligonucleotide. Using this assay, we successfully detected nearly a single copy of the JAK2V617F allele, without false-positive signals, using 10 ng of genomic DNA standard. Furthermore, MelcaTle showed no positive signals in 90 assays screening healthy individuals for JAK2V617F. When applying MelcaTle to 27 patients who were initially classified as JAK2V617F-positive on the basis of allele-specific PCR analysis and were thus suspected as having MPNs, we found that two of the patients were actually JAK2V617F-negative. A more careful clinical data analysis revealed that these two patients had developed transient erythrocytosis of unknown etiology but not polycythemia vera, a subtype of MPNs. These findings indicate that the newly developed MelcaTle assay should markedly improve the diagnosis of JAK2V617F-positive MPNs.

  9. LIFETIME AND SPECTRAL EVOLUTION OF A MAGMA OCEAN WITH A STEAM ATMOSPHERE: ITS DETECTABILITY BY FUTURE DIRECT IMAGING

    Energy Technology Data Exchange (ETDEWEB)

    Hamano, Keiko; Kawahara, Hajime; Abe, Yutaka [Department of Earth and Planetary Science, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Onishi, Masanori [Department of Earth and Planetary Sciences, Kobe University, 1-1 Rokkodai-cho, Nada, Kobe 657-8501 (Japan); Hashimoto, George L., E-mail: keiko@eps.s.u-tokyo.ac.jp [Department of Earth Sciences, Okayama University, 3-1-1 Tsushima-Naka, Kita, Okayama, 700-8530 (Japan)

    2015-06-20

    We present the thermal evolution and emergent spectra of solidifying terrestrial planets along with the formation of steam atmospheres. The lifetime of a magma ocean and its spectra through a steam atmosphere depends on the orbital distance of the planet from the host star. For a Type I planet, which is formed beyond a certain critical distance from the host star, the thermal emission declines on a timescale shorter than approximately 10{sup 6} years. Therefore, young stars should be targets when searching for molten planets in this orbital region. In contrast, a Type II planet, which is formed inside the critical distance, will emit significant thermal radiation from near-infrared atmospheric windows during the entire lifetime of the magma ocean. The K{sub s} and L bands will be favorable for future direct imaging because the planet-to-star contrasts of these bands are higher than approximately 10{sup −7}–10{sup −8}. Our model predicts that, in the Type II orbital region, molten planets would be present over the main sequence of the G-type host star if the initial bulk content of water exceeds approximately 1 wt%. In visible atmospheric windows, the contrasts of the thermal emission drop below 10{sup −10} in less than 10{sup 5} years, whereas those of the reflected light remain 10{sup −10} for both types of planets. Since the contrast level is comparable to those of reflected light from Earth-sized planets in the habitable zone, the visible reflected light from molten planets also provides a promising target for direct imaging with future ground- and space-based telescopes.

  10. Using a thermoluminescent dosimeter to evaluate the location reliability of the highest–skin dose area detected by treatment planning in radiotherapy for breast cancer

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Li-Min, E-mail: limin.sun@yahoo.com [Department of Radiation Oncology, Zuoying Branch of Kaohsiung Armed Forces General Hospital, Kaohsiung City, Taiwan (China); Huang, Chih-Jen [Department of Radiation Oncology, Kaohsiung Medical University Hospital, Kaohsiung Medical University, Kaohsiung City, Taiwan (China); Faculty of Medicine, Kaohsiung Medical University Hospital, Kaohsiung Medical University, Kaohsiung City, Taiwan (China); College of Medicine, Kaohsiung Medical University Hospital, Kaohsiung Medical University, Kaohsiung City, Taiwan (China); Chen, Hsiao-Yun [Department of Radiation Oncology, Kaohsiung Medical University Hospital, Kaohsiung Medical University, Kaohsiung City, Taiwan (China); Meng, Fan-Yun [Department of General Surgery, Zuoying Branch of Kaohsiung Armed Forces General Hospital, Kaohsiung City, Taiwan (China); Lu, Tsung-Hsien [Department of Radiation Oncology, Zuoying Branch of Kaohsiung Armed Forces General Hospital, Kaohsiung City, Taiwan (China); Tsao, Min-Jen [Department of General Surgery, Zuoying Branch of Kaohsiung Armed Forces General Hospital, Kaohsiung City, Taiwan (China)

    2014-01-01

    Acute skin reaction during adjuvant radiotherapy for breast cancer is an inevitable process, and its severity is related to the skin dose. A high–skin dose area can be speculated based on the isodose distribution shown on a treatment planning. To determine whether treatment planning can reflect high–skin dose location, 80 patients were collected and their skin doses in different areas were measured using a thermoluminescent dosimeter to locate the highest–skin dose area in each patient. We determined whether the skin dose is consistent with the highest-dose area estimated by the treatment planning of the same patient. The χ{sup 2} and Fisher exact tests revealed that these 2 methods yielded more consistent results when the highest-dose spots were located in the axillary and breast areas but not in the inframammary area. We suggest that skin doses shown on the treatment planning might be a reliable and simple alternative method for estimating the highest skin doses in some areas.

  11. Split-bolus single-phase cardiac multidetector computed tomography for reliable detection of left atrial thrombus. Comparison to transesophageal echocardiography

    Energy Technology Data Exchange (ETDEWEB)

    Staab, W.; Zwaka, P.A.; Sohns, J.M.; Schwarz, A.; Lotz, J. [University Medical Center Goettingen Univ. (Germany). Inst. for Diagnostic and Interventional Radiology; Sohns, C.; Vollmann, D.; Zabel, M.; Hasenfuss, G. [Goettingen Univ. (Germany). Dept. of Cardiology and Pneumology; Schneider, S. [Goettingen Univ. (Germany). Dept. of Medical Statistics

    2014-11-15

    Evaluation of a new cardiac MDCT protocol using a split-bolus contrast injection protocol and single MDCT scan for reliable diagnosis of LA/LAA thrombi in comparison to TEE, optimizing radiation exposure and use of contrast agent. A total of 182 consecutive patients with drug refractory AF scheduled for PVI (62.6% male, mean age: 64.1 ± 10.2 years) underwent routine diagnostic work including TEE and cardiac MDCT for the evaluation of LA/LAA anatomy and thrombus formation between November 2010 and March 2012. Contrast media injection was split into a pre-bolus of 30 ml and main bolus of 70 ml iodinated contrast agent separated by a short time delay. In this study, split-bolus cardiac MDCT identified 14 of 182 patients with filling defects of the LA/LAA. In all of these 14 patients, abnormalities were found in TEE. All 5 of the 14 patients with thrombus formation in cardiac MDCT were confirmed by TEE. MDCT was 100% accurate for thrombus, with strong but not perfect overall results for SEC equivalent on MDCT.

  12. Using a thermoluminescent dosimeter to evaluate the location reliability of the highest–skin dose area detected by treatment planning in radiotherapy for breast cancer

    International Nuclear Information System (INIS)

    Sun, Li-Min; Huang, Chih-Jen; Chen, Hsiao-Yun; Meng, Fan-Yun; Lu, Tsung-Hsien; Tsao, Min-Jen

    2014-01-01

    Acute skin reaction during adjuvant radiotherapy for breast cancer is an inevitable process, and its severity is related to the skin dose. A high–skin dose area can be speculated based on the isodose distribution shown on a treatment planning. To determine whether treatment planning can reflect high–skin dose location, 80 patients were collected and their skin doses in different areas were measured using a thermoluminescent dosimeter to locate the highest–skin dose area in each patient. We determined whether the skin dose is consistent with the highest-dose area estimated by the treatment planning of the same patient. The χ 2 and Fisher exact tests revealed that these 2 methods yielded more consistent results when the highest-dose spots were located in the axillary and breast areas but not in the inframammary area. We suggest that skin doses shown on the treatment planning might be a reliable and simple alternative method for estimating the highest skin doses in some areas

  13. Reliability issues : a Canadian perspective

    International Nuclear Information System (INIS)

    Konow, H.

    2004-01-01

    A Canadian perspective of power reliability issues was presented. Reliability depends on adequacy of supply and a framework for standards. The challenges facing the electric power industry include new demand, plant replacement and exports. It is expected that demand will by 670 TWh by 2020, with 205 TWh coming from new plants. Canada will require an investment of $150 billion to meet this demand and the need is comparable in the United States. As trade grows, the challenge becomes a continental issue and investment in the bi-national transmission grid will be essential. The 5 point plan of the Canadian Electricity Association is to: (1) establish an investment climate to ensure future electricity supply, (2) move government and industry towards smart and effective regulation, (3) work to ensure a sustainable future for the next generation, (4) foster innovation and accelerate skills development, and (5) build on the strengths of an integrated North American system to maximize opportunity for Canadians. The CEA's 7 measures that enhance North American reliability were listed with emphasis on its support for a self-governing international organization for developing and enforcing mandatory reliability standards. CEA also supports the creation of a binational Electric Reliability Organization (ERO) to identify and solve reliability issues in the context of a bi-national grid. tabs., figs

  14. Evaluation of an Immunochromatographic Test for Rapid and Reliable Serodiagnosis of Human Tularemia and Detection of Francisella tularensis-Specific Antibodies in Sera from Different Mammalian Species ▿

    Science.gov (United States)

    Splettstoesser, W.; Guglielmo-Viret, V.; Seibold, E.; Thullier, P.

    2010-01-01

    Tularemia is a highly contagious infectious zoonosis caused by the bacterial agent Francisella tularensis. Serology is still considered to be a cornerstone in tularemia diagnosis due to the low sensitivity of bacterial culture and the lack of standardization in PCR methodology for the direct identification of the pathogen. We developed a novel immunochromatographic test (ICT) to efficiently detect F. tularensis-specific antibodies in sera from humans and other mammalian species (nonhuman primate, pig, and rabbit). This new tool requires none or minimal laboratory equipment, and the results are obtained within 15 min. When compared to the method of microagglutination, which was shown to be more specific than the enzyme-linked immunosorbent assay, the ICT had a sensitivity of 98.3% (58 positive sera were tested) and a specificity of 96.5% (58 negative sera were tested) on human sera. On animal sera, the overall sensitivity was 100% (22 positive sera were tested) and specificity was also 100% (70 negative sera were tested). This rapid test preferentially detects IgG antibodies that may occur early in the course of human tularemia, but further evaluation with human sera is important to prove that the ICT can be a valuable field test to support a presumptive diagnosis of tularemia. The ICT can also be a useful tool to monitor successful vaccination with subunit vaccines or live vaccine strains containing lipopolysaccharide (e.g., LVS) and to detect seropositive individuals or animals in outbreak situations or in the context of epidemiologic surveillance programs in areas of endemicity as recently recommended by the World Health Organization. PMID:20220165

  15. A simple and reliable method to detect gamma irradiated lentil (Lens culinaris Medik.) seeds by germination efficiency and seedling growth test

    International Nuclear Information System (INIS)

    Chaudhuri, Sadhan K.

    2002-01-01

    Germination efficiency and root/shoot length of germinated seedling is proposed to identify irradiated lentil seeds. Germination percentage was reduced above 0.2 kGy and lentil seeds were unable to germinate above 1.0 kGy dose. The critical dose that prevented the root elongation varied from 0.1 to 0.5 kGy. The sensitivity of lentil seeds to gamma irradiation was inversely proportional to moisture content of the seeds. Radiation effects could be detected in seeds even 12 months storage after gamma irradiation

  16. Revisiting the STEC Testing Approach: Using espK and espV to Make Enterohemorrhagic Escherichia coli (EHEC) Detection More Reliable in Beef

    OpenAIRE

    Delannoy, Sabine; Chaves, Byron D.; Ison, Sarah A.; Webb, Hattie E.; Beutin, Lothar; Delaval, José; Billet, Isabelle; Fach, Patrick

    2016-01-01

    Current methods for screening Enterohemorrhagic Escherichia coli (EHEC) O157 and non-O157 in beef enrichments typically rely on the molecular detection of stx, eae, and serogroup-specific wzx or wzy gene fragments. As these genetic markers can also be found in some non-EHEC strains, a number of ‘false positive’ results are obtained. Here, we explore the suitability of five novel molecular markers, espK, espV, ureD, Z2098, and CRISPRO26:H11 as candidates for a more accurate screening of EHEC s...

  17. Forecasting reliability of transformer populations

    NARCIS (Netherlands)

    Schijndel, van A.; Wetzer, J.; Wouters, P.A.A.F.

    2007-01-01

    The expected replacement wave in the current power grid faces asset managers with challenging questions. Setting up a replacement strategy and planning calls for a forecast of the long term component reliability. For transformers the future failure probability can be predicted based on the ongoing

  18. Methane and carbon dioxide hydrates on Mars: Potential origins, distribution, detection, and implications for future in situ resource utilization

    Science.gov (United States)

    Pellenbarg, Robert E.; Max, Michael D.; Clifford, Stephen M.

    2003-04-01

    There is high probability for the long-term crustal accumulation of methane and carbon dioxide on Mars. These gases can arise from a variety of processes, including deep biosphere activity and abiotic mechanisms, or, like water, they could exist as remnants of planetary formation and by-products of internal differentiation. CH4 and CO2 would tend to rise buoyantly toward the planet's surface, condensing with water under appropriate conditions of temperature and pressure to form gas hydrate. Gas hydrates are a class of materials created when gas molecules are trapped within a crystalline lattice of water-ice. The hydrate stability fields of both CH4 and CO2 encompass a portion of the Martian crust that extends from within the water-ice cryosphere, from a depth as shallow as ~10-20 m to as great as a kilometer or more below the base of the Martian cryosphere. The presence and distribution of methane and carbon dioxide hydrates may be of critical importance in understanding the geomorphic evolution of Mars and the geophysical identification of water and other volatiles stored in the hydrates. Of critical importance, Martian gas hydrates would ensure the availability of key in situ resources for sustaining future robotic and human exploration, and the eventual colonization, of Mars.

  19. Future prospect of remote Cat-CVD on the basis of the production, transportation and detection of H atoms

    International Nuclear Information System (INIS)

    Umemoto, Hironobu; Matsumura, Hideki

    2008-01-01

    The future prospect of remote Cat-CVD, in which the decomposition and the deposition chambers are separated, is discussed on the basis of the absolute density measurements of H atoms. It is now well recognized that uniform deposition is possible on a large area without plasma damages by Cat-CVD. However, we may not overlook the demerits in Cat-CVD. One of the demerits is the poisoning of the catalyzer surfaces by the material gases, both temporary and permanent. One technique to overcome this problem is remote Cat-CVD. The question is how to separate the decomposition and deposition areas. If the separation is not enough, there should be back diffusion of the material gases, which will poison the catalyzers. If the separation is too tight, radicals may not effuse out from the decomposition chamber. These problems are discussed and it is shown that SiO 2 coating to reduce the radical recombination rates on walls is promising. The possibility of the polytetrafluoroethene coating by Cat-CVD is also discussed

  20. Does functional MRI detect activation in white matter? A review of emerging evidence, issues, and future directions

    Science.gov (United States)

    Gawryluk, Jodie R.; Mazerolle, Erin L.; D'Arcy, Ryan C. N.

    2014-01-01

    Functional magnetic resonance imaging (fMRI) is a non-invasive technique that allows for visualization of activated brain regions. Until recently, fMRI studies have focused on gray matter. There are two main reasons white matter fMRI remains controversial: (1) the blood oxygen level dependent (BOLD) fMRI signal depends on cerebral blood flow and volume, which are lower in white matter than gray matter and (2) fMRI signal has been associated with post-synaptic potentials (mainly localized in gray matter) as opposed to action potentials (the primary type of neural activity in white matter). Despite these observations, there is no direct evidence against measuring fMRI activation in white matter and reports of fMRI activation in white matter continue to increase. The questions underlying white matter fMRI activation are important. White matter fMRI activation has the potential to greatly expand the breadth of brain connectivity research, as well as improve the assessment and diagnosis of white matter and connectivity disorders. The current review provides an overview of the motivation to investigate white matter fMRI activation, as well as the published evidence of this phenomenon. We speculate on possible neurophysiologic bases of white matter fMRI signals, and discuss potential explanations for why reports of white matter fMRI activation are relatively scarce. We end with a discussion of future basic and clinical research directions in the study of white matter fMRI. PMID:25152709

  1. System Reliability Engineering

    International Nuclear Information System (INIS)

    Lim, Tae Jin

    2005-02-01

    This book tells of reliability engineering, which includes quality and reliability, reliability data, importance of reliability engineering, reliability and measure, the poisson process like goodness of fit test and the poisson arrival model, reliability estimation like exponential distribution, reliability of systems, availability, preventive maintenance such as replacement policies, minimal repair policy, shock models, spares, group maintenance and periodic inspection, analysis of common cause failure, and analysis model of repair effect.

  2. Making the most of RNA-seq: Pre-processing sequencing data with Opossum for reliable SNP variant detection [version 2; referees: 2 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Laura Oikkonen

    2017-03-01

    Full Text Available Identifying variants from RNA-seq (transcriptome sequencing data is a cost-effective and versatile complement to whole-exome (WES and whole-genome sequencing (WGS analysis. RNA-seq (transcriptome sequencing is primarily considered a method of gene expression analysis but it can also be used to detect DNA variants in expressed regions of the genome. However, current variant callers do not generally behave well with RNA-seq data due to reads encompassing intronic regions. We have developed a software programme called Opossum to address this problem. Opossum pre-processes RNA-seq reads prior to variant calling, and although it has been designed to work specifically with Platypus, it can be used equally well with other variant callers such as GATK HaplotypeCaller. In this work, we show that using Opossum in conjunction with either Platypus or GATK HaplotypeCaller maintains precision and improves the sensitivity for SNP detection compared to the GATK Best Practices pipeline. In addition, using it in combination with Platypus offers a substantial reduction in run times compared to the GATK pipeline so it is ideal when there are only limited time or computational resources available.

  3. Calculating system reliability with SRFYDO

    Energy Technology Data Exchange (ETDEWEB)

    Morzinski, Jerome [Los Alamos National Laboratory; Anderson - Cook, Christine M [Los Alamos National Laboratory; Klamann, Richard M [Los Alamos National Laboratory

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for the system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.

  4. Linear-after-the-exponential polymerase chain reaction and allied technologies. Real-time detection strategies for rapid, reliable diagnosis from single cells.

    Science.gov (United States)

    Pierce, Kenneth E; Wangh, Lawrence J

    2007-01-01

    Accurate detection of gene sequences in single cells is the ultimate challenge to polymerase chain reaction (PCR) sensitivity. Unfortunately, commonly used conventional and real-time PCR techniques are often too unreliable at that level to provide the accuracy needed for clinical diagnosis. Here we provide details of linear-after-the-exponential-PCR (LATE-PCR), a method similar to asymmetric PCR in the use of primers at different concentrations, but with novel design criteria to ensure high efficiency and specificity. Compared with conventional PCR, LATE-PCR increases the signal strength and allele discrimination capability of oligonucleotide probes such as molecular beacons and reduces variability among replicate samples. The analysis of real-time kinetics of LATE-PCR signals provides a means for improving the accuracy of single cell genetic diagnosis.

  5. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    International Nuclear Information System (INIS)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao; Carlson, Reed B.; Yoo, Tae-Sic

    2017-01-01

    Highlights: • Process monitoring can strengthen nuclear safeguards and material accountancy. • Assessment is conducted at a system-centric level to improve safeguards effectiveness. • Anomaly detection is improved by integrating process and operation relationships. • Decision making is benefited from using sensor and event sequence information. • Formal framework enables optimization of sensor and data processing resources. - Abstract: In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a system-centric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologies within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.

  6. Field-based detection of biological samples for forensic analysis: Established techniques, novel tools, and future innovations.

    Science.gov (United States)

    Morrison, Jack; Watts, Giles; Hobbs, Glyn; Dawnay, Nick

    2018-04-01

    Field based forensic tests commonly provide information on the presence and identity of biological stains and can also support the identification of species. Such information can support downstream processing of forensic samples and generate rapid intelligence. These approaches have traditionally used chemical and immunological techniques to elicit the result but some are known to suffer from a lack of specificity and sensitivity. The last 10 years has seen the development of field-based genetic profiling systems, with specific focus on moving the mainstay of forensic genetic analysis, namely STR profiling, out of the laboratory and into the hands of the non-laboratory user. In doing so it is now possible for enforcement officers to generate a crime scene DNA profile which can then be matched to a reference or database profile. The introduction of these novel genetic platforms also allows for further development of new molecular assays aimed at answering the more traditional questions relating to body fluid identity and species detection. The current drive for field-based molecular tools is in response to the needs of the criminal justice system and enforcement agencies, and promises a step-change in how forensic evidence is processed. However, the adoption of such systems by the law enforcement community does not represent a new strategy in the way forensic science has integrated previous novel approaches. Nor do they automatically represent a threat to the quality control and assurance practices that are central to the field. This review examines the historical need and subsequent research and developmental breakthroughs in field-based forensic analysis over the past two decades with particular focus on genetic methods Emerging technologies from a range of scientific fields that have potential applications in forensic analysis at the crime scene are identified and associated issues that arise from the shift from laboratory into operational field use are discussed

  7. Making the most of RNA-seq: Pre-processing sequencing data with Opossum for reliable SNP variant detection [version 1; referees: 2 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Laura Oikkonen

    2017-01-01

    Full Text Available Identifying variants from RNA-seq (transcriptome sequencing data is a cost-effective and versatile alternative to whole-genome sequencing. However, current variant callers do not generally behave well with RNA-seq data due to reads encompassing intronic regions. We have developed a software programme called Opossum to address this problem. Opossum pre-processes RNA-seq reads prior to variant calling, and although it has been designed to work specifically with Platypus, it can be used equally well with other variant callers such as GATK HaplotypeCaller. In this work, we show that using Opossum in conjunction with either Platypus or GATK HaplotypeCaller maintains precision and improves the sensitivity for SNP detection compared to the GATK Best Practices pipeline. In addition, using it in combination with Platypus offers a substantial reduction in run times compared to the GATK pipeline so it is ideal when there are only limited time or computational resources available.

  8. Fast and reliable detection of toxic Crotalaria spectabilis Roth. in Thunbergia laurifolia Lindl. herbal products using DNA barcoding coupled with HRM analysis.

    Science.gov (United States)

    Singtonat, Sahachat; Osathanunkul, Maslin

    2015-05-30

    useful in the identification and authentication of herbal species in processed samples. In the future, species authentication through Bar-HRM could be used to promote consumer trust, as well as raising the quality of herbal products.

  9. Solid State Lighting Reliability Components to Systems

    CERN Document Server

    Fan, XJ

    2013-01-01

    Solid State Lighting Reliability: Components to Systems begins with an explanation of the major benefits of solid state lighting (SSL) when compared to conventional lighting systems including but not limited to long useful lifetimes of 50,000 (or more) hours and high efficacy. When designing effective devices that take advantage of SSL capabilities the reliability of internal components (optics, drive electronics, controls, thermal design) take on critical importance. As such a detailed discussion of reliability from performance at the device level to sub components is included as well as the integrated systems of SSL modules, lamps and luminaires including various failure modes, reliability testing and reliability performance. This book also: Covers the essential reliability theories and practices for current and future development of Solid State Lighting components and systems Provides a systematic overview for not only the state-of-the-art, but also future roadmap and perspectives of Solid State Lighting r...

  10. Microprocessor hardware reliability

    Energy Technology Data Exchange (ETDEWEB)

    Wright, R I

    1982-01-01

    Microprocessor-based technology has had an impact in nearly every area of industrial electronics and many applications have important safety implications. Microprocessors are being used for the monitoring and control of hazardous processes in the chemical, oil and power generation industries, for the control and instrumentation of aircraft and other transport systems and for the control of industrial machinery. Even in the field of nuclear reactor protection, where designers are particularly conservative, microprocessors are used to implement certain safety functions and may play increasingly important roles in protection systems in the future. Where microprocessors are simply replacing conventional hard-wired control and instrumentation systems no new hazards are created by their use. In the field of robotics, however, the microprocessor has opened up a totally new technology and with it has created possible new and as yet unknown hazards. The paper discusses some of the design and manufacturing techniques which may be used to enhance the reliability of microprocessor based systems and examines the available reliability data on lsi/vlsi microcircuits. 12 references.

  11. AMSAA Reliability Growth Guide

    National Research Council Canada - National Science Library

    Broemm, William

    2000-01-01

    ... has developed reliability growth methodology for all phases of the process, from planning to tracking to projection. The report presents this methodology and associated reliability growth concepts.

  12. Rapid, reliable, and sensitive detection of adenosine deaminase activity by UHPLC-Q-Orbitrap HRMS and its application to inhibitory activity evaluation of traditional Chinese medicines.

    Science.gov (United States)

    Qi, Shenglan; Guan, Huida; Deng, Gang; Yang, Tao; Cheng, Xuemei; Liu, Wei; Liu, Ping; Wang, Changhong

    2018-05-10

    analytes were stable under the investigated conditions. The developed method was successfully applied to the detection of the inhibitory activity of ADA from traditional Chinese medicines. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Meeting the future metro network challenges and requirements by adopting programmable S-BVT with direct-detection and PDM functionality

    Science.gov (United States)

    Nadal, Laia; Svaluto Moreolo, Michela; Fàbrega, Josep M.; Vílchez, F. Javier

    2017-07-01

    In this paper, we propose an advanced programmable sliceable-bandwidth variable transceiver (S-BVT) with polarization division multiplexing (PDM) capability as a key enabler to fulfill the requirements for future 5G networks. Thanks to its cost-effective optoelectronic front-end based on orthogonal frequency division multiplexing (OFDM) technology and direct-detection (DD), the proposed S-BVT becomes suitable for next generation highly flexible and scalable metro networks. Polarization beam splitters (PBSs) and controllers (PCs), available on-demand, are included at the transceivers and at the network nodes, further enhancing the system flexibility and promoting an efficient use of the spectrum. 40G-100G PDM transmission has been experimentally demonstrated, within a 4-node photonic mesh network (ADRENALINE testbed), implementing a simplified equalization process.

  14. New generation of monolithic active pixel sensors for charged particle detection; Developpement d'un capteur de nouvelle generation et son electronique integree pour les collisionneurs futurs

    Energy Technology Data Exchange (ETDEWEB)

    Deptuch, G

    2002-09-01

    Vertex detectors are of great importance in particle physics experiments, as the knowledge of the event flavour is becoming an issue for the physics programme at Future Linear Colliders. Monolithic Active Pixel Sensors (MAPS) based on a novel detector structure have been proposed. Their fabrication is compatible with a standard CMOS process. The sensor is inseparable from the readout electronics, since both of them are integrated on the same, low-resistivity silicon wafer. The basic pixel configuration comprises only three MOS transistors and a diode collecting the charge through thermal diffusion. The charge is generated in the thin non-depleted epitaxial layer underneath the readout electronics. This approach provides, at low cost, a high resolution and thin device with the whole area sensitive to radiation. Device simulations using the ISE-TCAD package have been carried out to study the charge collection mechanism. In order to demonstrate the viability of the technique, four prototype chips have been fabricated using different submicrometer CMOS processes. The pixel gain has been calibrated using a {sup 55}Fe source and the Poisson sequence method. The prototypes have been exposed to high-energy particle beams at CERN. The tests proved excellent detection performances expressed in a single-track spatial resolution of 1.5 {mu}m and detection efficiency close to 100%, resulting from a SNR ratio of more than 30. Irradiation tests showed immunity of MAPS to a level of a few times 10{sup 12} n/cm{sup 2} and a few hundred kRad of ionising radiation. The ideas for future work, including on-pixel signal amplification, double sampling operation and current mode pixel design are present as well. (author)

  15. New generation of monolithic active pixel sensors for charged particle detection; Developpement d'un capteur de nouvelle generation et son electronique integree pour les collisionneurs futurs

    Energy Technology Data Exchange (ETDEWEB)

    Deptuch, G

    2002-09-01

    Vertex detectors are of great importance in particle physics experiments, as the knowledge of the event flavour is becoming an issue for the physics programme at Future Linear Colliders. Monolithic Active Pixel Sensors (MAPS) based on a novel detector structure have been proposed. Their fabrication is compatible with a standard CMOS process. The sensor is inseparable from the readout electronics, since both of them are integrated on the same, low-resistivity silicon wafer. The basic pixel configuration comprises only three MOS transistors and a diode collecting the charge through thermal diffusion. The charge is generated in the thin non-depleted epitaxial layer underneath the readout electronics. This approach provides, at low cost, a high resolution and thin device with the whole area sensitive to radiation. Device simulations using the ISE-TCAD package have been carried out to study the charge collection mechanism. In order to demonstrate the viability of the technique, four prototype chips have been fabricated using different submicrometer CMOS processes. The pixel gain has been calibrated using a {sup 55}Fe source and the Poisson sequence method. The prototypes have been exposed to high-energy particle beams at CERN. The tests proved excellent detection performances expressed in a single-track spatial resolution of 1.5 {mu}m and detection efficiency close to 100%, resulting from a SNR ratio of more than 30. Irradiation tests showed immunity of MAPS to a level of a few times 10{sup 12} n/cm{sup 2} and a few hundred kRad of ionising radiation. The ideas for future work, including on-pixel signal amplification, double sampling operation and current mode pixel design are present as well. (author)

  16. Fuel reliability experience in Finland

    International Nuclear Information System (INIS)

    Kekkonen, L.

    2015-01-01

    Four nuclear reactors have operated in Finland now for 35-38 years. The two VVER-440 units at Loviisa Nuclear Power Plant are operated by Fortum and two BWR’s in Olkiluoto are operated by Teollisuuden Voima Oyj (TVO). The fuel reliability experience of the four reactors operating currently in Finland has been very good and the fuel failure rates have been very low. Systematic inspection of spent fuel assemblies, and especially all failed assemblies, is a good practice that is employed in Finland in order to improve fuel reliability and operational safety. Investigation of the root cause of fuel failures is important in developing ways to prevent similar failures in the future. The operational and fuel reliability experience at the Loviisa Nuclear Power Plant has been reported also earlier in the international seminars on WWER Fuel Performance, Modelling and Experimental Support. In this paper the information on fuel reliability experience at Loviisa NPP is updated and also a short summary of the fuel reliability experience at Olkiluoto NPP is given. Keywords: VVER-440, fuel reliability, operational experience, poolside inspections, fuel failure identification. (author)

  17. Test-retest reliability and minimal detectable change scores for sit-to-stand-to-sit tests, the six-minute walk test, the one-leg heel-rise test, and handgrip strength in people undergoing hemodialysis.

    Science.gov (United States)

    Segura-Ortí, Eva; Martínez-Olmos, Francisco José

    2011-08-01

    Determining the relative and absolute reliability of outcomes of physical performance tests for people undergoing hemodialysis is necessary to discriminate between the true effects of exercise interventions and the inherent variability of this cohort. The aims of this study were to assess the relative reliability of sit-to-stand-to-sit tests (the STS-10, which measures the time [in seconds] required to complete 10 full stands from a sitting position, and the STS-60, which measures the number of repetitions achieved in 60 seconds), the Six-Minute Walk Test (6MWT), the one-leg heel-rise test, and the handgrip strength test and to calculate minimal detectable change (MDC) scores in people undergoing hemodialysis. This study was a prospective, nonexperimental investigation. Thirty-nine people undergoing hemodialysis at 2 clinics in Spain were contacted. Study participants performed the STS-10 (n=37), the STS-60 (n=37), and the 6MWT (n=36). At one of the settings, the participants also performed the one-leg heel-rise test (n=21) and the handgrip strength test (n=12) on both the right and the left sides. Participants attended 2 testing sessions 1 to 2 weeks apart. High intraclass correlation coefficients (≥.88) were found for all tests, suggesting good relative reliability. The MDC scores at 90% confidence intervals were as follows: 8.4 seconds for the STS-10, 4 repetitions for the STS-60, 66.3 m for the 6MWT, 3.4 kg for handgrip strength (force-generating capacity), 3.7 repetitions for the one-leg heel-rise test with the right leg, and 5.2 repetitions for the one-leg heel-rise test with the left leg. Limitations A limited sample of patients was used in this study. The STS-16, STS-60, 6MWT, one-leg heel rise test, and handgrip strength test are reliable outcome measures. The MDC scores at 90% confidence intervals for these tests will help to determine whether a change is due to error or to an intervention.

  18. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  19. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  20. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  1. Reliability Characteristics of Power Plants

    Directory of Open Access Journals (Sweden)

    Zbynek Martinek

    2017-01-01

    Full Text Available This paper describes the phenomenon of reliability of power plants. It gives an explanation of the terms connected with this topic as their proper understanding is important for understanding the relations and equations which model the possible real situations. The reliability phenomenon is analysed using both the exponential distribution and the Weibull distribution. The results of our analysis are specific equations giving information about the characteristics of the power plants, the mean time of operations and the probability of failure-free operation. Equations solved for the Weibull distribution respect the failures as well as the actual operating hours. Thanks to our results, we are able to create a model of dynamic reliability for prediction of future states. It can be useful for improving the current situation of the unit as well as for creating the optimal plan of maintenance and thus have an impact on the overall economics of the operation of these power plants.

  2. Reliability of the Fermilab Antiproton Source

    International Nuclear Information System (INIS)

    Harms, E. Jr.

    1993-05-01

    This paper reports on the reliability of the Fermilab Antiproton source since it began operation in 1985. Reliability of the complex as a whole as well as subsystem performance is summarized. Also discussed is the trending done to determine causes of significant machine downtime and actions taken to reduce the incidence of failure. Finally, results of a study to detect previously unidentified reliability limitations are presented

  3. Reliability of Power Electronic Converter Systems

    DEFF Research Database (Denmark)

    -link capacitance in power electronic converter systems; wind turbine systems; smart control strategies for improved reliability of power electronics system; lifetime modelling; power module lifetime test and state monitoring; tools for performance and reliability analysis of power electronics systems; fault...... for advancing the reliability, availability, system robustness, and maintainability of PECS at different levels of complexity. Drawing on the experience of an international team of experts, this book explores the reliability of PECS covering topics including an introduction to reliability engineering in power...... electronic converter systems; anomaly detection and remaining-life prediction for power electronics; reliability of DC-link capacitors in power electronic converters; reliability of power electronics packaging; modeling for life-time prediction of power semiconductor modules; minimization of DC...

  4. Suncor maintenance and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Little, S. [Suncor Energy, Calgary, AB (Canada)

    2006-07-01

    Fleet maintenance and reliability at Suncor Energy was discussed in this presentation, with reference to Suncor Energy's primary and support equipment fleets. This paper also discussed Suncor Energy's maintenance and reliability standard involving people, processes and technology. An organizational maturity chart that graphed organizational learning against organizational performance was illustrated. The presentation also reviewed the maintenance and reliability framework; maintenance reliability model; the process overview of the maintenance and reliability standard; a process flow chart of maintenance strategies and programs; and an asset reliability improvement process flow chart. An example of an improvement initiative was included, with reference to a shovel reliability review; a dipper trip reliability investigation; bucket related failures by type and frequency; root cause analysis of the reliability process; and additional actions taken. Last, the presentation provided a graph of the results of the improvement initiative and presented the key lessons learned. tabs., figs.

  5. Smallest detectable change and test-retest reliability of a self-reported outcome measure: Results of the Center for Epidemiologic Studies Depression Scale, General Self-Efficacy Scale, and 12-item General Health Questionnaire.

    Science.gov (United States)

    Ohno, Shotaro; Takahashi, Kana; Inoue, Aimi; Takada, Koki; Ishihara, Yoshiaki; Tanigawa, Masaru; Hirao, Kazuki

    2017-12-01

    This study aims to examine the smallest detectable change (SDC) and test-retest reliability of the Center for Epidemiologic Studies Depression Scale (CES-D), General Self-Efficacy Scale (GSES), and 12-item General Health Questionnaire (GHQ-12). We tested 154 young adults at baseline and 2 weeks later. We calculated the intra-class correlation coefficients (ICCs) for test-retest reliability with a two-way random effects model for agreement. We then calculated the standard error of measurement (SEM) for agreement using the ICC formula. The SEM for agreement was used to calculate SDC values at the individual level (SDC ind ) and group level (SDC group ). The study participants included 137 young adults. The ICCs for all self-reported outcome measurement scales exceeded 0.70. The SEM of CES-D was 3.64, leading to an SDC ind of 10.10 points and SDC group of 0.86 points. The SEM of GSES was 1.56, leading to an SDC ind of 4.33 points and SDC group of 0.37 points. The SEM of GHQ-12 with bimodal scoring was 1.47, leading to an SDC ind of 4.06 points and SDC group of 0.35 points. The SEM of GHQ-12 with Likert scoring was 2.44, leading to an SDC ind of 6.76 points and SDC group of 0.58 points. To confirm that the change was not a result of measurement error, a score of self-reported outcome measurement scales would need to change by an amount greater than these SDC values. This has important implications for clinicians and epidemiologists when assessing outcomes. © 2017 John Wiley & Sons, Ltd.

  6. Measurement Error, Reliability, and Minimum Detectable Change in the Mini-Mental State Examination, Montreal Cognitive Assessment, and Color Trails Test among Community Living Middle-Aged and Older Adults.

    Science.gov (United States)

    Feeney, Joanne; Savva, George M; O'Regan, Claire; King-Kallimanis, Bellinda; Cronin, Hilary; Kenny, Rose Anne

    2016-05-31

    Knowing the reliability of cognitive tests, particularly those commonly used in clinical practice, is important in order to interpret the clinical significance of a change in performance or a low score on a single test. To report the intra-class correlation (ICC), standard error of measurement (SEM) and minimum detectable change (MDC) for the Mini-Mental State Examination (MMSE), Montreal Cognitive Assessment (MoCA), and Color Trails Test (CTT) among community dwelling older adults. 130 participants aged 55 and older without severe cognitive impairment underwent two cognitive assessments between two and four months apart. Half the group changed rater between assessments and half changed time of day. Mean (standard deviation) MMSE was 28.1 (2.1) at baseline and 28.4 (2.1) at repeat. Mean (SD) MoCA increased from 24.8 (3.6) to 25.2 (3.6). There was a rater effect on CTT, but not on the MMSE or MoCA. The SEM of the MMSE was 1.0, leading to an MDC (based on a 95% confidence interval) of 3 points. The SEM of the MoCA was 1.5, implying an MDC95 of 4 points. MoCA (ICC = 0.81) was more reliable than MMSE (ICC = 0.75), but all tests examined showed substantial within-patient variation. An individual's score would have to change by greater than or equal to 3 points on the MMSE and 4 points on the MoCA for the rater to be confident that the change was not due to measurement error. This has important implications for epidemiologists and clinicians in dementia screening and diagnosis.

  7. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  8. Reliability data bases: the current picture

    International Nuclear Information System (INIS)

    Fragola, J.R.

    1985-01-01

    The paper addresses specific advances in nuclear power plant reliability data base development, a critical review of a select set of relevant data bases and suggested future data bases and suggested future data development needs required for risk assessment techniques to reach full potential

  9. Reliability testing of failed fuel location system

    International Nuclear Information System (INIS)

    Vieru, G.

    1996-01-01

    This paper presents the experimental reliability tests performed in order to prove the reliability parameters for Failed Fuel Location System (FFLS), equipment used to detect in which channel of a particular heat transport loop a fuel failure is located, and to find in which channel what particular bundle pair is failed. To do so, D20 samples from each reactor channel are sequentially monitored to detect a comparatively high level of delayed neutron activity. 15 refs, 8 figs, 2 tabs

  10. Human- and computer-accessible 2D correlation data for a more reliable structure determination of organic compounds. Future roles of researchers, software developers, spectrometer managers, journal editors, reviewers, publisher and database managers toward artificial-intelligence analysis of NMR spectra.

    Science.gov (United States)

    Jeannerat, Damien

    2017-01-01

    The introduction of a universal data format to report the correlation data of 2D NMR spectra such as COSY, HSQC and HMBC spectra will have a large impact on the reliability of structure determination of small organic molecules. These lists of assigned cross peaks will bridge signals found in NMR 1D and 2D spectra and the assigned chemical structure. The record could be very compact, human and computer readable so that it can be included in the supplementary material of publications and easily transferred into databases of scientific literature and chemical compounds. The records will allow authors, reviewers and future users to test the consistency and, in favorable situations, the uniqueness of the assignment of the correlation data to the associated chemical structures. Ideally, the data format of the correlation data should include direct links to the NMR spectra to make it possible to validate their reliability and allow direct comparison of spectra. In order to take the full benefits of their potential, the correlation data and the NMR spectra should therefore follow any manuscript in the review process and be stored in open-access database after publication. Keeping all NMR spectra, correlation data and assigned structures together at all time will allow the future development of validation tools increasing the reliability of past and future NMR data. This will facilitate the development of artificial intelligence analysis of NMR spectra by providing a source of data than can be used efficiently because they have been validated or can be validated by future users. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Human Reliability Program Overview

    Energy Technology Data Exchange (ETDEWEB)

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  12. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  13. Design and reliability analysis of a novel laser acupuncture device

    Science.gov (United States)

    Pan, Boan; Zhong, Fulin; Zhao, Ke; Li, Ting

    2018-02-01

    Acupuncture has a long history of more than 2000 years in China. However, traditional acupuncture adopts metallic needles which may bring discomfort and pricking to patients. Laser acupuncture (LA) is a non-invasive and painless way to achieve some therapeutic effects. And compared to traditional acupuncture, LA is free from infection. Taking these advantages of LA into consideration, we innovatively developed a portable laser acupuncture device with therapy part and detection part together. Therapy part sends out laser at the wavelength of 650 nm onto special acupoints of patients. And detection part includes integrated light-emitting diode (LED, 735/805/850 nm) and photodiode (OPT101). The detection part is used for the data collection for calculation of hemodynamic parameters based on near-infrared spectroscopy (NIRS). In this work, we carried out current-power test for sensitivity of therapy part. And we also conducted liquid-model optical experiment and arm blocking test for the sensitivity and effectiveness of detection part. The final results demonstrated great potential and reliability of the novel laser acupuncture device. In the future, we will apply this device in clinical applications to verify the effectiveness of the device and improve the reliability for more treatment of diseases.

  14. Reliable Design Versus Trust

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  15. Structural Reliability Analysis of Wind Turbines: A Review

    Directory of Open Access Journals (Sweden)

    Zhiyu Jiang

    2017-12-01

    Full Text Available The paper presents a detailed review of the state-of-the-art research activities on structural reliability analysis of wind turbines between the 1990s and 2017. We describe the reliability methods including the first- and second-order reliability methods and the simulation reliability methods and show the procedure for and application areas of structural reliability analysis of wind turbines. Further, we critically review the various structural reliability studies on rotor blades, bottom-fixed support structures, floating systems and mechanical and electrical components. Finally, future applications of structural reliability methods to wind turbine designs are discussed.

  16. Developing Reliable Life Support for Mars

    Science.gov (United States)

    Jones, Harry W.

    2017-01-01

    A human mission to Mars will require highly reliable life support systems. Mars life support systems may recycle water and oxygen using systems similar to those on the International Space Station (ISS). However, achieving sufficient reliability is less difficult for ISS than it will be for Mars. If an ISS system has a serious failure, it is possible to provide spare parts, or directly supply water or oxygen, or if necessary bring the crew back to Earth. Life support for Mars must be designed, tested, and improved as needed to achieve high demonstrated reliability. A quantitative reliability goal should be established and used to guide development t. The designers should select reliable components and minimize interface and integration problems. In theory a system can achieve the component-limited reliability, but testing often reveal unexpected failures due to design mistakes or flawed components. Testing should extend long enough to detect any unexpected failure modes and to verify the expected reliability. Iterated redesign and retest may be required to achieve the reliability goal. If the reliability is less than required, it may be improved by providing spare components or redundant systems. The number of spares required to achieve a given reliability goal depends on the component failure rate. If the failure rate is under estimated, the number of spares will be insufficient and the system may fail. If the design is likely to have undiscovered design or component problems, it is advisable to use dissimilar redundancy, even though this multiplies the design and development cost. In the ideal case, a human tended closed system operational test should be conducted to gain confidence in operations, maintenance, and repair. The difficulty in achieving high reliability in unproven complex systems may require the use of simpler, more mature, intrinsically higher reliability systems. The limitations of budget, schedule, and technology may suggest accepting lower and

  17. Principles of Bridge Reliability

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, Andrzej S.

    The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated......, and as an example the reliability profile and a sensitivity analyses for a corroded reinforced concrete bridge is shown....

  18. Reliability in engineering '87

    International Nuclear Information System (INIS)

    Tuma, M.

    1987-01-01

    The participants heard 51 papers dealing with the reliability of engineering products. Two of the papers were incorporated in INIS, namely ''Reliability comparison of two designs of low pressure regeneration of the 1000 MW unit at the Temelin nuclear power plant'' and ''Use of probability analysis of reliability in designing nuclear power facilities.''(J.B.)

  19. Application of solvent-assisted dispersive solid phase extraction as a new, fast, simple and reliable preconcentration and trace detection of lead and cadmium ions in fruit and water samples.

    Science.gov (United States)

    Behbahani, Mohammad; Ghareh Hassanlou, Parmoon; Amini, Mostafa M; Omidi, Fariborz; Esrafili, Ali; Farzadkia, Mehdi; Bagheri, Akbar

    2015-11-15

    In this research, a new sample treatment technique termed solvent-assisted dispersive solid phase extraction (SA-DSPE) was developed. The new method was based on the dispersion of the sorbent into the sample to maximize the contact surface. In this approach, the dispersion of the sorbent at a very low milligram level was achieved by injecting a mixture solution of the sorbent and disperser solvent into the aqueous sample. Thereby, a cloudy solution formed. The cloudy solution resulted from the dispersion of the fine particles of the sorbent in the bulk aqueous sample. After extraction, the cloudy solution was centrifuged and the enriched analytes in the sediment phase dissolved in ethanol and determined by flame atomic absorption spectrophotometer. Under the optimized conditions, the detection limit for lead and cadmium ions was 1.2 μg L(-1) and 0.2 μg L(-1), respectively. Furthermore, the preconcentration factor was 299.3 and 137.1 for cadmium and lead ions, respectively. SA-DSPE was successfully applied for trace determination of lead and cadmium in fruit (Citrus limetta, Kiwi and pomegranate) and water samples. Finally, the introduced sample preparation method can be used as a simple, rapid, reliable, selective and sensitive method for flame atomic absorption spectrophotometric determination of trace levels of lead and cadmium ions in fruit and water samples. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Reliability of chromogenic in situ hybridization for epidermal growth factor receptor gene copy number detection in non-small-cell lung carcinomas: a comparison with fluorescence in situ hybridization study.

    Science.gov (United States)

    Yoo, Seol Bong; Lee, Hyun Ju; Park, Jung Ok; Choe, Gheeyoung; Chung, Doo Hyun; Seo, Jeong-Wook; Chung, Jin-Haeng

    2010-03-01

    Fluorescence in situ hybridization (FISH) has been known to be the most representative and standardized test for assessing gene amplification. However, FISH requires a fluorescence microscope, the signals are labile and rapidly fade over time. Recently, chromogenic in situ hybridization (CISH) has emerged as a potential alternative to FISH. The aim of this study is to test the reliability of CISH technique for the detection of epidermal growth factor receptor (EGFR) gene amplification in non-small-cell lung carcinomas (NSCLC), to compare CISH results with FISH. A total of 277 formalin-fixed and paraffin embedded NSCLC tissue samples were retrieved from the surgical pathology archives at Seoul National University Bundang Hospital. CISH and FISH examinations were performed to test EGFR gene amplification status. There was high concordance in the assessment of EGFR gene copy number between CISH and FISH tests (Kappa coefficient=0.83). Excellent concordance was shown between two observers on the interpretation of the CISH results (Kappa coefficient=0.90). In conclusion, CISH result is highly reproducible, accurate and practical method to determine EGFR gene amplification in NSCLC. In addition, CISH allows a concurrent analysis of histological features of the tumors and gene copy numbers.

  1. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  2. Human factor reliability program

    International Nuclear Information System (INIS)

    Knoblochova, L.

    2017-01-01

    The human factor's reliability program was at Slovenske elektrarne, a.s. (SE) nuclear power plants. introduced as one of the components Initiatives of Excellent Performance in 2011. The initiative's goal was to increase the reliability of both people and facilities, in response to 3 major areas of improvement - Need for improvement of the results, Troubleshooting support, Supporting the achievement of the company's goals. The human agent's reliability program is in practice included: - Tools to prevent human error; - Managerial observation and coaching; - Human factor analysis; -Quick information about the event with a human agent; -Human reliability timeline and performance indicators; - Basic, periodic and extraordinary training in human factor reliability(authors)

  3. Detection and Characterization of Engineered Nanomaterials in the Environment: Current State-of-the-art and Future Directions Report, Annotated Bibliography, and Image Library

    Science.gov (United States)

    The increasing manufacture and implementation of engineered nanomaterials (ENMs) will continue to lead to the release of these materials into the environment. Reliably assessing the environmental exposure risk of ENMs will depend highly on the ability to quantify and characterize...

  4. Reliability-based condition assessment of steel containment and liners

    International Nuclear Information System (INIS)

    Ellingwood, B.; Bhattacharya, B.; Zheng, R.

    1996-11-01

    Steel containments and liners in nuclear power plants may be exposed to aggressive environments that may cause their strength and stiffness to decrease during the plant service life. Among the factors recognized as having the potential to cause structural deterioration are uniform, pitting or crevice corrosion; fatigue, including crack initiation and propagation to fracture; elevated temperature; and irradiation. The evaluation of steel containments and liners for continued service must provide assurance that they are able to withstand future extreme loads during the service period with a level of reliability that is sufficient for public safety. Rational methodologies to provide such assurances can be developed using modern structural reliability analysis principles that take uncertainties in loading, strength, and degradation resulting from environmental factors into account. The research described in this report is in support of the Steel Containments and Liners Program being conducted for the US Nuclear Regulatory Commission by the Oak Ridge National Laboratory. The research demonstrates the feasibility of using reliability analysis as a tool for performing condition assessments and service life predictions of steel containments and liners. Mathematical models that describe time-dependent changes in steel due to aggressive environmental factors are identified, and statistical data supporting the use of these models in time-dependent reliability analysis are summarized. The analysis of steel containment fragility is described, and simple illustrations of the impact on reliability of structural degradation are provided. The role of nondestructive evaluation in time-dependent reliability analysis, both in terms of defect detection and sizing, is examined. A Markov model provides a tool for accounting for time-dependent changes in damage condition of a structural component or system. 151 refs

  5. Finite element reliability analysis of fatigue life

    International Nuclear Information System (INIS)

    Harkness, H.H.; Belytschko, T.; Liu, W.K.

    1992-01-01

    Fatigue reliability is addressed by the first-order reliability method combined with a finite element method. Two-dimensional finite element models of components with cracks in mode I are considered with crack growth treated by the Paris law. Probability density functions of the variables affecting fatigue are proposed to reflect a setting where nondestructive evaluation is used, and the Rosenblatt transformation is employed to treat non-Gaussian random variables. Comparisons of the first-order reliability results and Monte Carlo simulations suggest that the accuracy of the first-order reliability method is quite good in this setting. Results show that the upper portion of the initial crack length probability density function is crucial to reliability, which suggests that if nondestructive evaluation is used, the probability of detection curve plays a key role in reliability. (orig.)

  6. French power system reliability report 2008

    International Nuclear Information System (INIS)

    Tesseron, J.M.

    2009-06-01

    The reliability of the French power system was fully under control in 2008, despite the power outage in the eastern part of the Provence-Alpes-Cote d'Azur region on November 3, which had been dreaded for several years, since it had not been possible to set up a structurally adequate network. Pursuant to a consultation meeting, the reinforcement solution proposed by RTE was approved by the Minister of Energy, boding well for greater reliability in future. Based on the observations presented in this 2008 Report, RTE's Power System Reliability Audit Mission considers that no new recommendations are needed beyond those expressed in previous reliability reports and during reliability audits. The publication of this yearly report is in keeping with RTE's goal to promote the follow-up over time of the evolution of reliability in its various aspects. RTE thus aims to contribute to the development of reliability culture, by encouraging an improved assessment by the different players (both RTE and network users) of the role they play in building reliability, and by advocating the taking into account of reliability and benchmarking in the European organisations of Transmission System Operators. Contents: 1 - Brief overview of the evolution of the internal and external environment; 2 - Operating situations encountered: climatic conditions, supply / demand balance management, operation of interconnections, management of internal congestion, contingencies affecting the transmission facilities; 3 - Evolution of the reliability reference guide: external reference guide: directives, laws, decrees, etc, ETSO, UCTE, ENTSO-E, contracting contributing to reliability, RTE internal reference guide; 4 - Evolution of measures contributing to reliability in the equipment field: intrinsic performances of components (generating sets, protection systems, operation PLC's, instrumentation and control, automatic frequency and voltage controls, transmission facilities, control systems, load

  7. The Future of Telecommuting

    OpenAIRE

    Handy, Susan; Mokhtarian, Patricia

    1996-01-01

    Interest in telecommuting is growing among workers, employers, transportation planners, communities, the telecommunications industry, and others. But actual levels of telecommuting appear to be increasing slowly, although there is little reliable data on trends. The future of telecommuting depends on whether employers provide the opportunity to telecommute and whether workers take advantage of this opportunity; government policies can encourage both. This article addresses that future by outl...

  8. Specialists' meeting on sodium boiling noise detection. Summary report

    International Nuclear Information System (INIS)

    1982-01-01

    The purpose of the meeting was to review and discuss methods available for detection of the initial stage of accidents in fast reactors, with most attention on reliable detection by acoustic techniques, which could provide a valuable addition to the safety protection. Results obtained from reactor experiments were also discussed and recommendations made for future developments. The meeting was divided into five technical sessions as follows: Signals from sodium boiling; Transmission of acoustic waves and background noise; Detection techniques; Reactor experiments; and Future requirements

  9. Specialists' meeting on sodium boiling noise detection. Summary report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1982-01-01

    The purpose of the meeting was to review and discuss methods available for detection of the initial stage of accidents in fast reactors, with most attention on reliable detection by acoustic techniques, which could provide a valuable addition to the safety protection. Results obtained from reactor experiments were also discussed and recommendations made for future developments. The meeting was divided into five technical sessions as follows: Signals from sodium boiling; Transmission of acoustic waves and background noise; Detection techniques; Reactor experiments; and Future requirements.

  10. Reliability and safety engineering

    CERN Document Server

    Verma, Ajit Kumar; Karanki, Durga Rao

    2016-01-01

    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  11. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  12. Reliability of electronic systems

    International Nuclear Information System (INIS)

    Roca, Jose L.

    2001-01-01

    Reliability techniques have been developed subsequently as a need of the diverse engineering disciplines, nevertheless they are not few those that think they have been work a lot on reliability before the same word was used in the current context. Military, space and nuclear industries were the first ones that have been involved in this topic, however not only in these environments it is that it has been carried out this small great revolution in benefit of the increase of the reliability figures of the products of those industries, but rather it has extended to the whole industry. The fact of the massive production, characteristic of the current industries, drove four decades ago, to the fall of the reliability of its products, on one hand, because the massively itself and, for other, to the recently discovered and even not stabilized industrial techniques. Industry should be changed according to those two new requirements, creating products of medium complexity and assuring an enough reliability appropriated to production costs and controls. Reliability began to be integral part of the manufactured product. Facing this philosophy, the book describes reliability techniques applied to electronics systems and provides a coherent and rigorous framework for these diverse activities providing a unifying scientific basis for the entire subject. It consists of eight chapters plus a lot of statistical tables and an extensive annotated bibliography. Chapters embrace the following topics: 1- Introduction to Reliability; 2- Basic Mathematical Concepts; 3- Catastrophic Failure Models; 4-Parametric Failure Models; 5- Systems Reliability; 6- Reliability in Design and Project; 7- Reliability Tests; 8- Software Reliability. This book is in Spanish language and has a potentially diverse audience as a text book from academic to industrial courses. (author)

  13. Sample-to-SNP kit: a reliable, easy and fast tool for the detection of HFE p.H63D and p.C282Y variations associated to hereditary hemochromatosis.

    Science.gov (United States)

    Nielsen, Peter B; Petersen, Maja S; Ystaas, Viviana; Andersen, Rolf V; Hansen, Karin M; Blaabjerg, Vibeke; Refstrup, Mette

    2012-10-01

    Classical hereditary hemochromatosis involves the HFE-gene and diagnostic analysis of the DNA variants HFE p.C282Y (c.845G>A; rs1800562) and HFE p.H63D (c.187C>G; rs1799945). The affected protein alters the iron homeostasis resulting in iron overload in various tissues. The aim of this study was to validate the TaqMan-based Sample-to-SNP protocol for the analysis of the HFE-p.C282Y and p.H63D variants with regard to accuracy, usefulness and reproducibility compared to an existing SNP protocol. The Sample-to-SNP protocol uses an approach where the DNA template is made accessible from a cell lysate followed by TaqMan analysis. Besides the HFE-SNPs other eight SNPs were used as well. These SNPs were: Coagulation factor II-gene F2 c.20210G>A, Coagulation factor V-gene F5 p.R506Q (c.1517G>A; rs121917732), Mitochondria SNP: mt7028 G>A, Mitochondria SNP: mt12308 A>G, Proprotein convertase subtilisin/kexin type 9-gene PCSK9 p.R46L (c.137G>T), Plutathione S-transferase pi 1-gene GSTP1 p.I105V (c313A>G; rs1695), LXR g.-171 A>G, ZNF202 g.-118 G>T. In conclusion the Sample-to-SNP kit proved to be an accurate, reliable, robust, easy to use and rapid TaqMan-based SNP detection protocol, which could be quickly implemented in a routine diagnostic or research facility. Copyright © 2012. Published by Elsevier B.V.

  14. Operational safety reliability research

    International Nuclear Information System (INIS)

    Hall, R.E.; Boccio, J.L.

    1986-01-01

    Operating reactor events such as the TMI accident and the Salem automatic-trip failures raised the concern that during a plant's operating lifetime the reliability of systems could degrade from the design level that was considered in the licensing process. To address this concern, NRC is sponsoring the Operational Safety Reliability Research project. The objectives of this project are to identify the essential tasks of a reliability program and to evaluate the effectiveness and attributes of such a reliability program applicable to maintaining an acceptable level of safety during the operating lifetime at the plant

  15. Design for Reliability in Renewable Energy Systems

    DEFF Research Database (Denmark)

    Blaabjerg, Frede; Zhou, Dao; Sangwongwanich, Ariya

    2017-01-01

    Power electronics are widely used in renewable energy systems to achieve lower cost of energy, higher efficiency and high power density. At the same time, the high reliability of the power electronics products is demanded, in order to reduce the failure rates and ensure cost-effective operation...... of the renewable energy systems. This paper thus describes the basic concepts used in reliability engineering, and presents the status and future trends of Design for Reliability (DfR) in power electronics, which is currently undergoing a paradigm shift to a physics-of-failure approach. Two case studies of a 2 MW...

  16. Dental hygiene faculty calibration in the evaluation of calculus detection.

    Science.gov (United States)

    Garland, Kandis V; Newell, Kathleen J

    2009-03-01

    The purpose of this pilot study was to explore the impact of faculty calibration training on intra- and interrater reliability regarding calculus detection. After IRB approval, twelve dental hygiene faculty members were recruited from a pool of twenty-two for voluntary participation and randomized into two groups. All subjects provided two pre- and two posttest scorings of calculus deposits on each of three typodonts by recording yes or no indicating if they detected calculus. Accuracy and consistency of calculus detection were evaluated using an answer key. The experimental group received three two-hour training sessions to practice a prescribed exploring sequence and technique for calculus detection. Participants immediately corrected their answers, received feedback from the trainer, and reconciled missed areas. Intra- and interrater reliability (pre- and posttest) was determined using Cohen's Kappa and compared between groups using repeated measures (split-plot) ANOVA. The groups did not differ from pre- to posttraining (intrarater reliability p=0.64; interrater reliability p=0.20). Training had no effect on reliability levels for simulated calculus detection in this study. Recommendations for future studies of faculty calibration when evaluating students include using patients for assessing rater reliability, employing larger samples at multiple sites, and assessing the impact on students' attitudes and learning outcomes.

  17. Hawaii Electric System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Loose, Verne William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2012-08-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  18. Hawaii electric system reliability.

    Energy Technology Data Exchange (ETDEWEB)

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  19. Improving machinery reliability

    CERN Document Server

    Bloch, Heinz P

    1998-01-01

    This totally revised, updated and expanded edition provides proven techniques and procedures that extend machinery life, reduce maintenance costs, and achieve optimum machinery reliability. This essential text clearly describes the reliability improvement and failure avoidance steps practiced by best-of-class process plants in the U.S. and Europe.

  20. LED system reliability

    NARCIS (Netherlands)

    Driel, W.D. van; Yuan, C.A.; Koh, S.; Zhang, G.Q.

    2011-01-01

    This paper presents our effort to predict the system reliability of Solid State Lighting (SSL) applications. A SSL system is composed of a LED engine with micro-electronic driver(s) that supplies power to the optic design. Knowledge of system level reliability is not only a challenging scientific

  1. Integrated system reliability analysis

    DEFF Research Database (Denmark)

    Gintautas, Tomas; Sørensen, John Dalsgaard

    Specific targets: 1) The report shall describe the state of the art of reliability and risk-based assessment of wind turbine components. 2) Development of methodology for reliability and risk-based assessment of the wind turbine at system level. 3) Describe quantitative and qualitative measures...

  2. Reliability of neural encoding

    DEFF Research Database (Denmark)

    Alstrøm, Preben; Beierholm, Ulrik; Nielsen, Carsten Dahl

    2002-01-01

    The reliability with which a neuron is able to create the same firing pattern when presented with the same stimulus is of critical importance to the understanding of neuronal information processing. We show that reliability is closely related to the process of phaselocking. Experimental results f...

  3. Design reliability engineering

    International Nuclear Information System (INIS)

    Buden, D.; Hunt, R.N.M.

    1989-01-01

    Improved design techniques are needed to achieve high reliability at minimum cost. This is especially true of space systems where lifetimes of many years without maintenance are needed and severe mass limitations exist. Reliability must be designed into these systems from the start. Techniques are now being explored to structure a formal design process that will be more complete and less expensive. The intent is to integrate the best features of design, reliability analysis, and expert systems to design highly reliable systems to meet stressing needs. Taken into account are the large uncertainties that exist in materials, design models, and fabrication techniques. Expert systems are a convenient method to integrate into the design process a complete definition of all elements that should be considered and an opportunity to integrate the design process with reliability, safety, test engineering, maintenance and operator training. 1 fig

  4. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  5. Design for reliability: NASA reliability preferred practices for design and test

    Science.gov (United States)

    Lalli, Vincent R.

    1994-01-01

    This tutorial summarizes reliability experience from both NASA and industry and reflects engineering practices that support current and future civil space programs. These practices were collected from various NASA field centers and were reviewed by a committee of senior technical representatives from the participating centers (members are listed at the end). The material for this tutorial was taken from the publication issued by the NASA Reliability and Maintainability Steering Committee (NASA Reliability Preferred Practices for Design and Test. NASA TM-4322, 1991). Reliability must be an integral part of the systems engineering process. Although both disciplines must be weighed equally with other technical and programmatic demands, the application of sound reliability principles will be the key to the effectiveness and affordability of America's space program. Our space programs have shown that reliability efforts must focus on the design characteristics that affect the frequency of failure. Herein, we emphasize that these identified design characteristics must be controlled by applying conservative engineering principles.

  6. A reliability program approach to operational safety

    International Nuclear Information System (INIS)

    Mueller, C.J.; Bezella, W.A.

    1985-01-01

    A Reliability Program (RP) model based on proven reliability techniques is being formulated for potential application in the nuclear power industry. Methods employed under NASA and military direction, commercial airline and related FAA programs were surveyed and a review of current nuclear risk-dominant issues conducted. The need for a reliability approach to address dependent system failures, operating and emergency procedures and human performance, and develop a plant-specific performance data base for safety decision making is demonstrated. Current research has concentrated on developing a Reliability Program approach for the operating phase of a nuclear plant's lifecycle. The approach incorporates performance monitoring and evaluation activities with dedicated tasks that integrate these activities with operation, surveillance, and maintenance of the plant. The detection, root-cause evaluation and before-the-fact correction of incipient or actual systems failures as a mechanism for maintaining plant safety is a major objective of the Reliability Program. (orig./HP)

  7. Space Vehicle Reliability Modeling in DIORAMA

    Energy Technology Data Exchange (ETDEWEB)

    Tornga, Shawn Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-12

    When modeling system performance of space based detection systems it is important to consider spacecraft reliability. As space vehicles age the components become prone to failure for a variety of reasons such as radiation damage. Additionally, some vehicles may lose the ability to maneuver once they exhaust fuel supplies. Typically failure is divided into two categories: engineering mistakes and technology surprise. This document will report on a method of simulating space vehicle reliability in the DIORAMA framework.

  8. TFTR CAMAC power supplies reliability

    International Nuclear Information System (INIS)

    Camp, R.A.; Bergin, W.

    1989-01-01

    Since the expected life of the Tokamak Fusion Test Reactor (TFTR) has been extended into the early 1990's, the issues of equipment wear-out, when to refurbish/replace, and the costs associated with these decisions, must be faced. The management of the maintenance of the TFTR Central Instrumentation, Control and Data Acquisition System (CICADA) power supplies within the CAMAC network is a case study of a set of systems to monitor repairable systems reliability, costs, and results of action. The CAMAC network is composed of approximately 500 racks, each with its own power supply. By using a simple reliability estimator on a coarse time interval, in conjunction with determining the root cause of individual failures, a cost effective repair and maintenance program has been realized. This paper describes the estimator, some of the specific causes for recurring failures and their correction, and the subsequent effects on the reliability estimator. By extension of this program the authors can assess the continued viability of CAMAC power supplies into the future, predicting wear-out and developing cost effective refurbishment/replacement policies. 4 refs., 3 figs., 1 tab

  9. Reliability of construction materials

    International Nuclear Information System (INIS)

    Merz, H.

    1976-01-01

    One can also speak of reliability with respect to materials. While for reliability of components the MTBF (mean time between failures) is regarded as the main criterium, this is replaced with regard to materials by possible failure mechanisms like physical/chemical reaction mechanisms, disturbances of physical or chemical equilibrium, or other interactions or changes of system. The main tasks of the reliability analysis of materials therefore is the prediction of the various failure reasons, the identification of interactions, and the development of nondestructive testing methods. (RW) [de

  10. Structural Reliability Methods

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Madsen, H. O.

    The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...... of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature...... of the uncertainties and their interplay is the developed, step-by-step. The concepts presented are illustrated by numerous examples throughout the text....

  11. Reliability and mechanical design

    International Nuclear Information System (INIS)

    Lemaire, Maurice

    1997-01-01

    A lot of results in mechanical design are obtained from a modelisation of physical reality and from a numerical solution which would lead to the evaluation of needs and resources. The goal of the reliability analysis is to evaluate the confidence which it is possible to grant to the chosen design through the calculation of a probability of failure linked to the retained scenario. Two types of analysis are proposed: the sensitivity analysis and the reliability analysis. Approximate methods are applicable to problems related to reliability, availability, maintainability and safety (RAMS)

  12. A reliability analysis tool for SpaceWire network

    Science.gov (United States)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  13. Approach to reliability assessment

    International Nuclear Information System (INIS)

    Green, A.E.; Bourne, A.J.

    1975-01-01

    Experience has shown that reliability assessments can play an important role in the early design and subsequent operation of technological systems where reliability is at a premium. The approaches to and techniques for such assessments, which have been outlined in the paper, have been successfully applied in variety of applications ranging from individual equipments to large and complex systems. The general approach involves the logical and systematic establishment of the purpose, performance requirements and reliability criteria of systems. This is followed by an appraisal of likely system achievment based on the understanding of different types of variational behavior. A fundamental reliability model emerges from the correlation between the appropriate Q and H functions for performance requirement and achievement. This model may cover the complete spectrum of performance behavior in all the system dimensions

  14. The rating reliability calculator

    Directory of Open Access Journals (Sweden)

    Solomon David J

    2004-04-01

    Full Text Available Abstract Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program.

  15. Structural systems reliability analysis

    International Nuclear Information System (INIS)

    Frangopol, D.

    1975-01-01

    For an exact evaluation of the reliability of a structure it appears necessary to determine the distribution densities of the loads and resistances and to calculate the correlation coefficients between loads and between resistances. These statistical characteristics can be obtained only on the basis of a long activity period. In case that such studies are missing the statistical properties formulated here give upper and lower bounds of the reliability. (orig./HP) [de

  16. Reliability and maintainability

    International Nuclear Information System (INIS)

    1994-01-01

    Several communications in this conference are concerned with nuclear plant reliability and maintainability; their titles are: maintenance optimization of stand-by Diesels of 900 MW nuclear power plants; CLAIRE: an event-based simulation tool for software testing; reliability as one important issue within the periodic safety review of nuclear power plants; design of nuclear building ventilation by the means of functional analysis; operation characteristic analysis for a power industry plant park, as a function of influence parameters

  17. Reliability data book

    International Nuclear Information System (INIS)

    Bento, J.P.; Boerje, S.; Ericsson, G.; Hasler, A.; Lyden, C.O.; Wallin, L.; Poern, K.; Aakerlund, O.

    1985-01-01

    The main objective for the report is to improve failure data for reliability calculations as parts of safety analyses for Swedish nuclear power plants. The work is based primarily on evaluations of failure reports as well as information provided by the operation and maintenance staff of each plant. In the report are presented charts of reliability data for: pumps, valves, control rods/rod drives, electrical components, and instruments. (L.E.)

  18. Methodology for reliability based condition assessment

    International Nuclear Information System (INIS)

    Mori, Y.; Ellingwood, B.

    1993-08-01

    Structures in nuclear power plants may be exposed to aggressive environmental effects that cause their strength to decrease over an extended period of service. A major concern in evaluating the continued service for such structures is to ensure that in their current condition they are able to withstand future extreme load events during the intended service life with a level of reliability sufficient for public safety. This report describes a methodology to facilitate quantitative assessments of current and future structural reliability and performance of structures in nuclear power plants. This methodology takes into account the nature of past and future loads, and randomness in strength and in degradation resulting from environmental factors. An adaptive Monte Carlo simulation procedure is used to evaluate time-dependent system reliability. The time-dependent reliability is sensitive to the time-varying load characteristics and to the choice of initial strength and strength degradation models but not to correlation in component strengths within a system. Inspection/maintenance strategies are identified that minimize the expected future costs of keeping the failure probability of a structure at or below an established target failure probability during its anticipated service period

  19. From reliability problems to nuclear safety problems

    International Nuclear Information System (INIS)

    Yastrebenetskij, M.A.

    2003-01-01

    The article is devoted to the 10-th anniversary of Kharkov Department (KhD) of SSTC NRS and reviews its creation prehistory (works on reliability of process automated control system carried out earlier by KhD scientists), basic results of KhD activities, and its future trends

  20. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  1. Analysis and Application of Reliability

    International Nuclear Information System (INIS)

    Jeong, Hae Seong; Park, Dong Ho; Kim, Jae Ju

    1999-05-01

    This book tells of analysis and application of reliability, which includes definition, importance and historical background of reliability, function of reliability and failure rate, life distribution and assumption of reliability, reliability of unrepaired system, reliability of repairable system, sampling test of reliability, failure analysis like failure analysis by FEMA and FTA, and cases, accelerated life testing such as basic conception, acceleration and acceleration factor, and analysis of accelerated life testing data, maintenance policy about alternation and inspection.

  2. The reliability of commonly used electrophysiology measures.

    Science.gov (United States)

    Brown, K E; Lohse, K R; Mayer, I M S; Strigaro, G; Desikan, M; Casula, E P; Meunier, S; Popa, T; Lamy, J-C; Odish, O; Leavitt, B R; Durr, A; Roos, R A C; Tabrizi, S J; Rothwell, J C; Boyd, L A; Orth, M

    Electrophysiological measures can help understand brain function both in healthy individuals and in the context of a disease. Given the amount of information that can be extracted from these measures and their frequent use, it is essential to know more about their inherent reliability. To understand the reliability of electrophysiology measures in healthy individuals. We hypothesized that measures of threshold and latency would be the most reliable and least susceptible to methodological differences between study sites. Somatosensory evoked potentials from 112 control participants; long-latency reflexes, transcranial magnetic stimulation with resting and active motor thresholds, motor evoked potential latencies, input/output curves, and short-latency sensory afferent inhibition and facilitation from 84 controls were collected at 3 visits over 24 months at 4 Track-On HD study sites. Reliability was assessed using intra-class correlation coefficients for absolute agreement, and the effects of reliability on statistical power are demonstrated for different sample sizes and study designs. Measures quantifying latencies, thresholds, and evoked responses at high stimulator intensities had the highest reliability, and required the smallest sample sizes to adequately power a study. Very few between-site differences were detected. Reliability and susceptibility to between-site differences should be evaluated for electrophysiological measures before including them in study designs. Levels of reliability vary substantially across electrophysiological measures, though there are few between-site differences. To address this, reliability should be used in conjunction with theoretical calculations to inform sample size and ensure studies are adequately powered to detect true change in measures of interest. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Energy futures

    International Nuclear Information System (INIS)

    Treat, J.E.

    1990-01-01

    This book provides fifteen of the futures industry's leading authorities with broader background in both theory and practice of energy futures trading in this updated text. The authors review the history of the futures market and the fundamentals of trading, hedging, and technical analysis; then they update you with the newest trends in energy futures trading - natural gas futures, options, regulations, and new information services. The appendices outline examples of possible contracts and their construction

  4. Exploration of reliability databases and comparison of former IFMIF's results

    International Nuclear Information System (INIS)

    Tapia, Carlos; Dies, Javier; Abal, Javier; Ibarra, Angel; Arroyo, Jose M.

    2011-01-01

    There is an uncertainty issue about the applicability of industrial databases to new designs, such as the International Fusion Materials Irradiation Facility (IFMIF), as they usually contain elements for which no historical statistics exist. The exploration of common components reliability data in Accelerator Driven Systems (ADS) and Liquid Metal Technologies (LMT) frameworks is the milestone to analyze the data used in IFMIF reliability's reports and for future studies. The comparison between the reliability accelerator results given in the former IFMIF's reports and the databases explored has been made by means of a new accelerator Reliability, Availability, Maintainability (RAM) analysis. The reliability database used in this analysis is traceable.

  5. Futuring for Future Ready Librarians

    Science.gov (United States)

    Figueroa, Miguel A.

    2018-01-01

    Futurists and foresight professionals offer several guiding principles for thinking about the future. These principles can help people to think about the future and become more powerful players in shaping the preferred futures they want for themselves and their communities. The principles also fit in well as strategies to support the Future Ready…

  6. Safety and reliability criteria

    International Nuclear Information System (INIS)

    O'Neil, R.

    1978-01-01

    Nuclear power plants and, in particular, reactor pressure boundary components have unique reliability requirements, in that usually no significant redundancy is possible, and a single failure can give rise to possible widespread core damage and fission product release. Reliability may be required for availability or safety reasons, but in the case of the pressure boundary and certain other systems safety may dominate. Possible Safety and Reliability (S and R) criteria are proposed which would produce acceptable reactor design. Without some S and R requirement the designer has no way of knowing how far he must go in analysing his system or component, or whether his proposed solution is likely to gain acceptance. The paper shows how reliability targets for given components and systems can be individually considered against the derived S and R criteria at the design and construction stage. Since in the case of nuclear pressure boundary components there is often very little direct experience on which to base reliability studies, relevant non-nuclear experience is examined. (author)

  7. Proposed reliability cost model

    Science.gov (United States)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  8. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  9. Improving the safety and reliability of Monju

    International Nuclear Information System (INIS)

    Itou, Kazumoto; Maeda, Hiroshi; Moriyama, Masatoshi

    1998-01-01

    Comprehensive safety review has been performed at Monju to determine why the Monju secondary sodium leakage accident occurred. We investigated how to improve the situation based on the results of the safety review. The safety review focused on five aspects of whether the facilities for dealing with the sodium leakage accident were adequate: the reliability of the detection method, the reliability of the method for preventing the spread of the sodium leakage accident, whether the documented operating procedures are adequate, whether the quality assurance system, program, and actions were properly performed and so on. As a result, we established for Monju a better method of dealing with sodium leakage accidents, rapid detection of sodium leakage, improvement of sodium drain facilities, and way to reduce damage to Monju systems after an accident. We also improve the operation procedures and quality assurance actions to increase the safety and reliability of Monju. (author)

  10. Reliability issues in PACS

    Science.gov (United States)

    Taira, Ricky K.; Chan, Kelby K.; Stewart, Brent K.; Weinberg, Wolfram S.

    1991-07-01

    Reliability is an increasing concern when moving PACS from the experimental laboratory to the clinical environment. Any system downtime may seriously affect patient care. The authors report on the several classes of errors encountered during the pre-clinical release of the PACS during the past several months and present the solutions implemented to handle them. The reliability issues discussed include: (1) environmental precautions, (2) database backups, (3) monitor routines of critical resources and processes, (4) hardware redundancy (networks, archives), and (5) development of a PACS quality control program.

  11. Reliability Parts Derating Guidelines

    Science.gov (United States)

    1982-06-01

    226-30, October 1974. 66 I, 26. "Reliability of GAAS Injection Lasers", De Loach , B. C., Jr., 1973 IEEE/OSA Conference on Laser Engineering and...Vol. R-23, No. 4, 226-30, October 1974. 28. "Reliability of GAAS Injection Lasers", De Loach , B. C., Jr., 1973 IEEE/OSA Conference on Laser...opnatien ot 󈨊 deg C, mounted on a 4-inach square 0.250~ inch thick al~loy alum~nusi panel.. This mounting technique should be L~ ken into cunoidur~tiou

  12. Future accelerators (?)

    Energy Technology Data Exchange (ETDEWEB)

    John Womersley

    2003-08-21

    I describe the future accelerator facilities that are currently foreseen for electroweak scale physics, neutrino physics, and nuclear structure. I will explore the physics justification for these machines, and suggest how the case for future accelerators can be made.

  13. Reliability studies in research reactors

    International Nuclear Information System (INIS)

    Albuquerque, Tob Rodrigues de

    2013-01-01

    Fault trees and event trees are widely used in industry to model and to evaluate the reliability of safety systems. Detailed analyzes in nuclear installations require the combination of these two techniques. This study uses the methods of FT (Fault Tree) and ET (Event Tree) to accomplish the PSA (Probabilistic Safety Assessment) in research reactors. According to IAEA (lnternational Atomic Energy Agency), the PSA is divided into Level 1, Level 2 and Level 3. At the Level 1, conceptually, the security systems perform to prevent the occurrence of accidents, At the Level 2, once accidents happened, this Level seeks to minimize consequences, known as stage management of accident, and at Level 3 accident impacts are determined. This study focuses on analyzing the Level 1, and searching through the acquisition of knowledge, the consolidation of methodologies for future reliability studies. The Greek Research Reactor, GRR-1, is a case example. The LOCA (Loss of Coolant Accident) was chosen as the initiating event and from it, using ET, possible accidental sequences were developed, which could lead damage to the core. Moreover, for each of affected systems, probabilities of each event top of FT were developed and evaluated in possible accidental sequences. Also, the estimates of importance measures for basic events are presented in this work. The studies of this research were conducted using a commercial computational tool SAPHIRE. Additionally, achieved results thus were considered satisfactory for the performance or the failure of analyzed systems. (author)

  14. Columbus safety and reliability

    Science.gov (United States)

    Longhurst, F.; Wessels, H.

    1988-10-01

    Analyses carried out to ensure Columbus reliability, availability, and maintainability, and operational and design safety are summarized. Failure modes/effects/criticality is the main qualitative tool used. The main aspects studied are fault tolerance, hazard consequence control, risk minimization, human error effects, restorability, and safe-life design.

  15. Power transformer reliability modelling

    NARCIS (Netherlands)

    Schijndel, van A.

    2010-01-01

    Problem description Electrical power grids serve to transport and distribute electrical power with high reliability and availability at acceptable costs and risks. These grids play a crucial though preferably invisible role in supplying sufficient power in a convenient form. Today’s society has

  16. Designing reliability into accelerators

    International Nuclear Information System (INIS)

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the ''factories,'' reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis

  17. Proof tests on reliability

    International Nuclear Information System (INIS)

    Mishima, Yoshitsugu

    1983-01-01

    In order to obtain public understanding on nuclear power plants, tests should be carried out to prove the reliability and safety of present LWR plants. For example, the aseismicity of nuclear power plants must be verified by using a large scale earthquake simulator. Reliability test began in fiscal 1975, and the proof tests on steam generators and on PWR support and flexure pins against stress corrosion cracking have already been completed, and the results have been internationally highly appreciated. The capacity factor of the nuclear power plant operation in Japan rose to 80% in the summer of 1983, and considering the period of regular inspection, it means the operation of almost full capacity. Japanese LWR technology has now risen to the top place in the world after having overcome the defects. The significance of the reliability test is to secure the functioning till the age limit is reached, to confirm the correct forecast of deteriorating process, to confirm the effectiveness of the remedy to defects and to confirm the accuracy of predicting the behavior of facilities. The reliability of nuclear valves, fuel assemblies, the heat affected zones in welding, reactor cooling pumps and electric instruments has been tested or is being tested. (Kako, I.)

  18. Reliability and code level

    NARCIS (Netherlands)

    Kasperski, M.; Geurts, C.P.W.

    2005-01-01

    The paper describes the work of the IAWE Working Group WBG - Reliability and Code Level, one of the International Codification Working Groups set up at ICWE10 in Copenhagen. The following topics are covered: sources of uncertainties in the design wind load, appropriate design target values for the

  19. Reliability of Plastic Slabs

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    1989-01-01

    In the paper it is shown how upper and lower bounds for the reliability of plastic slabs can be determined. For the fundamental case it is shown that optimal bounds of a deterministic and a stochastic analysis are obtained on the basis of the same failure mechanisms and the same stress fields....

  20. Reliability based structural design

    NARCIS (Netherlands)

    Vrouwenvelder, A.C.W.M.

    2014-01-01

    According to ISO 2394, structures shall be designed, constructed and maintained in such a way that they are suited for their use during the design working life in an economic way. To fulfil this requirement one needs insight into the risk and reliability under expected and non-expected actions. A

  1. Travel time reliability modeling.

    Science.gov (United States)

    2011-07-01

    This report includes three papers as follows: : 1. Guo F., Rakha H., and Park S. (2010), "A Multi-state Travel Time Reliability Model," : Transportation Research Record: Journal of the Transportation Research Board, n 2188, : pp. 46-54. : 2. Park S.,...

  2. Reliability and Model Fit

    Science.gov (United States)

    Stanley, Leanne M.; Edwards, Michael C.

    2016-01-01

    The purpose of this article is to highlight the distinction between the reliability of test scores and the fit of psychometric measurement models, reminding readers why it is important to consider both when evaluating whether test scores are valid for a proposed interpretation and/or use. It is often the case that an investigator judges both the…

  3. Parametric Mass Reliability Study

    Science.gov (United States)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  4. More reliable financing of future nuclear waste costs

    International Nuclear Information System (INIS)

    1994-01-01

    This appendix contains seven reports written by consultants to the Commission. The report titles are: Basic document regarding the inquiry on fund management; Scenarios for growth and real interest rates in a long perspective; Stability of the Swedish financing system; Report concerning the financing of nuclear waste management in Sweden and Finland and the cost control system in Sweden; Evaluation of the cost estimates and calculation methods of SKB; A study of the costs for nuclear waste - The basis for cost estimation; A review of scope and costs for the Swedish system for management of nuclear waste. The four last reports are separately indexed

  5. Reliability Approach of a Compressor System using Reliability Block ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... This paper presents a reliability analysis of such a system using reliability ... Keywords-compressor system, reliability, reliability block diagram, RBD .... the same structure has been kept with the three subsystems: air flow, oil flow and .... and Safety in Engineering Design", Springer, 2009. [3] P. O'Connor ...

  6. An idea for the future proton detection of (p,2p) reactions with the R3B set-up at FAIR

    International Nuclear Information System (INIS)

    Ribeiro, G; Perea, A; Mårtensson, M

    2015-01-01

    The R 3 B Collaboration has a long experience in probing exotic nuclei via quasi-free scattering reactions. To continue these studies a new array capable of detecting protons and gamma rays of high energy is currently being developed, the CALIFA (R 3 B CALorimeterfor In Flight γ arrays and high energy charged pArticles). This contribution reports on the current solution for the forward Endcap of the CALIFA detector and on the latest test results. (paper)

  7. Approach for an integral power transformer reliability model

    NARCIS (Netherlands)

    Schijndel, van A.; Wouters, P.A.A.F.; Steennis, E.F.; Wetzer, J.M.

    2012-01-01

    In electrical power transmission and distribution networks power transformers represent a crucial group of assets both in terms of reliability and investments. In order to safeguard the required quality at acceptable costs, decisions must be based on a reliable forecast of future behaviour. The aim

  8. Use of multi-sensor active fire detections to map fires in the United States: the future of monitoring trends in burn severity

    Science.gov (United States)

    Picotte, Joshua J.; Coan, Michael; Howard, Stephen M.

    2014-01-01

    The effort to utilize satellite-based MODIS, AVHRR, and GOES fire detections from the Hazard Monitoring System (HMS) to identify undocumented fires in Florida and improve the Monitoring Trends in Burn Severity (MTBS) mapping process has yielded promising results. This method was augmented using regression tree models to identify burned/not-burned pixels (BnB) in every Landsat scene (1984–2012) in Worldwide Referencing System 2 Path/Rows 16/40, 17/39, and 1839. The burned area delineations were combined with the HMS detections to create burned area polygons attributed with their date of fire detection. Within our study area, we processed 88,000 HMS points (2003–2012) and 1,800 Landsat scenes to identify approximately 300,000 burned area polygons. Six percent of these burned area polygons were larger than the 500-acre MTBS minimum size threshold. From this study, we conclude that the process can significantly improve understanding of fire occurrence and improve the efficiency and timeliness of assessing its impacts upon the landscape.

  9. Reliability in the utility computing era: Towards reliable Fog computing

    DEFF Research Database (Denmark)

    Madsen, Henrik; Burtschy, Bernard; Albeanu, G.

    2013-01-01

    This paper considers current paradigms in computing and outlines the most important aspects concerning their reliability. The Fog computing paradigm as a non-trivial extension of the Cloud is considered and the reliability of the networks of smart devices are discussed. Combining the reliability...... requirements of grid and cloud paradigms with the reliability requirements of networks of sensor and actuators it follows that designing a reliable Fog computing platform is feasible....

  10. Integrated reliability condition monitoring and maintenance of equipment

    CERN Document Server

    Osarenren, John

    2015-01-01

    Consider a Viable and Cost-Effective Platform for the Industries of the Future (IOF) Benefit from improved safety, performance, and product deliveries to your customers. Achieve a higher rate of equipment availability, performance, product quality, and reliability. Integrated Reliability: Condition Monitoring and Maintenance of Equipment incorporates reliable engineering and mathematical modeling to help you move toward sustainable development in reliability condition monitoring and maintenance. This text introduces a cost-effective integrated reliability growth monitor, integrated reliability degradation monitor, technological inheritance coefficient sensors, and a maintenance tool that supplies real-time information for predicting and preventing potential failures of manufacturing processes and equipment. The author highlights five key elements that are essential to any improvement program: improving overall equipment and part effectiveness, quality, and reliability; improving process performance with maint...

  11. UAS-Borne Photogrammetry for Surface Topographic Characterization: A Ground-Truth Baseline for Future Change Detection and Refinement of Scaled Remotely-Sensed Datasets

    Science.gov (United States)

    Coppersmith, R.; Schultz-Fellenz, E. S.; Sussman, A. J.; Vigil, S.; Dzur, R.; Norskog, K.; Kelley, R.; Miller, L.

    2015-12-01

    While long-term objectives of monitoring and verification regimes include remote characterization and discrimination of surficial geologic and topographic features at sites of interest, ground truth data is required to advance development of remote sensing techniques. Increasingly, it is desirable for these ground-based or ground-proximal characterization methodologies to be as nimble, efficient, non-invasive, and non-destructive as their higher-altitude airborne counterparts while ideally providing superior resolution. For this study, the area of interest is an alluvial site at the Nevada National Security Site intended for use in the Source Physics Experiment's (Snelson et al., 2013) second phase. Ground-truth surface topographic characterization was performed using a DJI Inspire 1 unmanned aerial system (UAS), at very low altitude (clouds. Within the area of interest, careful installation of surveyed ground control fiducial markers supplied necessary targets for field collection, and information for model georectification. The resulting model includes a Digital Elevation Model derived from 2D imagery. It is anticipated that this flexible and versatile characterization process will provide point cloud data resolution equivalent to a purely ground-based LiDAR scanning deployment (e.g., 1-2cm horizontal and vertical resolution; e.g., Sussman et al., 2012; Schultz-Fellenz et al., 2013). In addition to drastically increasing time efficiency in the field, the UAS method also allows for more complete coverage of the study area when compared to ground-based LiDAR. Comparison and integration of these data with conventionally-acquired airborne LiDAR data from a higher-altitude (~ 450m) platform will aid significantly in the refinement of technologies and detection capabilities of remote optical systems to identify and detect surface geologic and topographic signatures of interest. This work includes a preliminary comparison of surface signatures detected from varying

  12. Accelerator reliability workshop

    Energy Technology Data Exchange (ETDEWEB)

    Hardy, L; Duru, Ph; Koch, J M; Revol, J L; Van Vaerenbergh, P; Volpe, A M; Clugnet, K; Dely, A; Goodhew, D

    2002-07-01

    About 80 experts attended this workshop, which brought together all accelerator communities: accelerator driven systems, X-ray sources, medical and industrial accelerators, spallation sources projects (American and European), nuclear physics, etc. With newly proposed accelerator applications such as nuclear waste transmutation, replacement of nuclear power plants and others. Reliability has now become a number one priority for accelerator designers. Every part of an accelerator facility from cryogenic systems to data storage via RF systems are concerned by reliability. This aspect is now taken into account in the design/budget phase, especially for projects whose goal is to reach no more than 10 interruptions per year. This document gathers the slides but not the proceedings of the workshop.

  13. Human Reliability Program Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Landers, John; Rogers, Erin; Gerke, Gretchen

    2014-05-18

    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  14. Reliability and construction control

    Directory of Open Access Journals (Sweden)

    Sherif S. AbdelSalam

    2016-06-01

    Full Text Available The goal of this study was to determine the most reliable and efficient combination of design and construction methods required for vibro piles. For a wide range of static and dynamic formulas, the reliability-based resistance factors were calculated using EGYPT database, which houses load test results for 318 piles. The analysis was extended to introduce a construction control factor that determines the variation between the pile nominal capacities calculated using static versus dynamic formulae. From the major outcomes, the lowest coefficient of variation is associated with Davisson’s criterion, and the resistance factors calculated for the AASHTO method are relatively high compared with other methods. Additionally, the CPT-Nottingham and Schmertmann method provided the most economic design. Recommendations related to a pile construction control factor were also presented, and it was found that utilizing the factor can significantly reduce variations between calculated and actual capacities.

  15. Scyllac equipment reliability analysis

    International Nuclear Information System (INIS)

    Gutscher, W.D.; Johnson, K.J.

    1975-01-01

    Most of the failures in Scyllac can be related to crowbar trigger cable faults. A new cable has been designed, procured, and is currently undergoing evaluation. When the new cable has been proven, it will be worked into the system as quickly as possible without causing too much additional down time. The cable-tip problem may not be easy or even desirable to solve. A tightly fastened permanent connection that maximizes contact area would be more reliable than the plug-in type of connection in use now, but it would make system changes and repairs much more difficult. The balance of the failures have such a low occurrence rate that they do not cause much down time and no major effort is underway to eliminate them. Even though Scyllac was built as an experimental system and has many thousands of components, its reliability is very good. Because of this the experiment has been able to progress at a reasonable pace

  16. Improving Power Converter Reliability

    DEFF Research Database (Denmark)

    Ghimire, Pramod; de Vega, Angel Ruiz; Beczkowski, Szymon

    2014-01-01

    of a high-power IGBT module during converter operation, which may play a vital role in improving the reliability of the power converters. The measured voltage is used to estimate the module average junction temperature of the high and low-voltage side of a half-bridge IGBT separately in every fundamental......The real-time junction temperature monitoring of a high-power insulated-gate bipolar transistor (IGBT) module is important to increase the overall reliability of power converters for industrial applications. This article proposes a new method to measure the on-state collector?emitter voltage...... is measured in a wind power converter at a low fundamental frequency. To illustrate more, the test method as well as the performance of the measurement circuit are also presented. This measurement is also useful to indicate failure mechanisms such as bond wire lift-off and solder layer degradation...

  17. Accelerator reliability workshop

    International Nuclear Information System (INIS)

    Hardy, L.; Duru, Ph.; Koch, J.M.; Revol, J.L.; Van Vaerenbergh, P.; Volpe, A.M.; Clugnet, K.; Dely, A.; Goodhew, D.

    2002-01-01

    About 80 experts attended this workshop, which brought together all accelerator communities: accelerator driven systems, X-ray sources, medical and industrial accelerators, spallation sources projects (American and European), nuclear physics, etc. With newly proposed accelerator applications such as nuclear waste transmutation, replacement of nuclear power plants and others. Reliability has now become a number one priority for accelerator designers. Every part of an accelerator facility from cryogenic systems to data storage via RF systems are concerned by reliability. This aspect is now taken into account in the design/budget phase, especially for projects whose goal is to reach no more than 10 interruptions per year. This document gathers the slides but not the proceedings of the workshop

  18. Safety and reliability assessment

    International Nuclear Information System (INIS)

    1979-01-01

    This report contains the papers delivered at the course on safety and reliability assessment held at the CSIR Conference Centre, Scientia, Pretoria. The following topics were discussed: safety standards; licensing; biological effects of radiation; what is a PWR; safety principles in the design of a nuclear reactor; radio-release analysis; quality assurance; the staffing, organisation and training for a nuclear power plant project; event trees, fault trees and probability; Automatic Protective Systems; sources of failure-rate data; interpretation of failure data; synthesis and reliability; quantification of human error in man-machine systems; dispersion of noxious substances through the atmosphere; criticality aspects of enrichment and recovery plants; and risk and hazard analysis. Extensive examples are given as well as case studies

  19. Reliability of Circumplex Axes

    Directory of Open Access Journals (Sweden)

    Micha Strack

    2013-06-01

    Full Text Available We present a confirmatory factor analysis (CFA procedure for computing the reliability of circumplex axes. The tau-equivalent CFA variance decomposition model estimates five variance components: general factor, axes, scale-specificity, block-specificity, and item-specificity. Only the axes variance component is used for reliability estimation. We apply the model to six circumplex types and 13 instruments assessing interpersonal and motivational constructs—Interpersonal Adjective List (IAL, Interpersonal Adjective Scales (revised; IAS-R, Inventory of Interpersonal Problems (IIP, Impact Messages Inventory (IMI, Circumplex Scales of Interpersonal Values (CSIV, Support Action Scale Circumplex (SAS-C, Interaction Problems With Animals (IPI-A, Team Role Circle (TRC, Competing Values Leadership Instrument (CV-LI, Love Styles, Organizational Culture Assessment Instrument (OCAI, Customer Orientation Circle (COC, and System for Multi-Level Observation of Groups (behavioral adjectives; SYMLOG—in 17 German-speaking samples (29 subsamples, grouped by self-report, other report, and metaperception assessments. The general factor accounted for a proportion ranging from 1% to 48% of the item variance, the axes component for 2% to 30%; and scale specificity for 1% to 28%, respectively. Reliability estimates varied considerably from .13 to .92. An application of the Nunnally and Bernstein formula proposed by Markey, Markey, and Tinsley overestimated axes reliabilities in cases of large-scale specificities but otherwise works effectively. Contemporary circumplex evaluations such as Tracey’s RANDALL are sensitive to the ratio of the axes and scale-specificity components. In contrast, the proposed model isolates both components.

  20. The cost of reliability

    International Nuclear Information System (INIS)

    Ilic, M.

    1998-01-01

    In this article the restructuring process under way in the US power industry is being revisited from the point of view of transmission system provision and reliability was rolled into the average cost of electricity to all, it is not so obvious how is this cost managed in the new industry. A new MIT approach to transmission pricing is here suggested as a possible solution [it

  1. Isokinetic strength assessment offers limited predictive validity for detecting risk of future hamstring strain in sport: a systematic review and meta-analysis.

    Science.gov (United States)

    Green, Brady; Bourne, Matthew N; Pizzari, Tania

    2018-03-01

    To examine the value of isokinetic strength assessment for predicting risk of hamstring strain injury, and to direct future research into hamstring strain injuries. Systematic review. Database searches for Medline, CINAHL, Embase, AMED, AUSPORT, SPORTDiscus, PEDro and Cochrane Library from inception to April 2017. Manual reference checks, ahead-of-press and citation tracking. Prospective studies evaluating isokinetic hamstrings, quadriceps and hip extensor strength testing as a risk factor for occurrence of hamstring muscle strain. Independent search result screening. Risk of bias assessment by independent reviewers using Quality in Prognosis Studies tool. Best evidence synthesis and meta-analyses of standardised mean difference (SMD). Twelve studies were included, capturing 508 hamstring strain injuries in 2912 athletes. Isokinetic knee flexor, knee extensor and hip extensor outputs were examined at angular velocities ranging 30-300°/s, concentric or eccentric, and relative (Nm/kg) or absolute (Nm) measures. Strength ratios ranged between 30°/s and 300°/s. Meta-analyses revealed a small, significant predictive effect for absolute (SMD=-0.16, P=0.04, 95% CI -0.31 to -0.01) and relative (SMD=-0.17, P=0.03, 95% CI -0.33 to -0.014) eccentric knee flexor strength (60°/s). No other testing speed or strength ratio showed statistical association. Best evidence synthesis found over half of all variables had moderate or strong evidence for no association with future hamstring injury. Despite an isolated finding for eccentric knee flexor strength at slow speeds, the role and application of isokinetic assessment for predicting hamstring strain risk should be reconsidered, particularly given costs and specialised training required. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Investment in new product reliability

    International Nuclear Information System (INIS)

    Murthy, D.N.P.; Rausand, M.; Virtanen, S.

    2009-01-01

    Product reliability is of great importance to both manufacturers and customers. Building reliability into a new product is costly, but the consequences of inadequate product reliability can be costlier. This implies that manufacturers need to decide on the optimal investment in new product reliability by achieving a suitable trade-off between the two costs. This paper develops a framework and proposes an approach to help manufacturers decide on the investment in new product reliability.

  3. Weibull distribution in reliability data analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang

    2015-01-01

    Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)

  4. Interformat reliability of digital psychiatric self-report questionnaires: a systematic review.

    Science.gov (United States)

    Alfonsson, Sven; Maathz, Pernilla; Hursti, Timo

    2014-12-03

    Research on Internet-based interventions typically use digital versions of pen and paper self-report symptom scales. However, adaptation into the digital format could affect the psychometric properties of established self-report scales. Several studies have investigated differences between digital and pen and paper versions of instruments, but no systematic review of the results has yet been done. This review aims to assess the interformat reliability of self-report symptom scales used in digital or online psychotherapy research. Three databases (MEDLINE, Embase, and PsycINFO) were systematically reviewed for studies investigating the reliability between digital and pen and paper versions of psychiatric symptom scales. From a total of 1504 publications, 33 were included in the review, and interformat reliability of 40 different symptom scales was assessed. Significant differences in mean total scores between formats were found in 10 of 62 analyses. These differences were found in just a few studies, which indicates that the results were due to study effects and sample effects rather than unreliable instruments. The interformat reliability ranged from r=.35 to r=.99; however, the majority of instruments showed a strong correlation between format scores. The quality of the included studies varied, and several studies had insufficient power to detect small differences between formats. When digital versions of self-report symptom scales are compared to pen and paper versions, most scales show high interformat reliability. This supports the reliability of results obtained in psychotherapy research on the Internet and the comparability of the results to traditional psychotherapy research. There are, however, some instruments that consistently show low interformat reliability, suggesting that these conclusions cannot be generalized to all questionnaires. Most studies had at least some methodological issues with insufficient statistical power being the most common issue

  5. Review of current research, problems and future trends with regard to geochemical techniques for uranium exploration and recent developments in radon detection

    International Nuclear Information System (INIS)

    De Wet, W.J.

    1984-01-01

    The review deals with the need for knowledge of uranium geology and exploration techniques. The review mainly focuses on radon techniques and closely related aspects. The use of radon as a prospecting tool is primarily based on the fact that it is an inert gas, and threfore, has the ability to migrate through cracks and porous media. The methods used in radon prospecting are based on the detection of α or γ-radon produced during the radioactive decay of Rn and/or Rn decay daughter isotopes. The methods can be described as either active or passive. The active methods involve pumping of soil gas from a narrow hole drilled in the ground and suitably covered, into or through a detector instrument, whereas the passive methods register Rn concentrations in the ground under natural conditions. In uranium exploration the aim is to distinguish areas with enhanced radon concentrations in relation to background levels

  6. Future Textiles

    DEFF Research Database (Denmark)

    Hansen, Anne-Louise Degn; Jensen, Hanne Troels Fusvad; Hansen, Martin

    2011-01-01

    Magasinet Future Textiles samler resultaterne fra projektet Future Textiles, der markedsfører området intelligente tekstiler. I magasinet kan man læse om trends, drivkræfter, udfordringer samt få ideer til nye produkter inden for intelligente tekstiler. Områder som bæredygtighed og kundetilpasning...

  7. Evaluation of ECT reliability for axial ODSCC in steam generator tubes

    International Nuclear Information System (INIS)

    Lee, Jae Bong; Park, Jai Hak; Kim, Hong Deok; Chung, Han Sub

    2010-01-01

    The integrity of steam generator tubes is usually evaluated based on eddy current test (ECT) results. Because detection capacity of the ECT is not perfect, all of the physical flaws, which actually exist in steam generator tubes, cannot be detected by ECT inspection. Therefore it is very important to analyze ECT reliability in the integrity assessment of steam generators. The reliability of an ECT inspection system is divided into reliability of inspection technique and reliability of quality of analyst. And the reliability of ECT results is also divided into reliability of size and reliability of detection. The reliability of ECT sizing is often characterized as a linear regression model relating true flaw size data to measured flaw size data. The reliability of detection is characterized in terms of probability of detection (POD), which is expressed as a function of flaw size. In this paper the reliability of an ECT inspection system is analyzed quantitatively. POD of the ECT inspection system for axial outside diameter stress corrosion cracks (ODSCC) in steam generator tubes is evaluated. Using a log-logistic regression model, POD is evaluated from hit (detection) and miss (no detection) binary data obtained from destructive and non-destructive inspections of cracked tubes. Crack length and crack depth are considered as variables in multivariate log-logistic regression and their effects on detection capacity are assessed using two-dimensional POD (2-D POD) surface. The reliability of detection is also analyzed using POD for inspection technique (POD T ) and POD for analyst (POD A ).

  8. Futures Brokerages Face uncertain Future

    Institute of Scientific and Technical Information of China (English)

    WANG PEI

    2006-01-01

    @@ 2005 was a quiet year for China's futures market.After four new trading products, including cotton, fuel oil and corn, were launched on the market in 2004, the development of the market seemed to stagnate. The trade value of the futures market totaled 13.4 trillion yuan (US$ 1.67 trillion) in 2005, down 8.5 percent year-on-year. Although the decrease is quite small and the trade value was still the second highest in the market's history, the majority of futures brokerage firms were running in the red. In some areas, up to 80 percent of futures companies made losses.

  9. RELIABILITY MODELING BASED ON INCOMPLETE DATA: OIL PUMP APPLICATION

    Directory of Open Access Journals (Sweden)

    Ahmed HAFAIFA

    2014-07-01

    Full Text Available The reliability analysis for industrial maintenance is now increasingly demanded by the industrialists in the world. Indeed, the modern manufacturing facilities are equipped by data acquisition and monitoring system, these systems generates a large volume of data. These data can be used to infer future decisions affecting the health facilities. These data can be used to infer future decisions affecting the state of the exploited equipment. However, in most practical cases the data used in reliability modelling are incomplete or not reliable. In this context, to analyze the reliability of an oil pump, this work proposes to examine and treat the incomplete, incorrect or aberrant data to the reliability modeling of an oil pump. The objective of this paper is to propose a suitable methodology for replacing the incomplete data using a regression method.

  10. Future developments in physical protection against the insider threat

    International Nuclear Information System (INIS)

    Winblad, A.E.

    1985-01-01

    This report discusses a number of hardware elements that are being developed for future protection systems against insider adversaries. These elements will assist in enforcing administrative and procedural security rules, provide additional detection and delay capability for critical components, and improve the processing and display of security information. The incorporation of this hardware will add useful layers of protection to the employee-screening and human-reliability programs currently in use. User-friendly evaluation models that can aid in the overall design of more effective protection systems are also described. 2 refs., 6 figs

  11. Future developments in physical protection against the insider threat

    International Nuclear Information System (INIS)

    Winblad, A.E.

    1985-01-01

    This report discusses a number of hardware elements that are being developed for future protection systems against insider adversaries. These elements will assist in enforcing administrative and procedural security rules, provide additional detection and delay capability for critical components, and improve the processing and display of security information. The incorporation of this hardware will add useful layers of protection to the employee-screening and human-reliability programs currently in use. User-friendly evaluation models that can aid in the overall design of more effective protection systems are also described

  12. Nuclear performance and reliability

    International Nuclear Information System (INIS)

    Rothwell, G.

    1993-01-01

    If fewer forced outages are a sign of improved safety, nuclear power plants have become safer and more productive. There has been a significant improvement in nuclear power plant performance, due largely to a decline in the forced outage rate and a dramatic drop in the average number of forced outages per fuel cycle. If fewer forced outages are a sign of improved safety, nuclear power plants have become safer and more productive over time. To encourage further increases in performance, regulatory incentive schemes should reward reactor operators for improved reliability and safety, as well as for improved performance

  13. [How Reliable is Neuronavigation?].

    Science.gov (United States)

    Stieglitz, Lennart Henning

    2016-02-17

    Neuronavigation plays a central role in modern neurosurgery. It allows visualizing instruments and three-dimensional image data intraoperatively and supports spatial orientation. Thus it allows to reduce surgical risks and speed up complex surgical procedures. The growing availability and importance of neuronavigation makes clear how relevant it is to know about its reliability and accuracy. Different factors may influence the accuracy during the surgery unnoticed, misleading the surgeon. Besides the best possible optimization of the systems themselves, a good knowledge about its weaknesses is mandatory for every neurosurgeon.

  14. The value of reliability

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Karlström, Anders

    2010-01-01

    We derive the value of reliability in the scheduling of an activity of random duration, such as travel under congested conditions. Using a simple formulation of scheduling utility, we show that the maximal expected utility is linear in the mean and standard deviation of trip duration, regardless...... of the form of the standardised distribution of trip durations. This insight provides a unification of the scheduling model and models that include the standard deviation of trip duration directly as an argument in the cost or utility function. The results generalise approximately to the case where the mean...

  15. Imperfection detection probability at ultrasonic testing of reactor vessels

    International Nuclear Information System (INIS)

    Kazinczy, F. de; Koernvik, L.Aa.

    1980-02-01

    The report is a lecture given at a symposium organized by the Swedish nuclear power inspectorate on February 1980. Equipments, calibration and testing procedures are reported. The estimation of defect detection probability for ultrasonic tests and the reliability of literature data are discussed. Practical testing of reactor vessels and welded joints are described. Swedish test procedures are compared with other countries. Series of test data for welded joints of the OKG-2 reactor are presented. Future recommendations for testing procedures are made. (GBn)

  16. Highly-reliable laser diodes and modules for spaceborne applications

    Science.gov (United States)

    Deichsel, E.

    2017-11-01

    Laser applications become more and more interesting in contemporary missions such as earth observations or optical communication in space. One of these applications is light detection and ranging (LIDAR), which comprises huge scientific potential in future missions. The Nd:YAG solid-state laser of such a LIDAR system is optically pumped using 808nm emitting pump sources based on semiconductor laser-diodes in quasi-continuous wave (qcw) operation. Therefore reliable and efficient laser diodes with increased output powers are an important requirement for a spaceborne LIDAR-system. In the past, many tests were performed regarding the performance and life-time of such laser-diodes. There were also studies for spaceborne applications, but a test with long operation times at high powers and statistical relevance is pending. Other applications, such as science packages (e.g. Raman-spectroscopy) on planetary rovers require also reliable high-power light sources. Typically fiber-coupled laser diode modules are used for such applications. Besides high reliability and life-time, designs compatible to the harsh environmental conditions must be taken in account. Mechanical loads, such as shock or strong vibration are expected due to take-off or landing procedures. Many temperature cycles with high change rates and differences must be taken in account due to sun-shadow effects in planetary orbits. Cosmic radiation has strong impact on optical components and must also be taken in account. Last, a hermetic sealing must be considered, since vacuum can have disadvantageous effects on optoelectronics components.

  17. Validity of ultrasonography and measures of adult shoulder function and reliability of ultrasonography in detecting shoulder synovitis in patients with rheumatoid arthritis using magnetic resonance imaging as a gold standard.

    LENUS (Irish Health Repository)

    Bruyn, G A W

    2010-08-01

    To assess the intra- and interobserver reproducibility of musculoskeletal ultrasonography (US) in detecting inflammatory shoulder changes in patients with rheumatoid arthritis, and to determine the agreement between US and the Shoulder Pain and Disability Index (SPADI) and the Disabilities of the Arm, Shoulder, and Hand (DASH) questionnaire, using magnetic resonance imaging (MRI) as a gold standard.

  18. Interactive reliability assessment using an integrated reliability data bank

    International Nuclear Information System (INIS)

    Allan, R.N.; Whitehead, A.M.

    1986-01-01

    The logical structure, techniques and practical application of a computer-aided technique based on a microcomputer using floppy disc Random Access Files is described. This interactive computational technique is efficient if the reliability prediction program is coupled directly to a relevant source of data to create an integrated reliability assessment/reliability data bank system. (DG)

  19. Load Control System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Trudnowski, Daniel [Montana Tech of the Univ. of Montana, Butte, MT (United States)

    2015-04-03

    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  20. Supply chain reliability modelling

    Directory of Open Access Journals (Sweden)

    Eugen Zaitsev

    2012-03-01

    Full Text Available Background: Today it is virtually impossible to operate alone on the international level in the logistics business. This promotes the establishment and development of new integrated business entities - logistic operators. However, such cooperation within a supply chain creates also many problems related to the supply chain reliability as well as the optimization of the supplies planning. The aim of this paper was to develop and formulate the mathematical model and algorithms to find the optimum plan of supplies by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Methods: The mathematical model and algorithms to find the optimum plan of supplies were developed and formulated by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Results and conclusions: The problem of ensuring failure-free performance of goods supply channel analyzed in the paper is characteristic of distributed network systems that make active use of business process outsourcing technologies. The complex planning problem occurring in such systems that requires taking into account the consumer's requirements for failure-free performance in terms of supply volumes and correctness can be reduced to a relatively simple linear programming problem through logical analysis of the structures. The sequence of the operations, which should be taken into account during the process of the supply planning with the supplier's functional reliability, was presented.

  1. Data processing of qualitative results from an interlaboratory comparison for the detection of "Flavescence dorée" phytoplasma: How the use of statistics can improve the reliability of the method validation process in plant pathology.

    Science.gov (United States)

    Chabirand, Aude; Loiseau, Marianne; Renaudin, Isabelle; Poliakoff, Françoise

    2017-01-01

    A working group established in the framework of the EUPHRESCO European collaborative project aimed to compare and validate diagnostic protocols for the detection of "Flavescence dorée" (FD) phytoplasma in grapevines. Seven molecular protocols were compared in an interlaboratory test performance study where each laboratory had to analyze the same panel of samples consisting of DNA extracts prepared by the organizing laboratory. The tested molecular methods consisted of universal and group-specific real-time and end-point nested PCR tests. Different statistical approaches were applied to this collaborative study. Firstly, there was the standard statistical approach consisting in analyzing samples which are known to be positive and samples which are known to be negative and reporting the proportion of false-positive and false-negative results to respectively calculate diagnostic specificity and sensitivity. This approach was supplemented by the calculation of repeatability and reproducibility for qualitative methods based on the notions of accordance and concordance. Other new approaches were also implemented, based, on the one hand, on the probability of detection model, and, on the other hand, on Bayes' theorem. These various statistical approaches are complementary and give consistent results. Their combination, and in particular, the introduction of new statistical approaches give overall information on the performance and limitations of the different methods, and are particularly useful for selecting the most appropriate detection scheme with regards to the prevalence of the pathogen. Three real-time PCR protocols (methods M4, M5 and M6 respectively developed by Hren (2007), Pelletier (2009) and under patent oligonucleotides) achieved the highest levels of performance for FD phytoplasma detection. This paper also addresses the issue of indeterminate results and the identification of outlier results. The statistical tools presented in this paper and their

  2. Data processing of qualitative results from an interlaboratory comparison for the detection of "Flavescence dorée" phytoplasma: How the use of statistics can improve the reliability of the method validation process in plant pathology.

    Directory of Open Access Journals (Sweden)

    Aude Chabirand

    Full Text Available A working group established in the framework of the EUPHRESCO European collaborative project aimed to compare and validate diagnostic protocols for the detection of "Flavescence dorée" (FD phytoplasma in grapevines. Seven molecular protocols were compared in an interlaboratory test performance study where each laboratory had to analyze the same panel of samples consisting of DNA extracts prepared by the organizing laboratory. The tested molecular methods consisted of universal and group-specific real-time and end-point nested PCR tests. Different statistical approaches were applied to this collaborative study. Firstly, there was the standard statistical approach consisting in analyzing samples which are known to be positive and samples which are known to be negative and reporting the proportion of false-positive and false-negative results to respectively calculate diagnostic specificity and sensitivity. This approach was supplemented by the calculation of repeatability and reproducibility for qualitative methods based on the notions of accordance and concordance. Other new approaches were also implemented, based, on the one hand, on the probability of detection model, and, on the other hand, on Bayes' theorem. These various statistical approaches are complementary and give consistent results. Their combination, and in particular, the introduction of new statistical approaches give overall information on the performance and limitations of the different methods, and are particularly useful for selecting the most appropriate detection scheme with regards to the prevalence of the pathogen. Three real-time PCR protocols (methods M4, M5 and M6 respectively developed by Hren (2007, Pelletier (2009 and under patent oligonucleotides achieved the highest levels of performance for FD phytoplasma detection. This paper also addresses the issue of indeterminate results and the identification of outlier results. The statistical tools presented in this paper

  3. Reliability of salivary testosterone measurements in diagnosis of Polycystic Ovarian Syndrome

    Directory of Open Access Journals (Sweden)

    Omnia Youssef

    2010-07-01

    Conclusion: Determination of salivary testosterone is a reliable method to detect changes in the concentration of available biologically active testosterone in the serum. Salivary testosterone provides a sensitive, simple, reliable, non-invasive and uncomplicated diagnostic approach for PCOS.

  4. Sustainable Futures

    Science.gov (United States)

    Sustainable Futures is a voluntary program that encourages industry to use predictive models to screen new chemicals early in the development process and offers incentives to companies subject to TSCA section 5.

  5. OSS reliability measurement and assessment

    CERN Document Server

    Yamada, Shigeru

    2016-01-01

    This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.

  6. Transit ridership, reliability, and retention.

    Science.gov (United States)

    2008-10-01

    This project explores two major components that affect transit ridership: travel time reliability and rider : retention. It has been recognized that transit travel time reliability may have a significant impact on : attractiveness of transit to many ...

  7. Travel reliability inventory for Chicago.

    Science.gov (United States)

    2013-04-01

    The overarching goal of this research project is to enable state DOTs to document and monitor the reliability performance : of their highway networks. To this end, a computer tool, TRIC, was developed to produce travel reliability inventories from : ...

  8. Smart grids: the future network

    International Nuclear Information System (INIS)

    Bassi, F.; Sabelli, C.

    2008-01-01

    The old structure of the Italian electric power system due to the changes in place for years will always be less suitable to the needs of the future, therefore, must change to become more intelligent, more reliable, sustainable and economically [it

  9. Reliable and Efficient Communications in Wireless Sensor Networks

    International Nuclear Information System (INIS)

    Abdelhakim, M.M.

    2014-01-01

    Wireless sensor network (WSN) is a key technology for a wide range of military and civilian applications. Limited by the energy resources and processing capabilities of the sensor nodes, reliable and efficient communications in wireless sensor networks are challenging, especially when the sensors are deployed in hostile environments. This research aims to improve the reliability and efficiency of time-critical communications in WSNs, under both benign and hostile environments. We start with wireless sensor network with mobile access points (SENMA), where the mobile access points traverse the network to collect information from individual sensors. Due to its routing simplicity and energy efficiency, SENMA has attracted lots of attention from the research community. Here, we study reliable distributed detection in SENMA under Byzantine attacks, where some authenticated sensors are compromised to report fictitious information. The q-out-of-m rule is considered. It is popular in distributed detection and can achieve a good trade-off between the miss detection probability and the false alarm rate. However, a major limitation with this rule is that the optimal scheme parameters can only be obtained through exhaustive search. By exploiting the linear relationship between the scheme parameters and the network size, we propose simple but effective sub-optimal linear approaches. Then, for better flexibility and scalability, we derive a near-optimal closed-form solution based on the central limit theorem. It is proved that the false alarm rate of the q-out-of-m scheme diminishes exponentially as the network size increases, even if the percentage of malicious nodes remains fixed. This implies that large-scale sensor networks are more reliable under malicious attacks. To further improve the performance under time varying attacks, we propose an effective malicious node detection scheme for adaptive data fusion; the proposed scheme is analyzed using the entropy-based trust model

  10. 2017 NREL Photovoltaic Reliability Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-15

    NREL's Photovoltaic (PV) Reliability Workshop (PVRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology -- both critical goals for moving PV technologies deeper into the electricity marketplace.

  11. AECL's reliability and maintainability program

    International Nuclear Information System (INIS)

    Wolfe, W.A.; Nieuwhof, G.W.E.

    1976-05-01

    AECL's reliability and maintainability program for nuclear generating stations is described. How the various resources of the company are organized to design and construct stations that operate reliably and safely is shown. Reliability and maintainability includes not only special mathematically oriented techniques, but also the technical skills and organizational abilities of the company. (author)

  12. Reliability of Maximal Strength Testing in Novice Weightlifters

    Science.gov (United States)

    Loehr, James A.; Lee, Stuart M. C.; Feiveson, Alan H.; Ploutz-Snyder, Lori L.

    2009-01-01

    The one repetition maximum (1RM) is a criterion measure of muscle strength. However, the reliability of 1RM testing in novice subjects has received little attention. Understanding this information is crucial to accurately interpret changes in muscle strength. To evaluate the test-retest reliability of a squat (SQ), heel raise (HR), and deadlift (DL) 1RM in novice subjects. Twenty healthy males (31 plus or minus 5 y, 179.1 plus or minus 6.1 cm, 81.4 plus or minus 10.6 kg) with no weight training experience in the previous six months participated in four 1RM testing sessions, with each session separated by 5-7 days. SQ and HR 1RM were conducted using a smith machine; DL 1RM was assessed using free weights. Session 1 was considered a familiarization and was not included in the statistical analyses. Repeated measures analysis of variance with Tukey fs post-hoc tests were used to detect between-session differences in 1RM (p.0.05). Test-retest reliability was evaluated by intraclass correlation coefficients (ICC). During Session 2, the SQ and DL 1RM (SQ: 90.2 }4.3, DL: 75.9 }3.3 kg) were less than Session 3 (SQ: 95.3 }4.1, DL: 81.5 plus or minus 3.5 kg) and Session 4 (SQ: 96.6 }4.0, DL: 82.4 }3.9 kg), but there were no differences between Session 3 and Session 4. HR 1RM measured during Session 2 (150.1 }3.7 kg) and Session 3 (152.5 }3.9 kg) were not different from one another, but both were less than Session 4 (157.5 }3.8 kg). The reliability (ICC) of 1RM measures for Sessions 2-4 were 0.88, 0.83, and 0.87, for SQ, HR, and DL, respectively. When considering only Sessions 3 and 4, the reliability was 0.93, 0.91, and 0.86 for SQ, HR, and DL, respectively. One familiarization session and 2 test sessions (for SQ and DL) were required to obtain excellent reliability (ICC greater than or equal to 0.90) in 1RM values with novice subjects. We were unable to attain this level of reliability following 3 HR testing sessions therefore additional sessions may be required to obtain an

  13. Models on reliability of non-destructive testing

    International Nuclear Information System (INIS)

    Simola, K.; Pulkkinen, U.

    1998-01-01

    The reliability of ultrasonic inspections has been studied in e.g. international PISC (Programme for the Inspection of Steel Components) exercises. These exercises have produced a large amount of information on the effect of various factors on the reliability of inspections. The information obtained from reliability experiments are used to model the dependency of flaw detection probability on various factors and to evaluate the performance of inspection equipment, including the sizing accuracy. The information from experiments is utilised in a most effective way when mathematical models are applied. Here, some statistical models for reliability of non-destructive tests are introduced. In order to demonstrate the use of inspection reliability models, they have been applied to the inspection results of intergranular stress corrosion cracking (IGSCC) type flaws in PISC III exercise (PISC 1995). The models are applied to both flaw detection frequency data of all inspection teams and to flaw sizing data of one participating team. (author)

  14. Detection block

    International Nuclear Information System (INIS)

    Bezak, A.

    1987-01-01

    A diagram is given of a detection block used for monitoring burnup of nuclear reactor fuel. A shielding block is an important part of the detection block. It stabilizes the fuel assembly in the fixing hole in front of a collimator where a suitable gamma beam is defined for gamma spectrometry determination of fuel burnup. The detector case and a neutron source case are placed on opposite sides of the fixing hole. For neutron measurement for which the water in the tank is used as a moderator, the neutron detector-fuel assembly configuration is selected such that neutrons from spontaneous fission and neutrons induced with the neutron source can both be measured. The patented design of the detection block permits longitudinal travel and rotation of the fuel assembly to any position, and thus more reliable determination of nuclear fuel burnup. (E.S.). 1 fig

  15. Calculus detection calibration among dental hygiene faculty members utilizing dental endoscopy: a pilot study.

    Science.gov (United States)

    Partido, Brian B; Jones, Archie A; English, Dana L; Nguyen, Carol A; Jacks, Mary E

    2015-02-01

    Dental and dental hygiene faculty members often do not provide consistent instruction in the clinical environment, especially in tasks requiring clinical judgment. From previous efforts to calibrate faculty members in calculus detection using typodonts, researchers have suggested using human subjects and emerging technology to improve consistency in clinical instruction. The purpose of this pilot study was to determine if a dental endoscopy-assisted training program would improve intra- and interrater reliability of dental hygiene faculty members in calculus detection. Training included an ODU 11/12 explorer, typodonts, and dental endoscopy. A convenience sample of six participants was recruited from the dental hygiene faculty at a California community college, and a two-group randomized experimental design was utilized. Intra- and interrater reliability was measured before and after calibration training. Pretest and posttest Kappa averages of all participants were compared using repeated measures (split-plot) ANOVA to determine the effectiveness of the calibration training on intra- and interrater reliability. The results showed that both kinds of reliability significantly improved for all participants and the training group improved significantly in interrater reliability from pretest to posttest. Calibration training was beneficial to these dental hygiene faculty members, especially those beginning with less than full agreement. This study suggests that calculus detection calibration training utilizing dental endoscopy can effectively improve interrater reliability of dental and dental hygiene clinical educators. Future studies should include human subjects, involve more participants at multiple locations, and determine whether improved rater reliability can be sustained over time.

  16. Electronics reliability calculation and design

    CERN Document Server

    Dummer, Geoffrey W A; Hiller, N

    1966-01-01

    Electronics Reliability-Calculation and Design provides an introduction to the fundamental concepts of reliability. The increasing complexity of electronic equipment has made problems in designing and manufacturing a reliable product more and more difficult. Specific techniques have been developed that enable designers to integrate reliability into their products, and reliability has become a science in its own right. The book begins with a discussion of basic mathematical and statistical concepts, including arithmetic mean, frequency distribution, median and mode, scatter or dispersion of mea

  17. MEASUREMENT: ACCOUNTING FOR RELIABILITY IN PERFORMANCE ESTIMATES.

    Science.gov (United States)

    Waterman, Brian; Sutter, Robert; Burroughs, Thomas; Dunagan, W Claiborne

    2014-01-01

    When evaluating physician performance measures, physician leaders are faced with the quandary of determining whether departures from expected physician performance measurements represent a true signal or random error. This uncertainty impedes the physician leader's ability and confidence to take appropriate performance improvement actions based on physician performance measurements. Incorporating reliability adjustment into physician performance measurement is a valuable way of reducing the impact of random error in the measurements, such as those caused by small sample sizes. Consequently, the physician executive has more confidence that the results represent true performance and is positioned to make better physician performance improvement decisions. Applying reliability adjustment to physician-level performance data is relatively new. As others have noted previously, it's important to keep in mind that reliability adjustment adds significant complexity to the production, interpretation and utilization of results. Furthermore, the methods explored in this case study only scratch the surface of the range of available Bayesian methods that can be used for reliability adjustment; further study is needed to test and compare these methods in practice and to examine important extensions for handling specialty-specific concerns (e.g., average case volumes, which have been shown to be important in cardiac surgery outcomes). Moreover, it's important to note that the provider group average as a basis for shrinkage is one of several possible choices that could be employed in practice and deserves further exploration in future research. With these caveats, our results demonstrate that incorporating reliability adjustment into physician performance measurements is feasible and can notably reduce the incidence of "real" signals relative to what one would expect to see using more traditional approaches. A physician leader who is interested in catalyzing performance improvement

  18. Constraining the interaction between dark sectors with future HI intensity mapping observations

    Science.gov (United States)

    Xu, Xiaodong; Ma, Yin-Zhe; Weltman, Amanda

    2018-04-01

    We study a model of interacting dark matter and dark energy, in which the two components are coupled. We calculate the predictions for the 21-cm intensity mapping power spectra, and forecast the detectability with future single-dish intensity mapping surveys (BINGO, FAST and SKA-I). Since dark energy is turned on at z ˜1 , which falls into the sensitivity range of these radio surveys, the HI intensity mapping technique is an efficient tool to constrain the interaction. By comparing with current constraints on dark sector interactions, we find that future radio surveys will produce tight and reliable constraints on the coupling parameters.

  19. Future perspectives

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    International involvement in particle physics is what the International Committee for Future Accelerators (ICFA) is all about. At the latest Future Perspectives meeting at Brookhaven from 5-10 October (after a keynote speech by doyen Viktor Weisskopf, who regretted the emergence of 'a nationalistic trend'), ICFA reviewed progress and examined its commitments in the light of the evolving world particle physics scene. Particular aims were to review worldwide accelerator achievements and plans, to survey the work of the four panels, and to discuss ICFA's special role in future cooperation in accelerator construction and use, and in research and development work for both accelerators and for detectors

  20. Future Contingents

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter; Hasle., Per F. V.

    2015-01-01

    contingent statements. The problem of future contingents is interwoven with a number of issues in theology, philosophy, logic, semantics of natural language, computer science, and applied mathematics. The theological issue of how to reconcile the assumption of God's foreknowledge with the freedom and moral...... accountability of human beings has been a main impetus to the discussion and a major inspiration to the development of various logical models of time and future contingents. This theological issue is connected with the general philosophical question of determinism versus indeterminism. Within logic, the relation...... about the future. Finally, it should be mentioned that temporal logic has found a remarkable application in computer science and applied mathematics. In the late 1970s the first computer scientists realised the relevance of temporal logic for the purposes of computer science (see Hasle and Øhrstrøm 2004)....

  1. Future Contingents

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter; Hasle., Per F. V.

    2011-01-01

    contingent statements. The problem of future contingents is interwoven with a number of issues in theology, philosophy, logic, semantics of natural language, computer science, and applied mathematics. The theological issue of how to reconcile the assumption of God's foreknowledge with the freedom and moral...... accountability of human beings has been a main impetus to the discussion and a major inspiration to the development of various logical models of time and future contingents. This theological issue is connected with the general philosophical question of determinism versus indeterminism. Within logic, the relation...... about the future. Finally, it should be mentioned that temporal logic has found a remarkable application in computer science and applied mathematics. In the late 1970s the first computer scientists realised the relevance of temporal logic for the purposes of computer science (see Hasle and Øhrstrøm 2004)....

  2. Future Savvy

    DEFF Research Database (Denmark)

    Gordon, Adam

    There's no shortage of predictions available to organizations looking to anticipate and profit from future events or trends. Apparently helpful forecasts are ubiquitous in everyday communications such as newspapers and business magazines, and in specialized sources such as government and think......-tank forecasts, consultant reports, and stock-market guides. These resources are crucial, but they are also of very mixed quality. How can decision-makers know which predictions to take seriously, which to be wary of, and which to throw out entirely? Future Savvy provides analytical filters to judging predictive...... systematic "forecast filtering" to reveal strengths and weakness in the predictions they face. Future Savvy empowers both business and policy/government decision-makers to use forecasts wisely and so improve their judgment in anticipating opportunities, avoiding threats, and managing uncertainty....

  3. Energy Futures

    DEFF Research Database (Denmark)

    Davies, Sarah Rachael; Selin, Cynthia

    2012-01-01

    foresight and public and stakeholder engagement are used to reflect on?and direct?the impacts of new technology. In this essay we draw on our experience of anticipatory governance, in the shape of the ?NanoFutures? project on energy futures, to present a reflexive analysis of engagement and deliberation. We...... draw out five tensions of the practice of deliberation on energy technologies. Through tracing the lineages of these dilemmas, we discuss some of the implications of these tensions for the practice of civic engagement and deliberation in a set of questions for this community of practitioner-scholars....

  4. Of plants and reliability

    International Nuclear Information System (INIS)

    Schneider Horst

    2009-01-01

    Behind the political statements made about the transformer event at the Kruemmel nuclear power station (KKK) in the summer of 2009 there are fundamental issues of atomic law. Pursuant to Articles 20 and 28 of its Basic Law, Germany is a state in which the rule of law applies. Consequently, the aspects of atomic law associated with the incident merit a closer look, all the more so as the items concerned have been known for many years. Important aspects in the debate about the Kruemmel nuclear power plant are the fact that the transformer is considered part of the nuclear power station under atomic law and thus a ''plant'' subject to surveillance by the nuclear regulatory agencies, on the one hand, and the reliability under atomic law of the operator and the executive personnel responsible, on the other hand. Both ''plant'' and ''reliability'' are terms focusing on nuclear safety. Hence the question to what extent safety was affected in the Kruemmel incident. The classification of the event as 0 = no or only a very slight safety impact on the INES scale (INES = International Nuclear Event Scale) should not be used to put aside the safety issue once and for all. Points of fact and their technical significance must be considered prior to any legal assessment. Legal assessments and regulations are associated with facts and circumstances. Any legal examination is based on the facts as determined and elucidated. Any other procedure would be tantamount to an inadmissible legal advance conviction. Now, what is the position of political statements, i.e. political assessments and political responsibility? If everything is done the correct way, they come at the end, after exploration of the facts and evaluation under applicable law. Sometimes things are handled differently, with consequences which are not very helpful. In the light of the provisions about the rule of law as laid down in the Basic Law, the new federal government should be made to observe the proper sequence of

  5. Assessment of Lower Limb Muscle Strength and Power Using Hand-Held and Fixed Dynamometry: A Reliability and Validity Study

    Science.gov (United States)

    Perraton, Luke G.; Bower, Kelly J.; Adair, Brooke; Pua, Yong-Hao; Williams, Gavin P.; McGaw, Rebekah

    2015-01-01

    Introduction Hand-held dynamometry (HHD) has never previously been used to examine isometric muscle power. Rate of force development (RFD) is often used for muscle power assessment, however no consensus currently exists on the most appropriate method of calculation. The aim of this study was to examine the reliability of different algorithms for RFD calculation and to examine the intra-rater, inter-rater, and inter-device reliability of HHD as well as the concurrent validity of HHD for the assessment of isometric lower limb muscle strength and power. Methods 30 healthy young adults (age: 23±5yrs, male: 15) were assessed on two sessions. Isometric muscle strength and power were measured using peak force and RFD respectively using two HHDs (Lafayette Model-01165 and Hoggan microFET2) and a criterion-reference KinCom dynamometer. Statistical analysis of reliability and validity comprised intraclass correlation coefficients (ICC), Pearson correlations, concordance correlations, standard error of measurement, and minimal detectable change. Results Comparison of RFD methods revealed that a peak 200ms moving window algorithm provided optimal reliability results. Intra-rater, inter-rater, and inter-device reliability analysis of peak force and RFD revealed mostly good to excellent reliability (coefficients ≥ 0.70) for all muscle groups. Concurrent validity analysis showed moderate to excellent relationships between HHD and fixed dynamometry for the hip and knee (ICCs ≥ 0.70) for both peak force and RFD, with mostly poor to good results shown for the ankle muscles (ICCs = 0.31–0.79). Conclusions Hand-held dynamometry has good to excellent reliability and validity for most measures of isometric lower limb strength and power in a healthy population, particularly for proximal muscle groups. To aid implementation we have created freely available software to extract these variables from data stored on the Lafayette device. Future research should examine the reliability

  6. Evaluation of MHTGR fuel reliability

    International Nuclear Information System (INIS)

    Wichner, R.P.; Barthold, W.P.

    1992-07-01

    Modular High-Temperature Gas-Cooled Reactor (MHTGR) concepts that house the reactor vessel in a tight but unsealed reactor building place heightened importance on the reliability of the fuel particle coatings as fission product barriers. Though accident consequence analyses continue to show favorable results, the increased dependence on one type of barrier, in addition to a number of other factors, has caused the Nuclear Regulatory Commission (NRC) to consider conservative assumptions regarding fuel behavior. For this purpose, the concept termed ''weak fuel'' has been proposed on an interim basis. ''Weak fuel'' is a penalty imposed on consequence analyses whereby the fuel is assumed to respond less favorably to environmental conditions than predicted by behavioral models. The rationale for adopting this penalty, as well as conditions that would permit its reduction or elimination, are examined in this report. The evaluation includes an examination of possible fuel-manufacturing defects, quality-control procedures for defect detection, and the mechanisms by which fuel defects may lead to failure

  7. Multinomial-exponential reliability function: a software reliability model

    International Nuclear Information System (INIS)

    Saiz de Bustamante, Amalio; Saiz de Bustamante, Barbara

    2003-01-01

    The multinomial-exponential reliability function (MERF) was developed during a detailed study of the software failure/correction processes. Later on MERF was approximated by a much simpler exponential reliability function (EARF), which keeps most of MERF mathematical properties, so the two functions together makes up a single reliability model. The reliability model MERF/EARF considers the software failure process as a non-homogeneous Poisson process (NHPP), and the repair (correction) process, a multinomial distribution. The model supposes that both processes are statistically independent. The paper discusses the model's theoretical basis, its mathematical properties and its application to software reliability. Nevertheless it is foreseen model applications to inspection and maintenance of physical systems. The paper includes a complete numerical example of the model application to a software reliability analysis

  8. Iraq's future

    International Nuclear Information System (INIS)

    Henderson, S.

    1998-01-01

    The large oil reserves of Iraq make it an important player in the long-term political energy world. This article briefly reviews the oil industry''s development and current status in Iraq and discusses the planned oil and gas field development. Finally there is a political discussion regarding the future of Iraq in terms of religion, race and neighbouring countries. (UK)

  9. Bitcoin futures

    DEFF Research Database (Denmark)

    Brøgger, Søren Bundgaard

    2018-01-01

    Med introduktionen af et futures-marked er Bitcoin-eksponering blevet tilgængelig for en bredere gruppe af investorer, som hidtil ikke har kunnet eller villet tilgå det underliggende marked for Bitcoin. Artiklen finder, at kontrakterne umiddelbart favoriserer spekulanter på bekostning af hedgers og...

  10. Quality assurance and reliability

    International Nuclear Information System (INIS)

    Normand, J.; Charon, M.

    1975-01-01

    Concern for obtaining high-quality products which will function properly when required to do so is nothing new - it is one manifestation of a conscientious attitude to work. However, the complexity and cost of equipment and the consequences of even temporary immobilization are such that it has become necessary to make special arrangements for obtaining high-quality products and examining what one has obtained. Each unit within an enterprise must examine its own work or arrange for it to be examined; a unit whose specific task is quality assurance is responsible for overall checking, but does not relieve other units of their responsibility. Quality assurance is a form of mutual assistance within an enterprise, designed to remove the causes of faults as far as possible. It begins very early in a project and continues through the ordering stage, construction, start-up trials and operation. Quality and hence reliability are the direct result of what is done at all stages of a project. They depend on constant attention to detail, for even a minor piece of poor workmanship can, in the case of an essential item of equipment, give rise to serious operational difficulties

  11. Reliability of using circulating tumor cells for detecting epidermal growth factor receptor mutation status in advanced non-small-cell lung cancer patients: a meta-analysis and systematic review

    Directory of Open Access Journals (Sweden)

    Hu F

    2018-03-01

    Full Text Available Fang Hu,* Xiaowei Mao,* Yujun Zhang, Xiaoxuan Zheng, Ping Gu, Huimin Wang, Xueyan ZhangDepartment of Pulmonary Medicine, Shanghai Chest Hospital, Shanghai Jiao Tong University, Shanghai, People’s Republic of China *These authors contributed equally to this workPurpose: To evaluate the clinical value of circulating tumor cells as a surrogate to detect epidermal growth factor receptor mutation in advanced non-small-cell lung cancer (NSCLC patients.Methods: We searched the electronic databases, and all articles meeting predetermined selection criteria were included in this study. The pooled sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, and diagnostic odds ratio were calculated. The evaluation indexes of the diagnostic performance were the summary receiver operating characteristic curve and area under the summary receiver operating characteristic curve.Results: Eight eligible publications with 255 advanced NSCLC patients were included in this meta-analysis. Taking tumor tissues as reference, the pooled sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, and diagnostic odds ratio of circulating tumor cells for detecting the epidermal growth factor receptor mutation status were found to be 0.82 (95% confidence interval [CI]: 0.50–0.95, 0.95 (95% CI: 0.24–1.00, 16.81 (95% CI: 0.33–848.62, 0.19 (95% CI: 0.06–0.64, and 86.81 (95% CI: 1.22–6,154.15, respectively. The area under the summary receiver operating characteristic curve was 0.92 (95% CI: 0.89–0.94. The subgroup analysis showed that the factors of blood volume, histological type, EGFR-tyrosine kinase inhibitor therapy, and circulating tumor cell and tissue test methods for EGFR accounted for the significant difference of the pooled specificity. No significant difference was found between the pooled sensitivity of the subgroup.Conclusion: Our meta-analysis confirmed that circulating tumor cells are a good surrogate for

  12. Reliability of Oronasal Fistula Classification.

    Science.gov (United States)

    Sitzman, Thomas J; Allori, Alexander C; Matic, Damir B; Beals, Stephen P; Fisher, David M; Samson, Thomas D; Marcus, Jeffrey R; Tse, Raymond W

    2018-01-01

    Objective Oronasal fistula is an important complication of cleft palate repair that is frequently used to evaluate surgical quality, yet reliability of fistula classification has never been examined. The objective of this study was to determine the reliability of oronasal fistula classification both within individual surgeons and between multiple surgeons. Design Using intraoral photographs of children with repaired cleft palate, surgeons rated the location of palatal fistulae using the Pittsburgh Fistula Classification System. Intrarater and interrater reliability scores were calculated for each region of the palate. Participants Eight cleft surgeons rated photographs obtained from 29 children. Results Within individual surgeons reliability for each region of the Pittsburgh classification ranged from moderate to almost perfect (κ = .60-.96). By contrast, reliability between surgeons was lower, ranging from fair to substantial (κ = .23-.70). Between-surgeon reliability was lowest for the junction of the soft and hard palates (κ = .23). Within-surgeon and between-surgeon reliability were almost perfect for the more general classification of fistula in the secondary palate (κ = .95 and κ = .83, respectively). Conclusions This is the first reliability study of fistula classification. We show that the Pittsburgh Fistula Classification System is reliable when used by an individual surgeon, but less reliable when used among multiple surgeons. Comparisons of fistula occurrence among surgeons may be subject to less bias if they use the more general classification of "presence or absence of fistula of the secondary palate" rather than the Pittsburgh Fistula Classification System.

  13. Dependent systems reliability estimation by structural reliability approach

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2014-01-01

    Estimation of system reliability by classical system reliability methods generally assumes that the components are statistically independent, thus limiting its applicability in many practical situations. A method is proposed for estimation of the system reliability with dependent components, where...... the leading failure mechanism(s) is described by physics of failure model(s). The proposed method is based on structural reliability techniques and accounts for both statistical and failure effect correlations. It is assumed that failure of any component is due to increasing damage (fatigue phenomena...... identification. Application of the proposed method can be found in many real world systems....

  14. Innovation and future in Westinghouse

    International Nuclear Information System (INIS)

    Congedo, T.; Dulloo, A.; Goosen, J.; Llovet, R.

    2007-01-01

    For the past six years, Westinghouse has used a Road Map process to direct technology development in a way that integrates the efforts of our businesses to addresses the needs of our customers and respond to significant drivers in the evolving business environment. As the nuclear industry experiences a resurgence, it is ever more necessary that we increase our planning horizon to 10-15 years in the future so as to meet the expectations of our customers. In the Future Point process, driven by the methods of Design for Six Sigma (DFSS), Westinghouse considers multiple possible future scenarios to plan long term evolutionary and revolutionary development that can reliably create the major products and services of the future market. the products and services of the future stretch the imagination from what we provide today. However, the journey to these stretch targets prompts key development milestones that will help deliver ideas useful for nearer term products. (Author) 1 refs

  15. A reliable method for the stability analysis of structures ...

    African Journals Online (AJOL)

    The detection of structural configurations with singular tangent stiffness matrix is essential because they can be unstable. The secondary paths, especially in unstable buckling, can play the most important role in the loss of stability and collapse of the structure. A new method for reliable detection and accurate computation of ...

  16. Modeling high-Power Accelerators Reliability-SNS LINAC (SNS-ORNL); MAX LINAC (MYRRHA)

    International Nuclear Information System (INIS)

    Pitigoi, A. E.; Fernandez Ramos, P.

    2013-01-01

    Improving reliability has recently become a very important objective in the field of particle accelerators. The particle accelerators in operation are constantly undergoing modifications, and improvements are implemented using new technologies, more reliable components or redundant schemes (to obtain more reliability, strength, more power, etc.) A reliability model of SNS (Spallation Neutron Source) LINAC has been developed within MAX project and analysis of the accelerator systems reliability has been performed within the MAX project, using the Risk Spectrum reliability analysis software. The analysis results have been evaluated by comparison with the SNS operational data. Results and conclusions are presented in this paper, oriented to identify design weaknesses and provide recommendations for improving reliability of MYRRHA linear accelerator. The SNS reliability model developed for the MAX preliminary design phase indicates possible avenues for further investigation that could be needed to improve the reliability of the high-power accelerators, in view of the future reliability targets of ADS accelerators.

  17. Research on Connection and Function Reliability of the Oil&Gas Pipeline System

    Directory of Open Access Journals (Sweden)

    Xu Bo

    2017-01-01

    Full Text Available Pipeline transportation is the optimal way for energy delivery in terms of safety, efficiency and environmental protection. Because of the complexity of pipeline external system including geological hazards, social and cultural influence, it is a great challenge to operate the pipeline safely and reliable. Therefore, the pipeline reliability becomes an important issue. Based on the classical reliability theory, the analysis of pipeline system is carried out, then the reliability model of the pipeline system is built, and the calculation is addressed thereafter. Further the connection and function reliability model is applied to a practical active pipeline system, with the use of the proposed methodology of the pipeline system; the connection reliability and function reliability are obtained. This paper firstly presented to considerate the connection and function reliability separately and obtain significant contribution to establish the mathematical reliability model of pipeline system, hence provide fundamental groundwork for the pipeline reliability research in the future.

  18. Accountability for enhanced reliability for ALWRs

    International Nuclear Information System (INIS)

    Sanford, M.O.

    1993-01-01

    Advanced nuclear units must compete with alternative generating technologies for inclusion in a utility's future planning. Operating reliability, or capacity factor, and operating cost are important factors in this competition. The ALWR Utility Requirements Document has established targets for both ALWR Capacity Factors (87%) and O ampersand M costs (13--16 mils/KWh), but recognizes that these O ampersand M cost targets need to be lower. The recently published USCEA cost study estimates O ampersand M costs on the order of 6 mils/kw. Meeting, or better yet improving on, these targets is essential for nuclear to be competitive. A number of specific items have been included in the Utility Requirements Document to improve reliability based on experience with operating units. For the most part, these include either avoiding known problems, or simplifying the design. Vendor design organizations are generally complying with the URD and this is expected to produce an improved design both in terms of reliability and ease of operation and maintenance. As one embarks on First-of-a-Kind Engineering (FOAKE), however, it is useful to revisit the issue of responsibility for assuring that reliability improvements are embodied in the new plant designs. The vendor design team must ultimately assume this responsibility. While utility operating and maintenance experience is an essential input to an effective design, decisions on how best to design for this experience remain a designer function. This poses a dilemma for the designer in that such decisions seem to invariably require a trade-off between the capital cost of the product and the ultimate O ampersand M cost. Since traditionally only capital cost has had a direct measurable impact on the vendor, it has received the greater weight. A more appropriate balance must be struck in the future, however, for nuclear to be competitive. Utilities must demand a balance in the design of ALWRs which truly reflects the total cost of power

  19. Reliability of reactor materials

    International Nuclear Information System (INIS)

    Toerroenen, K.; Aho-Mantila, I.

    1986-05-01

    This report is the final technical report of the fracture mechanics part of the Reliability of Reactor Materials Programme, which was carried out at the Technical Research Centre of Finland (VTT) through the years 1981 to 1983. Research and development work was carried out in five major areas, viz. statistical treatment and modelling of cleavage fracture, crack arrest, ductile fracture, instrumented impact testing as well as comparison of numerical and experimental elastic-plastic fracture mechanics. In the area of cleavage fracture the critical variables affecting the fracture of steels are considered in the frames of a statistical model, so called WST-model. Comparison of fracture toughness values predicted by the model and corresponding experimental values shows excellent agreement for a variety of microstructures. different posibilities for using the model are discussed. The development work in the area of crack arrest testing was concentrated in the crack starter properties, test arrangement and computer control. A computerized elastic-plastic fracture testing method with a variety of test specimen geometries in a large temperature range was developed for a routine stage. Ductile fracture characteristics of reactor pressure vessel steel A533B and comparable weld material are given. The features of a new, patented instrumented impact tester are described. Experimental and theoretical comparisons between the new and conventional testers indicated clearly the improvements achieved with the new tester. A comparison of numerical and experimental elastic-plastic fracture mechanics capabilities at VTT was carried out. The comparison consisted of two-dimensional linear elastic as well as elastic-plastic finite element analysis of four specimen geometries and equivalent experimental tests. (author)

  20. Reliability of Broadcast Communications Under Sparse Random Linear Network Coding

    OpenAIRE

    Brown, Suzie; Johnson, Oliver; Tassi, Andrea

    2018-01-01

    Ultra-reliable Point-to-Multipoint (PtM) communications are expected to become pivotal in networks offering future dependable services for smart cities. In this regard, sparse Random Linear Network Coding (RLNC) techniques have been widely employed to provide an efficient way to improve the reliability of broadcast and multicast data streams. This paper addresses the pressing concern of providing a tight approximation to the probability of a user recovering a data stream protected by this kin...

  1. Field reliability of electronic systems

    International Nuclear Information System (INIS)

    Elm, T.

    1984-02-01

    This report investigates, through several examples from the field, the reliability of electronic units in a broader sense. That is, it treats not just random parts failure, but also inadequate reliability design and (externally and internally) induced failures. The report is not meant to be merely an indication of the state of the art for the reliability prediction methods we know, but also as a contribution to the investigation of man-machine interplay in the operation and repair of electronic equipment. The report firmly links electronics reliability to safety and risk analyses approaches with a broader, system oriented view of reliability prediction and with postfailure stress analysis. It is intended to reveal, in a qualitative manner, the existence of symptom and cause patterns. It provides a background for further investigations to identify the detailed mechanisms of the faults and the remedical actions and precautions for achieving cost effective reliability. (author)

  2. Reliability Assessment Of Wind Turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2014-01-01

    Reduction of cost of energy for wind turbines are very important in order to make wind energy competitive compared to other energy sources. Therefore the turbine components should be designed to have sufficient reliability but also not be too costly (and safe). This paper presents models...... for uncertainty modeling and reliability assessment of especially the structural components such as tower, blades, substructure and foundation. But since the function of a wind turbine is highly dependent on many electrical and mechanical components as well as a control system also reliability aspects...... of these components are discussed and it is described how there reliability influences the reliability of the structural components. Two illustrative examples are presented considering uncertainty modeling, reliability assessment and calibration of partial safety factors for structural wind turbine components exposed...

  3. Reliability engineering theory and practice

    CERN Document Server

    Birolini, Alessandro

    2014-01-01

    This book shows how to build in, evaluate, and demonstrate reliability and availability of components, equipment, systems. It presents the state-of-theart of reliability engineering, both in theory and practice, and is based on the author's more than 30 years experience in this field, half in industry and half as Professor of Reliability Engineering at the ETH, Zurich. The structure of the book allows rapid access to practical results. This final edition extend and replace all previous editions. New are, in particular, a strategy to mitigate incomplete coverage, a comprehensive introduction to human reliability with design guidelines and new models, and a refinement of reliability allocation, design guidelines for maintainability, and concepts related to regenerative stochastic processes. The set of problems for homework has been extended. Methods & tools are given in a way that they can be tailored to cover different reliability requirement levels and be used for safety analysis. Because of the Appendice...

  4. Reliability of Wireless Sensor Networks

    Science.gov (United States)

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2014-01-01

    Wireless Sensor Networks (WSNs) consist of hundreds or thousands of sensor nodes with limited processing, storage, and battery capabilities. There are several strategies to reduce the power consumption of WSN nodes (by increasing the network lifetime) and increase the reliability of the network (by improving the WSN Quality of Service). However, there is an inherent conflict between power consumption and reliability: an increase in reliability usually leads to an increase in power consumption. For example, routing algorithms can send the same packet though different paths (multipath strategy), which it is important for reliability, but they significantly increase the WSN power consumption. In this context, this paper proposes a model for evaluating the reliability of WSNs considering the battery level as a key factor. Moreover, this model is based on routing algorithms used by WSNs. In order to evaluate the proposed models, three scenarios were considered to show the impact of the power consumption on the reliability of WSNs. PMID:25157553

  5. Reliability analysis of reactor protection systems

    International Nuclear Information System (INIS)

    Alsan, S.

    1976-07-01

    A theoretical mathematical study of reliability is presented and the concepts subsequently defined applied to the study of nuclear reactor safety systems. The theory is applied to investigations of the operational reliability of the Siloe reactor from the point of view of rod drop. A statistical study conducted between 1964 and 1971 demonstrated that most rod drop incidents arose from circumstances associated with experimental equipment (new set-ups). The reliability of the most suitable safety system for some recently developed experimental equipment is discussed. Calculations indicate that if all experimental equipment were equipped with these new systems, only 1.75 rod drop accidents would be expected to occur per year on average. It is suggested that all experimental equipment should be equipped with these new safety systems and tested every 21 days. The reliability of the new safety system currently being studied for the Siloe reactor was also investigated. The following results were obtained: definite failures must be detected immediately as a result of the disturbances produced; the repair time must not exceed a few hours; the equipment must be tested every week. Under such conditions, the rate of accidental rod drops is about 0.013 on average per year. The level of nondefinite failures is less than 10 -6 per hour and the level of nonprotection 1 hour per year. (author)

  6. Reliability Estimation for Digital Instrument/Control System

    International Nuclear Information System (INIS)

    Yang, Yaguang; Sydnor, Russell

    2011-01-01

    Digital instrumentation and controls (DI and C) systems are widely adopted in various industries because of their flexibility and ability to implement various functions that can be used to automatically monitor, analyze, and control complicated systems. It is anticipated that the DI and C will replace the traditional analog instrumentation and controls (AI and C) systems in all future nuclear reactor designs. There is an increasing interest for reliability and risk analyses for safety critical DI and C systems in regulatory organizations, such as The United States Nuclear Regulatory Commission. Developing reliability models and reliability estimation methods for digital reactor control and protection systems will involve every part of the DI and C system, such as sensors, signal conditioning and processing components, transmission lines and digital communication systems, D/A and A/D converters, computer system, signal processing software, control and protection software, power supply system, and actuators. Some of these components are hardware, such as sensors and actuators, their failure mechanisms are well understood, and the traditional reliability model and estimation methods can be directly applied. But many of these components are firmware which has software embedded in the hardware, and software needs special consideration because its failure mechanism is unique, and the reliability estimation method for a software system will be different from the ones used for hardware systems. In this paper, we will propose a reliability estimation method for the entire DI and C system reliability using a recently developed software reliability estimation method and a traditional hardware reliability estimation method

  7. Reliability Estimation for Digital Instrument/Control System

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yaguang; Sydnor, Russell [U.S. Nuclear Regulatory Commission, Washington, D.C. (United States)

    2011-08-15

    Digital instrumentation and controls (DI and C) systems are widely adopted in various industries because of their flexibility and ability to implement various functions that can be used to automatically monitor, analyze, and control complicated systems. It is anticipated that the DI and C will replace the traditional analog instrumentation and controls (AI and C) systems in all future nuclear reactor designs. There is an increasing interest for reliability and risk analyses for safety critical DI and C systems in regulatory organizations, such as The United States Nuclear Regulatory Commission. Developing reliability models and reliability estimation methods for digital reactor control and protection systems will involve every part of the DI and C system, such as sensors, signal conditioning and processing components, transmission lines and digital communication systems, D/A and A/D converters, computer system, signal processing software, control and protection software, power supply system, and actuators. Some of these components are hardware, such as sensors and actuators, their failure mechanisms are well understood, and the traditional reliability model and estimation methods can be directly applied. But many of these components are firmware which has software embedded in the hardware, and software needs special consideration because its failure mechanism is unique, and the reliability estimation method for a software system will be different from the ones used for hardware systems. In this paper, we will propose a reliability estimation method for the entire DI and C system reliability using a recently developed software reliability estimation method and a traditional hardware reliability estimation method.

  8. ECLSS Reliability for Long Duration Missions Beyond Lower Earth Orbit

    Science.gov (United States)

    Sargusingh, Miriam J.; Nelson, Jason

    2014-01-01

    Reliability has been highlighted by NASA as critical to future human space exploration particularly in the area of environmental controls and life support systems. The Advanced Exploration Systems (AES) projects have been encouraged to pursue higher reliability components and systems as part of technology development plans. However, there is no consensus on what is meant by improving on reliability; nor on how to assess reliability within the AES projects. This became apparent when trying to assess reliability as one of several figures of merit for a regenerable water architecture trade study. In the Spring of 2013, the AES Water Recovery Project (WRP) hosted a series of events at the NASA Johnson Space Center (JSC) with the intended goal of establishing a common language and understanding of our reliability goals and equipping the projects with acceptable means of assessing our respective systems. This campaign included an educational series in which experts from across the agency and academia provided information on terminology, tools and techniques associated with evaluating and designing for system reliability. The campaign culminated in a workshop at JSC with members of the ECLSS and AES communities with the goal of developing a consensus on what reliability means to AES and identifying methods for assessing our low to mid-technology readiness level (TRL) technologies for reliability. This paper details the results of the workshop.

  9. Robot Futures

    DEFF Research Database (Denmark)

    Christoffersen, Anja; Grindsted Nielsen, Sally; Jochum, Elizabeth Ann

    Robots are increasingly used in health care settings, e.g., as homecare assistants and personal companions. One challenge for personal robots in the home is acceptance. We describe an innovative approach to influencing the acceptance of care robots using theatrical performance. Live performance...... is a useful testbed for developing and evaluating what makes robots expressive; it is also a useful platform for designing robot behaviors and dialogue that result in believable characters. Therefore theatre is a valuable testbed for studying human-robot interaction (HRI). We investigate how audiences...... perceive social robots interacting with humans in a future care scenario through a scripted performance. We discuss our methods and initial findings, and outline future work....

  10. Reliability and statistical power analysis of cortical and subcortical FreeSurfer metrics in a large sample of healthy elderly.

    Science.gov (United States)

    Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz

    2015-03-01

    FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Inter- and intra-rater reliability of 3D kinematics during maximum mouth opening of asymptomatic subjects.

    Science.gov (United States)

    Calixtre, Leticia Bojikian; Nakagawa, Theresa Helissa; Alburquerque-Sendín, Francisco; da Silva Grüninger, Bruno Leonardo; de Sena Rosa, Lianna Ramalho; Oliveira, Ana Beatriz

    2017-11-07

    Previous studies evaluated 3D human jaw movements using kinematic analysis systems during mouth opening, but information on the reliability of such measurements is still scarce. The purpose of this study was to analyze within- and between-session reliabilities, inter-rater reliability, standard error of measurement (SEM), minimum detectable change (MDC) and consistency of agreement across raters and sessions of 3D kinematic variables during maximum mouth opening (MMO). Thirty-six asymptomatic subjects from both genders were evaluated on two different days, five to seven days apart. Subjects performed three MMO movements while kinematic data were collected. Intraclass correlation coefficient (ICC), SEM and MDC were calculated for all variables, and Bland-Altman plots were constructed. Jaw radius and width were the most reproducible variables (ICC>0.81) and demonstrated minor error. Incisor displacement during MMO and angular movements in the sagittal plane presented good reliability (ICC from 0.61 to 0.8) and small errors and, consequently, could be used in future studies with the same methodology and population. The variables with smaller amplitudes (condylar translations during mouth opening and closing and mandibular movements on the frontal and transversal planes) were less reliable (ICCmandibular movements in the frontal and transversal planes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. The value of service reliability

    OpenAIRE

    Benezech , Vincent; Coulombel , Nicolas

    2013-01-01

    International audience; This paper studies the impact of service frequency and reliability on the choice of departure time and the travel cost of transit users. When the user has (α, β, γ) scheduling preferences, we show that the optimal head start decreases with service reliability, as expected. It does not necessarily decrease with service frequency, however. We derive the value of service headway (VoSH) and the value of service reliability (VoSR), which measure the marginal effect on the e...

  13. Distribution-Independent Reliable Learning

    OpenAIRE

    Kanade, Varun; Thaler, Justin

    2014-01-01

    We study several questions in the reliable agnostic learning framework of Kalai et al. (2009), which captures learning tasks in which one type of error is costlier than others. A positive reliable classifier is one that makes no false positive errors. The goal in the positive reliable agnostic framework is to output a hypothesis with the following properties: (i) its false positive error rate is at most $\\epsilon$, (ii) its false negative error rate is at most $\\epsilon$ more than that of the...

  14. Future directions

    International Nuclear Information System (INIS)

    Lutz, R.J. Jr.

    2004-01-01

    Topics presented concerning the future developments in risk analysis are: safety goals, US severe accident policy, code developments, research programs, analyses and operation action, linking with the deterministic analyses. Principle consideration in risk is defined as protection of both general population, and nearby residents. The principal goal should be consistent with risk of other man-caused activities, the cost benefit after minimum safety levels are achieved, and proportional to benefits to be gained

  15. A quantitative calculation for software reliability evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young-Jun; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    To meet these regulatory requirements, the software used in the nuclear safety field has been ensured through the development, validation, safety analysis, and quality assurance activities throughout the entire process life cycle from the planning phase to the installation phase. A variety of activities, such as the quality assurance activities are also required to improve the quality of a software. However, there are limitations to ensure that the quality is improved enough. Therefore, the effort to calculate the reliability of the software continues for a quantitative evaluation instead of a qualitative evaluation. In this paper, we propose a quantitative calculation method for the software to be used for a specific operation of the digital controller in an NPP. After injecting random faults in the internal space of a developed controller and calculating the ability to detect the injected faults using diagnostic software, we can evaluate the software reliability of a digital controller in an NPP. We tried to calculate the software reliability of the controller in an NPP using a new method that differs from a traditional method. It calculates the fault detection coverage after injecting the faults into the software memory space rather than the activity through the life cycle process. We attempt differentiation by creating a new definition of the fault, imitating the software fault using the hardware, and giving a consideration and weights for injection faults.

  16. Characteristics and application study of AP1000 NPPs equipment reliability classification method

    International Nuclear Information System (INIS)

    Guan Gao

    2013-01-01

    AP1000 nuclear power plant applies an integrated approach to establish equipment reliability classification, which includes probabilistic risk assessment technique, maintenance rule administrative, power production reliability classification and functional equipment group bounding method, and eventually classify equipment reliability into 4 levels. This classification process and result are very different from classical RCM and streamlined RCM. It studied the characteristic of AP1000 equipment reliability classification approach, considered that equipment reliability classification should effectively support maintenance strategy development and work process control, recommended to use a combined RCM method to establish the future equipment reliability program of AP1000 nuclear power plants. (authors)

  17. System Reliability for Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Marquez-Dominguez, Sergio; Sørensen, John Dalsgaard

    2013-01-01

    E). In consequence, a rational treatment of uncertainties is done in order to assess the reliability of critical details in OWTs. Limit state equations are formulated for fatigue critical details which are not influenced by wake effects generated in offshore wind farms. Furthermore, typical bi-linear S-N curves...... are considered for reliability verification according to international design standards of OWTs. System effects become important for each substructure with many potential fatigue hot spots. Therefore, in this paper a framework for system effects is presented. This information can be e.g. no detection of cracks...... in inspections or measurements from condition monitoring systems. Finally, an example is established to illustrate the practical application of this framework for jacket type wind turbine substructure considering system effects....

  18. Model-based fault detection algorithm for photovoltaic system monitoring

    KAUST Repository

    Harrou, Fouzi; Sun, Ying; Saidi, Ahmed

    2018-01-01

    Reliable detection of faults in PV systems plays an important role in improving their reliability, productivity, and safety. This paper addresses the detection of faults in the direct current (DC) side of photovoltaic (PV) systems using a

  19. Future climate

    International Nuclear Information System (INIS)

    La Croce, A.

    1991-01-01

    According to George Woodwell, founder of the Woods Hole Research Center, due the combustion of fossil fuels, deforestation and accelerated respiration, the net annual increase of carbon, in the form of carbon dioxide, to the 750 billion tonnes already present in the earth's atmosphere, is in the order of 3 to 5 billion tonnes. Around the world, scientists, investigating the probable effects of this increase on the earth's future climate, are now formulating coupled air and ocean current models which take account of water temperature and salinity dependent carbon dioxide exchange mechanisms acting between the atmosphere and deep layers of ocean waters

  20. Automated reliability assessment for spectroscopic redshift measurements

    Science.gov (United States)

    Jamal, S.; Le Brun, V.; Le Fèvre, O.; Vibert, D.; Schmitt, A.; Surace, C.; Copin, Y.; Garilli, B.; Moresco, M.; Pozzetti, L.

    2018-03-01

    Context. Future large-scale surveys, such as the ESA Euclid mission, will produce a large set of galaxy redshifts (≥106) that will require fully automated data-processing pipelines to analyze the data, extract crucial information and ensure that all requirements are met. A fundamental element in these pipelines is to associate to each galaxy redshift measurement a quality, or reliability, estimate. Aim. In this work, we introduce a new approach to automate the spectroscopic redshift reliability assessment based on machine learning (ML) and characteristics of the redshift probability density function. Methods: We propose to rephrase the spectroscopic redshift estimation into a Bayesian framework, in order to incorporate all sources of information and uncertainties related to the redshift estimation process and produce a redshift posterior probability density function (PDF). To automate the assessment of a reliability flag, we exploit key features in the redshift posterior PDF and machine learning algorithms. Results: As a working example, public data from the VIMOS VLT Deep Survey is exploited to present and test this new methodology. We first tried to reproduce the existing reliability flags using supervised classification in order to describe different types of redshift PDFs, but due to the subjective definition of these flags (classification accuracy 58%), we soon opted for a new homogeneous partitioning of the data into distinct clusters via unsupervised classification. After assessing the accuracy of the new clusters via resubstitution and test predictions (classification accuracy 98%), we projected unlabeled data from preliminary mock simulations for the Euclid space mission into this mapping to predict their redshift reliability labels. Conclusions: Through the development of a methodology in which a system can build its own experience to assess the quality of a parameter, we are able to set a preliminary basis of an automated reliability assessment for