WorldWideScience

Sample records for reliable detection methods

  1. Reliably detectable flaw size for NDE methods that use calibration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-04-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-1823 and associated mh18232 POD software gives most common methods of POD analysis. In this paper, POD analysis is applied to an NDE method, such as eddy current testing, where calibration is used. NDE calibration standards have known size artificial flaws such as electro-discharge machined (EDM) notches and flat bottom hole (FBH) reflectors which are used to set instrument sensitivity for detection of real flaws. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. Therefore, it is important to correlate signal responses from real flaws with signal responses form artificial flaws used in calibration process to determine reliably detectable flaw size.

  2. Reliability Study Regarding the Use of Histogram Similarity Methods for Damage Detection

    Directory of Open Access Journals (Sweden)

    Nicoleta Gillich

    2013-01-01

    Full Text Available The paper analyses the reliability of three dissimilarity estimators to compare histograms, as support for a frequency-based damage detection method, able to identify structural changes in beam-like structures. First a brief presentation of the own developed damage detection method is made, with focus on damage localization. It consists actually in comparing a histogram derived from measurement results, with a large series of histograms, namely the damage location indexes for all locations along the beam, obtained by calculus. We tested some dissimilarity estimators like the Minkowski-form Distances, the Kullback-Leibler Divergence and the Histogram Intersection and found the Minkowski Distance as the method providing best results. It was tested for numerous locations, using real measurement results and with results artificially debased by noise, proving its reliability.

  3. Self-Tuning Method for Increased Obstacle Detection Reliability Based on Internet of Things LiDAR Sensor Models.

    Science.gov (United States)

    Castaño, Fernando; Beruvides, Gerardo; Villalonga, Alberto; Haber, Rodolfo E

    2018-05-10

    On-chip LiDAR sensors for vehicle collision avoidance are a rapidly expanding area of research and development. The assessment of reliable obstacle detection using data collected by LiDAR sensors has become a key issue that the scientific community is actively exploring. The design of a self-tuning methodology and its implementation are presented in this paper, to maximize the reliability of LiDAR sensors network for obstacle detection in the 'Internet of Things' (IoT) mobility scenarios. The Webots Automobile 3D simulation tool for emulating sensor interaction in complex driving environments is selected in order to achieve that objective. Furthermore, a model-based framework is defined that employs a point-cloud clustering technique, and an error-based prediction model library that is composed of a multilayer perceptron neural network, and k-nearest neighbors and linear regression models. Finally, a reinforcement learning technique, specifically a Q-learning method, is implemented to determine the number of LiDAR sensors that are required to increase sensor reliability for obstacle localization tasks. In addition, a IoT driving assistance user scenario, connecting a five LiDAR sensor network is designed and implemented to validate the accuracy of the computational intelligence-based framework. The results demonstrated that the self-tuning method is an appropriate strategy to increase the reliability of the sensor network while minimizing detection thresholds.

  4. A Method for Improving Reliability of Radiation Detection using Deep Learning Framework

    International Nuclear Information System (INIS)

    Chang, Hojong; Kim, Tae-Ho; Han, Byunghun; Kim, Hyunduk; Kim, Ki-duk

    2017-01-01

    Radiation detection is essential technology for overall field of radiation and nuclear engineering. Previously, technology for radiation detection composes of preparation of the table of the input spectrum to output spectrum in advance, which requires simulation of numerous predicted output spectrum with simulation using parameters modeling the spectrum. In this paper, we propose new technique to improve the performance of radiation detector. The software in the radiation detector has been stagnant for a while with possible intrinsic error of simulation. In the proposed method, to predict the input source using output spectrum measured by radiation detector is performed using deep neural network. With highly complex model, we expect that the complex pattern between data and the label can be captured well. Furthermore, the radiation detector should be calibrated regularly and beforehand. We propose a method to calibrate radiation detector using GAN. We hope that the power of deep learning may also reach to radiation detectors and make huge improvement on the field. Using improved radiation detector, the reliability of detection would be confident, and there are many tasks remaining to solve using deep learning in nuclear engineering society.

  5. Assessment of the reliability of ultrasonic inspection methods

    International Nuclear Information System (INIS)

    Haines, N.F.; Langston, D.B.; Green, A.J.; Wilson, R.

    1982-01-01

    The reliability of NDT techniques has remained an open question for many years. A reliable technique may be defined as one that, when rigorously applied by a number of inspection teams, consistently finds then correctly sizes all defects of concern. In this paper we report an assessment of the reliability of defect detection by manual ultrasonic methods applied to the inspection of thick section pressure vessel weldments. Initially we consider the available data relating to the inherent physical capabilities of ultrasonic techniques to detect cracks in weldment and then, independently, we assess the likely variability in team to team performance when several teams are asked to follow the same specified test procedure. The two aspects of 'capability' and 'variability' are brought together to provide quantitative estimates of the overall reliability of ultrasonic inspection of thick section pressure vessel weldments based on currently existing data. The final section of the paper considers current research programmes on reliability and presents a view on how these will help to further improve NDT reliability. (author)

  6. Is air-displacement plethysmography a reliable method of detecting ongoing changes in percent body fat within obese children involved in a weight management program?

    DEFF Research Database (Denmark)

    Ewane, Cecile; McConkey, Stacy A; Kreiter, Clarence D

    2010-01-01

    (percent body fat) over time. The gold standard method, hydrodensitometry, has severe limitations for the pediatric population. OBJECTIVE: This study examines the reliability of air-displacement plethysmography (ADP) in detecting percent body fat changes within obese children over time. METHODS: Percent...... body fat by ADP, weight, and body mass index (BMI) were measured for eight obese children aged 5-12 years enrolled in a weight management program over a 12-month period. These measurements were taken at initial evaluation, 1.5 months, 3 months, 6 months, and 12 months to monitor the progress...... of the subjects and detect any changes in these measures over time. Statistical analysis was used to determine the reliability of the data collected. RESULTS: The reliability estimate for percent body fat by ADP was 0.78. This was much lower than the reliability of BMI, 0.98, and weight measurements, 0...

  7. A reliable method for the stability analysis of structures ...

    African Journals Online (AJOL)

    The detection of structural configurations with singular tangent stiffness matrix is essential because they can be unstable. The secondary paths, especially in unstable buckling, can play the most important role in the loss of stability and collapse of the structure. A new method for reliable detection and accurate computation of ...

  8. Structural Reliability Methods

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Madsen, H. O.

    The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...... of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature...... of the uncertainties and their interplay is the developed, step-by-step. The concepts presented are illustrated by numerous examples throughout the text....

  9. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  10. Reliability testing of tendon disease using two different scanning methods in patients with rheumatoid arthritis

    DEFF Research Database (Denmark)

    Bruyn, George A W; Möller, Ingrid; Garrido, Jesus

    2012-01-01

    To assess the intra- and interobserver reliability of musculoskeletal ultrasonography (US) in detecting inflammatory and destructive tendon abnormalities in patients with RA using two different scanning methods.......To assess the intra- and interobserver reliability of musculoskeletal ultrasonography (US) in detecting inflammatory and destructive tendon abnormalities in patients with RA using two different scanning methods....

  11. Spices, irradiation and detection methods

    International Nuclear Information System (INIS)

    Sjoeberg, A.M.; Manninen, M.

    1991-01-01

    This paper is about microbiological aspects of spices and microbiological methods to detect irradiated food. The proposed method is a combination of the Direct Epifluorescence Filter Technique (DEFT) and the Aerobic Plate Count (APC). The evidence for irradiation of spices is based on the demonstration of a higher DEFT count than the APC. The principle was first tested in our earlier investigation in the detection of irradiation of whole spices. The combined DEFT+APC procedure was found to give a fairly reliable indication of whether or not a whole spice sample had been irradiated. The results are given (8 figs, 22 refs)

  12. Scenario based approach to structural damage detection and its value in a risk and reliability perspective

    DEFF Research Database (Denmark)

    Hovgaard, Mads Knude; Hansen, Jannick Balleby; Brincker, Rune

    2013-01-01

    A scenario- and vibration based structural damage detection method is demonstrated though simulation. The method is Finite Element (FE) based. The value of the monitoring is calculated using structural reliability theory. A high cycle fatigue crack propagation model is assumed as the damage mecha......- and without monitoring. Monte Carlo Sampling (MCS) is used to estimate the probabilities and the tower of an onshore NREL 5MW wind turbine is given as a calculation case......A scenario- and vibration based structural damage detection method is demonstrated though simulation. The method is Finite Element (FE) based. The value of the monitoring is calculated using structural reliability theory. A high cycle fatigue crack propagation model is assumed as the damage...

  13. Soybean allergen detection methods--a comparison study

    DEFF Research Database (Denmark)

    Pedersen, M. Højgaard; Holzhauser, T.; Bisson, C.

    2008-01-01

    Soybean containing products are widely consumed, thus reliable methods for detection of soy in foods are needed in order to make appropriate risk assessment studies to adequately protect soy allergic patients. Six methods were compared using eight food products with a declared content of soy...

  14. The reliability of magnetic resonance imaging in traumatic brain injury lesion detection

    NARCIS (Netherlands)

    Geurts, B.H.J.; Andriessen, T.M.J.C.; Goraj, B.M.; Vos, P.E.

    2012-01-01

    Objective: This study compares inter-rater-reliability, lesion detection and clinical relevance of T2-weighted imaging (T2WI), Fluid Attenuated Inversion Recovery (FLAIR), T2*-gradient recalled echo (T2*-GRE) and Susceptibility Weighted Imaging (SWI) in Traumatic Brain Injury (TBI). Methods: Three

  15. Detecting long-term growth trends using tree rings: a critical evaluation of methods.

    Science.gov (United States)

    Peters, Richard L; Groenendijk, Peter; Vlam, Mart; Zuidema, Pieter A

    2015-05-01

    Tree-ring analysis is often used to assess long-term trends in tree growth. A variety of growth-trend detection methods (GDMs) exist to disentangle age/size trends in growth from long-term growth changes. However, these detrending methods strongly differ in approach, with possible implications for their output. Here, we critically evaluate the consistency, sensitivity, reliability and accuracy of four most widely used GDMs: conservative detrending (CD) applies mathematical functions to correct for decreasing ring widths with age; basal area correction (BAC) transforms diameter into basal area growth; regional curve standardization (RCS) detrends individual tree-ring series using average age/size trends; and size class isolation (SCI) calculates growth trends within separate size classes. First, we evaluated whether these GDMs produce consistent results applied to an empirical tree-ring data set of Melia azedarach, a tropical tree species from Thailand. Three GDMs yielded similar results - a growth decline over time - but the widely used CD method did not detect any change. Second, we assessed the sensitivity (probability of correct growth-trend detection), reliability (100% minus probability of detecting false trends) and accuracy (whether the strength of imposed trends is correctly detected) of these GDMs, by applying them to simulated growth trajectories with different imposed trends: no trend, strong trends (-6% and +6% change per decade) and weak trends (-2%, +2%). All methods except CD, showed high sensitivity, reliability and accuracy to detect strong imposed trends. However, these were considerably lower in the weak or no-trend scenarios. BAC showed good sensitivity and accuracy, but low reliability, indicating uncertainty of trend detection using this method. Our study reveals that the choice of GDM influences results of growth-trend studies. We recommend applying multiple methods when analysing trends and encourage performing sensitivity and reliability

  16. A New Method to Detect and Correct the Critical Errors and Determine the Software-Reliability in Critical Software-System

    International Nuclear Information System (INIS)

    Krini, Ossmane; Börcsök, Josef

    2012-01-01

    In order to use electronic systems comprising of software and hardware components in safety related and high safety related applications, it is necessary to meet the Marginal risk numbers required by standards and legislative provisions. Existing processes and mathematical models are used to verify the risk numbers. On the hardware side, various accepted mathematical models, processes, and methods exist to provide the required proof. To this day, however, there are no closed models or mathematical procedures known that allow for a dependable prediction of software reliability. This work presents a method that makes a prognosis on the residual critical error number in software. Conventional models lack this ability and right now, there are no methods that forecast critical errors. The new method will show that an estimate of the residual error number of critical errors in software systems is possible by using a combination of prediction models, a ratio of critical errors, and the total error number. Subsequently, the critical expected value-function at any point in time can be derived from the new solution method, provided the detection rate has been calculated using an appropriate estimation method. Also, the presented method makes it possible to make an estimate on the critical failure rate. The approach is modelled on a real process and therefore describes two essential processes - detection and correction process.

  17. Reliability assessment for thickness measurements of pipe wall using probability of detection

    International Nuclear Information System (INIS)

    Nakamoto, Hiroyuki; Kojima, Fumio; Kato, Sho

    2013-01-01

    This paper proposes a reliability assessment method for thickness measurements of pipe wall using probability of detection (POD). Thicknesses of pipes are measured by qualified inspectors with ultrasonic thickness gauges. The inspection results are affected by human factors of the inspectors and include some errors, because the inspectors have different experiences and frequency of inspections. In order to ensure reliability for inspection results, first, POD evaluates experimental results of pipe-wall thickness inspection. We verify that the results have differences depending on inspectors including qualified inspectors. Second, two human factors that affect POD are indicated. Finally, it is confirmed that POD can identify the human factors and ensure reliability for pipe-wall thickness inspections. (author)

  18. An Evaluation Method of Equipment Reliability Configuration Management

    Science.gov (United States)

    Wang, Wei; Feng, Weijia; Zhang, Wei; Li, Yuan

    2018-01-01

    At present, many equipment development companies have been aware of the great significance of reliability of the equipment development. But, due to the lack of effective management evaluation method, it is very difficult for the equipment development company to manage its own reliability work. Evaluation method of equipment reliability configuration management is to determine the reliability management capabilities of equipment development company. Reliability is not only designed, but also managed to achieve. This paper evaluates the reliability management capabilities by reliability configuration capability maturity model(RCM-CMM) evaluation method.

  19. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    Science.gov (United States)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  20. Reliability and discriminatory power of methods for dental plaque quantification

    Directory of Open Access Journals (Sweden)

    Daniela Prócida Raggio

    2010-04-01

    Full Text Available OBJECTIVE: This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI and fluorescence camera (FC to detect plaque. MATERIAL AND METHODS: Six volunteers used palatal appliances with six bovine enamel blocks presenting different stages of plaque accumulation. The presence of plaque with and without disclosing was assessed using VI. Images were obtained with FC and digital camera in both conditions. The area covered by plaque was assessed. Examinations were done by two independent examiners. Data were analyzed by Kruskal-Wallis and Kappa tests to compare different conditions of samples and to assess the inter-examiner reproducibility. RESULTS: Some methods presented adequate reproducibility. The Turesky index and the assessment of area covered by disclosed plaque in the FC images presented the highest discriminatory powers. CONCLUSION: The Turesky index and images with FC with disclosing present good reliability and discriminatory power in quantifying dental plaque.

  1. Reliability of the MicroScan WalkAway PC21 panel in identifying and detecting oxacillin resistance in clinical coagulase-negative staphylococci strains.

    Science.gov (United States)

    Olendzki, A N; Barros, E M; Laport, M S; Dos Santos, K R N; Giambiagi-Demarval, M

    2014-01-01

    The purpose of this study was to determine the reliability of the MicroScan WalkAway PosCombo21 (PC21) system for the identification of coagulase-negative staphylococci (CNS) strains and the detection of oxacillin resistance. Using molecular and phenotypic methods, 196 clinical strains were evaluated. The automated system demonstrated 100 % reliability for the identification of the clinical strains Staphylococcus haemolyticus, Staphylococcus hominis and Staphylococcus cohnii; 98.03 % reliability for the identification of Staphylococcus epidermidis; 70 % reliability for the identification of Staphylococcus lugdunensis; 40 % reliability for the identification of Staphylococcus warneri; and 28.57 % reliability for the identification of Staphylococcus capitis, but no reliability for the identification of Staphylococcus auricularis, Staphylococcus simulans and Staphylococcus xylosus. We concluded that the automated system provides accurate results for the more common CNS species but often fails to accurately identify less prevalent species. For the detection of oxacillin resistance, the automated system showed 100 % specificity and 90.22 % sensitivity. Thus, the PC21 panel detects oxacillin-resistant strains, but is limited by the heteroresistance that is observed when using most phenotypic methods.

  2. HUMAN RELIABILITY ANALYSIS DENGAN PENDEKATAN COGNITIVE RELIABILITY AND ERROR ANALYSIS METHOD (CREAM

    Directory of Open Access Journals (Sweden)

    Zahirah Alifia Maulida

    2015-01-01

    Full Text Available Kecelakaan kerja pada bidang grinding dan welding menempati urutan tertinggi selama lima tahun terakhir di PT. X. Kecelakaan ini disebabkan oleh human error. Human error terjadi karena pengaruh lingkungan kerja fisik dan non fisik.Penelitian kali menggunakan skenario untuk memprediksi serta mengurangi kemungkinan terjadinya error pada manusia dengan pendekatan CREAM (Cognitive Reliability and Error Analysis Method. CREAM adalah salah satu metode human reliability analysis yang berfungsi untuk mendapatkan nilai Cognitive Failure Probability (CFP yang dapat dilakukan dengan dua cara yaitu basic method dan extended method. Pada basic method hanya akan didapatkan nilai failure probabailty secara umum, sedangkan untuk extended method akan didapatkan CFP untuk setiap task. Hasil penelitian menunjukkan faktor- faktor yang mempengaruhi timbulnya error pada pekerjaan grinding dan welding adalah kecukupan organisasi, kecukupan dari Man Machine Interface (MMI & dukungan operasional, ketersediaan prosedur/ perencanaan, serta kecukupan pelatihan dan pengalaman. Aspek kognitif pada pekerjaan grinding yang memiliki nilai error paling tinggi adalah planning dengan nilai CFP 0.3 dan pada pekerjaan welding yaitu aspek kognitif execution dengan nilai CFP 0.18. Sebagai upaya untuk mengurangi nilai error kognitif pada pekerjaan grinding dan welding rekomendasi yang diberikan adalah memberikan training secara rutin, work instrucstion yang lebih rinci dan memberikan sosialisasi alat. Kata kunci: CREAM (cognitive reliability and error analysis method, HRA (human reliability analysis, cognitive error Abstract The accidents in grinding and welding sectors were the highest cases over the last five years in PT. X and it caused by human error. Human error occurs due to the influence of working environment both physically and non-physically. This study will implement an approaching scenario called CREAM (Cognitive Reliability and Error Analysis Method. CREAM is one of human

  3. A Method of Nuclear Software Reliability Estimation

    International Nuclear Information System (INIS)

    Park, Gee Yong; Eom, Heung Seop; Cheon, Se Woo; Jang, Seung Cheol

    2011-01-01

    A method on estimating software reliability for nuclear safety software is proposed. This method is based on the software reliability growth model (SRGM) where the behavior of software failure is assumed to follow the non-homogeneous Poisson process. Several modeling schemes are presented in order to estimate and predict more precisely the number of software defects based on a few of software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating the software test cases into the model. It is identified that this method is capable of accurately estimating the remaining number of software defects which are on-demand type directly affecting safety trip functions. The software reliability can be estimated from a model equation and one method of obtaining the software reliability is proposed

  4. NDE reliability and probability of detection (POD) evolution and paradigm shift

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Surendra [NDE Engineering, Materials and Process Engineering, Honeywell Aerospace, Phoenix, AZ 85034 (United States)

    2014-02-18

    The subject of NDE Reliability and POD has gone through multiple phases since its humble beginning in the late 1960s. This was followed by several programs including the important one nicknamed “Have Cracks – Will Travel” or in short “Have Cracks” by Lockheed Georgia Company for US Air Force during 1974–1978. This and other studies ultimately led to a series of developments in the field of reliability and POD starting from the introduction of fracture mechanics and Damaged Tolerant Design (DTD) to statistical framework by Bernes and Hovey in 1981 for POD estimation to MIL-STD HDBK 1823 (1999) and 1823A (2009). During the last decade, various groups and researchers have further studied the reliability and POD using Model Assisted POD (MAPOD), Simulation Assisted POD (SAPOD), and applying Bayesian Statistics. All and each of these developments had one objective, i.e., improving accuracy of life prediction in components that to a large extent depends on the reliability and capability of NDE methods. Therefore, it is essential to have a reliable detection and sizing of large flaws in components. Currently, POD is used for studying reliability and capability of NDE methods, though POD data offers no absolute truth regarding NDE reliability, i.e., system capability, effects of flaw morphology, and quantifying the human factors. Furthermore, reliability and POD have been reported alike in meaning but POD is not NDE reliability. POD is a subset of the reliability that consists of six phases: 1) samples selection using DOE, 2) NDE equipment setup and calibration, 3) System Measurement Evaluation (SME) including Gage Repeatability and Reproducibility (Gage R and R) and Analysis Of Variance (ANOVA), 4) NDE system capability and electronic and physical saturation, 5) acquiring and fitting data to a model, and data analysis, and 6) POD estimation. This paper provides an overview of all major POD milestones for the last several decades and discuss rationale for using

  5. Advances in developing rapid, reliable and portable detection systems for alcohol.

    Science.gov (United States)

    Thungon, Phurpa Dema; Kakoti, Ankana; Ngashangva, Lightson; Goswami, Pranab

    2017-11-15

    Development of portable, reliable, sensitive, simple, and inexpensive detection system for alcohol has been an instinctive demand not only in traditional brewing, pharmaceutical, food and clinical industries but also in rapidly growing alcohol based fuel industries. Highly sensitive, selective, and reliable alcohol detections are currently amenable typically through the sophisticated instrument based analyses confined mostly to the state-of-art analytical laboratory facilities. With the growing demand of rapid and reliable alcohol detection systems, an all-round attempt has been made over the past decade encompassing various disciplines from basic and engineering sciences. Of late, the research for developing small-scale portable alcohol detection system has been accelerated with the advent of emerging miniaturization techniques, advanced materials and sensing platforms such as lab-on-chip, lab-on-CD, lab-on-paper etc. With these new inter-disciplinary approaches along with the support from the parallel knowledge growth on rapid detection systems being pursued for various targets, the progress on translating the proof-of-concepts to commercially viable and environment friendly portable alcohol detection systems is gaining pace. Here, we summarize the progress made over the years on the alcohol detection systems, with a focus on recent advancement towards developing portable, simple and efficient alcohol sensors. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Reliable detection of fluence anomalies in EPID-based IMRT pretreatment quality assurance using pixel intensity deviations

    International Nuclear Information System (INIS)

    Gordon, J. J.; Gardner, J. K.; Wang, S.; Siebers, J. V.

    2012-01-01

    Purpose: This work uses repeat images of intensity modulated radiation therapy (IMRT) fields to quantify fluence anomalies (i.e., delivery errors) that can be reliably detected in electronic portal images used for IMRT pretreatment quality assurance. Methods: Repeat images of 11 clinical IMRT fields are acquired on a Varian Trilogy linear accelerator at energies of 6 MV and 18 MV. Acquired images are corrected for output variations and registered to minimize the impact of linear accelerator and electronic portal imaging device (EPID) positioning deviations. Detection studies are performed in which rectangular anomalies of various sizes are inserted into the images. The performance of detection strategies based on pixel intensity deviations (PIDs) and gamma indices is evaluated using receiver operating characteristic analysis. Results: Residual differences between registered images are due to interfraction positional deviations of jaws and multileaf collimator leaves, plus imager noise. Positional deviations produce large intensity differences that degrade anomaly detection. Gradient effects are suppressed in PIDs using gradient scaling. Background noise is suppressed using median filtering. In the majority of images, PID-based detection strategies can reliably detect fluence anomalies of ≥5% in ∼1 mm 2 areas and ≥2% in ∼20 mm 2 areas. Conclusions: The ability to detect small dose differences (≤2%) depends strongly on the level of background noise. This in turn depends on the accuracy of image registration, the quality of the reference image, and field properties. The longer term aim of this work is to develop accurate and reliable methods of detecting IMRT delivery errors and variations. The ability to resolve small anomalies will allow the accuracy of advanced treatment techniques, such as image guided, adaptive, and arc therapies, to be quantified.

  7. A high-throughput multiplex method adapted for GMO detection.

    Science.gov (United States)

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  8. Fault detection and reliability, knowledge based and other approaches

    International Nuclear Information System (INIS)

    Singh, M.G.; Hindi, K.S.; Tzafestas, S.G.

    1987-01-01

    These proceedings are split up into four major parts in order to reflect the most significant aspects of reliability and fault detection as viewed at present. The first part deals with knowledge-based systems and comprises eleven contributions from leading experts in the field. The emphasis here is primarily on the use of artificial intelligence, expert systems and other knowledge-based systems for fault detection and reliability. The second part is devoted to fault detection of technological systems and comprises thirteen contributions dealing with applications of fault detection techniques to various technological systems such as gas networks, electric power systems, nuclear reactors and assembly cells. The third part of the proceedings, which consists of seven contributions, treats robust, fault tolerant and intelligent controllers and covers methodological issues as well as several applications ranging from nuclear power plants to industrial robots to steel grinding. The fourth part treats fault tolerant digital techniques and comprises five contributions. Two papers, one on reactor noise analysis, the other on reactor control system design, are indexed separately. (author)

  9. Evaluation of structural reliability using simulation methods

    Directory of Open Access Journals (Sweden)

    Baballëku Markel

    2015-01-01

    Full Text Available Eurocode describes the 'index of reliability' as a measure of structural reliability, related to the 'probability of failure'. This paper is focused on the assessment of this index for a reinforced concrete bridge pier. It is rare to explicitly use reliability concepts for design of structures, but the problems of structural engineering are better known through them. Some of the main methods for the estimation of the probability of failure are the exact analytical integration, numerical integration, approximate analytical methods and simulation methods. Monte Carlo Simulation is used in this paper, because it offers a very good tool for the estimation of probability in multivariate functions. Complicated probability and statistics problems are solved through computer aided simulations of a large number of tests. The procedures of structural reliability assessment for the bridge pier and the comparison with the partial factor method of the Eurocodes have been demonstrated in this paper.

  10. Reliability evaluation of the Savannah River reactor leak detection system

    International Nuclear Information System (INIS)

    Daugherty, W.L.; Sindelar, R.L.; Wallace, I.T.

    1991-01-01

    The Savannah River Reactors have been in operation since the mid-1950's. The primary degradation mode for the primary coolant loop piping is intergranular stress corrosion cracking. The leak-before-break (LBB) capability of the primary system piping has been demonstrated as part of an overall structural integrity evaluation. One element of the LBB analyses is a reliability evaluation of the leak detection system. The most sensitive element of the leak detection system is the airborne tritium monitors. The presence of small amounts of tritium in the heavy water coolant provide the basis for a very sensitive system of leak detection. The reliability of the tritium monitors to properly identify a crack leaking at a rate of either 50 or 300 lb/day (0.004 or 0.023 gpm, respectively) has been characterized. These leak rates correspond to action points for which specific operator actions are required. High reliability has been demonstrated using standard fault tree techniques. The probability of not detecting a leak within an assumed mission time of 24 hours is estimated to be approximately 5 x 10 -5 per demand. This result is obtained for both leak rates considered. The methodology and assumptions used to obtain this result are described in this paper. 3 refs., 1 fig., 1 tab

  11. Rapid and reliable detection and identification of GM events using multiplex PCR coupled with oligonucleotide microarray.

    Science.gov (United States)

    Xu, Xiaodan; Li, Yingcong; Zhao, Heng; Wen, Si-yuan; Wang, Sheng-qi; Huang, Jian; Huang, Kun-lun; Luo, Yun-bo

    2005-05-18

    To devise a rapid and reliable method for the detection and identification of genetically modified (GM) events, we developed a multiplex polymerase chain reaction (PCR) coupled with a DNA microarray system simultaneously aiming at many targets in a single reaction. The system included probes for screening gene, species reference gene, specific gene, construct-specific gene, event-specific gene, and internal and negative control genes. 18S rRNA was combined with species reference genes as internal controls to assess the efficiency of all reactions and to eliminate false negatives. Two sets of the multiplex PCR system were used to amplify four and five targets, respectively. Eight different structure genes could be detected and identified simultaneously for Roundup Ready soybean in a single microarray. The microarray specificity was validated by its ability to discriminate two GM maizes Bt176 and Bt11. The advantages of this method are its high specificity and greatly reduced false-positives and -negatives. The multiplex PCR coupled with microarray technology presented here is a rapid and reliable tool for the simultaneous detection of GM organism ingredients.

  12. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  13. Application of reliability methods in Ontario Hydro

    International Nuclear Information System (INIS)

    Jeppesen, R.; Ravishankar, T.J.

    1985-01-01

    Ontario Hydro have established a reliability program in support of its substantial nuclear program. Application of the reliability program to achieve both production and safety goals is described. The value of such a reliability program is evident in the record of Ontario Hydro's operating nuclear stations. The factors which have contributed to the success of the reliability program are identified as line management's commitment to reliability; selective and judicious application of reliability methods; establishing performance goals and monitoring the in-service performance; and collection, distribution, review and utilization of performance information to facilitate cost-effective achievement of goals and improvements. (orig.)

  14. Reliability of recordings of subgingival calculus detected using an ultrasonic device.

    Science.gov (United States)

    Corraini, Priscila; López, Rodrigo

    2015-04-01

    To assess the intra-examiner reliability of recordings of subgingival calculus detected using an ultrasonic device, and to investigate the influence of subject-, tooth- and site-level factors on the reliability of these subgingival calculus recordings. On two occasions, within a 1-week interval, 147 adult periodontitis patients received a full-mouth clinical periodontal examination by a single trained examiner. Duplicate subgingival calculus recordings, in six sites per tooth, were obtained using an ultrasonic device for calculus detection and removal. Agreement was observed in 65 % of the 22,584 duplicate subgingival calculus recordings, ranging 45 % to 83 % according to subject. Using hierarchical modeling, disagreements in the subgingival calculus duplicate recordings were more likely in all other sites than the mid-buccal, and in sites harboring supragingival calculus. Disagreements were less likely in sites with PD ≥  4 mm and with furcation involvement  ≥  degree 2. Bleeding on probing or suppuration did not influence the reliability of subgingival calculus. At the subject-level, disagreements were less likely in patients presenting with the highest and lowest extent categories of the covariate subgingival calculus. The reliability of subgingival calculus recordings using the ultrasound technology is reasonable. The results of the present study suggest that the reliability of subgingival calculus recordings is not influenced by the presence of inflammation. Moreover, subgingival calculus can be more reliably detected using the ultrasound device at sites with higher need for periodontal therapy, i.e., sites presenting with deep pockets and premolars and molars with furcation involvement.

  15. The Development of DNA Based Methods for the Reliable and Efficient Identification of Nicotiana tabacum in Tobacco and Its Derived Products

    Directory of Open Access Journals (Sweden)

    Sukumar Biswas

    2016-01-01

    Full Text Available Reliable methods are needed to detect the presence of tobacco components in tobacco products to effectively control smuggling and classify tariff and excise in tobacco industry to control illegal tobacco trade. In this study, two sensitive and specific DNA based methods, one quantitative real-time PCR (qPCR assay and the other loop-mediated isothermal amplification (LAMP assay, were developed for the reliable and efficient detection of the presence of tobacco (Nicotiana tabacum in various tobacco samples and commodities. Both assays targeted the same sequence of the uridine 5′-monophosphate synthase (UMPS, and their specificities and sensitivities were determined with various plant materials. Both qPCR and LAMP methods were reliable and accurate in the rapid detection of tobacco components in various practical samples, including customs samples, reconstituted tobacco samples, and locally purchased cigarettes, showing high potential for their application in tobacco identification, particularly in the special cases where the morphology or chemical compositions of tobacco have been disrupted. Therefore, combining both methods would facilitate not only the detection of tobacco smuggling control, but also the detection of tariff classification and of excise.

  16. Quench detection method for 2G HTS wire

    International Nuclear Information System (INIS)

    Marchevsky, M; Xie, Y-Y; Selvamanickam, V

    2010-01-01

    2G HTS conductors are increasingly used in various commercial applications and their thermal and electrical stability is an important reliability factor. Detection and prevention of quenches in 2G wire-based cables and solenoids has proven to be a difficult engineering task. This is largely due to a very slow normal zone propagation in coated conductors that leads to formation of localized hotspots while the rest of the conductor remains in the superconducting state. We propose an original method of quench and hotspot detection for 2G wires and coils that is based upon local magnetic sensing and takes advantage of 2G wire planar geometry. We demonstrate our technique experimentally and show that its sensitivity is superior to the known voltage detection scheme. A unique feature of the method is its capability to remotely detect instant degradation of the wire critical current even before a normal zone is developed within the conductor. Various modifications of the method applicable to practical device configurations are discussed.

  17. Quench detection method for 2G HTS wire

    Energy Technology Data Exchange (ETDEWEB)

    Marchevsky, M; Xie, Y-Y; Selvamanickam, V, E-mail: maxmarche@gmail.co, E-mail: yxie@superpower-inc.co [SuperPower, Inc., 450 Duane Avenue, Schenectady, NY 12304 (United States)

    2010-03-15

    2G HTS conductors are increasingly used in various commercial applications and their thermal and electrical stability is an important reliability factor. Detection and prevention of quenches in 2G wire-based cables and solenoids has proven to be a difficult engineering task. This is largely due to a very slow normal zone propagation in coated conductors that leads to formation of localized hotspots while the rest of the conductor remains in the superconducting state. We propose an original method of quench and hotspot detection for 2G wires and coils that is based upon local magnetic sensing and takes advantage of 2G wire planar geometry. We demonstrate our technique experimentally and show that its sensitivity is superior to the known voltage detection scheme. A unique feature of the method is its capability to remotely detect instant degradation of the wire critical current even before a normal zone is developed within the conductor. Various modifications of the method applicable to practical device configurations are discussed.

  18. Simultaneous amplification of two bacterial genes: more reliable method of Helicobacter pylori detection in microbial rich dental plaque samples.

    Science.gov (United States)

    Chaudhry, Saima; Idrees, Muhammad; Izhar, Mateen; Butt, Arshad Kamal; Khan, Ayyaz Ali

    2011-01-01

    Polymerase Chain reaction (PCR) assay is considered superior to other methods for detection of Helicobacter pylori (H. pylori) in oral cavity; however, it also has limitations when sample under study is microbial rich dental plaque. The type of gene targeted and number of primers used for bacterial detection in dental plaque samples can have a significant effect on the results obtained as there are a number of closely related bacterial species residing in plaque biofilm. Also due to high recombination rate of H. pylori some of the genes might be down regulated or absent. The present study was conducted to determine the frequency of H. pylori colonization of dental plaque by simultaneously amplifying two genes of the bacterium. One hundred dental plaque specimens were collected from dyspeptic patients before their upper gastrointestinal endoscopy and presence of H. pylori was determined through PCR assay using primers targeting two different genes of the bacterium. Eighty-nine of the 100 samples were included in final analysis. With simultaneous amplification of two bacterial genes 51.6% of the dental plaque samples were positive for H. pylori while this prevalence increased to 73% when only one gene amplification was used for bacterial identification. Detection of H. pylori in dental plaque samples is more reliable when two genes of the bacterium are simultaneously amplified as compared to one gene amplification only.

  19. Steam leak detection in advance reactors via acoustics method

    International Nuclear Information System (INIS)

    Singh, Raj Kumar; Rao, A. Rama

    2011-01-01

    Highlights: → Steam leak detection system is developed to detect any leak inside the reactor vault. → The technique uses leak noise frequency spectrum for leak detection. → Testing of system and method to locate the leak is also developed and discussed in present paper. - Abstract: Prediction of LOCA (loss of coolant activity) plays very important role in safety of nuclear reactor. Coolant is responsible for heat transfer from fuel bundles. Loss of coolant is an accidental situation which requires immediate shut down of reactor. Fall in system pressure during LOCA is the trip parameter used for initiating automatic reactor shut down. However, in primary heat transport system operating in two phase regimes, detection of small break LOCA is not simple. Due to very slow leak rates, time for the fall of pressure is significantly slow. From reactor safety point of view, it is extremely important to find reliable and effective alternative for detecting slow pressure drop in case of small break LOCA. One such technique is the acoustic signal caused by LOCA in small breaks. In boiling water reactors whose primary heat transport is to be driven by natural circulation, small break LOCA detection is important. For prompt action on post small break LOCA, steam leak detection system is developed to detect any leak inside the reactor vault. The detection technique is reliable and plays a very important role in ensuring safety of the reactor. Methodology developed for steam leak detection is discussed in present paper. The methods to locate the leak is also developed and discussed in present paper which is based on analysis of the signal.

  20. Method of detecting genetic deletions identified with chromosomal abnormalities

    Energy Technology Data Exchange (ETDEWEB)

    Gray, Joe W; Pinkel, Daniel; Tkachuk, Douglas

    2013-11-26

    Methods and compositions for staining based upon nucleic acid sequence that employ nucleic acid probes are provided. Said methods produce staining patterns that can be tailored for specific cytogenetic analyzes. Said probes are appropriate for in situ hybridization and stain both interphase and metaphase chromosomal material with reliable signals. The nucleic acids probes are typically of a complexity greater tha 50 kb, the complexity depending upon the cytogenetic application. Methods and reagents are provided for the detection of genetic rearrangements. Probes and test kits are provided for use in detecting genetic rearrangements, particlularly for use in tumor cytogenetics, in the detection of disease related loci, specifically cancer, such as chronic myelogenous leukemia (CML) and for biological dosimetry. Methods and reagents are described for cytogenetic research, for the differentiation of cytogenetically similar ut genetically different diseases, and for many prognostic and diagnostic applications.

  1. System principles, mathematical models and methods to ensure high reliability of safety systems

    Science.gov (United States)

    Zaslavskyi, V.

    2017-04-01

    Modern safety and security systems are composed of a large number of various components designed for detection, localization, tracking, collecting, and processing of information from the systems of monitoring, telemetry, control, etc. They are required to be highly reliable in a view to correctly perform data aggregation, processing and analysis for subsequent decision making support. On design and construction phases of the manufacturing of such systems a various types of components (elements, devices, and subsystems) are considered and used to ensure high reliability of signals detection, noise isolation, and erroneous commands reduction. When generating design solutions for highly reliable systems a number of restrictions and conditions such as types of components and various constrains on resources should be considered. Various types of components perform identical functions; however, they are implemented using diverse principles, approaches and have distinct technical and economic indicators such as cost or power consumption. The systematic use of different component types increases the probability of tasks performing and eliminates the common cause failure. We consider type-variety principle as an engineering principle of system analysis, mathematical models based on this principle, and algorithms for solving optimization problems of highly reliable safety and security systems design. Mathematical models are formalized in a class of two-level discrete optimization problems of large dimension. The proposed approach, mathematical models, algorithms can be used for problem solving of optimal redundancy on the basis of a variety of methods and control devices for fault and defects detection in technical systems, telecommunication networks, and energy systems.

  2. Reliability and Minimum Detectable Change of Temporal-Spatial, Kinematic, and Dynamic Stability Measures during Perturbed Gait.

    Directory of Open Access Journals (Sweden)

    Christopher A Rábago

    Full Text Available Temporal-spatial, kinematic variability, and dynamic stability measures collected during perturbation-based assessment paradigms are often used to identify dysfunction associated with gait instability. However, it remains unclear which measures are most reliable for detecting and tracking responses to perturbations. This study systematically determined the between-session reliability and minimum detectable change values of temporal-spatial, kinematic variability, and dynamic stability measures during three types of perturbed gait. Twenty young healthy adults completed two identical testing sessions two weeks apart, comprised of an unperturbed and three perturbed (cognitive, physical, and visual walking conditions in a virtual reality environment. Within each session, perturbation responses were compared to unperturbed walking using paired t-tests. Between-session reliability and minimum detectable change values were also calculated for each measure and condition. All temporal-spatial, kinematic variability and dynamic stability measures demonstrated fair to excellent between-session reliability. Minimal detectable change values, normalized to mean values ranged from 1-50%. Step width mean and variability measures demonstrated the greatest response to perturbations with excellent between-session reliability and low minimum detectable change values. Orbital stability measures demonstrated specificity to perturbation direction and sensitivity with excellent between-session reliability and low minimum detectable change values. We observed substantially greater between-session reliability and lower minimum detectable change values for local stability measures than previously described which may be the result of averaging across trials within a session and using velocity versus acceleration data for reconstruction of state spaces. Across all perturbation types, temporal-spatial, orbital and local measures were the most reliable measures with the

  3. The Reliability and Effectiveness of a Radar-Based Animal Detection System

    Science.gov (United States)

    2017-09-22

    This document contains data on the reliability and effectiveness of an animal detection system along U.S. Hwy 95 near Bonners Ferry, Idaho. The system uses a Doppler radar to detect large mammals (e.g., deer and elk) when they approach the highway. T...

  4. The Reliability and Effectiveness of a Radar-Based Animal Detection System

    Science.gov (United States)

    2017-09-01

    This document contains data on the reliability and effectiveness of an animal detection system along U.S. Hwy 95 near Bonners Ferry, Idaho. The system uses a Doppler radar to detect large mammals (e.g., deer and elk) when they approach the highway. T...

  5. A novel approach for reliable detection of cathepsin S activities in mouse antigen presenting cells.

    Science.gov (United States)

    Steimle, Alex; Kalbacher, Hubert; Maurer, Andreas; Beifuss, Brigitte; Bender, Annika; Schäfer, Andrea; Müller, Ricarda; Autenrieth, Ingo B; Frick, Julia-Stefanie

    2016-05-01

    Cathepsin S (CTSS) is a eukaryotic protease mostly expressed in professional antigen presenting cells (APCs). Since CTSS activity regulation plays a role in the pathogenesis of various autoimmune diseases like multiple sclerosis, atherosclerosis, Sjögren's syndrome and psoriasis as well as in cancer progression, there is an ongoing interest in the reliable detection of cathepsin S activity. Various applications have been invented for specific detection of this enzyme. However, most of them have only been shown to be suitable for human samples, do not deliver quantitative results or the experimental procedure requires technical equipment that is not commonly available in a standard laboratory. We have tested a fluorogen substrate, Mca-GRWPPMGLPWE-Lys(Dnp)-DArg-NH2, that has been described to specifically detect CTSS activities in human APCs for its potential use for mouse samples. We have modified the protocol and thereby offer a cheap, easy, reproducible and quick activity assay to detect CTSS activities in mouse APCs. Since most of basic research on CTSS is performed in mice, this method closes a gap and offers a possibility for reliable and quantitative CTSS activity detection that can be performed in almost every laboratory. Copyright © 2016. Published by Elsevier B.V.

  6. A novel reliability evaluation method for large engineering systems

    Directory of Open Access Journals (Sweden)

    Reda Farag

    2016-06-01

    Full Text Available A novel reliability evaluation method for large nonlinear engineering systems excited by dynamic loading applied in time domain is presented. For this class of problems, the performance functions are expected to be function of time and implicit in nature. Available first- or second-order reliability method (FORM/SORM will be challenging to estimate reliability of such systems. Because of its inefficiency, the classical Monte Carlo simulation (MCS method also cannot be used for large nonlinear dynamic systems. In the proposed approach, only tens instead of hundreds or thousands of deterministic evaluations at intelligently selected points are used to extract the reliability information. A hybrid approach, consisting of the stochastic finite element method (SFEM developed by the author and his research team using FORM, response surface method (RSM, an interpolation scheme, and advanced factorial schemes, is proposed. The method is clarified with the help of several numerical examples.

  7. Objective Methods for Reliable Detection of Concealed Depression

    Directory of Open Access Journals (Sweden)

    Cynthia eSolomon

    2015-04-01

    Full Text Available Recent research has shown that it is possible to automatically detect clinical depression from audio-visual recordings. Before considering integration in a clinical pathway, a key question that must be asked is whether such systems can be easily fooled. This work explores the potential of acoustic features to detect clinical depression in adults both when acting normally and when asked to conceal their depression. Nine adults diagnosed with mild to moderate depression as per the Beck Depression Inventory (BDI-II and Patient Health Questionnaire (PHQ-9 were asked a series of questions and to read a excerpt from a novel aloud under two different experimental conditions. In one, participants were asked to act naturally and in the other, to suppress anything that they felt would be indicative of their depression. Acoustic features were then extracted from this data and analysed using paired t-tests to determine any statistically significant differences between healthy and depressed participants. Most features that were found to be significantly different during normal behaviour remained so during concealed behaviour. In leave-one-subject-out automatic classification studies of the 9 depressed subjects and 8 matched healthy controls, an 88% classification accuracy and 89% sensitivity was achieved. Results remained relatively robust during concealed behaviour, with classifiers trained on only non-concealed data achieving 81% detection accuracy and 75% sensitivity when tested on concealed data. These results indicate there is good potential to build deception-proof automatic depression monitoring systems.

  8. Structural reliability calculation method based on the dual neural network and direct integration method.

    Science.gov (United States)

    Li, Haibin; He, Yun; Nie, Xiaobo

    2018-01-01

    Structural reliability analysis under uncertainty is paid wide attention by engineers and scholars due to reflecting the structural characteristics and the bearing actual situation. The direct integration method, started from the definition of reliability theory, is easy to be understood, but there are still mathematics difficulties in the calculation of multiple integrals. Therefore, a dual neural network method is proposed for calculating multiple integrals in this paper. Dual neural network consists of two neural networks. The neural network A is used to learn the integrand function, and the neural network B is used to simulate the original function. According to the derivative relationships between the network output and the network input, the neural network B is derived from the neural network A. On this basis, the performance function of normalization is employed in the proposed method to overcome the difficulty of multiple integrations and to improve the accuracy for reliability calculations. The comparisons between the proposed method and Monte Carlo simulation method, Hasofer-Lind method, the mean value first-order second moment method have demonstrated that the proposed method is an efficient and accurate reliability method for structural reliability problems.

  9. Various imaging methods in the detection of small hepatomas

    International Nuclear Information System (INIS)

    Nakatsuka, Haruki; Kaminou, Toshio; Takemoto, Kazumasa; Takashima, Sumio; Kobayashi, Nobuyuki; Nakamura, Kenji; Onoyama, Yasuto; Kurioka, Naruto

    1985-01-01

    Fifty-one patients with small hepatomas under 5 cm in diameter were studied to compare the detectability of various imaging methods. Positive finding was obtained in 50 % of the patients by scintigraphy, in 74 % by ultrasonography and in 79 % by CT during screening tests. Rate of detection in retrospective analysis, after the site of the tumor had been known, were 73 %, 93 % and 87 % respectively. Rate of detection was 92 % by celiac arteriography and 98 % by selective hepatic arteriography. In 21 patients, who had the tumor under 3 cm, the rate was 32 % for scintigraphy, 74 % for ultrasonography and 65 % for CT during screening, whereas it was 58 %, 84 % and 75 % retrospectively. By celiac arteriography, it was 85 %, and by hepatic arteriography, 95 %. Rate of detection of small hepatomas in screening tests differed remarkably from that in retrospective analysis. No single method of imaging can disclose reliably the presense of small hepatoma, therefore more than one method should be used in screening. (author)

  10. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    OpenAIRE

    Hai An; Ling Zhou; Hui Sun

    2016-01-01

    Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new...

  11. Reliability and applications of statistical methods based on oligonucleotide frequencies in bacterial and archaeal genomes

    DEFF Research Database (Denmark)

    Bohlin, J; Skjerve, E; Ussery, David

    2008-01-01

    with here are mainly used to examine similarities between archaeal and bacterial DNA from different genomes. These methods compare observed genomic frequencies of fixed-sized oligonucleotides with expected values, which can be determined by genomic nucleotide content, smaller oligonucleotide frequencies......, or be based on specific statistical distributions. Advantages with these statistical methods include measurements of phylogenetic relationship with relatively small pieces of DNA sampled from almost anywhere within genomes, detection of foreign/conserved DNA, and homology searches. Our aim was to explore...... the reliability and best suited applications for some popular methods, which include relative oligonucleotide frequencies (ROF), di- to hexanucleotide zero'th order Markov methods (ZOM) and 2.order Markov chain Method (MCM). Tests were performed on distant homology searches with large DNA sequences, detection...

  12. Novel Fingertip Image-Based Heart Rate Detection Methods for a Smartphone

    Directory of Open Access Journals (Sweden)

    Rifat Zaman

    2017-02-01

    Full Text Available We hypothesize that our fingertip image-based heart rate detection methods using smartphone reliably detect the heart rhythm and rate of subjects. We propose fingertip curve line movement-based and fingertip image intensity-based detection methods, which both use the movement of successive fingertip images obtained from smartphone cameras. To investigate the performance of the proposed methods, heart rhythm and rate of the proposed methods are compared to those of the conventional method, which is based on average image pixel intensity. Using a smartphone, we collected 120 s pulsatile time series data from each recruited subject. The results show that the proposed fingertip curve line movement-based method detects heart rate with a maximum deviation of 0.0832 Hz and 0.124 Hz using time- and frequency-domain based estimation, respectively, compared to the conventional method. Moreover, another proposed fingertip image intensity-based method detects heart rate with a maximum deviation of 0.125 Hz and 0.03 Hz using time- and frequency-based estimation, respectively.

  13. Reliability Evaluation of Bridges Based on Nonprobabilistic Response Surface Limit Method

    Directory of Open Access Journals (Sweden)

    Xuyong Chen

    2017-01-01

    Full Text Available Due to many uncertainties in nonprobabilistic reliability assessment of bridges, the limit state function is generally unknown. The traditional nonprobabilistic response surface method is a lengthy and oscillating iteration process and leads to difficultly solving the nonprobabilistic reliability index. This article proposes a nonprobabilistic response surface limit method based on the interval model. The intention of this method is to solve the upper and lower limits of the nonprobabilistic reliability index and to narrow the range of the nonprobabilistic reliability index. If the range of the reliability index reduces to an acceptable accuracy, the solution will be considered convergent, and the nonprobabilistic reliability index will be obtained. The case study indicates that using the proposed method can avoid oscillating iteration process, make iteration process stable and convergent, reduce iteration steps significantly, and improve computational efficiency and precision significantly compared with the traditional nonprobabilistic response surface method. Finally, the nonprobabilistic reliability evaluation process of bridge will be built through evaluating the reliability of one PC continuous rigid frame bridge with three spans using the proposed method, which appears to be more simple and reliable when lack of samples and parameters in the bridge nonprobabilistic reliability evaluation is present.

  14. Detection of HBsAg and Anti HBc on donors of a blood bank by IRMA and ELISA methods

    International Nuclear Information System (INIS)

    Freire Martinez, D.Y.

    1985-10-01

    Comparative evaluation of two methods, Immunoradiometric Assay (IRMA) and Enzyme Immunoassay (ELISA), for detecting HBsAg and Anti HBc was made for determining which is the most advantageous and reliable. The study was made on 300 donors of the Hospital San Juan de Dios Blood Bank. In comparison with the reference method (IRMA), ELISA shows 91.67% of sensitivity. The Anti HBc detection by IRMA is more reliable than the HBsAg detection by IRMA and ELISA for determining the carrier state

  15. Molecular methods for the detection of mutations.

    Science.gov (United States)

    Monteiro, C; Marcelino, L A; Conde, A R; Saraiva, C; Giphart-Gassler, M; De Nooij-van Dalen, A G; Van Buuren-van Seggelen, V; Van der Keur, M; May, C A; Cole, J; Lehmann, A R; Steinsgrimsdottir, H; Beare, D; Capulas, E; Armour, J A

    2000-01-01

    We report the results of a collaborative study aimed at developing reliable, direct assays for mutation in human cells. The project used common lymphoblastoid cell lines, both with and without mutagen treatment, as a shared resource to validate the development of new molecular methods for the detection of low-level mutations in the presence of a large excess of normal alleles. As the "gold standard, " hprt mutation frequencies were also measured on the same samples. The methods under development included i) the restriction site mutation (RSM) assay, in which mutations lead to the destruction of a restriction site; ii) minisatellite length-change mutation, in which mutations lead to alleles containing new numbers of tandem repeat units; iii) loss of heterozygosity for HLA epitopes, in which antibodies can be used to direct selection for mutant cells; iv) multiple fluorescence-based long linker arm nucleotides assay (mf-LLA) technology, for the detection of substitutional mutations; v) detection of alterations in the TP53 locus using a (CA) array as the target for the screening; and vi) PCR analysis of lymphocytes for the presence of the BCL2 t(14:18) translocation. The relative merits of these molecular methods are discussed, and a comparison made with more "traditional" methods.

  16. Research Note The reliability of a field test kit for the detection and ...

    African Journals Online (AJOL)

    Research Note The reliability of a field test kit for the detection and the persistence of ... Open Access DOWNLOAD FULL TEXT ... The objectives were to test a field kit for practicality and reliability, to assess the spread of the bacteria among ...

  17. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  18. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    International Nuclear Information System (INIS)

    Boring, Ronald Laurids

    2010-01-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  19. Review of methods for the integration of reliability and design engineering

    International Nuclear Information System (INIS)

    Reilly, J.T.

    1978-03-01

    A review of methods for the integration of reliability and design engineering was carried out to establish a reliability program philosophy, an initial set of methods, and procedures to be used by both the designer and reliability analyst. The report outlines a set of procedures which implements a philosophy that requires increased involvement by the designer in reliability analysis. Discussions of each method reviewed include examples of its application

  20. Emergency First Responders' Experience with Colorimetric Detection Methods

    Energy Technology Data Exchange (ETDEWEB)

    Sandra L. Fox; Keith A. Daum; Carla J. Miller; Marnie M. Cortez

    2007-10-01

    Nationwide, first responders from state and federal support teams respond to hazardous materials incidents, industrial chemical spills, and potential weapons of mass destruction (WMD) attacks. Although first responders have sophisticated chemical, biological, radiological, and explosive detectors available for assessment of the incident scene, simple colorimetric detectors have a role in response actions. The large number of colorimetric chemical detection methods available on the market can make the selection of the proper methods difficult. Although each detector has unique aspects to provide qualitative or quantitative data about the unknown chemicals present, not all detectors provide consistent, accurate, and reliable results. Included here, in a consumer-report-style format, we provide “boots on the ground” information directly from first responders about how well colorimetric chemical detection methods meet their needs in the field and how they procure these methods.

  1. Reliability of Estimation Pile Load Capacity Methods

    Directory of Open Access Journals (Sweden)

    Yudhi Lastiasih

    2014-04-01

    Full Text Available None of numerous previous methods for predicting pile capacity is known how accurate any of them are when compared with the actual ultimate capacity of piles tested to failure. The author’s of the present paper have conducted such an analysis, based on 130 data sets of field loading tests. Out of these 130 data sets, only 44 could be analysed, of which 15 were conducted until the piles actually reached failure. The pile prediction methods used were: Brinch Hansen’s method (1963, Chin’s method (1970, Decourt’s Extrapolation Method (1999, Mazurkiewicz’s method (1972, Van der Veen’s method (1953, and the Quadratic Hyperbolic Method proposed by Lastiasih et al. (2012. It was obtained that all the above methods were sufficiently reliable when applied to data from pile loading tests that loaded to reach failure. However, when applied to data from pile loading tests that loaded without reaching failure, the methods that yielded lower values for correction factor N are more recommended. Finally, the empirical method of Reese and O’Neill (1988 was found to be reliable enough to be used to estimate the Qult of a pile foundation based on soil data only.

  2. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  3. Recent and innovative methods for detection of bacteremia and fungemia

    International Nuclear Information System (INIS)

    Reller, L.B.

    1983-01-01

    Advances continue to be made in methods for more reliable or more rapid means of detecting bacteremia and fungemia. The importance of blood sample volume and broth dilution has been established in controlled studies. New technology includes the use of resins that remove antimicrobials from blood samples, detection of radioactivity from organisms given radiolabeled substrate, use of dyes that stain microbial DNA and RNA, use of slides coated with growth media, and lysis-centrifugation for trapping microorganisms. Technology now being considered includes counterimmunoelectrophoresis, head-space gas chromatography, electrical impedance, microcalorimetry, and the use of lasers to detect pH changes and turbidity

  4. Method matters: Understanding diagnostic reliability in DSM-IV and DSM-5.

    Science.gov (United States)

    Chmielewski, Michael; Clark, Lee Anna; Bagby, R Michael; Watson, David

    2015-08-01

    Diagnostic reliability is essential for the science and practice of psychology, in part because reliability is necessary for validity. Recently, the DSM-5 field trials documented lower diagnostic reliability than past field trials and the general research literature, resulting in substantial criticism of the DSM-5 diagnostic criteria. Rather than indicating specific problems with DSM-5, however, the field trials may have revealed long-standing diagnostic issues that have been hidden due to a reliance on audio/video recordings for estimating reliability. We estimated the reliability of DSM-IV diagnoses using both the standard audio-recording method and the test-retest method used in the DSM-5 field trials, in which different clinicians conduct separate interviews. Psychiatric patients (N = 339) were diagnosed using the SCID-I/P; 218 were diagnosed a second time by an independent interviewer. Diagnostic reliability using the audio-recording method (N = 49) was "good" to "excellent" (M κ = .80) and comparable to the DSM-IV field trials estimates. Reliability using the test-retest method (N = 218) was "poor" to "fair" (M κ = .47) and similar to DSM-5 field-trials' estimates. Despite low test-retest diagnostic reliability, self-reported symptoms were highly stable. Moreover, there was no association between change in self-report and change in diagnostic status. These results demonstrate the influence of method on estimates of diagnostic reliability. (c) 2015 APA, all rights reserved).

  5. Botulinum Neurotoxin Detection Methods for Public Health Response and Surveillance

    Directory of Open Access Journals (Sweden)

    Nagarajan Thirunavukkarasu

    2018-06-01

    Full Text Available Botulism outbreak due to consumption of food contaminated with botulinum neurotoxins (BoNTs is a public health emergency. The threat of bioterrorism through deliberate distribution in food sources and/or aerosolization of BoNTs raises global public health and security concerns due to the potential for high mortality and morbidity. Rapid and reliable detection methods are necessary to support clinical diagnosis and surveillance for identifying the source of contamination, performing epidemiological analysis of the outbreak, preventing and responding to botulism outbreaks. This review considers the applicability of various BoNT detection methods and examines their fitness-for-purpose in safeguarding the public health and security goals.

  6. Stepwise multiphoton activation fluorescence reveals a new method of melanin detection

    Science.gov (United States)

    Lai, Zhenhua; Kerimo, Josef; Mega, Yair; DiMarzio, Charles A.

    2013-06-01

    The stepwise multiphoton activated fluorescence (SMPAF) of melanin, activated by a continuous-wave mode near infrared (NIR) laser, reveals a broad spectrum extending from the visible spectra to the NIR and has potential application for a low-cost, reliable method of detecting melanin. SMPAF images of melanin in mouse hair and skin are compared with conventional multiphoton fluorescence microscopy and confocal reflectance microscopy (CRM). By combining CRM with SMPAF, we can locate melanin reliably. However, we have the added benefit of eliminating background interference from other components inside mouse hair and skin. The melanin SMPAF signal from the mouse hair is a mixture of a two-photon process and a third-order process. The melanin SMPAF emission spectrum is activated by a 1505.9-nm laser light, and the resulting spectrum has a peak at 960 nm. The discovery of the emission peak may lead to a more energy-efficient method of background-free melanin detection with less photo-bleaching.

  7. A novel visual saliency detection method for infrared video sequences

    Science.gov (United States)

    Wang, Xin; Zhang, Yuzhen; Ning, Chen

    2017-12-01

    Infrared video applications such as target detection and recognition, moving target tracking, and so forth can benefit a lot from visual saliency detection, which is essentially a method to automatically localize the ;important; content in videos. In this paper, a novel visual saliency detection method for infrared video sequences is proposed. Specifically, for infrared video saliency detection, both the spatial saliency and temporal saliency are considered. For spatial saliency, we adopt a mutual consistency-guided spatial cues combination-based method to capture the regions with obvious luminance contrast and contour features. For temporal saliency, a multi-frame symmetric difference approach is proposed to discriminate salient moving regions of interest from background motions. Then, the spatial saliency and temporal saliency are combined to compute the spatiotemporal saliency using an adaptive fusion strategy. Besides, to highlight the spatiotemporal salient regions uniformly, a multi-scale fusion approach is embedded into the spatiotemporal saliency model. Finally, a Gestalt theory-inspired optimization algorithm is designed to further improve the reliability of the final saliency map. Experimental results demonstrate that our method outperforms many state-of-the-art saliency detection approaches for infrared videos under various backgrounds.

  8. Advances in methods and applications of reliability and safety analysis

    International Nuclear Information System (INIS)

    Fieandt, J.; Hossi, H.; Laakso, K.; Lyytikaeinen, A.; Niemelae, I.; Pulkkinen, U.; Pulli, T.

    1986-01-01

    The know-how of the reliability and safety design and analysis techniques of Vtt has been established over several years in analyzing the reliability in the Finnish nuclear power plants Loviisa and Olkiluoto. This experience has been later on applied and developed to be used in the process industry, conventional power industry, automation and electronics. VTT develops and transfers methods and tools for reliability and safety analysis to the private and public sectors. The technology transfer takes place in joint development projects with potential users. Several computer-aided methods, such as RELVEC for reliability modelling and analysis, have been developed. The tool developed are today used by major Finnish companies in the fields of automation, nuclear power, shipbuilding and electronics. Development of computer-aided and other methods needed in analysis of operating experience, reliability or safety is further going on in a number of research and development projects

  9. Reliability-based design optimization via high order response surface method

    International Nuclear Information System (INIS)

    Li, Hong Shuang

    2013-01-01

    To reduce the computational effort of reliability-based design optimization (RBDO), the response surface method (RSM) has been widely used to evaluate reliability constraints. We propose an efficient methodology for solving RBDO problems based on an improved high order response surface method (HORSM) that takes advantage of an efficient sampling method, Hermite polynomials and uncertainty contribution concept to construct a high order response surface function with cross terms for reliability analysis. The sampling method generates supporting points from Gauss-Hermite quadrature points, which can be used to approximate response surface function without cross terms, to identify the highest order of each random variable and to determine the significant variables connected with point estimate method. The cross terms between two significant random variables are added to the response surface function to improve the approximation accuracy. Integrating the nested strategy, the improved HORSM is explored in solving RBDO problems. Additionally, a sampling based reliability sensitivity analysis method is employed to reduce the computational effort further when design variables are distributional parameters of input random variables. The proposed methodology is applied on two test problems to validate its accuracy and efficiency. The proposed methodology is more efficient than first order reliability method based RBDO and Monte Carlo simulation based RBDO, and enables the use of RBDO as a practical design tool.

  10. Reliability methods in nuclear power plant ageing management

    International Nuclear Information System (INIS)

    Simola, K.

    1999-01-01

    The aim of nuclear power plant ageing management is to maintain an adequate safety level throughout the lifetime of the plant. In ageing studies, the reliability of components, systems and structures is evaluated taking into account the possible time-dependent degradation. The phases of ageing analyses are generally the identification of critical components, identification and evaluation of ageing effects, and development of mitigation methods. This thesis focuses on the use of reliability methods and analyses of plant- specific operating experience in nuclear power plant ageing studies. The presented applications and method development have been related to nuclear power plants, but many of the approaches can also be applied outside the nuclear industry. The thesis consists of a summary and seven publications. The summary provides an overview of ageing management and discusses the role of reliability methods in ageing analyses. In the publications, practical applications and method development are described in more detail. The application areas at component and system level are motor-operated valves and protection automation systems, for which experience-based ageing analyses have been demonstrated. Furthermore, Bayesian ageing models for repairable components have been developed, and the management of ageing by improving maintenance practices is discussed. Recommendations for improvement of plant information management in order to facilitate ageing analyses are also given. The evaluation and mitigation of ageing effects on structural components is addressed by promoting the use of probabilistic modelling of crack growth, and developing models for evaluation of the reliability of inspection results. (orig.)

  11. Reliability methods in nuclear power plant ageing management

    Energy Technology Data Exchange (ETDEWEB)

    Simola, K. [VTT Automation, Espoo (Finland). Industrial Automation

    1999-07-01

    The aim of nuclear power plant ageing management is to maintain an adequate safety level throughout the lifetime of the plant. In ageing studies, the reliability of components, systems and structures is evaluated taking into account the possible time-dependent degradation. The phases of ageing analyses are generally the identification of critical components, identification and evaluation of ageing effects, and development of mitigation methods. This thesis focuses on the use of reliability methods and analyses of plant- specific operating experience in nuclear power plant ageing studies. The presented applications and method development have been related to nuclear power plants, but many of the approaches can also be applied outside the nuclear industry. The thesis consists of a summary and seven publications. The summary provides an overview of ageing management and discusses the role of reliability methods in ageing analyses. In the publications, practical applications and method development are described in more detail. The application areas at component and system level are motor-operated valves and protection automation systems, for which experience-based ageing analyses have been demonstrated. Furthermore, Bayesian ageing models for repairable components have been developed, and the management of ageing by improving maintenance practices is discussed. Recommendations for improvement of plant information management in order to facilitate ageing analyses are also given. The evaluation and mitigation of ageing effects on structural components is addressed by promoting the use of probabilistic modelling of crack growth, and developing models for evaluation of the reliability of inspection results. (orig.)

  12. A Type-2 fuzzy data fusion approach for building reliable weighted protein interaction networks with application in protein complex detection.

    Science.gov (United States)

    Mehranfar, Adele; Ghadiri, Nasser; Kouhsar, Morteza; Golshani, Ashkan

    2017-09-01

    Detecting the protein complexes is an important task in analyzing the protein interaction networks. Although many algorithms predict protein complexes in different ways, surveys on the interaction networks indicate that about 50% of detected interactions are false positives. Consequently, the accuracy of existing methods needs to be improved. In this paper we propose a novel algorithm to detect the protein complexes in 'noisy' protein interaction data. First, we integrate several biological data sources to determine the reliability of each interaction and determine more accurate weights for the interactions. A data fusion component is used for this step, based on the interval type-2 fuzzy voter that provides an efficient combination of the information sources. This fusion component detects the errors and diminishes their effect on the detection protein complexes. So in the first step, the reliability scores have been assigned for every interaction in the network. In the second step, we have proposed a general protein complex detection algorithm by exploiting and adopting the strong points of other algorithms and existing hypotheses regarding real complexes. Finally, the proposed method has been applied for the yeast interaction datasets for predicting the interactions. The results show that our framework has a better performance regarding precision and F-measure than the existing approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Peak Detection Method Evaluation for Ion Mobility Spectrometry by Using Machine Learning Approaches

    DEFF Research Database (Denmark)

    Hauschild, Anne-Christin; Kopczynski, Dominik; D'Addario, Marianna

    2013-01-01

    machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region......-merging with VisualNow, and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods...

  14. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    Directory of Open Access Journals (Sweden)

    Hai An

    2016-08-01

    Full Text Available Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new hybrid reliability index definition is presented based on the random–fuzzy–interval model. Furthermore, the calculation flowchart of the hybrid reliability index is presented and it is solved using the modified limit-step length iterative algorithm, which ensures convergence. And the validity of convergent algorithm for the hybrid reliability model is verified through the calculation examples in literature. In the end, a numerical example is demonstrated to show that the hybrid reliability index is applicable for the wear reliability assessment of mechanisms, where truncated random variables, fuzzy random variables, and interval variables coexist. The demonstration also shows the good convergence of the iterative algorithm proposed in this article.

  15. Comparison of Model Reliabilities from Single-Step and Bivariate Blending Methods

    DEFF Research Database (Denmark)

    Taskinen, Matti; Mäntysaari, Esa; Lidauer, Martin

    2013-01-01

    Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from...... be calculated. Model reliabilities by the single-step and the bivariate blending methods were higher than by animal model due to genomic information. Compared to the single-step method, the bivariate blending method reliability estimates were, in general, lower. Computationally bivariate blending method was......, on the other hand, lighter than the single-step method....

  16. The reliability and accuracy of two methods for proximal caries detection and depth on directly visible proximal surfaces: an in vitro study

    DEFF Research Database (Denmark)

    Ekstrand, K R; Alloza, Alvaro Luna; Promisiero, L

    2011-01-01

    This study aimed to determine the reliability and accuracy of the ICDAS and radiographs in detecting and estimating the depth of proximal lesions on extracted teeth. The lesions were visible to the naked eye. Three trained examiners scored a total of 132 sound/carious proximal surfaces from 106 p...

  17. A novel method for rapid and reliable detection of complex vertebral malformation and bovine leukocyte adhesion deficiency in Holstein cattle

    Directory of Open Access Journals (Sweden)

    Zhang Yi

    2012-07-01

    Full Text Available Abstract Background Complex vertebral malformation (CVM and bovine leukocyte adhesion deficiency (BLAD are two autosomal recessive lethal genetic defects frequently occurring in Holstein cattle, identifiable by single nucleotide polymorphisms. The objective of this study is to develop a rapid and reliable genotyping assay to screen the active Holstein sires and determine the carrier frequency of CVM and BLAD in Chinese dairy cattle population. Results We developed real-time PCR-based assays for discrimination of wild-type and defective alleles, so that carriers can be detected. Only one step was required after the DNA extraction from the sample and time consumption was about 2 hours. A total of 587 Chinese Holstein bulls were assayed, and fifty-six CVM-carriers and eight BLAD-carriers were identified, corresponding to heterozygote carrier frequencies of 9.54% and 1.36%, respectively. The pedigree analysis showed that most of the carriers could be traced back to the common ancestry, Osborndale Ivanhoe for BLAD and Pennstate Ivanhoe Star for CVM. Conclusions These results demonstrate that real-time PCR is a simple, rapid and reliable assay for BLAD and CVM defective allele detection. The high frequency of the CVM allele suggests that implementing a routine testing system is necessary to gradually eradicate the deleterious gene from the Chinese Holstein population.

  18. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  19. Crack detecting method

    International Nuclear Information System (INIS)

    Narita, Michiko; Aida, Shigekazu

    1998-01-01

    A penetration liquid or a slow drying penetration liquid prepared by mixing a penetration liquid and a slow drying liquid is filled to the inside of an artificial crack formed to a member to be detected such as of boiler power generation facilities and nuclear power facilities. A developing liquid is applied to the periphery of the artificial crack on the surface of a member to be detected. As the slow-drying liquid, an oil having a viscosity of 56 is preferably used. Loads are applied repeatedly to the member to be detected, and when a crack is caused to the artificial crack, the permeation liquid penetrates into the crack. The penetration liquid penetrated into the crack is developed by the developing liquid previously coated to the periphery of the artificial crack of the surface of the member to be detected. When a crack is caused, since the crack is developed clearly even if it is a small opening, the crack can be recognized visually reliably. (I.N.)

  20. Results of a Demonstration Assessment of Passive System Reliability Utilizing the Reliability Method for Passive Systems (RMPS)

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia; Grelle, Austin

    2015-04-26

    Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), a systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.

  1. Assessment of modern methods of human factor reliability analysis in PSA studies

    International Nuclear Information System (INIS)

    Holy, J.

    2001-12-01

    The report is structured as follows: Classical terms and objects (Probabilistic safety assessment as a framework for human reliability assessment; Human failure within the PSA model; Basic types of operator failure modelled in a PSA study and analyzed by HRA methods; Qualitative analysis of human reliability; Quantitative analysis of human reliability used; Process of analysis of nuclear reactor operator reliability in a PSA study); New terms and objects (Analysis of dependences; Errors of omission; Errors of commission; Error forcing context); and Overview and brief assessment of human reliability analysis (Basic characteristics of the methods; Assets and drawbacks of the use of each of HRA method; History and prospects of the use of the methods). (P.A.)

  2. Suitability of the thermoluminescence method for detection of irradiated foods

    International Nuclear Information System (INIS)

    Pinnioja, S.

    1993-01-01

    Irradiated foods can be detected by thermoluminescence (TL) of contaminating minerals. Altogether about 300 lots of herbs, spices, berries, mushrooms and seafood were studied by the TL method. Irradiated herbs and spices were easily differentiated from unirradiated ones two years after irradiation of a 10 kGy dose. The mineral composition of seafood was variable; and while calcite was suitable for the TL analysis, aragonite and smectite gave unreliable results. Control analyses during two years confirmed the reliability of TL method. (author)

  3. Reliability considerations of electronics components for the deep underwater muon and neutrino detection system

    International Nuclear Information System (INIS)

    Leskovar, B.

    1980-02-01

    The reliability of some electronics components for the Deep Underwater Muon and Neutrino Detection (DUMAND) System is discussed. An introductory overview of engineering concepts and technique for reliability assessment is given. Component reliability is discussed in the contest of major factors causing failures, particularly with respect to physical and chemical causes, process technology and testing, and screening procedures. Failure rates are presented for discrete devices and for integrated circuits as well as for basic electronics components. Furthermore, the military reliability specifications and standards for semiconductor devices are reviewed

  4. Method for assessing reliability of a network considering probabilistic safety assessment

    International Nuclear Information System (INIS)

    Cepin, M.

    2005-01-01

    A method for assessment of reliability of the network is developed, which uses the features of the fault tree analysis. The method is developed in a way that the increase of the network under consideration does not require significant increase of the model. The method is applied to small examples of network consisting of a small number of nodes and a small number of their connections. The results give the network reliability. They identify equipment, which is to be carefully maintained in order that the network reliability is not reduced, and equipment, which is a candidate for redundancy, as this would improve network reliability significantly. (author)

  5. Reliability improvement of multiversion software by exchanging modules

    International Nuclear Information System (INIS)

    Shima, Kazuyuki; Matsumoto, Ken-ichi; Torii, Koji

    1996-01-01

    In this paper, we proposes a method to improve reliability of multiversion software. In CER proposed in, checkpoints are put in versions of program and errors of versions are detected and recovered at the checkpoints. It prevent versions from failing and improve the reliability of multiversion software. But it is point out that CER decreases the reliability of the multiversion software if the detection and recovery of errors are assumed to be able to fail. In the method proposed in this paper, versions of program are developed following the same module specifications. When failures of versions of program are detected, faulty modules are identified and replaced them to other modules. It create versions without faulty modules and improve the reliability of multiversion software. The failure probability of multiversion software is estimated to become about a hundredth of the failure probability by the proposed method where the failure probability of each version is 0.000698, the number of versions is 5 and the number of modules is 20. (author)

  6. The reliability and internal consistency of one-shot and flicker change detection for measuring individual differences in visual working memory capacity.

    Science.gov (United States)

    Pailian, Hrag; Halberda, Justin

    2015-04-01

    We investigated the psychometric properties of the one-shot change detection task for estimating visual working memory (VWM) storage capacity-and also introduced and tested an alternative flicker change detection task for estimating these limits. In three experiments, we found that the one-shot whole-display task returns estimates of VWM storage capacity (K) that are unreliable across set sizes-suggesting that the whole-display task is measuring different things at different set sizes. In two additional experiments, we found that the one-shot single-probe variant shows improvements in the reliability and consistency of K estimates. In another additional experiment, we found that a one-shot whole-display-with-click task (requiring target localization) also showed improvements in reliability and consistency. The latter results suggest that the one-shot task can return reliable and consistent estimates of VWM storage capacity (K), and they highlight the possibility that the requirement to localize the changed target is what engenders this enhancement. Through a final series of four experiments, we introduced and tested an alternative flicker change detection method that also requires the observer to localize the changing target and that generates, from response times, an estimate of VWM storage capacity (K). We found that estimates of K from the flicker task correlated with estimates from the traditional one-shot task and also had high reliability and consistency. We highlight the flicker method's ability to estimate executive functions as well as VWM storage capacity, and discuss the potential for measuring multiple abilities with the one-shot and flicker tasks.

  7. A reliable method for the counting and control of single ions for single-dopant controlled devices

    International Nuclear Information System (INIS)

    Shinada, T; Kurosawa, T; Nakayama, H; Zhu, Y; Hori, M; Ohdomari, I

    2008-01-01

    By 2016, transistor device size will be just 10 nm. However, a transistor that is doped at a typical concentration of 10 18 atoms cm -3 has only one dopant atom in the active channel region. Therefore, it can be predicted that conventional doping methods such as ion implantation and thermal diffusion will not be available ten years from now. We have been developing a single-ion implantation (SII) method that enables us to implant dopant ions one-by-one into semiconductors until the desired number is reached. Here we report a simple but reliable method to control the number of single-dopant atoms by detecting the change in drain current induced by single-ion implantation. The drain current decreases in a stepwise fashion as a result of the clusters of displaced Si atoms created by every single-ion incidence. This result indicates that the single-ion detection method we have developed is capable of detecting single-ion incidence with 100% efficiency. Our method potentially could pave the way to future single-atom devices, including a solid-state quantum computer

  8. Survey of industry methods for producing highly reliable software

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Persons, W.L.

    1994-11-01

    The Nuclear Reactor Regulation Office of the US Nuclear Regulatory Commission is charged with assessing the safety of new instrument and control designs for nuclear power plants which may use computer-based reactor protection systems. Lawrence Livermore National Laboratory has evaluated the latest techniques in software reliability for measurement, estimation, error detection, and prediction that can be used during the software life cycle as a means of risk assessment for reactor protection systems. One aspect of this task has been a survey of the software industry to collect information to help identify the design factors used to improve the reliability and safety of software. The intent was to discover what practices really work in industry and what design factors are used by industry to achieve highly reliable software. The results of the survey are documented in this report. Three companies participated in the survey: Computer Sciences Corporation, International Business Machines (Federal Systems Company), and TRW. Discussions were also held with NASA Software Engineering Lab/University of Maryland/CSC, and the AIAA Software Reliability Project

  9. Reliability of leak detection systems in LWRs

    International Nuclear Information System (INIS)

    Kupperman, D.S.

    1986-10-01

    In this paper, NRC guidelines for leak detection will be reviewed, current practices described, potential safety-related problems discussed, and potential improvements in leak detection technology (with emphasis on acoustic methods) evaluated

  10. RELIABILITY ASSESSMENT OF ENTROPY METHOD FOR SYSTEM CONSISTED OF IDENTICAL EXPONENTIAL UNITS

    Institute of Scientific and Technical Information of China (English)

    Sun Youchao; Shi Jun

    2004-01-01

    The reliability assessment of unit-system near two levels is the most important content in the reliability multi-level synthesis of complex systems. Introducing the information theory into system reliability assessment, using the addible characteristic of information quantity and the principle of equivalence of information quantity, an entropy method of data information conversion is presented for the system consisted of identical exponential units. The basic conversion formulae of entropy method of unit test data are derived based on the principle of information quantity equivalence. The general models of entropy method synthesis assessment for system reliability approximate lower limits are established according to the fundamental principle of the unit reliability assessment. The applications of the entropy method are discussed by way of practical examples. Compared with the traditional methods, the entropy method is found to be valid and practicable and the assessment results are very satisfactory.

  11. Sequential optimization and reliability assessment method for metal forming processes

    International Nuclear Information System (INIS)

    Sahai, Atul; Schramm, Uwe; Buranathiti, Thaweepat; Chen Wei; Cao Jian; Xia, Cedric Z.

    2004-01-01

    Uncertainty is inevitable in any design process. The uncertainty could be due to the variations in geometry of the part, material properties or due to the lack of knowledge about the phenomena being modeled itself. Deterministic design optimization does not take uncertainty into account and worst case scenario assumptions lead to vastly over conservative design. Probabilistic design, such as reliability-based design and robust design, offers tools for making robust and reliable decisions under the presence of uncertainty in the design process. Probabilistic design optimization often involves double-loop procedure for optimization and iterative probabilistic assessment. This results in high computational demand. The high computational demand can be reduced by replacing computationally intensive simulation models with less costly surrogate models and by employing Sequential Optimization and reliability assessment (SORA) method. The SORA method uses a single-loop strategy with a series of cycles of deterministic optimization and reliability assessment. The deterministic optimization and reliability assessment is decoupled in each cycle. This leads to quick improvement of design from one cycle to other and increase in computational efficiency. This paper demonstrates the effectiveness of Sequential Optimization and Reliability Assessment (SORA) method when applied to designing a sheet metal flanging process. Surrogate models are used as less costly approximations to the computationally expensive Finite Element simulations

  12. An open source cryostage and software analysis method for detection of antifreeze activity

    DEFF Research Database (Denmark)

    Lørup Buch, Johannes; Ramløv, H

    2016-01-01

    AFP could reliably be told apart from controls after only two minutes of recrystallisation. The goal of providing a fast, cheap and easy method for detecting antifreeze proteins in solution was met, and further development of the system can be followed at https://github.com/pechano/cryostage.......The aim of this study is to provide the reader with a simple setup that can detect antifreeze proteins (AFP) by inhibition of ice recrystallisation in very small sample sizes. This includes an open source cryostage, a method for preparing and loading samples as well as a software analysis method...

  13. Anomaly Detection in Gas Turbine Fuel Systems Using a Sequential Symbolic Method

    Directory of Open Access Journals (Sweden)

    Fei Li

    2017-05-01

    Full Text Available Anomaly detection plays a significant role in helping gas turbines run reliably and economically. Considering the collective anomalous data and both sensitivity and robustness of the anomaly detection model, a sequential symbolic anomaly detection method is proposed and applied to the gas turbine fuel system. A structural Finite State Machine is used to evaluate posterior probabilities of observing symbolic sequences and the most probable state sequences they may locate. Hence an estimation-based model and a decoding-based model are used to identify anomalies in two different ways. Experimental results indicate that both models have both ideal performance overall, but the estimation-based model has a strong robustness ability, whereas the decoding-based model has a strong accuracy ability, particularly in a certain range of sequence lengths. Therefore, the proposed method can facilitate well existing symbolic dynamic analysis- based anomaly detection methods, especially in the gas turbine domain.

  14. Reliability Estimation of the Pultrusion Process Using the First-Order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Baran, Ismet; Tutum, Cem Celal; Hattel, Jesper Henri

    2013-01-01

    In the present study the reliability estimation of the pultrusion process of a flat plate is analyzed by using the first order reliability method (FORM). The implementation of the numerical process model is validated by comparing the deterministic temperature and cure degree profiles...... with corresponding analyses in the literature. The centerline degree of cure at the exit (CDOCE) being less than a critical value and the maximum composite temperature (Tmax) during the process being greater than a critical temperature are selected as the limit state functions (LSFs) for the FORM. The cumulative...

  15. A method to assign failure rates for piping reliability assessments

    International Nuclear Information System (INIS)

    Gamble, R.M.; Tagart, S.W. Jr.

    1991-01-01

    This paper reports on a simplified method that has been developed to assign failure rates that can be used in reliability and risk studies of piping. The method can be applied on a line-by-line basis by identifying line and location specific attributes that can lead to piping unreliability from in-service degradation mechanisms and random events. A survey of service experience for nuclear piping reliability also was performed. The data from this survey provides a basis for identifying in-service failure attributes and assigning failure rates for risk and reliability studies

  16. Reliability analysis of neutron transport simulation using Monte Carlo method

    International Nuclear Information System (INIS)

    Souza, Bismarck A. de; Borges, Jose C.

    1995-01-01

    This work presents a statistical and reliability analysis covering data obtained by computer simulation of neutron transport process, using the Monte Carlo method. A general description of the method and its applications is presented. Several simulations, corresponding to slowing down and shielding problems have been accomplished. The influence of the physical dimensions of the materials and of the sample size on the reliability level of results was investigated. The objective was to optimize the sample size, in order to obtain reliable results, optimizing computation time. (author). 5 refs, 8 figs

  17. Reliability and minimal detectable difference in multisegment foot kinematics during shod walking and running.

    Science.gov (United States)

    Milner, Clare E; Brindle, Richard A

    2016-01-01

    There has been increased interest recently in measuring kinematics within the foot during gait. While several multisegment foot models have appeared in the literature, the Oxford foot model has been used frequently for both walking and running. Several studies have reported the reliability for the Oxford foot model, but most studies to date have reported reliability for barefoot walking. The purpose of this study was to determine between-day (intra-rater) and within-session (inter-trial) reliability of the modified Oxford foot model during shod walking and running and calculate minimum detectable difference for common variables of interest. Healthy adult male runners participated. Participants ran and walked in the gait laboratory for five trials of each. Three-dimensional gait analysis was conducted and foot and ankle joint angle time series data were calculated. Participants returned for a second gait analysis at least 5 days later. Intraclass correlation coefficients and minimum detectable difference were determined for walking and for running, to indicate both within-session and between-day reliability. Overall, relative variables were more reliable than absolute variables, and within-session reliability was greater than between-day reliability. Between-day intraclass correlation coefficients were comparable to those reported previously for adults walking barefoot. It is an extension in the use of the Oxford foot model to incorporate wearing a shoe while maintaining marker placement directly on the skin for each segment. These reliability data for walking and running will aid in the determination of meaningful differences in studies which use this model during shod gait. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Between-day reliability of a method for non-invasive estimation of muscle composition.

    Science.gov (United States)

    Simunič, Boštjan

    2012-08-01

    Tensiomyography is a method for valid and non-invasive estimation of skeletal muscle fibre type composition. The validity of selected temporal tensiomyographic measures has been well established recently; there is, however, no evidence regarding the method's between-day reliability. Therefore it is the aim of this paper to establish the between-day repeatability of tensiomyographic measures in three skeletal muscles. For three consecutive days, 10 healthy male volunteers (mean±SD: age 24.6 ± 3.0 years; height 177.9 ± 3.9 cm; weight 72.4 ± 5.2 kg) were examined in a supine position. Four temporal measures (delay, contraction, sustain, and half-relaxation time) and maximal amplitude were extracted from the displacement-time tensiomyogram. A reliability analysis was performed with calculations of bias, random error, coefficient of variation (CV), standard error of measurement, and intra-class correlation coefficient (ICC) with a 95% confidence interval. An analysis of ICC demonstrated excellent agreement (ICC were over 0.94 in 14 out of 15 tested parameters). However, lower CV was observed in half-relaxation time, presumably because of the specifics of the parameter definition itself. These data indicate that for the three muscles tested, tensiomyographic measurements were reproducible across consecutive test days. Furthermore, we indicated the most possible origin of the lowest reliability detected in half-relaxation time. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Finite element reliability analysis of fatigue life

    International Nuclear Information System (INIS)

    Harkness, H.H.; Belytschko, T.; Liu, W.K.

    1992-01-01

    Fatigue reliability is addressed by the first-order reliability method combined with a finite element method. Two-dimensional finite element models of components with cracks in mode I are considered with crack growth treated by the Paris law. Probability density functions of the variables affecting fatigue are proposed to reflect a setting where nondestructive evaluation is used, and the Rosenblatt transformation is employed to treat non-Gaussian random variables. Comparisons of the first-order reliability results and Monte Carlo simulations suggest that the accuracy of the first-order reliability method is quite good in this setting. Results show that the upper portion of the initial crack length probability density function is crucial to reliability, which suggests that if nondestructive evaluation is used, the probability of detection curve plays a key role in reliability. (orig.)

  20. Human reliability analysis methods for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-11-01

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  1. A reliability evaluation method for NPP safety DCS application software

    International Nuclear Information System (INIS)

    Li Yunjian; Zhang Lei; Liu Yuan

    2014-01-01

    In the field of nuclear power plant (NPP) digital i and c application, reliability evaluation for safety DCS application software is a key obstacle to be removed. In order to quantitatively evaluate reliability of NPP safety DCS application software, this paper propose a reliability evaluating method based on software development life cycle every stage's v and v defects density characteristics, by which the operating reliability level of the software can be predicted before its delivery, and helps to improve the reliability of NPP safety important software. (authors)

  2. Characteristics and application study of AP1000 NPPs equipment reliability classification method

    International Nuclear Information System (INIS)

    Guan Gao

    2013-01-01

    AP1000 nuclear power plant applies an integrated approach to establish equipment reliability classification, which includes probabilistic risk assessment technique, maintenance rule administrative, power production reliability classification and functional equipment group bounding method, and eventually classify equipment reliability into 4 levels. This classification process and result are very different from classical RCM and streamlined RCM. It studied the characteristic of AP1000 equipment reliability classification approach, considered that equipment reliability classification should effectively support maintenance strategy development and work process control, recommended to use a combined RCM method to establish the future equipment reliability program of AP1000 nuclear power plants. (authors)

  3. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    1984-10-01

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  4. RELIABILITY OF THE DETECTION OF THE BARYON ACOUSTIC PEAK

    International Nuclear Information System (INIS)

    MartInez, Vicent J.; Arnalte-Mur, Pablo; De la Cruz, Pablo; Saar, Enn; Tempel, Elmo; Pons-BorderIa, MarIa Jesus; Paredes, Silvestre; Fernandez-Soto, Alberto

    2009-01-01

    The correlation function of the distribution of matter in the universe shows, at large scales, baryon acoustic oscillations, which were imprinted prior to recombination. This feature was first detected in the correlation function of the luminous red galaxies of the Sloan Digital Sky Survey (SDSS). Recently, the final release (DR7) of the SDSS has been made available, and the useful volume is about two times bigger than in the old sample. We present here, for the first time, the redshift-space correlation function of this sample at large scales together with that for one shallower, but denser volume-limited subsample drawn from the Two-Degree Field Redshift Survey. We test the reliability of the detection of the acoustic peak at about 100 h -1 Mpc and the behavior of the correlation function at larger scales by means of careful estimation of errors. We confirm the presence of the peak in the latest data although broader than in previous detections.

  5. Reliability of leak detection systems in light water reactors

    International Nuclear Information System (INIS)

    Kupperman, D.S.

    1987-01-01

    US Nuclear Regulatory Commission Guide 1.45 recommends the use of at least three different detection methods in reactors to detect leakage. Monitoring of both sump-flow and airborne particulate radioactivity is recommended. A third method can involve either monitoring of condensate flow rate from air coolers or monitoring of airborne gaseous radioactivity. Although the methods currently used for leak detection reflect the state of the art, other techniques may be developed and used. Since the recommendations of Regulatory Guide 1.45 are not mandatory, the technical specifications for 74 operating plants have been reviewed to determine the types of leak detection methods employed. In addition, Licensee Event Report (LER) Compilations from June 1985 to June 1986 have been reviewed to help establish actual capabilities for detecting leaks and determining their source. Work at Argonne National Laboratory has demonstrated that improvements in leak detection, location, and sizing are possible with advanced acoustic leak detection technology

  6. Human reliability-based MC and A models for detecting insider theft

    International Nuclear Information System (INIS)

    Duran, Felicia Angelica; Wyss, Gregory Dane

    2010-01-01

    Material control and accounting (MC and A) safeguards operations that track and account for critical assets at nuclear facilities provide a key protection approach for defeating insider adversaries. These activities, however, have been difficult to characterize in ways that are compatible with the probabilistic path analysis methods that are used to systematically evaluate the effectiveness of a site's physical protection (security) system (PPS). MC and A activities have many similar characteristics to operator procedures performed in a nuclear power plant (NPP) to check for anomalous conditions. This work applies human reliability analysis (HRA) methods and models for human performance of NPP operations to develop detection probabilities for MC and A activities. This has enabled the development of an extended probabilistic path analysis methodology in which MC and A protections can be combined with traditional sensor data in the calculation of PPS effectiveness. The extended path analysis methodology provides an integrated evaluation of a safeguards and security system that addresses its effectiveness for attacks by both outside and inside adversaries.

  7. Reliability research to nuclear power plant operators based on several methods

    International Nuclear Information System (INIS)

    Fang Xiang; Li Fu; Zhao Bingquan

    2009-01-01

    The paper utilizes many kinds of international reliability research methods, and summarizes the review of reliability research of Chinese nuclear power plant operators in past over ten years based on the simulator platform of nuclear power plant. The paper shows the necessity and feasibility of the research to nuclear power plant operators from many angles including human cognition reliability, fuzzy mathematics model and psychological research model, etc. It will be good to the safe operation of nuclear power plant based on many kinds of research methods to the reliability research of nuclear power plant operators. (authors)

  8. Investigation of MLE in nonparametric estimation methods of reliability function

    International Nuclear Information System (INIS)

    Ahn, Kwang Won; Kim, Yoon Ik; Chung, Chang Hyun; Kim, Kil Yoo

    2001-01-01

    There have been lots of trials to estimate a reliability function. In the ESReDA 20 th seminar, a new method in nonparametric way was proposed. The major point of that paper is how to use censored data efficiently. Generally there are three kinds of approach to estimate a reliability function in nonparametric way, i.e., Reduced Sample Method, Actuarial Method and Product-Limit (PL) Method. The above three methods have some limits. So we suggest an advanced method that reflects censored information more efficiently. In many instances there will be a unique maximum likelihood estimator (MLE) of an unknown parameter, and often it may be obtained by the process of differentiation. It is well known that the three methods generally used to estimate a reliability function in nonparametric way have maximum likelihood estimators that are uniquely exist. So, MLE of the new method is derived in this study. The procedure to calculate a MLE is similar just like that of PL-estimator. The difference of the two is that in the new method, the mass (or weight) of each has an influence of the others but the mass in PL-estimator not

  9. ESTIMATING RELIABILITY OF DISTURBANCES IN SATELLITE TIME SERIES DATA BASED ON STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Z.-G. Zhou

    2016-06-01

    Full Text Available Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with “Change/ No change” by most of the present methods, while few methods focus on estimating reliability (or confidence level of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1 Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST. (2 Forecasting and detecting disturbances in new time series data. (3 Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI and Confidence Levels (CL. The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  10. A Reliable Method for the Evaluation of the Anaphylactoid Reaction Caused by Injectable Drugs

    Directory of Open Access Journals (Sweden)

    Fang Wang

    2016-10-01

    Full Text Available Adverse reactions of injectable drugs usually occur at first administration and are closely associated with the dosage and speed of injection. This phenomenon is correlated with the anaphylactoid reaction. However, up to now, study methods based on antigen detection have still not gained wide acceptance and single physiological indicators cannot be utilized to differentiate anaphylactoid reactions from allergic reactions and inflammatory reactions. In this study, a reliable method for the evaluation of anaphylactoid reactions caused by injectable drugs was established by using multiple physiological indicators. We used compound 48/80, ovalbumin and endotoxin as the sensitization agents to induce anaphylactoid, allergic and inflammatory reactions. Different experimental animals (guinea pig and nude rat and different modes of administration (intramuscular, intravenous and intraperitoneal injection and different times (15 min, 30 min and 60 min were evaluated to optimize the study protocol. The results showed that the optimal way to achieve sensitization involved treating guinea pigs with the different agents by intravenous injection for 30 min. Further, seven related humoral factors including 5-HT, SC5b-9, Bb, C4d, IL-6, C3a and histamine were detected by HPLC analysis and ELISA assay to determine their expression level. The results showed that five of them, including 5-HT, SC5b-9, Bb, C4d and IL-6, displayed significant differences between anaphylactoid, allergic and inflammatory reactions, which indicated that their combination could be used to distinguish these three reactions. Then different injectable drugs were used to verify this method and the results showed that the chosen indicators exhibited good correlation with the anaphylactoid reaction which indicated that the established method was both practical and reliable. Our research provides a feasible method for the diagnosis of the serious adverse reactions caused by injectable drugs which

  11. LEA Detection and Tracking Method for Color-Independent Visual-MIMO

    Directory of Open Access Journals (Sweden)

    Jai-Eun Kim

    2016-07-01

    Full Text Available Communication performance in the color-independent visual-multiple input multiple output (visual-MIMO technique is deteriorated by light emitting array (LEA detection and tracking errors in the received image because the image sensor included in the camera must be used as the receiver in the visual-MIMO system. In this paper, in order to improve detection reliability, we first set up the color-space-based region of interest (ROI in which an LEA is likely to be placed, and then use the Harris corner detection method. Next, we use Kalman filtering for robust tracking by predicting the most probable location of the LEA when the relative position between the camera and the LEA varies. In the last step of our proposed method, the perspective projection is used to correct the distorted image, which can improve the symbol decision accuracy. Finally, through numerical simulation, we show the possibility of robust detection and tracking of the LEA, which results in a symbol error rate (SER performance improvement.

  12. New Multiplexing Tools for Reliable GMO Detection

    NARCIS (Netherlands)

    Pla, M.; Nadal, A.; Baeten, V.; Bahrdt, C.; Berben, G.; Bertheau, Y.; Coll, A.; Dijk, van J.P.; Dobnik, D.; Fernandez-Pierna, J.A.; Gruden, K.; Hamels, S.; Holck, A.; Holst-Jensen, A.; Janssen, E.; Kok, E.J.; Paz, La J.L.; Laval, V.; Leimanis, S.; Malcevschi, A.; Marmiroli, N.; Morisset, D.; Prins, T.W.; Remacle, J.; Ujhelyi, G.; Wulff, D.

    2012-01-01

    Among the available methods for GMO detection, enforcement and routine laboratories use in practice PCR, based on the detection of transgenic DNA. The cost required for GMO analysis is constantly increasing due to the progress of GMO commercialization, with inclusion of higher diversity of species,

  13. In vitro cost-effective methods to detect carbapenemases in Enterobacteriaceae

    Directory of Open Access Journals (Sweden)

    Varsha Gupta

    2018-01-01

    Full Text Available The rise in carbapenemases-producing organisms has challenged the scientific community. Infections caused by these bacteria have limited treatment options. There are various types such as Klebsiella pneumoniae carbapenemase (Ambler class A, metallo-beta-lactamases of VIM-type, IMP-type, NDM-type (Ambler class B, and OXA-48-types (Ambler class D. An efficient strategy for detection of carbapenemase producers is important to determine the appropriate therapeutic modalities. In this study, four methods - Carba NP test, modified Carba NP (MCNP test, carbapenem inactivation method (CIM test, and Rapidec Carba NP kit test were evaluated. We evaluated an in-house MCNP test to detect carbapenemase production using a single protocol which gave reliable results. Furthermore, CIM using routine antibiotic discs gives good results. Both these tests were found to be cost-effective.

  14. Improving the safety and reliability of Monju

    International Nuclear Information System (INIS)

    Itou, Kazumoto; Maeda, Hiroshi; Moriyama, Masatoshi

    1998-01-01

    Comprehensive safety review has been performed at Monju to determine why the Monju secondary sodium leakage accident occurred. We investigated how to improve the situation based on the results of the safety review. The safety review focused on five aspects of whether the facilities for dealing with the sodium leakage accident were adequate: the reliability of the detection method, the reliability of the method for preventing the spread of the sodium leakage accident, whether the documented operating procedures are adequate, whether the quality assurance system, program, and actions were properly performed and so on. As a result, we established for Monju a better method of dealing with sodium leakage accidents, rapid detection of sodium leakage, improvement of sodium drain facilities, and way to reduce damage to Monju systems after an accident. We also improve the operation procedures and quality assurance actions to increase the safety and reliability of Monju. (author)

  15. Fast Reliability Assessing Method for Distribution Network with Distributed Renewable Energy Generation

    Science.gov (United States)

    Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming

    2018-01-01

    This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.

  16. A Novel Method for Detection and Classification of Covered Conductor Faults

    Directory of Open Access Journals (Sweden)

    Stanislav Misak

    2016-01-01

    Full Text Available Medium-Voltage (MV overhead lines with Covered Conductors (CCs are increasingly being used around the world primarily in forested or dissected terrain areas or in urban areas where it is not possible to utilize MV cable lines. The CC is specific in high operational reliability provided by the conductor core insulation compared to Aluminium-Conductor Steel-Reinforced (ACSR overhead lines. The only disadvantage of the CC is rather the problematic detection of faults compared to the ACSR. In this work, we consider the following faults: the contact of a tree branch with a CC and the fall of a conductor on the ground. The standard protection relays are unable to detect the faults and so the faults pose a risk for individuals in the vicinity of the conductor as well as it compromises the overall safety and reliability of the MV distribution system. In this article, we continue with our previous work aimed at the method enabling detection of the faults and we introduce a method enabling a classification of the fault type. Such a classification is especially important for an operator of an MV distribution system to plan the optimal maintenance or repair the faulty conductors since the fall of a tree branch can be solved later whereas the breakdown of a conductor means an immediate action of the operator.

  17. A two-step method for fast and reliable EUV mask metrology

    Science.gov (United States)

    Helfenstein, Patrick; Mochi, Iacopo; Rajendran, Rajeev; Yoshitake, Shusuke; Ekinci, Yasin

    2017-03-01

    One of the major obstacles towards the implementation of extreme ultraviolet lithography for upcoming technology nodes in semiconductor industry remains the realization of a fast and reliable detection methods patterned mask defects. We are developing a reflective EUV mask-scanning lensless imaging tool (RESCAN), installed at the Swiss Light Source synchrotron at the Paul Scherrer Institut. Our system is based on a two-step defect inspection method. In the first step, a low-resolution defect map is generated by die to die comparison of the diffraction patterns from areas with programmed defects, to those from areas that are known to be defect-free on our test sample. In a later stage, a die to database comparison will be implemented in which the measured diffraction patterns will be compared to those calculated directly from the mask layout. This Scattering Scanning Contrast Microscopy technique operates purely in the Fourier domain without the need to obtain the aerial image and, given a sufficient signal to noise ratio, defects are found in a fast and reliable way, albeit with a location accuracy limited by the spot size of the incident illumination. Having thus identified rough locations for the defects, a fine scan is carried out in the vicinity of these locations. Since our source delivers coherent illumination, we can use an iterative phase-retrieval method to reconstruct the aerial image of the scanned area with - in principle - diffraction-limited resolution without the need of an objective lens. Here, we will focus on the aerial image reconstruction technique and give a few examples to illustrate the capability of the method.

  18. Modifying nodal pricing method considering market participants optimality and reliability

    Directory of Open Access Journals (Sweden)

    A. R. Soofiabadi

    2015-06-01

    Full Text Available This paper develops a method for nodal pricing and market clearing mechanism considering reliability of the system. The effects of components reliability on electricity price, market participants’ profit and system social welfare is considered. This paper considers reliability both for evaluation of market participant’s optimality as well as for fair pricing and market clearing mechanism. To achieve fair pricing, nodal price has been obtained through a two stage optimization problem and to achieve fair market clearing mechanism, comprehensive criteria has been introduced for optimality evaluation of market participant. Social welfare of the system and system efficiency are increased under proposed modified nodal pricing method.

  19. Reliability analysis for the quench detection in the LHC machine

    CERN Document Server

    Denz, R; Vergara-Fernández, A

    2002-01-01

    The Large Hadron Collider (LHC) will incorporate a large amount of superconducting elements that require protection in case of a quench. Key elements in the quench protection system are the electronic quench detectors. Their reliability will have an important impact on the down time as well as on the operational cost of the collider. The expected rates of both false and missed quenches have been computed for several redundant detection schemes. The developed model takes account of the maintainability of the system to optimise the frequency of foreseen checks, and evaluate their influence on the performance of different detection topologies. Seen the uncertainty of the failure rate of the components combined with the LHC tunnel environment, the study has been completed with a sensitivity analysis of the results. The chosen detection scheme and the maintainability strategy for each detector family are given.

  20. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine

    DEFF Research Database (Denmark)

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon

    2013-01-01

    as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine...

  1. Level III Reliability methods feasible for complex structures

    NARCIS (Netherlands)

    Waarts, P.H.; Boer, A. de

    2001-01-01

    The paper describes the comparison between three types of reliability methods: code type level I used by a designer, full level I and a level III method. Two cases that are typical for civil engineering practise, a cable-stayed subjected to traffic load and the installation of a soil retaining sheet

  2. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  3. Larvas output and influence of human factor in reliability of meat inspection by the method of artificial digestion

    OpenAIRE

    Đorđević Vesna; Savić Marko; Vasilev Saša; Đorđević Milovan

    2013-01-01

    On the basis of the performed analyses of the factors that contributed the infected meat reach food chain, we have found out that the infection occurred after consuming the meat inspected by the method of collective samples artificial digestion by using a magnetic stirrer (MM). In this work there are presented assay results which show how modifications of the method, on the level of final sedimentation, influence the reliability of Trichinella larvas detect...

  4. Assessment and Improving Methods of Reliability Indices in Bakhtar Regional Electricity Company

    Directory of Open Access Journals (Sweden)

    Saeed Shahrezaei

    2013-04-01

    Full Text Available Reliability of a system is the ability of a system to do prospected duties in future and the probability of desirable operation for doing predetermined duties. Power system elements failures data are the main data of reliability assessment in the network. Determining antiseptic parameters is the goal of reliability assessment by using system history data. These parameters help to recognize week points of the system. In other words, the goal of reliability assessment is operation improving and decreasing of the failures and power outages. This paper is developed to assess reliability indices of Bakhtar Regional Electricity Company up to 1393 and the improving methods and their effects on the reliability indices in this network. DIgSILENT Power Factory software is employed for simulation. Simulation results show the positive effect of improving methods in reliability indices of Bakhtar Regional Electricity Company.

  5. Review of Quantitative Software Reliability Methods

    Energy Technology Data Exchange (ETDEWEB)

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of

  6. An Intelligent Method for Structural Reliability Analysis Based on Response Surface

    Institute of Scientific and Technical Information of China (English)

    桂劲松; 刘红; 康海贵

    2004-01-01

    As water depth increases, the structural safety and reliability of a system become more and more important and challenging. Therefore, the structural reliability method must be applied in ocean engineering design such as offshore platform design. If the performance function is known in structural reliability analysis, the first-order second-moment method is often used. If the performance function could not be definitely expressed, the response surface method is always used because it has a very clear train of thought and simple programming. However, the traditional response surface method fits the response surface of quadratic polynomials where the problem of accuracy could not be solved, because the true limit state surface can be fitted well only in the area near the checking point. In this paper, an intelligent computing method based on the whole response surface is proposed, which can be used for the situation where the performance function could not be definitely expressed in structural reliability analysis. In this method, a response surface of the fuzzy neural network for the whole area should be constructed first, and then the structural reliability can be calculated by the genetic algorithm. In the proposed method, all the sample points for the training network come from the whole area, so the true limit state surface in the whole area can be fitted. Through calculational examples and comparative analysis, it can be known that the proposed method is much better than the traditional response surface method of quadratic polynomials, because, the amount of calculation of finite element analysis is largely reduced, the accuracy of calculation is improved,and the true limit state surface can be fitted very well in the whole area. So, the method proposed in this paper is suitable for engineering application.

  7. Comparative study on 4 quantitative detection methods of apoptosis induced by radiation

    International Nuclear Information System (INIS)

    Yang Yepeng; Chen Guanying; Zhou Mei; Shen Qinjian; Shen Lei; Zhu Yingbao

    2004-01-01

    Objective: To reveal the capability of 4 apoptosis-detecting methods to discriminate between apoptosis and necrosis and show their respective advantages and shortcomings through comparison of detected results and analysis of detection mechanism. Methods: Four methods, PI staining-flow cytometric detection (P-F method), TUNEL labeling-flow cytometric detection (T-F method), annexing V-FITC/PI vital staining-flow cytometric detection (A-F method) and Hoechst/PI vital staining-fluorescence microscopic observation (H-O method), were used to determine apoptosis and necrosis in human breast cancer MCF-7 cell line induced by γ-rays. Hydroxycamptothecine and sodium azide were used to induce positive controls of apoptosis and necrosis respectively. Results: All 4 methods showed good time-dependent and dose dependent respondence to apoptosis induced by γ-rays and hydroxycamptothecine. Apoptotic cell ratios and curve slopes obtained from P-F method were minimum and, on the contrary, those from T-F method were maximum among these 4 methods. With A-F method and H-O method, two sets of data, apoptosis and necrosis, could be gained respectively and the data gained from these two methods were close to equal. A-F method and H-O method could distinguish necrosis induced by sodium azide from apoptosis while P-F method and T-F method presented false increase of apoptosis. Conclusions: P-F method and T-F method can not discriminate between apoptosis and necrosis. P-F method is less sensitive but more simple, convenient and economical than T-F method. A-F method and H-O method can distinguish necrosis from apoptosis. A-F method is more costly but more quick and reliable than H-O method. H-O method is economical, practical and morphological changes of cells and nucleus can be observed simultaneously with it. (authors)

  8. Reliability of non-destructive testing methods

    International Nuclear Information System (INIS)

    Broekhoven, M.J.G.

    1988-01-01

    This contribution regards the results of an evaluation of the reliability of radiography (X-rays and gamma-rays), manual-, and mechanized/automated ultrasonic examination by generally accepted codes/rules, with respect to detection, characterization and sizing/localization of defects. The evaluation is based on the results of examinations, by a number of teams, of 30 test plates, 30 and 50 mm thickness, containing V,U, X and K-shaped welds each containing several types of imperfections (211) in total) typical for steel arc fusion welding, such as porosity, inclusions, lack of fusion or penetration and cracks. Besides, some results are presented obtained from research on advanced UT-techniques, viz. the time-of-flight-diffraction and flaw-tip deflection technique. (author)

  9. Reliability of non-destructive testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Broekhoven, M J.G. [Ministry of Social Affairs, (Netherlands)

    1988-12-31

    This contribution regards the results of an evaluation of the reliability of radiography (X-rays and gamma-rays), manual-, and mechanized/automated ultrasonic examination by generally accepted codes/rules, with respect to detection, characterization and sizing/localization of defects. The evaluation is based on the results of examinations, by a number of teams, of 30 test plates, 30 and 50 mm thickness, containing V,U, X and K-shaped welds each containing several types of imperfections (211) in total) typical for steel arc fusion welding, such as porosity, inclusions, lack of fusion or penetration and cracks. Besides, some results are presented obtained from research on advanced UT-techniques, viz. the time-of-flight-diffraction and flaw-tip deflection technique. (author). 4 refs.

  10. Reliability Evaluation of Bridges Based on Nonprobabilistic Response Surface Limit Method

    OpenAIRE

    Chen, Xuyong; Chen, Qian; Bian, Xiaoya; Fan, Jianping

    2017-01-01

    Due to many uncertainties in nonprobabilistic reliability assessment of bridges, the limit state function is generally unknown. The traditional nonprobabilistic response surface method is a lengthy and oscillating iteration process and leads to difficultly solving the nonprobabilistic reliability index. This article proposes a nonprobabilistic response surface limit method based on the interval model. The intention of this method is to solve the upper and lower limits of the nonprobabilistic ...

  11. Survey of methods used to asses human reliability in the human factors reliability benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1988-01-01

    The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim to assess the state-of-the-art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participate in the HF-RBE, which is organised around two study cases: (1) analysis of routine functional test and maintenance procedures, with the aim to assess the probability of test-induced failures, the probability of failures to remain unrevealed, and the potential to initiate transients because of errors performed in the test; and (2) analysis of human actions during an operational transient, with the aim to assess the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. The paper briefly reports how the HF-RBE was structured and gives an overview of the methods that have been used for predicting human reliability in both study cases. The experience in applying these methods is discussed and the results obtained are compared. (author)

  12. Limitations in simulator time-based human reliability analysis methods

    International Nuclear Information System (INIS)

    Wreathall, J.

    1989-01-01

    Developments in human reliability analysis (HRA) methods have evolved slowly. Current methods are little changed from those of almost a decade ago, particularly in the use of time-reliability relationships. While these methods were suitable as an interim step, the time (and the need) has come to specify the next evolution of HRA methods. As with any performance-oriented data source, power plant simulator data have no direct connection to HRA models. Errors reported in data are normal deficiencies observed in human performance; failures are events modeled in probabilistic risk assessments (PRAs). Not all errors cause failures; not all failures are caused by errors. Second, the times at which actions are taken provide no measure of the likelihood of failures to act correctly within an accident scenario. Inferences can be made about human reliability, but they must be made with great care. Specific limitations are discussed. Simulator performance data are useful in providing qualitative evidence of the variety of error types and their potential influences on operating systems. More work is required to combine recent developments in the psychology of error with the qualitative data collected at stimulators. Until data become openly available, however, such an advance will not be practical

  13. A Targeted LC-MS/MS Method for the Simultaneous Detection and Quantitation of Egg, Milk, and Peanut Allergens in Sugar Cookies.

    Science.gov (United States)

    Boo, Chelsea C; Parker, Christine H; Jackson, Lauren S

    2018-01-01

    Food allergy is a growing public health concern, with many individuals reporting allergies to multiple food sources. Compliance with food labeling regulations and prevention of inadvertent cross-contact in manufacturing requires the use of reliable methods for the detection and quantitation of allergens in processed foods. In this work, a novel liquid chromatography-tandem mass spectrometry multiple-reaction monitoring method for multiallergen detection and quantitation of egg, milk, and peanut was developed and evaluated in an allergen-incurred baked sugar cookie matrix. A systematic evaluation of method parameters, including sample extraction, concentration, and digestion, were optimized for candidate allergen peptide markers. The optimized method enabled the reliable detection and quantitation of egg, milk, and peanut allergens in sugar cookies, with allergen concentrations as low as 5 ppm allergen-incurred ingredient.

  14. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine: An overview

    Science.gov (United States)

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon; Prasad, Ramjee

    2013-01-01

    Recently, a need to develop supportive new scientific evidence for contemporary Ayurveda has emerged. One of the research objectives is an assessment of the reliability of diagnoses and treatment. Reliability is a quantitative measure of consistency. It is a crucial issue in classification (such as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine diagnoses is in the formative stage. However, reliability studies in Ayurveda are in the preliminary stage. In this paper, examples are provided to illustrate relevant concepts of reliability studies of diagnostic methods and their implication in practice, education, and training. An introduction to reliability estimates and different study designs and statistical analysis is given for future studies in Ayurveda. PMID:23930037

  15. Ultrasonic Guided Wave Method For Crack Detection In Buried Plastic Pipe

    Directory of Open Access Journals (Sweden)

    Wan Hamat Wan Sofian

    2016-01-01

    Full Text Available Plastic pipe are widely used in many fields for the fluid or gaseous product conveyance but basic components of a plastic material made it very sensitive to damage, which requires techniques for detecting damage reliable and efficient. Ultrasonic guided wave is a sensitive method based on propagation of low-frequency excitation in solid structures for damage detection. Ultrasonic guided wave method are performed to investigate the effect of crack to the frequency signal using Fast Fourier Transform (FFT analysis. This paper researched to determine performance of ultrasonic guided wave method in order to detect crack in buried pipeline. It was found that for an uncrack pipe, FFT analysis shows one peak which is the operating frequency by the piezoelectric actuator itself while the FFT analysis for single cracked pipe shows two peak which is the operating frequency by the piezoelectric actuator itself and the resultant frequency from the crack. For multi cracked pipe, the frequency signal shows more than two peak depend the number of crack. The results presented here may facilitate improvements in the accuracy and precision of pipeline crack detection.

  16. Statistical Bayesian method for reliability evaluation based on ADT data

    Science.gov (United States)

    Lu, Dawei; Wang, Lizhi; Sun, Yusheng; Wang, Xiaohong

    2018-05-01

    Accelerated degradation testing (ADT) is frequently conducted in the laboratory to predict the products’ reliability under normal operating conditions. Two kinds of methods, degradation path models and stochastic process models, are utilized to analyze degradation data and the latter one is the most popular method. However, some limitations like imprecise solution process and estimation result of degradation ratio still exist, which may affect the accuracy of the acceleration model and the extrapolation value. Moreover, the conducted solution of this problem, Bayesian method, lose key information when unifying the degradation data. In this paper, a new data processing and parameter inference method based on Bayesian method is proposed to handle degradation data and solve the problems above. First, Wiener process and acceleration model is chosen; Second, the initial values of degradation model and parameters of prior and posterior distribution under each level is calculated with updating and iteration of estimation values; Third, the lifetime and reliability values are estimated on the basis of the estimation parameters; Finally, a case study is provided to demonstrate the validity of the proposed method. The results illustrate that the proposed method is quite effective and accuracy in estimating the lifetime and reliability of a product.

  17. Mathematical Methods in Survival Analysis, Reliability and Quality of Life

    CERN Document Server

    Huber, Catherine; Mesbah, Mounir

    2008-01-01

    Reliability and survival analysis are important applications of stochastic mathematics (probability, statistics and stochastic processes) that are usually covered separately in spite of the similarity of the involved mathematical theory. This title aims to redress this situation: it includes 21 chapters divided into four parts: Survival analysis, Reliability, Quality of life, and Related topics. Many of these chapters were presented at the European Seminar on Mathematical Methods for Survival Analysis, Reliability and Quality of Life in 2006.

  18. A method of predicting the reliability of CDM coil insulation

    International Nuclear Information System (INIS)

    Kytasty, A.; Ogle, C.; Arrendale, H.

    1992-01-01

    This paper presents a method of predicting the reliability of the Collider Dipole Magnet (CDM) coil insulation design. The method proposes a probabilistic treatment of electrical test data, stress analysis, material properties variability and loading uncertainties to give the reliability estimate. The approach taken to predict reliability of design related failure modes of the CDM is to form analytical models of the various possible failure modes and their related mechanisms or causes, and then statistically assess the contributions of the various contributing variables. The probability of the failure mode occurring is interpreted as the number of times one would expect certain extreme situations to combine and randomly occur. One of the more complex failure modes of the CDM will be used to illustrate this methodology

  19. Developing a reliable signal wire attachment method for rail.

    Science.gov (United States)

    2014-11-01

    The goal of this project was to develop a better attachment method for rail signal wires to improve the reliability of signaling : systems. EWI conducted basic research into the failure mode of current attachment methods and developed and tested a ne...

  20. An improved electrochemiluminescence polymerase chain reaction method for highly sensitive detection of plant viruses

    International Nuclear Information System (INIS)

    Tang Yabing; Xing Da; Zhu Debin; Liu Jinfeng

    2007-01-01

    Recently, we have reported an electrochemiluminescence polymerase chain reaction (ECL-PCR) method for detection of genetically modified organisms. The ECL-PCR method was further improved in the current study by introducing a multi-purpose nucleic acid sequence that was specific to the tris(bipyridine) ruthenium (TBR) labeled probe, into the 5' terminal of the primers. The method was applied to detect plant viruses. Conserved sequence of the plant viruses was amplified by PCR. The product was hybridized with a biotin labeled probe and a TBR labeled probe. The hybridization product was separated by streptavidin-coated magnetic beads, and detected by measuring the ECL signals of the TBR labeled. Under the optimized conditions, the experiment results show that the detection limit is 50 fmol of PCR products, and the signal-to-noise ratio is in excess of 14.6. The method was used to detect banana streak virus, banana bunchy top virus, and papaya leaf curl virus. The experiment results show that this method could reliably identity viruses infected plant samples. The improved ECL-PCR approach has higher sensitivity and lower cost than previous approach. It can effectively detect the plant viruses with simplicity, stability, and high sensitivity

  1. Data-driven fault detection for industrial processes canonical correlation analysis and projection based methods

    CERN Document Server

    Chen, Zhiwen

    2017-01-01

    Zhiwen Chen aims to develop advanced fault detection (FD) methods for the monitoring of industrial processes. With the ever increasing demands on reliability and safety in industrial processes, fault detection has become an important issue. Although the model-based fault detection theory has been well studied in the past decades, its applications are limited to large-scale industrial processes because it is difficult to build accurate models. Furthermore, motivated by the limitations of existing data-driven FD methods, novel canonical correlation analysis (CCA) and projection-based methods are proposed from the perspectives of process input and output data, less engineering effort and wide application scope. For performance evaluation of FD methods, a new index is also developed. Contents A New Index for Performance Evaluation of FD Methods CCA-based FD Method for the Monitoring of Stationary Processes Projection-based FD Method for the Monitoring of Dynamic Processes Benchmark Study and Real-Time Implementat...

  2. A comparison of photographic, replication and direct clinical examination methods for detecting developmental defects of enamel

    Directory of Open Access Journals (Sweden)

    Pakshir Hamid-Reza

    2011-04-01

    Full Text Available Abstract Background Different methods have been used for detecting developmental defects of enamel (DDE. This study aimed to compare photographic and replication methods with the direct clinical examination method for detecting DDE in children's permanent incisors. Methods 110 8-10-year-old schoolchildren were randomly selected from an examined sample of 335 primary Shiraz school children. Modified DDE index was used in all three methods. Direct examinations were conducted by two calibrated examiners using flat oral mirrors and tongue blades. Photographs were taken using a digital SLR camera (Nikon D-80, macro lens, macro flashes, and matt flash filters. Impressions were taken using additional-curing silicon material and casts made in orthodontic stone. Impressions and models were both assessed using dental loupes (magnification=x3.5. Each photograph/impression/cast was assessed by two calibrated examiners. Reliability of methods was assessed using kappa agreement tests. Kappa agreement, McNemar's and two-sample proportion tests were used to compare results obtained by the photographic and replication methods with those obtained by the direct examination method. Results Of the 110 invited children, 90 were photographed and 73 had impressions taken. The photographic method had higher reliability levels than the other two methods, and compared to the direct clinical examination detected significantly more subjects with DDE (P = 0.002, 3.1 times more DDE (P Conclusion The photographic method was much more sensitive than direct clinical examination in detecting DDE and was the best of the three methods for epidemiological studies. The replication method provided less information about DDE compared to photography. Results of this study have implications for both epidemiological and detailed clinical studies on DDE.

  3. Reliability of surface electromyography timing parameters in gait in cervical spondylotic myelopathy.

    LENUS (Irish Health Repository)

    Malone, Ailish

    2012-02-01

    The aims of this study were to validate a computerised method to detect muscle activity from surface electromyography (SEMG) signals in gait in patients with cervical spondylotic myelopathy (CSM), and to evaluate the test-retest reliability of the activation times designated by this method. SEMG signals were recorded from rectus femoris (RF), biceps femoris (BF), tibialis anterior (TA), and medial gastrocnemius (MG), during gait in 12 participants with CSM on two separate test days. Four computerised activity detection methods, based on the Teager-Kaiser Energy Operator (TKEO), were applied to a subset of signals and compared to visual interpretation of muscle activation. The most accurate method was then applied to all signals for evaluation of test-retest reliability. A detection method based on a combined slope and amplitude threshold showed the highest agreement (87.5%) with visual interpretation. With respect to reliability, the standard error of measurement (SEM) of the timing of RF, TA and MG between test days was 5.5% stride duration or less, while the SEM of BF was 9.4%. The timing parameters of RF, TA and MG designated by this method were considered sufficiently reliable for use in clinical practice, however the reliability of BF was questionable.

  4. Advancing methods for reliably assessing motivational interviewing fidelity using the motivational interviewing skills code.

    Science.gov (United States)

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C

    2015-02-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. FJ-2207 measuring instrument detection pipe surface a level of pollution method

    International Nuclear Information System (INIS)

    Wang Jiangong

    2010-01-01

    On the pipe surface contamination were detected α level of pollution is a frequently encountered dose-detection work. Because the pipeline surface arc, while the measuring probe for the plane, which for accurate measurement difficult. In this paper, on the FJ-2207-type pipe surface contamination measuring instrument measuring pollution levels in the α method was studied. Introduced the FJ-2207 measuring instrument detection pipe surface α pollution levels. Studied this measuring instrument on the same sources of surface, plane α level of radioactivity measured differences in the results obtained control of the apparatus when the direct measurement of the surface correction factor, and gives 32-216 specifications commonly used pipe direct measurement of the amendment factor. Convenient method, test results are reliable for the accurate measurement of pipe pollution levels in the surface of α as a reference and learning. (authors)

  6. PAUT-based defect detection method for submarine pressure hulls

    Directory of Open Access Journals (Sweden)

    Min-jae Jung

    2018-03-01

    Full Text Available A submarine has a pressure hull that can withstand high hydraulic pressure and therefore, requires the use of highly advanced shipbuilding technology. When producing a pressure hull, periodic inspection, repair, and maintenance are conducted to maintain its soundness. Of the maintenance methods, Non-Destructive Testing (NDT is the most effective, because it does not damage the target but sustains its original form and function while inspecting internal and external defects. The NDT process to detect defects in the welded parts of the submarine is applied through Magnetic particle Testing (MT to detect surface defects and Ultrasonic Testing (UT and Radiography Testing (RT to detect internal defects. In comparison with RT, UT encounters difficulties in distinguishing the types of defects, can yield different results depending on the skills of the inspector, and stores no inspection record. At the same time, the use of RT gives rise to issues related to worker safety due to radiation exposure. RT is also difficult to apply from the perspectives of the manufacturing of the submarine and economic feasibility. Therefore, in this study, the Phased Array Ultrasonic Testing (PAUT method was applied to propose an inspection method that can address the above disadvantages by designing a probe to enhance the precision of detection of hull defects and the reliability of calculations of defect size. Keywords: Submarine pressure hull, Non-destructive testing, Phased array ultrasonic testing

  7. Indian program for development of technologies relevant to reliable, non-intrusive, concealed-contraband detection

    International Nuclear Information System (INIS)

    Auluck, S.K.H.

    2007-01-01

    Generating capability for reliable, non-intrusive detection of concealed-contraband, particularly, organic contraband like explosives and narcotics, has become a national priority. This capability spans a spectrum of technologies. If a technology mission addressing the needs of a highly sophisticated technology like PFNA is set up, the capabilities acquired would be adequate to meet the requirements of many other sets of technologies. This forms the background of the Indian program for development of technologies relevant to reliable, non-intrusive, concealed contraband detection. One of the central themes of the technology development programs would be modularization of the neutron source and detector technologies, so that common elements can be combined in different ways for meeting a variety of application requirements. (author)

  8. The effect of DLC-coating deposition method on the reliability and mechanical properties of abutment's screws.

    Science.gov (United States)

    Bordin, Dimorvan; Coelho, Paulo G; Bergamo, Edmara T P; Bonfante, Estevam A; Witek, Lukasz; Del Bel Cury, Altair A

    2018-04-10

    To characterize the mechanical properties of different coating methods of DLC (diamond-like carbon) onto dental implant abutment screws, and their effect on the probability of survival (reliability). Seventy-five abutment screws were allocated into three groups according to the coating method: control (no coating); UMS - DLC applied through unbalanced magnetron sputtering; RFPA-DLC applied through radio frequency plasma-activated (n=25/group). Twelve screws (n=4) were used to determine the hardness and Young's modulus (YM). A 3D finite element model composed of titanium substrate, DLC-layer and a counterpart were constructed. The deformation (μm) and shear stress (MPa) were calculated. The remaining screws of each group were torqued into external hexagon abutments and subjected to step-stress accelerated life-testing (SSALT) (n=21/group). The probability Weibull curves and reliability (probability survival) were calculated considering the mission of 100, 150 and 200N at 50,000 and 100,000 cycles. DLC-coated experimental groups evidenced higher hardness than control (p1 indicating that fatigue contributed to failure. High reliability was depicted at a mission of 100N. At 200N a significant decrease in reliability was detected for all groups (ranging from 39% to 66%). No significant difference was observed among groups regardless of mission. Screw fracture was the chief failure mode. DLC-coating have been used to improve titanium's mechanical properties and increase the reliability of dental implant-supported restorations. Copyright © 2018 The Academy of Dental Materials. Published by Elsevier Inc. All rights reserved.

  9. Reliability analysis for thermal cutting method based non-explosive separation device

    International Nuclear Information System (INIS)

    Choi, Jun Woo; Hwang, Kuk Ha; Kim, Byung Kyu

    2016-01-01

    In order to increase the reliability of a separation device for a small satellite, a new non-explosive separation device is invented. This device is activated using a thermal cutting method with a Ni-Cr wire. A reliability analysis is carried out for the proposed non-explosive separation device by applying the Fault tree analysis (FTA) method. In the FTA results for the separation device, only ten single-point failure modes are found. The reliability modeling and analysis for the device are performed considering failure of the power supply, the Ni-Cr wire burns failure and unwinds, the holder separation failure, the balls separation failure, and the pin release failure. Ultimately, the reliability of the proposed device is calculated as 0.999989 with five Ni-Cr wire coils

  10. Reliability analysis for thermal cutting method based non-explosive separation device

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jun Woo; Hwang, Kuk Ha; Kim, Byung Kyu [Korea Aerospace University, Goyang (Korea, Republic of)

    2016-12-15

    In order to increase the reliability of a separation device for a small satellite, a new non-explosive separation device is invented. This device is activated using a thermal cutting method with a Ni-Cr wire. A reliability analysis is carried out for the proposed non-explosive separation device by applying the Fault tree analysis (FTA) method. In the FTA results for the separation device, only ten single-point failure modes are found. The reliability modeling and analysis for the device are performed considering failure of the power supply, the Ni-Cr wire burns failure and unwinds, the holder separation failure, the balls separation failure, and the pin release failure. Ultimately, the reliability of the proposed device is calculated as 0.999989 with five Ni-Cr wire coils.

  11. Machine learning plus optical flow: a simple and sensitive method to detect cardioactive drugs

    Science.gov (United States)

    Lee, Eugene K.; Kurokawa, Yosuke K.; Tu, Robin; George, Steven C.; Khine, Michelle

    2015-07-01

    Current preclinical screening methods do not adequately detect cardiotoxicity. Using human induced pluripotent stem cell-derived cardiomyocytes (iPS-CMs), more physiologically relevant preclinical or patient-specific screening to detect potential cardiotoxic effects of drug candidates may be possible. However, one of the persistent challenges for developing a high-throughput drug screening platform using iPS-CMs is the need to develop a simple and reliable method to measure key electrophysiological and contractile parameters. To address this need, we have developed a platform that combines machine learning paired with brightfield optical flow as a simple and robust tool that can automate the detection of cardiomyocyte drug effects. Using three cardioactive drugs of different mechanisms, including those with primarily electrophysiological effects, we demonstrate the general applicability of this screening method to detect subtle changes in cardiomyocyte contraction. Requiring only brightfield images of cardiomyocyte contractions, we detect changes in cardiomyocyte contraction comparable to - and even superior to - fluorescence readouts. This automated method serves as a widely applicable screening tool to characterize the effects of drugs on cardiomyocyte function.

  12. An Investment Level Decision Method to Secure Long-term Reliability

    Science.gov (United States)

    Bamba, Satoshi; Yabe, Kuniaki; Seki, Tomomichi; Shibaya, Tetsuji

    The slowdown in power demand increase and facility replacement causes the aging and lower reliability in power facility. And the aging is followed by the rapid increase of repair and replacement when many facilities reach their lifetime in future. This paper describes a method to estimate the repair and replacement costs in future by applying the life-cycle cost model and renewal theory to the historical data. This paper also describes a method to decide the optimum investment plan, which replaces facilities in the order of cost-effectiveness by setting replacement priority formula, and the minimum investment level to keep the reliability. Estimation examples applied to substation facilities show that the reasonable and leveled future cash-out can keep the reliability by lowering the percentage of replacements caused by fatal failures.

  13. A generic method for assignment of reliability scores applied to solvent accessibility predictions

    Directory of Open Access Journals (Sweden)

    Nielsen Morten

    2009-07-01

    Full Text Available Abstract Background Estimation of the reliability of specific real value predictions is nontrivial and the efficacy of this is often questionable. It is important to know if you can trust a given prediction and therefore the best methods associate a prediction with a reliability score or index. For discrete qualitative predictions, the reliability is conventionally estimated as the difference between output scores of selected classes. Such an approach is not feasible for methods that predict a biological feature as a single real value rather than a classification. As a solution to this challenge, we have implemented a method that predicts the relative surface accessibility of an amino acid and simultaneously predicts the reliability for each prediction, in the form of a Z-score. Results An ensemble of artificial neural networks has been trained on a set of experimentally solved protein structures to predict the relative exposure of the amino acids. The method assigns a reliability score to each surface accessibility prediction as an inherent part of the training process. This is in contrast to the most commonly used procedures where reliabilities are obtained by post-processing the output. Conclusion The performance of the neural networks was evaluated on a commonly used set of sequences known as the CB513 set. An overall Pearson's correlation coefficient of 0.72 was obtained, which is comparable to the performance of the currently best public available method, Real-SPINE. Both methods associate a reliability score with the individual predictions. However, our implementation of reliability scores in the form of a Z-score is shown to be the more informative measure for discriminating good predictions from bad ones in the entire range from completely buried to fully exposed amino acids. This is evident when comparing the Pearson's correlation coefficient for the upper 20% of predictions sorted according to reliability. For this subset, values of 0

  14. Bearing Procurement Analysis Method by Total Cost of Ownership Analysis and Reliability Prediction

    Science.gov (United States)

    Trusaji, Wildan; Akbar, Muhammad; Sukoyo; Irianto, Dradjad

    2018-03-01

    In making bearing procurement analysis, price and its reliability must be considered as decision criteria, since price determines the direct cost as acquisition cost and reliability of bearing determine the indirect cost such as maintenance cost. Despite the indirect cost is hard to identify and measured, it has high contribution to overall cost that will be incurred. So, the indirect cost of reliability must be considered when making bearing procurement analysis. This paper tries to explain bearing evaluation method with the total cost of ownership analysis to consider price and maintenance cost as decision criteria. Furthermore, since there is a lack of failure data when bearing evaluation phase is conducted, reliability prediction method is used to predict bearing reliability from its dynamic load rating parameter. With this method, bearing with a higher price but has higher reliability is preferable for long-term planning. But for short-term planning the cheaper one but has lower reliability is preferable. This contextuality can give rise to conflict between stakeholders. Thus, the planning horizon needs to be agreed by all stakeholder before making a procurement decision.

  15. Self-Tuning Threshold Method for Real-Time Gait Phase Detection Based on Ground Contact Forces Using FSRs

    Directory of Open Access Journals (Sweden)

    Jing Tang

    2018-02-01

    Full Text Available This paper presents a novel methodology for detecting the gait phase of human walking on level ground. The previous threshold method (TM sets a threshold to divide the ground contact forces (GCFs into on-ground and off-ground states. However, the previous methods for gait phase detection demonstrate no adaptability to different people and different walking speeds. Therefore, this paper presents a self-tuning triple threshold algorithm (STTTA that calculates adjustable thresholds to adapt to human walking. Two force sensitive resistors (FSRs were placed on the ball and heel to measure GCFs. Three thresholds (i.e., high-threshold, middle-threshold andlow-threshold were used to search out the maximum and minimum GCFs for the self-adjustments of thresholds. The high-threshold was the main threshold used to divide the GCFs into on-ground and off-ground statuses. Then, the gait phases were obtained through the gait phase detection algorithm (GPDA, which provides the rules that determine calculations for STTTA. Finally, the STTTA reliability is determined by comparing the results between STTTA and Mariani method referenced as the timing analysis module (TAM and Lopez–Meyer methods. Experimental results show that the proposed method can be used to detect gait phases in real time and obtain high reliability when compared with the previous methods in the literature. In addition, the proposed method exhibits strong adaptability to different wearers walking at different walking speeds.

  16. The Reliability, Impact, and Cost-Effectiveness of Value-Added Teacher Assessment Methods

    Science.gov (United States)

    Yeh, Stuart S.

    2012-01-01

    This article reviews evidence regarding the intertemporal reliability of teacher rankings based on value-added methods. Value-added methods exhibit low reliability, yet are broadly supported by prominent educational researchers and are increasingly being used to evaluate and fire teachers. The article then presents a cost-effectiveness analysis…

  17. Chromogenic in situ hybridization is a reliable assay for detection of ALK rearrangements in adenocarcinomas of the lung.

    Science.gov (United States)

    Schildhaus, Hans-Ulrich; Deml, Karl-Friedrich; Schmitz, Katja; Meiboom, Maren; Binot, Elke; Hauke, Sven; Merkelbach-Bruse, Sabine; Büttner, Reinhard

    2013-11-01

    Reliable detection of anaplastic lymphoma kinase (ALK) rearrangements is a prerequisite for personalized treatment of lung cancer patients, as ALK rearrangements represent a predictive biomarker for the therapy with specific tyrosine kinase inhibitors. Currently, fluorescent in situ hybridization (FISH) is considered to be the standard method for assessing formalin-fixed and paraffin-embedded tissue for ALK inversions and translocations. However, FISH requires a specialized equipment, the signals fade rapidly and it is difficult to detect overall morphology and tumor heterogeneity. Chromogenic in situ hybridization (CISH) has been successfully introduced as an alternative test for the detection of several genetic aberrations. This study validates a newly developed ALK CISH assay by comparing FISH and CISH signal patterns in lung cancer samples with and without ALK rearrangements. One hundred adenocarcinomas of the lung were included in this study, among them 17 with known ALK rearrangement. FISH and CISH were carried out and evaluated according to the manufacturers' recommendations. For both assays, tumors were considered positive if ≥15% of tumor cells showed either isolated 3' signals or break-apart patterns or a combination of both. A subset of tumors was exemplarily examined by using a novel EML4 (echinoderm microtubule-associated protein-like 4) CISH probe. Red, green and fusion CISH signals were clearcut and different signal patterns were easily recognized. The percentage of aberrant tumor cells was statistically highly correlated (PCISH. On the basis of 86 samples that were evaluable by ALK CISH, we found a 100% sensitivity and 100% specificity of this assay. Furthermore, EML4 rearrangements could be recognized by CISH. CISH is a highly reliable, sensitive and specific method for the detection of ALK gene rearrangements in pulmonary adenocarcinomas. Our results suggest that CISH might serve as a suitable alternative to FISH, which is the current gold

  18. STEM - software test and evaluation methods: fault detection using static analysis techniques

    International Nuclear Information System (INIS)

    Bishop, P.G.; Esp, D.G.

    1988-08-01

    STEM is a software reliability project with the objective of evaluating a number of fault detection and fault estimation methods which can be applied to high integrity software. This Report gives some interim results of applying both manual and computer-based static analysis techniques, in particular SPADE, to an early CERL version of the PODS software containing known faults. The main results of this study are that: The scope for thorough verification is determined by the quality of the design documentation; documentation defects become especially apparent when verification is attempted. For well-defined software, the thoroughness of SPADE-assisted verification for detecting a large class of faults was successfully demonstrated. For imprecisely-defined software (not recommended for high-integrity systems) the use of tools such as SPADE is difficult and inappropriate. Analysis and verification tools are helpful, through their reliability and thoroughness. However, they are designed to assist, not replace, a human in validating software. Manual inspection can still reveal errors (such as errors in specification and errors of transcription of systems constants) which current tools cannot detect. There is a need for tools to automatically detect typographical errors in system constants, for example by reporting outliers to patterns. To obtain the maximum benefit from advanced tools, they should be applied during software development (when verification problems can be detected and corrected) rather than retrospectively. (author)

  19. Reliability of a semi-quantitative method for dermal exposure assessment (DREAM)

    NARCIS (Netherlands)

    Wendel de Joode, B. van; Hemmen, J.J. van; Meijster, T.; Major, V.; London, L.; Kromhout, H.

    2005-01-01

    Valid and reliable semi-quantitative dermal exposure assessment methods for epidemiological research and for occupational hygiene practice, applicable for different chemical agents, are practically nonexistent. The aim of this study was to assess the reliability of a recently developed

  20. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    Science.gov (United States)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  1. System reliability with correlated components: Accuracy of the Equivalent Planes method

    NARCIS (Netherlands)

    Roscoe, K.; Diermanse, F.; Vrouwenvelder, A.C.W.M.

    2015-01-01

    Computing system reliability when system components are correlated presents a challenge because it usually requires solving multi-fold integrals numerically, which is generally infeasible due to the computational cost. In Dutch flood defense reliability modeling, an efficient method for computing

  2. System reliability with correlated components : Accuracy of the Equivalent Planes method

    NARCIS (Netherlands)

    Roscoe, K.; Diermanse, F.; Vrouwenvelder, T.

    2015-01-01

    Computing system reliability when system components are correlated presents a challenge because it usually requires solving multi-fold integrals numerically, which is generally infeasible due to the computational cost. In Dutch flood defense reliability modeling, an efficient method for computing

  3. Reliability and validity of the AutoCAD software method in lumbar lordosis measurement.

    Science.gov (United States)

    Letafatkar, Amir; Amirsasan, Ramin; Abdolvahabi, Zahra; Hadadnezhad, Malihe

    2011-12-01

    The aim of this study was to determine the reliability and validity of the AutoCAD software method in lumbar lordosis measurement. Fifty healthy volunteers with a mean age of 23 ± 1.80 years were enrolled. A lumbar lateral radiograph was taken on all participants, and the lordosis was measured according to the Cobb method. Afterward, the lumbar lordosis degree was measured via AutoCAD software and flexible ruler methods. The current study is accomplished in 2 parts: intratester and intertester evaluations of reliability as well as the validity of the flexible ruler and software methods. Based on the intraclass correlation coefficient, AutoCAD's reliability and validity in measuring lumbar lordosis were 0.984 and 0.962, respectively. AutoCAD showed to be a reliable and valid method to measure lordosis. It is suggested that this method may replace those that are costly and involve health risks, such as radiography, in evaluating lumbar lordosis.

  4. A semi-automated method for rapid detection of ripple events on interictal voltage discharges in the scalp electroencephalogram.

    Science.gov (United States)

    Chu, Catherine J; Chan, Arthur; Song, Dan; Staley, Kevin J; Stufflebeam, Steven M; Kramer, Mark A

    2017-02-01

    High frequency oscillations are emerging as a clinically important indicator of epileptic networks. However, manual detection of these high frequency oscillations is difficult, time consuming, and subjective, especially in the scalp EEG, thus hindering further clinical exploration and application. Semi-automated detection methods augment manual detection by reducing inspection to a subset of time intervals. We propose a new method to detect high frequency oscillations that co-occur with interictal epileptiform discharges. The new method proceeds in two steps. The first step identifies candidate time intervals during which high frequency activity is increased. The second step computes a set of seven features for each candidate interval. These features require that the candidate event contain a high frequency oscillation approximately sinusoidal in shape, with at least three cycles, that co-occurs with a large amplitude discharge. Candidate events that satisfy these features are stored for validation through visual analysis. We evaluate the detector performance in simulation and on ten examples of scalp EEG data, and show that the proposed method successfully detects spike-ripple events, with high positive predictive value, low false positive rate, and high intra-rater reliability. The proposed method is less sensitive than the existing method of visual inspection, but much faster and much more reliable. Accurate and rapid detection of high frequency activity increases the clinical viability of this rhythmic biomarker of epilepsy. The proposed spike-ripple detector rapidly identifies candidate spike-ripple events, thus making clinical analysis of prolonged, multielectrode scalp EEG recordings tractable. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Selected Methods For Increases Reliability The Of Electronic Systems Security

    Directory of Open Access Journals (Sweden)

    Paś Jacek

    2015-11-01

    Full Text Available The article presents the issues related to the different methods to increase the reliability of electronic security systems (ESS for example, a fire alarm system (SSP. Reliability of the SSP in the descriptive sense is a property preservation capacity to implement the preset function (e.g. protection: fire airport, the port, logistics base, etc., at a certain time and under certain conditions, e.g. Environmental, despite the possible non-compliance by a specific subset of elements this system. Analyzing the available literature on the ESS-SSP is not available studies on methods to increase the reliability (several works similar topics but moving with respect to the burglary and robbery (Intrusion. Based on the analysis of the set of all paths in the system suitability of the SSP for the scenario mentioned elements fire events (device critical because of security.

  6. Damage detection in composite materials using Lamb wave methods

    Science.gov (United States)

    Kessler, Seth S.; Spearing, S. Mark; Soutis, Constantinos

    2002-04-01

    Cost-effective and reliable damage detection is critical for the utilization of composite materials. This paper presents part of an experimental and analytical survey of candidate methods for in situ damage detection of composite materials. Experimental results are presented for the application of Lamb wave techniques to quasi-isotropic graphite/epoxy test specimens containing representative damage modes, including delamination, transverse ply cracks and through-holes. Linear wave scans were performed on narrow laminated specimens and sandwich beams with various cores by monitoring the transmitted waves with piezoceramic sensors. Optimal actuator and sensor configurations were devised through experimentation, and various types of driving signal were explored. These experiments provided a procedure capable of easily and accurately determining the time of flight of a Lamb wave pulse between an actuator and sensor. Lamb wave techniques provide more information about damage presence and severity than previously tested methods (frequency response techniques), and provide the possibility of determining damage location due to their local response nature. These methods may prove suitable for structural health monitoring applications since they travel long distances and can be applied with conformable piezoelectric actuators and sensors that require little power.

  7. Structural Reliability Using Probability Density Estimation Methods Within NESSUS

    Science.gov (United States)

    Chamis, Chrisos C. (Technical Monitor); Godines, Cody Ric

    2003-01-01

    A reliability analysis studies a mathematical model of a physical system taking into account uncertainties of design variables and common results are estimations of a response density, which also implies estimations of its parameters. Some common density parameters include the mean value, the standard deviation, and specific percentile(s) of the response, which are measures of central tendency, variation, and probability regions, respectively. Reliability analyses are important since the results can lead to different designs by calculating the probability of observing safe responses in each of the proposed designs. All of this is done at the expense of added computational time as compared to a single deterministic analysis which will result in one value of the response out of many that make up the density of the response. Sampling methods, such as monte carlo (MC) and latin hypercube sampling (LHS), can be used to perform reliability analyses and can compute nonlinear response density parameters even if the response is dependent on many random variables. Hence, both methods are very robust; however, they are computationally expensive to use in the estimation of the response density parameters. Both methods are 2 of 13 stochastic methods that are contained within the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) program. NESSUS is a probabilistic finite element analysis (FEA) program that was developed through funding from NASA Glenn Research Center (GRC). It has the additional capability of being linked to other analysis programs; therefore, probabilistic fluid dynamics, fracture mechanics, and heat transfer are only a few of what is possible with this software. The LHS method is the newest addition to the stochastic methods within NESSUS. Part of this work was to enhance NESSUS with the LHS method. The new LHS module is complete, has been successfully integrated with NESSUS, and been used to study four different test cases that have been

  8. Using DOProC method in reliability assessment of steel elements exposed to fatigue

    Directory of Open Access Journals (Sweden)

    Krejsa Martin

    2017-01-01

    Full Text Available Fatigue crack damage depends on a number of stress range cycles. This is a time factor in the course of reliability for the entire designed service life. Three sizes are important for the characteristics of the propagation of fatigue cracks - initial size, detectable size and acceptable size. The theoretical model of fatigue crack progression can be based on a linear fracture mechanic. Depending on location of an initial crack, the crack may propagate in structural element e.g. from the edge or from the surface. When determining the required degree of reliability, it is possible to specify the time of the first inspection of the construction which will focus on the fatigue damage. Using a conditional probability and Bayesian approach, times for subsequent inspections can be determined. For probabilistic modelling of fatigue crack progression was used the original and new probabilistic method - the Direct Optimized Probabilistic Calculation (“DOProC”, which uses a purely numerical approach without any simulation techniques or approximation approach based on optimized numerical integration.

  9. Validity and reliability of methods for the detection of secondary caries around amalgam restorations in primary teeth

    Directory of Open Access Journals (Sweden)

    Mariana Minatel Braga

    2010-03-01

    Full Text Available Secondary caries has been reported as the main reason for restoration replacement. The aim of this in vitro study was to evaluate the performance of different methods - visual inspection, laser fluorescence (DIAGNOdent, radiography and tactile examination - for secondary caries detection in primary molars restored with amalgam. Fifty-four primary molars were photographed and 73 suspect sites adjacent to amalgam restorations were selected. Two examiners evaluated independently these sites using all methods. Agreement between examiners was assessed by the Kappa test. To validate the methods, a caries-detector dye was used after restoration removal. The best cut-off points for the sample were found by a Receiver Operator Characteristic (ROC analysis, and the area under the ROC curve (Az, and the sensitivity, specificity and accuracy of the methods were calculated for enamel (D2 and dentine (D3 thresholds. These parameters were found for each method and then compared by the McNemar test. The tactile examination and visual inspection presented the highest inter-examiner agreement for the D2 and D3 thresholds, respectively. The visual inspection also showed better performance than the other methods for both thresholds (Az = 0.861 and Az = 0.841, respectively. In conclusion, the visual inspection presented the best performance for detecting enamel and dentin secondary caries in primary teeth restored with amalgam.

  10. A new detection method based on CFAR and DE for OFPS

    Science.gov (United States)

    Qiu, Zezheng; Zheng, Tong; Qu, Hongquan; Pang, Liping

    2016-09-01

    Optical fiber pre-warning system (OFPS) is widely utilized in pipeline transport fields. The intrusions of OFPS need to be located. In this system, the original signals consist of noises, interferences, and intrusion signals. Here, noises are background and harmless interferences possessing with high power, and the intrusion signals are the main target of detection in this system. Hence, the study stresses on extracting the intrusion signals from the total ones. The proposed method can be divided into two parts, constant false alarm rate (CFAR) and dilation and erosion (DE). The former is applied to eliminate noises, and the latter is to remove interferences. According to some researches, the feature of noise background accords with the CFAR spatial detection. Furthermore, the detection results after CFAR can be presented as a binary image of time and space. Besides, interferences are relatively disconnected. Consequently, they can be eliminated by DE which is introduced from the image processing. To sum up, this novel method is based on CFAR and DE which can eliminate noises and interferences effectively. Moreover, it performs a brilliant detection performance. A series of tests were developed in Men Tou Gou of Beijing, China, and the reliability of proposed method can be verified by these tests.

  11. Issues in benchmarking human reliability analysis methods: A literature review

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Hendrickson, Stacey M.L.; Forester, John A.; Tran, Tuan Q.; Lois, Erasmia

    2010-01-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies is presented in order to aid in the design of future HRA benchmarking endeavors.

  12. Issues in benchmarking human reliability analysis methods : a literature review.

    Energy Technology Data Exchange (ETDEWEB)

    Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  13. Evaluation of Information Requirements of Reliability Methods in Engineering Design

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema

    2010-01-01

    This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...

  14. Is sequential cranial ultrasound reliable for detection of white matter injury in very preterm infants?

    International Nuclear Information System (INIS)

    Leijser, Lara M.; Steggerda, Sylke J.; Walther, Frans J.; Wezel-Meijler, Gerda van; Bruine, Francisca T. de; Grond, Jeroen van der

    2010-01-01

    Cranial ultrasound (cUS) may not be reliable for detection of diffuse white matter (WM) injury. Our aim was to assess in very preterm infants the reliability of a classification system for WM injury on sequential cUS throughout the neonatal period, using magnetic resonance imaging (MRI) as reference standard. In 110 very preterm infants (gestational age <32 weeks), serial cUS during admission (median 8, range 4-22) and again around term equivalent age (TEA) and a single MRI around TEA were performed. cUS during admission were assessed for presence of WM changes, and contemporaneous cUS and MRI around TEA additionally for abnormality of lateral ventricles. Sequential cUS (from birth up to TEA) and MRI were classified as normal/mildly abnormal, moderately abnormal, or severely abnormal, based on a combination of findings of the WM and lateral ventricles. Predictive values of the cUS classification were calculated. Sequential cUS were classified as normal/mildly abnormal, moderately abnormal, and severely abnormal in, respectively, 22%, 65%, and 13% of infants and MRI in, respectively, 30%, 52%, and 18%. The positive predictive value of the cUS classification for the MRI classification was high for severely abnormal WM (0.79) but lower for normal/mildly abnormal (0.67) and moderately abnormal (0.64) WM. Sequential cUS during the neonatal period detects severely abnormal WM in very preterm infants but is less reliable for mildly and moderately abnormal WM. MRI around TEA seems needed to reliably detect WM injury in very preterm infants. (orig.)

  15. A simplified high-throughput method for pyrethroid knock-down resistance (kdr) detection in Anopheles gambiae

    Science.gov (United States)

    Lynd, Amy; Ranson, Hilary; McCall, P J; Randle, Nadine P; Black, William C; Walker, Edward D; Donnelly, Martin J

    2005-01-01

    Background A single base pair mutation in the sodium channel confers knock-down resistance to pyrethroids in many insect species. Its occurrence in Anopheles mosquitoes may have important implications for malaria vector control especially considering the current trend for large scale pyrethroid-treated bednet programmes. Screening Anopheles gambiae populations for the kdr mutation has become one of the mainstays of programmes that monitor the development of insecticide resistance. The screening is commonly performed using a multiplex Polymerase Chain Reaction (PCR) which, since it is reliant on a single nucleotide polymorphism, can be unreliable. Here we present a reliable and potentially high throughput method for screening An. gambiae for the kdr mutation. Methods A Hot Ligation Oligonucleotide Assay (HOLA) was developed to detect both the East and West African kdr alleles in the homozygous and heterozygous states, and was optimized for use in low-tech developing world laboratories. Results from the HOLA were compared to results from the multiplex PCR for field and laboratory mosquito specimens to provide verification of the robustness and sensitivity of the technique. Results and Discussion The HOLA assay, developed for detection of the kdr mutation, gives a bright blue colouration for a positive result whilst negative reactions remain colourless. The results are apparent within a few minutes of adding the final substrate and can be scored by eye. Heterozygotes are scored when a sample gives a positive reaction to the susceptible probe and the kdr probe. The technique uses only basic laboratory equipment and skills and can be carried out by anyone familiar with the Enzyme-linked immunosorbent assay (ELISA) technique. A comparison to the multiplex PCR method showed that the HOLA assay was more reliable, and scoring of the plates was less ambiguous. Conclusion The method is capable of detecting both the East and West African kdr alleles in the homozygous and

  16. A simplified high-throughput method for pyrethroid knock-down resistance (kdr detection in Anopheles gambiae

    Directory of Open Access Journals (Sweden)

    Walker Edward D

    2005-03-01

    Full Text Available Abstract Background A single base pair mutation in the sodium channel confers knock-down resistance to pyrethroids in many insect species. Its occurrence in Anopheles mosquitoes may have important implications for malaria vector control especially considering the current trend for large scale pyrethroid-treated bednet programmes. Screening Anopheles gambiae populations for the kdr mutation has become one of the mainstays of programmes that monitor the development of insecticide resistance. The screening is commonly performed using a multiplex Polymerase Chain Reaction (PCR which, since it is reliant on a single nucleotide polymorphism, can be unreliable. Here we present a reliable and potentially high throughput method for screening An. gambiae for the kdr mutation. Methods A Hot Ligation Oligonucleotide Assay (HOLA was developed to detect both the East and West African kdr alleles in the homozygous and heterozygous states, and was optimized for use in low-tech developing world laboratories. Results from the HOLA were compared to results from the multiplex PCR for field and laboratory mosquito specimens to provide verification of the robustness and sensitivity of the technique. Results and Discussion The HOLA assay, developed for detection of the kdr mutation, gives a bright blue colouration for a positive result whilst negative reactions remain colourless. The results are apparent within a few minutes of adding the final substrate and can be scored by eye. Heterozygotes are scored when a sample gives a positive reaction to the susceptible probe and the kdr probe. The technique uses only basic laboratory equipment and skills and can be carried out by anyone familiar with the Enzyme-linked immunosorbent assay (ELISA technique. A comparison to the multiplex PCR method showed that the HOLA assay was more reliable, and scoring of the plates was less ambiguous. Conclusion The method is capable of detecting both the East and West African kdr alleles

  17. IN SEARCH OF A FAST SCREENING METHOD FOR DETECTING THE MALINGERING OF COGNITIVE IMPAIRMENT

    Directory of Open Access Journals (Sweden)

    Amada Ampudia

    2012-07-01

    Full Text Available Forensic settings demand expedient and conclusive forensic psychological assessment. The aim of this study was to design a simple and fast, but reliable psychometric instrument for detecting the malingering of cognitive impairment. In a quasi-experimental design, 156 individuals were divided into three groups: a normal group with no cognitive impairment; a Mild Cognitive Impairment (MCI group; and a group of informed malingerers with no MCI who feigned cognitive impairment. Receiver Operating Curve (ROC analysis of the Test of Memory Malingering (TOMM, and of several subtests of the Wechsler Memory Scale (WMS-III revealed that the WMS-III was as reliable and accurate as the TOMM in discriminating malingerers from the honest. The results revealed that the diagnostic accuracy, sensitivity and specificity of the WMS-III Auditory Recognition Delayed of Verbal Paired Associates subtest was similar to the TOMM in discriminating malingering from genuine memory impairment. In conclusion, the WMS-III Recognition of Verbal Paired Associates subtest and the TOMM provide a fast, valid and reliable screening method for detecting the malingering of cognitive impairment.

  18. Development of a Tandem Repeat-Based Polymerase Chain Displacement Reaction Method for Highly Sensitive Detection of 'Candidatus Liberibacter asiaticus'.

    Science.gov (United States)

    Lou, Binghai; Song, Yaqin; RoyChowdhury, Moytri; Deng, Chongling; Niu, Ying; Fan, Qijun; Tang, Yan; Zhou, Changyong

    2018-02-01

    Huanglongbing (HLB) is one of the most destructive diseases in citrus production worldwide. Early detection of HLB pathogens can facilitate timely removal of infected citrus trees in the field. However, low titer and uneven distribution of HLB pathogens in host plants make reliable detection challenging. Therefore, the development of effective detection methods with high sensitivity is imperative. This study reports the development of a novel method, tandem repeat-based polymerase chain displacement reaction (TR-PCDR), for the detection of 'Candidatus Liberibacter asiaticus', a widely distributed HLB-associated bacterium. A uniquely designed primer set (TR2-PCDR-F/TR2-PCDR-1R) and a thermostable Taq DNA polymerase mutant with strand displacement activity were used for TR-PCDR amplification. Performed in a regular thermal cycler, TR-PCDR could produce more than two amplicons after each amplification cycle. Sensitivity of the developed TR-PCDR was 10 copies of target DNA fragment. The sensitive level was proven to be 100× higher than conventional PCR and similar to real-time PCR. Data from the detection of 'Ca. L. asiaticus' with filed samples using the above three methods also showed similar results. No false-positive TR-PCDR amplification was observed from healthy citrus samples and water controls. These results thereby illustrated that the developed TR-PCDR method can be applied to the reliable, highly sensitive, and cost-effective detection of 'Ca. L. asiaticus'.

  19. Comparison of Sample and Detection Quantification Methods for Salmonella Enterica from Produce

    Science.gov (United States)

    Hummerick, M. P.; Khodadad, C.; Richards, J. T.; Dixit, A.; Spencer, L. M.; Larson, B.; Parrish, C., II; Birmele, M.; Wheeler, Raymond

    2014-01-01

    The purpose of this study was to identify and optimize fast and reliable sampling and detection methods for the identification of pathogens that may be present on produce grown in small vegetable production units on the International Space Station (ISS), thus a field setting. Microbiological testing is necessary before astronauts are allowed to consume produce grown on ISS where currently there are two vegetable production units deployed, Lada and Veggie.

  20. Peak detection method evaluation for ion mobility spectrometry by using machine learning approaches.

    Science.gov (United States)

    Hauschild, Anne-Christin; Kopczynski, Dominik; D'Addario, Marianna; Baumbach, Jörg Ingo; Rahmann, Sven; Baumbach, Jan

    2013-04-16

    Ion mobility spectrometry with pre-separation by multi-capillary columns (MCC/IMS) has become an established inexpensive, non-invasive bioanalytics technology for detecting volatile organic compounds (VOCs) with various metabolomics applications in medical research. To pave the way for this technology towards daily usage in medical practice, different steps still have to be taken. With respect to modern biomarker research, one of the most important tasks is the automatic classification of patient-specific data sets into different groups, healthy or not, for instance. Although sophisticated machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region-merging with VisualNow, and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods and systematically study their classification performance based on the four peak detectors' results. Second, we investigate the classification variance and robustness regarding perturbation and overfitting. Our main finding is that the power of the classification accuracy is almost equally good for all methods, the manually created gold standard as well as the four automatic peak finding methods. In addition, we note that all tools, manual and automatic, are similarly robust against perturbations. However, the classification performance is more robust against overfitting when using the PME as peak calling preprocessor. In summary, we conclude that all methods, though small differences exist, are largely reliable and enable a wide spectrum of real-world biomedical applications.

  1. A generic method for estimating system reliability using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Ramirez-Marquez, Jose Emmanuel

    2009-01-01

    This study presents a holistic method for constructing a Bayesian network (BN) model for estimating system reliability. BN is a probabilistic approach that is used to model and predict the behavior of a system based on observed stochastic events. The BN model is a directed acyclic graph (DAG) where the nodes represent system components and arcs represent relationships among them. Although recent studies on using BN for estimating system reliability have been proposed, they are based on the assumption that a pre-built BN has been designed to represent the system. In these studies, the task of building the BN is typically left to a group of specialists who are BN and domain experts. The BN experts should learn about the domain before building the BN, which is generally very time consuming and may lead to incorrect deductions. As there are no existing studies to eliminate the need for a human expert in the process of system reliability estimation, this paper introduces a method that uses historical data about the system to be modeled as a BN and provides efficient techniques for automated construction of the BN model, and hence estimation of the system reliability. In this respect K2, a data mining algorithm, is used for finding associations between system components, and thus building the BN model. This algorithm uses a heuristic to provide efficient and accurate results while searching for associations. Moreover, no human intervention is necessary during the process of BN construction and reliability estimation. The paper provides a step-by-step illustration of the method and evaluation of the approach with literature case examples

  2. A generic method for estimating system reliability using Bayesian networks

    Energy Technology Data Exchange (ETDEWEB)

    Doguc, Ozge [Stevens Institute of Technology, Hoboken, NJ 07030 (United States); Ramirez-Marquez, Jose Emmanuel [Stevens Institute of Technology, Hoboken, NJ 07030 (United States)], E-mail: jmarquez@stevens.edu

    2009-02-15

    This study presents a holistic method for constructing a Bayesian network (BN) model for estimating system reliability. BN is a probabilistic approach that is used to model and predict the behavior of a system based on observed stochastic events. The BN model is a directed acyclic graph (DAG) where the nodes represent system components and arcs represent relationships among them. Although recent studies on using BN for estimating system reliability have been proposed, they are based on the assumption that a pre-built BN has been designed to represent the system. In these studies, the task of building the BN is typically left to a group of specialists who are BN and domain experts. The BN experts should learn about the domain before building the BN, which is generally very time consuming and may lead to incorrect deductions. As there are no existing studies to eliminate the need for a human expert in the process of system reliability estimation, this paper introduces a method that uses historical data about the system to be modeled as a BN and provides efficient techniques for automated construction of the BN model, and hence estimation of the system reliability. In this respect K2, a data mining algorithm, is used for finding associations between system components, and thus building the BN model. This algorithm uses a heuristic to provide efficient and accurate results while searching for associations. Moreover, no human intervention is necessary during the process of BN construction and reliability estimation. The paper provides a step-by-step illustration of the method and evaluation of the approach with literature case examples.

  3. Thermoluminescence method for detection of irradiated food

    International Nuclear Information System (INIS)

    Pinnioja, S.

    1998-01-01

    intensity than feldspars from cold regions, evidently because a more altered mineral structure is typical in warm water regions. A new autoradiographic method to determine luminescence of irradiated rock surfaces was developed for the study. The method of thermoluminescence analysis has been used for the official control analysis of irradiated food in Finland since 1990. In the course of the study, about 500 analyses were carried out for the Finnish Customs Laboratory. Eighty lots of irradiated herbs or spices and 10 lots of irradiated seafood were found. During the last two years, irradiated green tea in spice mixtures and irradiated frog legs have been detected. No irradiated berry or mushroom products have been found. Screening with a photostimulated luminescence (PSL) instrument, followed by TL analysis to confirm the positive and ambiguous samples, provides a reliable tool for the identification of irradiated food containing adhering or contaminating minerals. The reliability of the TL method was proved in European trials. Standardisation of the method has been undertaken by the European Committee for Standardization (CEN). A TL method based on the determination of TL silicate minerals in dry herbs and spices has recently been accepted as an official CEN standard. (orig.)

  4. Thermoluminescence method for detection of irradiated food

    Energy Technology Data Exchange (ETDEWEB)

    Pinnioja, S

    1998-12-31

    intensity than feldspars from cold regions, evidently because a more altered mineral structure is typical in warm water regions. A new autoradiographic method to determine luminescence of irradiated rock surfaces was developed for the study. The method of thermoluminescence analysis has been used for the official control analysis of irradiated food in Finland since 1990. In the course of the study, about 500 analyses were carried out for the Finnish Customs Laboratory. Eighty lots of irradiated herbs or spices and 10 lots of irradiated seafood were found. During the last two years, irradiated green tea in spice mixtures and irradiated frog legs have been detected. No irradiated berry or mushroom products have been found. Screening with a photostimulated luminescence (PSL) instrument, followed by TL analysis to confirm the positive and ambiguous samples, provides a reliable tool for the identification of irradiated food containing adhering or contaminating minerals. The reliability of the TL method was proved in European trials. Standardisation of the method has been undertaken by the European Committee for Standardization (CEN). A TL method based on the determination of TL silicate minerals in dry herbs and spices has recently been accepted as an official CEN standard. (orig.) 55 refs.

  5. Research on Control Method Based on Real-Time Operational Reliability Evaluation for Space Manipulator

    Directory of Open Access Journals (Sweden)

    Yifan Wang

    2014-05-01

    Full Text Available A control method based on real-time operational reliability evaluation for space manipulator is presented for improving the success rate of a manipulator during the execution of a task. In this paper, a method for quantitative analysis of operational reliability is given when manipulator is executing a specified task; then a control model which could control the quantitative operational reliability is built. First, the control process is described by using a state space equation. Second, process parameters are estimated in real time using Bayesian method. Third, the expression of the system's real-time operational reliability is deduced based on the state space equation and process parameters which are estimated using Bayesian method. Finally, a control variable regulation strategy which considers the cost of control is given based on the Theory of Statistical Process Control. It is shown via simulations that this method effectively improves the operational reliability of space manipulator control system.

  6. Study of Fuze Structure and Reliability Design Based on the Direct Search Method

    Science.gov (United States)

    Lin, Zhang; Ning, Wang

    2017-03-01

    Redundant design is one of the important methods to improve the reliability of the system, but mutual coupling of multiple factors is often involved in the design. In my study, Direct Search Method is introduced into the optimum redundancy configuration for design optimization, in which, the reliability, cost, structural weight and other factors can be taken into account simultaneously, and the redundant allocation and reliability design of aircraft critical system are computed. The results show that this method is convenient and workable, and applicable to the redundancy configurations and optimization of various designs upon appropriate modifications. And this method has a good practical value.

  7. An automated method for estimating reliability of grid systems using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Emmanuel Ramirez-Marquez, Jose

    2012-01-01

    Grid computing has become relevant due to its applications to large-scale resource sharing, wide-area information transfer, and multi-institutional collaborating. In general, in grid computing a service requests the use of a set of resources, available in a grid, to complete certain tasks. Although analysis tools and techniques for these types of systems have been studied, grid reliability analysis is generally computation-intensive to obtain due to the complexity of the system. Moreover, conventional reliability models have some common assumptions that cannot be applied to the grid systems. Therefore, new analytical methods are needed for effective and accurate assessment of grid reliability. This study presents a new method for estimating grid service reliability, which does not require prior knowledge about the grid system structure unlike the previous studies. Moreover, the proposed method does not rely on any assumptions about the link and node failure rates. This approach is based on a data-mining algorithm, the K2, to discover the grid system structure from raw historical system data, that allows to find minimum resource spanning trees (MRST) within the grid then, uses Bayesian networks (BN) to model the MRST and estimate grid service reliability.

  8. Implementation of the SFRA method as valuable tool for detection of power transformer active part deformation

    Directory of Open Access Journals (Sweden)

    Milić Saša D.

    2014-01-01

    Full Text Available The paper presents the SFRA (Sweep Frequency Response Analysis-SFRA method for analyzing frequency response of transformer windings in order to identify potential defects in the geometry of the core and winding. The most frequent problems (recognized by SFRA are: core shift, shorted or open winding, unwanted contact between core and mass, etc. Comparative analysis of this method with conventional methods is carried out in situ transformer in real hard industrial conditions. Benefits of SFRA method are great reliability and repeatability of the measurements. This method belongs to the non-invasive category. Due to the high reliability and repeatability of the measurements it is very suitable for detection of changes in the geometry of the coil and the core during prophylactic field testing, or after transporting the transformer.

  9. Visual acuity measures do not reliably detect childhood refractive error--an epidemiological study.

    Directory of Open Access Journals (Sweden)

    Lisa O'Donoghue

    Full Text Available PURPOSE: To investigate the utility of uncorrected visual acuity measures in screening for refractive error in white school children aged 6-7-years and 12-13-years. METHODS: The Northern Ireland Childhood Errors of Refraction (NICER study used a stratified random cluster design to recruit children from schools in Northern Ireland. Detailed eye examinations included assessment of logMAR visual acuity and cycloplegic autorefraction. Spherical equivalent refractive data from the right eye were used to classify significant refractive error as myopia of at least 1DS, hyperopia as greater than +3.50DS and astigmatism as greater than 1.50DC, whether it occurred in isolation or in association with myopia or hyperopia. RESULTS: Results are presented from 661 white 12-13-year-old and 392 white 6-7-year-old school-children. Using a cut-off of uncorrected visual acuity poorer than 0.20 logMAR to detect significant refractive error gave a sensitivity of 50% and specificity of 92% in 6-7-year-olds and 73% and 93% respectively in 12-13-year-olds. In 12-13-year-old children a cut-off of poorer than 0.20 logMAR had a sensitivity of 92% and a specificity of 91% in detecting myopia and a sensitivity of 41% and a specificity of 84% in detecting hyperopia. CONCLUSIONS: Vision screening using logMAR acuity can reliably detect myopia, but not hyperopia or astigmatism in school-age children. Providers of vision screening programs should be cognisant that where detection of uncorrected hyperopic and/or astigmatic refractive error is an aspiration, current UK protocols will not effectively deliver.

  10. A survey of direct inversion methods having possible application to tunnel detection

    International Nuclear Information System (INIS)

    Mager, R.D.

    1985-01-01

    Within recent years there has been considerable interest in the development of geophysical methods for the location of hidden underground tunnels and cavities. Consideration of this problem has been motivated by military applications, such as the detection of shallow man-made tunnels and arm caches, as well as civilian applications such as detection of limestone cavities in karst terrain and the mapping of abandoned mine workings. There are also applications for in-situ coal gasification and for the monitoring of nuclear waste disposal sites. The most reliable method presently used to map these underground anomalies has been direct detection by closely spaced drilling. However, the high cost of drilling renders this method impractical except for detailed and localized mapping, and certainly unfeasible for any type of broad-scale reconnaissance activity. Largely motivated by petroleum and mineral exploration needs, however, the seismic industry has seen a virtual revolution in acquisition and processing techniques within the past ten years. Paralleling these developments have been corresponding developments in acoustical imaging and non-destructive testing. Researchers in the field of inverse scattering have produced a number of new methods for target imaging from backscattered reflection data

  11. Fatigue crack detection and identification by the elastic wave propagation method

    Science.gov (United States)

    Stawiarski, Adam; Barski, Marek; Pająk, Piotr

    2017-05-01

    In this paper the elastic wave propagation phenomenon was used to detect the initiation of the fatigue damage in isotropic plate with a circular hole. The safety and reliability of structures mostly depend on the effectiveness of the monitoring methods. The Structural Health Monitoring (SHM) system based on the active pitch-catch measurement technique was proposed. The piezoelectric (PZT) elements was used as an actuators and sensors in the multipoint measuring system. The comparison of the intact and defected structures has been used by damage detection algorithm. One part of the SHM system has been responsible for detection of the fatigue crack initiation. The second part observed the evolution of the damage growth and assess the size of the defect. The numerical results of the wave propagation phenomenon has been used to present the effectiveness and accuracy of the proposed method. The preliminary experimental analysis has been carried out during the tension test of the aluminum plate with a circular hole to determine the efficiency of the measurement technique.

  12. Extended block diagram method for a multi-state system reliability assessment

    International Nuclear Information System (INIS)

    Lisnianski, Anatoly

    2007-01-01

    The presented method extends the classical reliability block diagram method to a repairable multi-state system. It is very suitable for engineering applications since the procedure is well formalized and based on the natural decomposition of the entire multi-state system (the system is represented as a collection of its elements). Until now, the classical block diagram method did not provide the reliability assessment for the repairable multi-state system. The straightforward stochastic process methods are very difficult for engineering application in such cases due to the 'dimension damnation'-huge number of system states. The suggested method is based on the combined random processes and the universal generating function technique and drastically reduces the number of states in the multi-state model

  13. Current perspectives on genetically modified crops and detection methods.

    Science.gov (United States)

    Kamle, Madhu; Kumar, Pradeep; Patra, Jayanta Kumar; Bajpai, Vivek K

    2017-07-01

    Genetically modified (GM) crops are the fastest adopted commodities in the agribiotech industry. This market penetration should provide a sustainable basis for ensuring food supply for growing global populations. The successful completion of two decades of commercial GM crop production (1996-2015) is underscored by the increasing rate of adoption of genetic engineering technology by farmers worldwide. With the advent of introduction of multiple traits stacked together in GM crops for combined herbicide tolerance, insect resistance, drought tolerance or disease resistance, the requirement of reliable and sensitive detection methods for tracing and labeling genetically modified organisms in the food/feed chain has become increasingly important. In addition, several countries have established threshold levels for GM content which trigger legally binding labeling schemes. The labeling of GM crops is mandatory in many countries (such as China, EU, Russia, Australia, New Zealand, Brazil, Israel, Saudi Arabia, Korea, Chile, Philippines, Indonesia, Thailand), whereas in Canada, Hong Kong, USA, South Africa, and Argentina voluntary labeling schemes operate. The rapid adoption of GM crops has increased controversies, and mitigating these issues pertaining to the implementation of effective regulatory measures for the detection of GM crops is essential. DNA-based detection methods have been successfully employed, while the whole genome sequencing using next-generation sequencing (NGS) technologies provides an advanced means for detecting genetically modified organisms and foods/feeds in GM crops. This review article describes the current status of GM crop commercialization and discusses the benefits and shortcomings of common and advanced detection systems for GMs in foods and animal feeds.

  14. Risk-based methods for reliability investments in electric power distribution systems

    Energy Technology Data Exchange (ETDEWEB)

    Alvehag, Karin

    2011-07-01

    Society relies more and more on a continuous supply of electricity. However, while under investments in reliability lead to an unacceptable number of power interruptions, over investments result in too high costs for society. To give incentives for a socio economically optimal level of reliability, quality regulations have been adopted in many European countries. These quality regulations imply new financial risks for the distribution system operator (DSO) since poor reliability can reduce the allowed revenue for the DSO and compensation may have to be paid to affected customers. This thesis develops a method for evaluating the incentives for reliability investments implied by different quality regulation designs. The method can be used to investigate whether socio economically beneficial projects are also beneficial for a profit-maximizing DSO subject to a particular quality regulation design. To investigate which reinvestment projects are preferable for society and a DSO, risk-based methods are developed. With these methods, the probability of power interruptions and the consequences of these can be simulated. The consequences of interruptions for the DSO will to a large extent depend on the quality regulation. The consequences for the customers, and hence also society, will depend on factors such as the interruption duration and time of occurrence. The proposed risk-based methods consider extreme outage events in the risk assessments by incorporating the impact of severe weather, estimating the full probability distribution of the total reliability cost, and formulating a risk-averse strategy. Results from case studies performed show that quality regulation design has a significant impact on reinvestment project profitability for a DSO. In order to adequately capture the financial risk that the DSO is exposed to, detailed riskbased methods, such as the ones developed in this thesis, are needed. Furthermore, when making investment decisions, a risk

  15. Early detection of canine hip dysplasia: comparison of two palpation and five radiographic methods

    International Nuclear Information System (INIS)

    Adams, W.M.; Dueland, R.T.; Meinen, J.; O'Brien, R.T.; Giuliano, E.; Nordheim, E.V.

    1998-01-01

    Hip joint laxity was evaluated in four breeds (i.e., greyhound, Labrador retriever, Irish setter, hound mixed-breed) of puppies (n=32) by Ortolani's and Bardens' maneuvers, by subjective assessment of radiographs (Orthopedic Foundation for Animals [OFA] method), and by four radiographic measurement indices. Puppies were studied at four, six-to-10, 16-to-18, and 52 weeks of age. The purpose of this study was to compare palpation and radiographic methods of hip laxity detection in puppies for predicting the development of degenerative joint disease (DJD) by one year of age. Twenty-seven (42%) hips developed DJD. Ortolani's method was not a reliable predictor of hip dysplasia at six-to-10 weeks; it was significantly predictive at 16-to-18 weeks but had a high incidence of false negatives. Bardens' and subjective (OFA) assessment methods were not reliable at six-to-10 or 16-to-18 weeks. Radiographic measurements taken with femurs in a neutral position and hips distracted (distraction index [DI] and Norberg angle) and measurements taken with femurs extended in OFA position (Norberg angle) of six- to 10-week-old puppies accurately predicted DJD occurrence by one year of age (p less than 0.01). Distraction index measurement (PennHIP method) was the most accurate in predicting the development of DJD (p less than 0.001). Distraction index radiography in puppies six-to-10 and 16-to-18 weeks of age was the most reliable predictor of hip dysplasia. Norberg angle measurement was more reliable during hip distraction than when hips were measured in the OFA position in 16- to 18-week-old puppies, but had similar reliability in six- to 10-week-old puppies

  16. The Monte Carlo Simulation Method for System Reliability and Risk Analysis

    CERN Document Server

    Zio, Enrico

    2013-01-01

    Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling.   Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques.   This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...

  17. EEMD-based multiscale ICA method for slewing bearing fault detection and diagnosis

    Science.gov (United States)

    Žvokelj, Matej; Zupan, Samo; Prebil, Ivan

    2016-05-01

    A novel multivariate and multiscale statistical process monitoring method is proposed with the aim of detecting incipient failures in large slewing bearings, where subjective influence plays a minor role. The proposed method integrates the strengths of the Independent Component Analysis (ICA) multivariate monitoring approach with the benefits of Ensemble Empirical Mode Decomposition (EEMD), which adaptively decomposes signals into different time scales and can thus cope with multiscale system dynamics. The method, which was named EEMD-based multiscale ICA (EEMD-MSICA), not only enables bearing fault detection but also offers a mechanism of multivariate signal denoising and, in combination with the Envelope Analysis (EA), a diagnostic tool. The multiscale nature of the proposed approach makes the method convenient to cope with data which emanate from bearings in complex real-world rotating machinery and frequently represent the cumulative effect of many underlying phenomena occupying different regions in the time-frequency plane. The efficiency of the proposed method was tested on simulated as well as real vibration and Acoustic Emission (AE) signals obtained through conducting an accelerated run-to-failure lifetime experiment on a purpose-built laboratory slewing bearing test stand. The ability to detect and locate the early-stage rolling-sliding contact fatigue failure of the bearing indicates that AE and vibration signals carry sufficient information on the bearing condition and that the developed EEMD-MSICA method is able to effectively extract it, thereby representing a reliable bearing fault detection and diagnosis strategy.

  18. Reliability and validity of non-radiographic methods of thoracic kyphosis measurement: a systematic review.

    Science.gov (United States)

    Barrett, Eva; McCreesh, Karen; Lewis, Jeremy

    2014-02-01

    A wide array of instruments are available for non-invasive thoracic kyphosis measurement. Guidelines for selecting outcome measures for use in clinical and research practice recommend that properties such as validity and reliability are considered. This systematic review reports on the reliability and validity of non-invasive methods for measuring thoracic kyphosis. A systematic search of 11 electronic databases located studies assessing reliability and/or validity of non-invasive thoracic kyphosis measurement techniques. Two independent reviewers used a critical appraisal tool to assess the quality of retrieved studies. Data was extracted by the primary reviewer. The results were synthesized qualitatively using a level of evidence approach. 27 studies satisfied the eligibility criteria and were included in the review. The reliability, validity and both reliability and validity were investigated by sixteen, two and nine studies respectively. 17/27 studies were deemed to be of high quality. In total, 15 methods of thoracic kyphosis were evaluated in retrieved studies. All investigated methods showed high (ICC ≥ .7) to very high (ICC ≥ .9) levels of reliability. The validity of the methods ranged from low to very high. The strongest levels of evidence for reliability exists in support of the Debrunner kyphometer, Spinal Mouse and Flexicurve index, and for validity supports the arcometer and Flexicurve index. Further reliability and validity studies are required to strengthen the level of evidence for the remaining methods of measurement. This should be addressed by future research. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Pairing field methods to improve inference in wildlife surveys while accommodating detection covariance.

    Science.gov (United States)

    Clare, John; McKinney, Shawn T; DePue, John E; Loftin, Cynthia S

    2017-10-01

    individuals more readily than passive hair catches. Inability to photographically distinguish individual sex did not appear to induce negative bias in camera density estimates; instead, hair catches appeared to produce detection competition between individuals that may have been a source of negative bias. Our model reformulations broaden the range of circumstances in which analyses incorporating multiple sources of information can be robustly used, and our empirical results demonstrate that using multiple field-methods can enhance inferences regarding ecological parameters of interest and improve understanding of how reliably survey methods sample these parameters. © 2017 by the Ecological Society of America.

  20. Method for assessing software reliability of the document management system using the RFID technology

    Directory of Open Access Journals (Sweden)

    Kiedrowicz Maciej

    2016-01-01

    Full Text Available The deliberations presented in this study refer to the method for assessing software reliability of the docu-ment management system, using the RFID technology. A method for determining the reliability structure of the dis-cussed software, understood as the index vector for assessing reliability of its components, was proposed. The model of the analyzed software is the control transfer graph, in which the probability of activating individual components during the system's operation results from the so-called operational profile, which characterizes the actual working environment. The reliability structure is established as a result of the solution of a specific mathematical software task. The knowledge of the reliability structure of the software makes it possible to properly plan the time and finan-cial expenses necessary to build the software, which would meet the reliability requirements. The application of the presented method is illustrated by the number example, corresponding to the software reality of the RFID document management system.

  1. Structural system reliability calculation using a probabilistic fault tree analysis method

    Science.gov (United States)

    Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.

    1992-01-01

    The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.

  2. Optical and non-optical methods for detection and characterization of microparticles and exosomes.

    Science.gov (United States)

    van der Pol, E; Hoekstra, A G; Sturk, A; Otto, C; van Leeuwen, T G; Nieuwland, R

    2010-12-01

    Microparticles and exosomes are cell-derived microvesicles present in body fluids that play a role in coagulation, inflammation, cellular homeostasis and survival, intercellular communication, and transport. Despite increasing scientific and clinical interest, no standard procedures are available for the isolation, detection and characterization of microparticles and exosomes, because their size is below the reach of conventional detection methods. Our objective is to give an overview of currently available and potentially applicable methods for optical and non-optical determination of the size, concentration, morphology, biochemical composition and cellular origin of microparticles and exosomes. The working principle of all methods is briefly discussed, as well as their applications and limitations based on the underlying physical parameters of the technique. For most methods, the expected size distribution for a given microvesicle population is determined. The explanations of the physical background and the outcomes of our calculations provide insights into the capabilities of each method and make a comparison possible between the discussed methods. In conclusion, several (combinations of) methods can detect clinically relevant properties of microparticles and exosomes. These methods should be further explored and validated by comparing measurement results so that accurate, reliable and fast solutions come within reach. © 2010 International Society on Thrombosis and Haemostasis.

  3. An improved AE detection method of rail defect based on multi-level ANC with VSS-LMS

    Science.gov (United States)

    Zhang, Xin; Cui, Yiming; Wang, Yan; Sun, Mingjian; Hu, Hengshan

    2018-01-01

    In order to ensure the safety and reliability of railway system, Acoustic Emission (AE) method is employed to investigate rail defect detection. However, little attention has been paid to the defect detection at high speed, especially for noise interference suppression. Based on AE technology, this paper presents an improved rail defect detection method by multi-level ANC with VSS-LMS. Multi-level noise cancellation based on SANC and ANC is utilized to eliminate complex noises at high speed, and tongue-shaped curve with index adjustment factor is proposed to enhance the performance of variable step-size algorithm. Defect signals and reference signals are acquired by the rail-wheel test rig. The features of noise signals and defect signals are analyzed for effective detection. The effectiveness of the proposed method is demonstrated by comparing with the previous study, and different filter lengths are investigated to obtain a better noise suppression performance. Meanwhile, the detection ability of the proposed method is verified at the top speed of the test rig. The results clearly illustrate that the proposed method is effective in detecting rail defects at high speed, especially for noise interference suppression.

  4. Space Vehicle Reliability Modeling in DIORAMA

    Energy Technology Data Exchange (ETDEWEB)

    Tornga, Shawn Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-12

    When modeling system performance of space based detection systems it is important to consider spacecraft reliability. As space vehicles age the components become prone to failure for a variety of reasons such as radiation damage. Additionally, some vehicles may lose the ability to maneuver once they exhaust fuel supplies. Typically failure is divided into two categories: engineering mistakes and technology surprise. This document will report on a method of simulating space vehicle reliability in the DIORAMA framework.

  5. A human reliability based usability evaluation method for safety-critical software

    International Nuclear Information System (INIS)

    Boring, R. L.; Tran, T. Q.; Gertman, D. I.; Ragsdale, A.

    2006-01-01

    Boring and Gertman (2005) introduced a novel method that augments heuristic usability evaluation methods with that of the human reliability analysis method of SPAR-H. By assigning probabilistic modifiers to individual heuristics, it is possible to arrive at the usability error probability (UEP). Although this UEP is not a literal probability of error, it nonetheless provides a quantitative basis to heuristic evaluation. This method allows one to seamlessly prioritize and identify usability issues (i.e., a higher UEP requires more immediate fixes). However, the original version of this method required the usability evaluator to assign priority weights to the final UEP, thus allowing the priority of a usability issue to differ among usability evaluators. The purpose of this paper is to explore an alternative approach to standardize the priority weighting of the UEP in an effort to improve the method's reliability. (authors)

  6. Reliability of salivary testosterone measurements in diagnosis of Polycystic Ovarian Syndrome

    Directory of Open Access Journals (Sweden)

    Omnia Youssef

    2010-07-01

    Conclusion: Determination of salivary testosterone is a reliable method to detect changes in the concentration of available biologically active testosterone in the serum. Salivary testosterone provides a sensitive, simple, reliable, non-invasive and uncomplicated diagnostic approach for PCOS.

  7. Rapid detection of Listeria monocytogenes in raw milk and soft cheese by a redox potential measurement based method combined with real-time PCR.

    Science.gov (United States)

    Erdősi, Orsolya; Szakmár, Katalin; Reichart, Olivér; Szili, Zsuzsanna; László, Noémi; Székely Körmöczy, Péter; Laczay, Péter

    2014-09-01

    The incidence of outbreaks of foodborne listeriosis has indicated the need for a reliable and rapid detection of the microbe in different foodstuffs. A method combining redox potential measurement and real-time polymerase chain reaction (PCR) was developed to detect Listeria monocytogenes in artificially contaminated raw milk and soft cheese. Food samples of 25 g or 25 ml were homogenised in 225 ml of Listeria Enrichment Broth (LEB) with Oxford supplement, and the redox potential measurement technique was applied. For Listeria species the measuring time was maximum 34 h. The absence of L. monocytogenes could reliably be proven by the redox potential measurement method, but Listeria innocua and Bacillus subtilis could not be differentiated from L. monocytogenes on the basis of the redox curves. The presence of L. monocytogenes had to be confirmed by real-time PCR. The combination of these two methods proved to detect < 10 cfu/g of L. monocytogenes in a cost- and time-effective manner. This method can potentially be used as an alternative to the standard nutrient method for the rapid detection of L. monocytogenes in food.

  8. Issues in cognitive reliability

    International Nuclear Information System (INIS)

    Woods, D.D.; Hitchler, M.J.; Rumancik, J.A.

    1984-01-01

    This chapter examines some problems in current methods to assess reactor operator reliability at cognitive tasks and discusses new approaches to solve these problems. The two types of human failures are errors in the execution of an intention and errors in the formation/selection of an intention. Topics considered include the types of description, error correction, cognitive performance and response time, the speed-accuracy tradeoff function, function based task analysis, and cognitive task analysis. One problem of human reliability analysis (HRA) techniques in general is the question of what are the units of behavior whose reliability are to be determined. A second problem for HRA is that people often detect and correct their errors. The use of function based analysis, which maps the problem space for plant control, is recommended

  9. How often should we monitor for reliable detection of atrial fibrillation recurrence? Efficiency considerations and implications for study design.

    Directory of Open Access Journals (Sweden)

    Efstratios I Charitos

    Full Text Available Although atrial fibrillation (AF recurrence is unpredictable in terms of onset and duration, current intermittent rhythm monitoring (IRM diagnostic modalities are short-termed and discontinuous. The aim of the present study was to investigate the necessary IRM frequency required to reliably detect recurrence of various AF recurrence patterns.The rhythm histories of 647 patients (mean AF burden: 12 ± 22% of monitored time; 687 patient-years with implantable continuous monitoring devices were reconstructed and analyzed. With the use of computationally intensive simulation, we evaluated the necessary IRM frequency to reliably detect AF recurrence of various AF phenotypes using IRM of various durations.The IRM frequency required for reliable AF detection depends on the amount and temporal aggregation of the AF recurrence (p95% sensitivity of AF recurrence required higher IRM frequencies (>12 24-hour; >6 7-day; >4 14-day; >3 30-day IRM per year; p<0.0001 than currently recommended. Lower IRM frequencies will under-detect AF recurrence and introduce significant bias in the evaluation of therapeutic interventions. More frequent but of shorter duration, IRMs (24-hour are significantly more time effective (sensitivity per monitored time than a fewer number of longer IRM durations (p<0.0001.Reliable AF recurrence detection requires higher IRM frequencies than currently recommended. Current IRM frequency recommendations will fail to diagnose a significant proportion of patients. Shorter duration but more frequent IRM strategies are significantly more efficient than longer IRM durations.Unique identifier: NCT00806689.

  10. An exact method for solving logical loops in reliability analysis

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2009-01-01

    This paper presents an exact method for solving logical loops in reliability analysis. The systems that include logical loops are usually described by simultaneous Boolean equations. First, present a basic rule of solving simultaneous Boolean equations. Next, show the analysis procedures for three-component system with external supports. Third, more detailed discussions are given for the establishment of logical loop relation. Finally, take up two typical structures which include more than one logical loop. Their analysis results and corresponding GO-FLOW charts are given. The proposed analytical method is applicable to loop structures that can be described by simultaneous Boolean equations, and it is very useful in evaluating the reliability of complex engineering systems.

  11. Structural reliability methods: Code development status

    Science.gov (United States)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  12. AK-SYS: An adaptation of the AK-MCS method for system reliability

    International Nuclear Information System (INIS)

    Fauriat, W.; Gayton, N.

    2014-01-01

    A lot of research work has been proposed over the last two decades to evaluate the probability of failure of a structure involving a very time-consuming mechanical model. Surrogate model approaches based on Kriging, such as the Efficient Global Reliability Analysis (EGRA) or the Active learning and Kriging-based Monte-Carlo Simulation (AK-MCS) methods, are very efficient and each has advantages of its own. EGRA is well suited to evaluating small probabilities, as the surrogate can be used to classify any population. AK-MCS is built in relation to a given population and requires no optimization program for the active learning procedure to be performed. It is therefore easier to implement and more likely to spend computational effort on areas with a significant probability content. When assessing system reliability, analytical approaches and first-order approximation are widely used in the literature. However, in the present paper we rather focus on sampling techniques and, considering the recent adaptation of the EGRA method for systems, a strategy is presented to adapt the AK-MCS method for system reliability. The AK-SYS method, “Active learning and Kriging-based SYStem reliability method”, is presented. Its high efficiency and accuracy are illustrated via various examples

  13. Reliability Analysis of Safety Grade PLC(POSAFE-Q) for Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, J. Y.; Lyou, J.; Lee, D. Y.; Choi, J. G.; Park, W. M.

    2006-01-01

    The Part Count Method of the military standard MILHDK- 217F has been used for the reliability prediction of the nuclear field. This handbook determines the Programmable Logic Controller (PLC) failure rate by summing the failure rates of the individual component included in the PLC. Normally it is easily predictable that the components added for the fault detection improve the reliability of the PLC. But the application of this handbook is estimated with poor reliability because of the increased component number for the fault detection. To compensate this discrepancy, the quantitative reliability analysis method is suggested using the functional separation model in this paper. And it is applied to the Reactor Protection System (RPS) being developed in Korea to identify any design weak points from a safety point of view

  14. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  15. Specific PCR-based detection of Alternaria helianthi

    DEFF Research Database (Denmark)

    Udayashankar, A.C.; Nayaka, S. Chandra; Archana, B.

    2012-01-01

    Alternaria helianthi is an important seed-borne pathogenic fungus responsible for blight disease in sunflower. The current detection methods, which are based on culture and morphological identification, are time-consuming, laborious and are not always reliable. A PCR-based diagnostic method...... tested. The detection limit of the PCR method was of 10 pg from template DNA. The primers could also detect the pathogen in infected sunflower seed. This species-specific PCR method provides a quick, simple, powerful and reliable alternative to conventional methods in the detection and identification...

  16. Reliable Grid Condition Detection and Control of Single-Phase Distributed Power Generation Systems

    DEFF Research Database (Denmark)

    Ciobotaru, Mihai

    standards addressed to the grid-connected systems will harmonize the combination of the DPGS and the classical power plants. Consequently, the major tasks of this thesis were to develop new grid condition detection techniques and intelligent control in order to allow the DPGS not only to deliver power...... to the utility grid but also to sustain it. This thesis was divided into two main parts, namely "Grid Condition Detection" and "Control of Single-Phase DPGS". In the first part, the main focus was on reliable Phase Locked Loop (PLL) techniques for monitoring the grid voltage and on grid impedance estimation...... techniques. Additionally, a new technique for detecting the islanding mode has been developed and successfully tested. In the second part, the main reported research was concentrated around adaptive current controllers based on the information provided by the grid condition detection techniques. To guarantee...

  17. Reliability Analysis Of Fire System On The Industry Facility By Use Fameca Method

    International Nuclear Information System (INIS)

    Sony T, D.T.; Situmorang, Johnny; Ismu W, Puradwi; Demon H; Mulyanto, Dwijo; Kusmono, Slamet; Santa, Sigit Asmara

    2000-01-01

    FAMECA is one of the analysis method to determine system reliability on the industry facility. Analysis is done by some procedure that is identification of component function, determination of failure mode, severity level and effect of their failure. Reliability value is determined by three combinations that is severity level, component failure value and critical component. Reliability of analysis has been done for fire system on the industry by FAMECA method. Critical component which identified is pump, air release valve, check valve, manual test valve, isolation valve, control system etc

  18. A new method of small target detection based on neural network

    Science.gov (United States)

    Hu, Jing; Hu, Yongli; Lu, Xinxin

    2018-02-01

    The detection and tracking of moving dim target in infrared image have been an research hotspot for many years. The target in each frame of images only occupies several pixels without any shape and structure information. Moreover, infrared small target is often submerged in complicated background with low signal-to-clutter ratio, making the detection very difficult. Different backgrounds exhibit different statistical properties, making it becomes extremely complex to detect the target. If the threshold segmentation is not reasonable, there may be more noise points in the final detection, which is unfavorable for the detection of the trajectory of the target. Single-frame target detection may not be able to obtain the desired target and cause high false alarm rate. We believe the combination of suspicious target detection spatially in each frame and temporal association for target tracking will increase reliability of tracking dim target. The detection of dim target is mainly divided into two parts, In the first part, we adopt bilateral filtering method in background suppression, after the threshold segmentation, the suspicious target in each frame are extracted, then we use LSTM(long short term memory) neural network to predict coordinates of target of the next frame. It is a brand-new method base on the movement characteristic of the target in sequence images which could respond to the changes in the relationship between past and future values of the values. Simulation results demonstrate proposed algorithm can effectively predict the trajectory of the moving small target and work efficiently and robustly with low false alarm.

  19. Reliability, validity and minimal detectable change of the Mini-BESTest in Greek participants with chronic stroke.

    Science.gov (United States)

    Lampropoulou, Sofia I; Billis, Evdokia; Gedikoglou, Ingrid A; Michailidou, Christina; Nowicky, Alexander V; Skrinou, Dimitra; Michailidi, Fotini; Chandrinou, Danae; Meligkoni, Margarita

    2018-02-23

    This study aimed to investigate the psychometric characteristics of reliability, validity and ability to detect change of a newly developed balance assessment tool, the Mini-BESTest, in Greek patients with stroke. A prospective, observational design study with test-retest measures was conducted. A convenience sample of 21 Greek patients with chronic stroke (14 male, 7 female; age of 63 ± 16 years) was recruited. Two independent examiners administered the scale, for the inter-rater reliability, twice within 10 days for the test-retest reliability. Bland Altman Analysis for repeated measures assessed the absolute reliability and the Standard Error of Measurement (SEM) and the Minimum Detectable Change at 95% confidence interval (MDC 95% ) were established. The Greek Mini-BESTest (Mini-BESTest GR ) was correlated with the Greek Berg Balance Scale (BBS GR ) for assessing the concurrent validity and with the Timed Up and Go (TUG), the Functional Reach Test (FRT) and the Greek Falls Efficacy Scale-International (FES-I GR ) for the convergent validity. The Mini-BESTestGR demonstrated excellent inter-rater reliability (ICC (95%CI) = 0.997 (0.995-0.999, SEM = 0.46) with the scores of two raters within the limits of agreement (mean dif  = -0.143 ± 0.727, p > 0.05) and test-retest reliability (ICC (95%CI) = 0.966 (0.926-0.988), SEM = 1.53). Additionally, the Mini-BESTest GR yielded very strong to moderate correlations with BBS GR (r = 0.924, p reliability and the equally good validity of the Mini-BESTest GR , strongly support its utility in Greek people with chronic stroke. Its ability to identify clinically meaningful changes and falls risk need further investigation.

  20. COMPOSITE METHOD OF RELIABILITY RESEARCH FOR HIERARCHICAL MULTILAYER ROUTING SYSTEMS

    Directory of Open Access Journals (Sweden)

    R. B. Tregubov

    2016-09-01

    Full Text Available The paper deals with the idea of a research method for hierarchical multilayer routing systems. The method represents a composition of methods of graph theories, reliability, probabilities, etc. These methods are applied to the solution of different private analysis and optimization tasks and are systemically connected and coordinated with each other through uniform set-theoretic representation of the object of research. The hierarchical multilayer routing systems are considered as infrastructure facilities (gas and oil pipelines, automobile and railway networks, systems of power supply and communication with distribution of material resources, energy or information with the use of hierarchically nested functions of routing. For descriptive reasons theoretical constructions are considered on the example of task solution of probability determination for up state of specific infocommunication system. The author showed the possibility of constructive combination of graph representation of structure of the object of research and a logic probable analysis method of its reliability indices through uniform set-theoretic representation of its elements and processes proceeding in them.

  1. The reliability, accuracy and minimal detectable difference of a multi-segment kinematic model of the foot-shoe complex.

    Science.gov (United States)

    Bishop, Chris; Paul, Gunther; Thewlis, Dominic

    2013-04-01

    Kinematic models are commonly used to quantify foot and ankle kinematics, yet no marker sets or models have been proven reliable or accurate when wearing shoes. Further, the minimal detectable difference of a developed model is often not reported. We present a kinematic model that is reliable, accurate and sensitive to describe the kinematics of the foot-shoe complex and lower leg during walking gait. In order to achieve this, a new marker set was established, consisting of 25 markers applied on the shoe and skin surface, which informed a four segment kinematic model of the foot-shoe complex and lower leg. Three independent experiments were conducted to determine the reliability, accuracy and minimal detectable difference of the marker set and model. Inter-rater reliability of marker placement on the shoe was proven to be good to excellent (ICC=0.75-0.98) indicating that markers could be applied reliably between raters. Intra-rater reliability was better for the experienced rater (ICC=0.68-0.99) than the inexperienced rater (ICC=0.38-0.97). The accuracy of marker placement along each axis was <6.7 mm for all markers studied. Minimal detectable difference (MDD90) thresholds were defined for each joint; tibiocalcaneal joint--MDD90=2.17-9.36°, tarsometatarsal joint--MDD90=1.03-9.29° and the metatarsophalangeal joint--MDD90=1.75-9.12°. These thresholds proposed are specific for the description of shod motion, and can be used in future research designed at comparing between different footwear. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Numerical simulation for cracks detection using the finite elements method

    Directory of Open Access Journals (Sweden)

    S Bennoud

    2016-09-01

    Full Text Available The means of detection must ensure controls either during initial construction, or at the time of exploitation of all parts. The Non destructive testing (NDT gathers the most widespread methods for detecting defects of a part or review the integrity of a structure. In the areas of advanced industry (aeronautics, aerospace, nuclear …, assessing the damage of materials is a key point to control durability and reliability of parts and materials in service. In this context, it is necessary to quantify the damage and identify the different mechanisms responsible for the progress of this damage. It is therefore essential to characterize materials and identify the most sensitive indicators attached to damage to prevent their destruction and use them optimally. In this work, simulation by finite elements method is realized with aim to calculate the electromagnetic energy of interaction: probe and piece (with/without defect. From calculated energy, we deduce the real and imaginary components of the impedance which enables to determine the characteristic parameters of a crack in various metallic parts.

  3. A study in the reliability analysis method for nuclear power plant structures (I)

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Byung Hwan; Choi, Seong Cheol; Shin, Ho Sang; Yang, In Hwan; Kim, Yi Sung; Yu, Young; Kim, Se Hun [Seoul, Nationl Univ., Seoul (Korea, Republic of)

    1999-03-15

    Nuclear power plant structures may be exposed to aggressive environmental effects that may cause their strength and stiffness to decrease over their service life. Although the physics of these damage mechanisms are reasonably well understood and quantitative evaluation of their effects on time-dependent structural behavior is possible in some instances, such evaluations are generally very difficult and remain novel. The assessment of existing steel containment in nuclear power plants for continued service must provide quantitative evidence that they are able to withstand future extreme loads during a service period with an acceptable level of reliability. Rational methodologies to perform the reliability assessment can be developed from mechanistic models of structural deterioration, using time-dependent structural reliability analysis to take loading and strength uncertainties into account. The final goal of this study is to develop the analysis method for the reliability of containment structures. The cause and mechanism of corrosion is first clarified and the reliability assessment method has been established. By introducing the equivalent normal distribution, the procedure of reliability analysis which can determine the failure probabilities has been established. The influence of design variables to reliability and the relation between the reliability and service life will be continued second year research.

  4. Comparison of detection methods for extended-spectrum beta-lactamases in Escherichia coli strains

    Directory of Open Access Journals (Sweden)

    Ewelina Kałużna

    2014-06-01

    Full Text Available Introduction: Detection of extended-spectrum beta-lactamases (ESBLs could be a major challenge for microbiologists – the difficulties arise mainly from the phenotypic differences among strains.Materials and Methods: Evaluation of ESBLs was performed on 42 strains of E. coli by: 1 DDST on MHA, 2 DDST on MHA with cloxacillin, 3 CT on MHA, according to CLSI, 4 CT on MHA with cloxacillin, 5 Etest ESBL (AB Biodisk, 6 CHROMagarTM ESBL (GRASO, 7 ChromID® ESBL (bioMérieux, and 8 automatic system VITEK2 ESBL test (bioMérieux.Result: Positive results were obtained for 20 strains using method 1, for 18 strains using method 2, 17 by method 3, 14 by method 4, 11 by method 5, 39 by method 6, 40 by method 7, and 15 by method 8. Using Etest ESBL 6.0 non-determinable results were obtained. The most consistent results were obtained when comparing the results of method 3 with results of method 2 (97.6%, and comparing the results obtained using methods 3 and 8 (95.2%.Conclusions: Based on our study we conclude that the chromogenic media can only be used as a screening method for the detection of ESBLs in E. coli rods. Etest is less useful compared to other phenotype methods, due to the impossibility of obtaining results for all the tested strains. Adding cloxacillin to MHA does not increase the frequency of detection of ESBLs in E. coli strains. DDST seems to be the most reliable among phenotypic methods for the detection of ESBLs in E. coli rods.

  5. A comparative study on the HW reliability assessment methods for digital I and C equipment

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Hoan Sung; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Lee, G. Y. [Korea Atomic Energy Research Institute, Taejeon (Korea); Kim, M. C. [Korea Advanced Institute of Science and Technology, Taejeon (Korea); Jun, S. T. [KHNP, Taejeon (Korea)

    2002-03-01

    It is necessary to predict or to evaluate the reliability of electronic equipment for the probabilistic safety analysis of digital instrument and control equipment. But most databases for the reliability prediction have no data for the up-to-date equipment and the failure modes are not classified. The prediction results for the specific component show different values according to the methods and databases. For boards and systems each method shows different values than others also. This study is for reliability prediction of PDC system for Wolsong NPP1 as a digital I and C equipment. Various reliability prediction methods and failure databases are used in calculation of the reliability to compare the effects of sensitivity and accuracy of each model and database. Many considerations for the reliability assessment of digital systems are derived with the results of this study. 14 refs., 19 figs., 15 tabs. (Author)

  6. Remote detection device and detection method therefor

    International Nuclear Information System (INIS)

    Kogure, Sumio; Yoshida, Yoji; Matsuo, Takashiro; Takehara, Hidetoshi; Kojima, Shinsaku.

    1997-01-01

    The present invention provides a non-destructive detection device for collectively, efficiently and effectively conducting maintenance and detection for confirming the integrity of a nuclear reactor by way of a shielding member for shielding radiation rays generated from an objective portion to be detected. Namely, devices for direct visual detection using an under water TV camera as a sensor, an eddy current detection using a coil as a sensor and each magnetic powder flow detection are integrated and applied collectively. Specifically, the visual detection by using the TV camera and the eddy current flaw detection are adopted together. The flaw detection with magnetic powder is applied as a means for confirming the results of the two kinds of detections by other method. With such procedures, detection techniques using respective specific theories are combined thereby enabling to enhance the accuracy for the evaluation of the detection. (I.S.)

  7. Model correction factor method for reliability problems involving integrals of non-Gaussian random fields

    DEFF Research Database (Denmark)

    Franchin, P.; Ditlevsen, Ove Dalager; Kiureghian, Armen Der

    2002-01-01

    The model correction factor method (MCFM) is used in conjunction with the first-order reliability method (FORM) to solve structural reliability problems involving integrals of non-Gaussian random fields. The approach replaces the limit-state function with an idealized one, in which the integrals ...

  8. A dynamic discretization method for reliability inference in Dynamic Bayesian Networks

    International Nuclear Information System (INIS)

    Zhu, Jiandao; Collette, Matthew

    2015-01-01

    The material and modeling parameters that drive structural reliability analysis for marine structures are subject to a significant uncertainty. This is especially true when time-dependent degradation mechanisms such as structural fatigue cracking are considered. Through inspection and monitoring, information such as crack location and size can be obtained to improve these parameters and the corresponding reliability estimates. Dynamic Bayesian Networks (DBNs) are a powerful and flexible tool to model dynamic system behavior and update reliability and uncertainty analysis with life cycle data for problems such as fatigue cracking. However, a central challenge in using DBNs is the need to discretize certain types of continuous random variables to perform network inference while still accurately tracking low-probability failure events. Most existing discretization methods focus on getting the overall shape of the distribution correct, with less emphasis on the tail region. Therefore, a novel scheme is presented specifically to estimate the likelihood of low-probability failure events. The scheme is an iterative algorithm which dynamically partitions the discretization intervals at each iteration. Through applications to two stochastic crack-growth example problems, the algorithm is shown to be robust and accurate. Comparisons are presented between the proposed approach and existing methods for the discretization problem. - Highlights: • A dynamic discretization method is developed for low-probability events in DBNs. • The method is compared to existing approaches on two crack growth problems. • The method is shown to improve on existing methods for low-probability events

  9. Robust and reliable banknote authentification and print flaw detection with opto-acoustical sensor fusion methods

    Science.gov (United States)

    Lohweg, Volker; Schaede, Johannes; Türke, Thomas

    2006-02-01

    The authenticity checking and inspection of bank notes is a high labour intensive process where traditionally every note on every sheet is inspected manually. However with the advent of more and more sophisticated security features, both visible and invisible, and the requirement of cost reduction in the printing process, it is clear that automation is required. As more and more print techniques and new security features will be established, total quality security, authenticity and bank note printing must be assured. Therefore, this factor necessitates amplification of a sensorial concept in general. We propose a concept for both authenticity checking and inspection methods for pattern recognition and classification for securities and banknotes, which is based on the concept of sensor fusion and fuzzy interpretation of data measures. In the approach different methods of authenticity analysis and print flaw detection are combined, which can be used for vending or sorting machines, as well as for printing machines. Usually only the existence or appearance of colours and their textures are checked by cameras. Our method combines the visible camera images with IR-spectral sensitive sensors, acoustical and other measurements like temperature and pressure of printing machines.

  10. Non-coding RNA detection methods combined to improve usability, reproducibility and precision

    Directory of Open Access Journals (Sweden)

    Kreikemeyer Bernd

    2010-09-01

    Full Text Available Abstract Background Non-coding RNAs gain more attention as their diverse roles in many cellular processes are discovered. At the same time, the need for efficient computational prediction of ncRNAs increases with the pace of sequencing technology. Existing tools are based on various approaches and techniques, but none of them provides a reliable ncRNA detector yet. Consequently, a natural approach is to combine existing tools. Due to a lack of standard input and output formats combination and comparison of existing tools is difficult. Also, for genomic scans they often need to be incorporated in detection workflows using custom scripts, which decreases transparency and reproducibility. Results We developed a Java-based framework to integrate existing tools and methods for ncRNA detection. This framework enables users to construct transparent detection workflows and to combine and compare different methods efficiently. We demonstrate the effectiveness of combining detection methods in case studies with the small genomes of Escherichia coli, Listeria monocytogenes and Streptococcus pyogenes. With the combined method, we gained 10% to 20% precision for sensitivities from 30% to 80%. Further, we investigated Streptococcus pyogenes for novel ncRNAs. Using multiple methods--integrated by our framework--we determined four highly probable candidates. We verified all four candidates experimentally using RT-PCR. Conclusions We have created an extensible framework for practical, transparent and reproducible combination and comparison of ncRNA detection methods. We have proven the effectiveness of this approach in tests and by guiding experiments to find new ncRNAs. The software is freely available under the GNU General Public License (GPL, version 3 at http://www.sbi.uni-rostock.de/moses along with source code, screen shots, examples and tutorial material.

  11. Application of reliability analysis methods to the comparison of two safety circuits

    International Nuclear Information System (INIS)

    Signoret, J.-P.

    1975-01-01

    Two circuits of different design, intended for assuming the ''Low Pressure Safety Injection'' function in PWR reactors are analyzed using reliability methods. The reliability analysis of these circuits allows the failure trees to be established and the failure probability derived. The dependence of these results on test use and maintenance is emphasized as well as critical paths. The great number of results obtained may allow a well-informed choice taking account of the reliability wanted for the type of circuits [fr

  12. Reliability of Lyapunov characteristic exponents computed by the two-particle method

    Science.gov (United States)

    Mei, Lijie; Huang, Li

    2018-03-01

    For highly complex problems, such as the post-Newtonian formulation of compact binaries, the two-particle method may be a better, or even the only, choice to compute the Lyapunov characteristic exponent (LCE). This method avoids the complex calculations of variational equations compared with the variational method. However, the two-particle method sometimes provides spurious estimates to LCEs. In this paper, we first analyze the equivalence in the definition of LCE between the variational and two-particle methods for Hamiltonian systems. Then, we develop a criterion to determine the reliability of LCEs computed by the two-particle method by considering the magnitude of the initial tangent (or separation) vector ξ0 (or δ0), renormalization time interval τ, machine precision ε, and global truncation error ɛT. The reliable Lyapunov characteristic indicators estimated by the two-particle method form a V-shaped region, which is restricted by d0, ε, and ɛT. Finally, the numerical experiments with the Hénon-Heiles system, the spinning compact binaries, and the post-Newtonian circular restricted three-body problem strongly support the theoretical results.

  13. Reliability Assessment for Low-cost Unmanned Aerial Vehicles

    Science.gov (United States)

    Freeman, Paul Michael

    algorithms to experimental faulted and unfaulted flight test data. Flight tests are conducted with actuator faults that affect the plant input and sensor faults that affect the vehicle state measurements. A model-based detection strategy is designed and uses robust linear filtering methods to reject exogenous disturbances, e.g. wind, while providing robustness to model variation. A data-driven algorithm is developed to operate exclusively on raw flight test data without physical model knowledge. The fault detection and identification performance of these complementary but different methods is compared. Together, enhanced reliability assessment and multi-pronged fault detection and identification techniques can help to bring about the next generation of reliable low-cost unmanned aircraft.

  14. Dependent systems reliability estimation by structural reliability approach

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2014-01-01

    Estimation of system reliability by classical system reliability methods generally assumes that the components are statistically independent, thus limiting its applicability in many practical situations. A method is proposed for estimation of the system reliability with dependent components, where...... the leading failure mechanism(s) is described by physics of failure model(s). The proposed method is based on structural reliability techniques and accounts for both statistical and failure effect correlations. It is assumed that failure of any component is due to increasing damage (fatigue phenomena...... identification. Application of the proposed method can be found in many real world systems....

  15. A study on multi-data source fusion method for petroleum pipeline leak detection

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Wei; Zhang, Laibin [Research Center of Oil and Gas Safety Engineering Technology, China University of Petroleum, Beijing, (China)

    2010-07-01

    The detection of leaks on petroleum pipeline is a very important safety issue. Several studies were commissioned to develop new monitoring procedures for leakage detection. This paper sets out a new leak detection process. The approach developed took into consideration steady and transient states. The study investigated leak diagnosis problems in product pipelines using multi-sensor measurements (pressure, flux, density and temperature). The information collected from each sensor was considered as pieces of evidence that describe the operational conditions of the pipeline. The Dempster-Shafer (D-S) theory is used to associate multi-sensor data to pipe health indices. Experimental pressure and flow rate data were recorded using a Pipeline Leak Detection System (PLDS) acquisition card and used to verify the accuracy and reliability of this new detection method. The results showed that the degree of credibility was a high as 0.877. It was also found that multi-feature information fusion improves recognition of pipeline conditions.

  16. A fast approximation method for reliability analysis of cold-standby systems

    International Nuclear Information System (INIS)

    Wang, Chaonan; Xing, Liudong; Amari, Suprasad V.

    2012-01-01

    Analyzing reliability of large cold-standby systems has been a complicated and time-consuming task, especially for systems with components having non-exponential time-to-failure distributions. In this paper, an approximation model, which is based on the central limit theorem, is presented for the reliability analysis of binary cold-standby systems. The proposed model can estimate the reliability of large cold-standby systems with binary-state components having arbitrary time-to-failure distributions in an efficient and easy way. The accuracy and efficiency of the proposed method are illustrated using several different types of distributions for both 1-out-of-n and k-out-of-n cold-standby systems.

  17. A Novel Reliability Enhanced Handoff Method in Future Wireless Heterogeneous Networks

    Directory of Open Access Journals (Sweden)

    Wang YuPeng

    2016-01-01

    Full Text Available As the demand increases, future networks will follow the trends of network variety and service flexibility, which requires heterogeneous type of network deployment and reliable communication method. In practice, most communication failure happens due to the bad radio link quality, i.e., high-speed users suffers a lot on the problem of radio link failure, which causes the problem of communication interrupt and radio link recovery. To make the communication more reliable, especially for the high mobility users, we propose a novel communication handoff mechanism to reduce the occurrence of service interrupt. Based on computer simulation, we find that the reliability on the service is greatly improved.

  18. A Bayesian reliability evaluation method with integrated accelerated degradation testing and field information

    International Nuclear Information System (INIS)

    Wang, Lizhi; Pan, Rong; Li, Xiaoyang; Jiang, Tongmin

    2013-01-01

    Accelerated degradation testing (ADT) is a common approach in reliability prediction, especially for products with high reliability. However, oftentimes the laboratory condition of ADT is different from the field condition; thus, to predict field failure, one need to calibrate the prediction made by using ADT data. In this paper a Bayesian evaluation method is proposed to integrate the ADT data from laboratory with the failure data from field. Calibration factors are introduced to calibrate the difference between the lab and the field conditions so as to predict a product's actual field reliability more accurately. The information fusion and statistical inference procedure are carried out through a Bayesian approach and Markov chain Monte Carlo methods. The proposed method is demonstrated by two examples and the sensitivity analysis to prior distribution assumption

  19. Development of DNA elution method to detect irradiated foodstuff

    International Nuclear Information System (INIS)

    Copin, M.P.; Bourgeois, C.M.

    1991-01-01

    The aim of the work is to develop a reliable method to detect whether a fresh and frozen foodstuff has been irradiated. The molecule of DNA is one of the targets of ionizing radiation. The induction of three major classes of lesion have been shown. Double strand breaks, single strand breaks and base damage. Among the different techniques used to observe and quantify the strand breaks, techniques of elution are very interesting. The method proposed consisted of a filtration of the DNA at the atmospheric pressure and in non denaturing conditions. The amount of DNA retained on the filter is measured after being suitably labelled by microfluorometry. A difference in the amount of DNA retained on a filter of 2 μm from a lysed muscular tissue sample between a frozen Norway lobster which has been irradiated and one which has not, is observed. 7 refs

  20. Reliability analysis of reactor systems by applying probability method; Analiza pouzdanosti reaktorskih sistema primenom metoda verovatnoce

    Energy Technology Data Exchange (ETDEWEB)

    Milivojevic, S [Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1974-12-15

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component.

  1. Calculation of the reliability of large complex systems by the relevant path method

    International Nuclear Information System (INIS)

    Richter, G.

    1975-03-01

    In this paper, analytical methods are presented and tested with which the probabilistic reliability data of technical systems can be determined for given fault trees and block diagrams and known reliability data of the components. (orig./AK) [de

  2. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    Science.gov (United States)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  3. Reliability analysis of safety systems of nuclear power plant and utility experience with reliability safeguarding of systems during specified normal operation

    International Nuclear Information System (INIS)

    Balfanz, H.P.

    1989-01-01

    The paper gives an outline of the methods applied for reliability analysis of safety systems in nuclear power plant. The main tasks are to check the system design for detection of weak points, and to find possibilities of optimizing the strategies for inspection, inspection intervals, maintenance periods. Reliability safeguarding measures include the determination and verification of the broundary conditions of the analysis with regard to the reliability parameters and maintenance parameters used in the analysis, and the analysis of data feedback reflecting the plant response during operation. (orig.) [de

  4. Application of Statistical Methods to Activation Analytical Results near the Limit of Detection

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Wanscher, B.

    1978-01-01

    Reporting actual numbers instead of upper limits for analytical results at or below the detection limit may produce reliable data when these numbers are subjected to appropriate statistical processing. Particularly in radiometric methods, such as activation analysis, where individual standard...... deviations of analytical results may be estimated, improved discrimination may be based on the Analysis of Precision. Actual experimental results from a study of the concentrations of arsenic in human skin demonstrate the power of this principle....

  5. Pharyngeal pH alone is not reliable for the detection of pharyngeal reflux events: A study with oesophageal and pharyngeal pH-impedance monitoring

    Science.gov (United States)

    Desjardin, Marie; Roman, Sabine; des Varannes, Stanislas Bruley; Gourcerol, Guillaume; Coffin, Benoit; Ropert, Alain; Mion, François

    2013-01-01

    Background Pharyngeal pH probes and pH-impedance catheters have been developed for the diagnosis of laryngo-pharyngeal reflux. Objective To determine the reliability of pharyngeal pH alone for the detection of pharyngeal reflux events. Methods 24-h pH-impedance recordings performed in 45 healthy subjects with a bifurcated probe for detection of pharyngeal and oesophageal reflux events were reviewed. Pharyngeal pH drops to below 4 and 5 were analysed for the simultaneous occurrence of pharyngeal reflux, gastro-oesophageal reflux, and swallows, according to impedance patterns. Results Only 7.0% of pharyngeal pH drops to below 5 identified with impedance corresponded to pharyngeal reflux, while 92.6% were related to swallows and 10.2 and 13.3% were associated with proximal and distal gastro-oesophageal reflux events, respectively. Of pharyngeal pH drops to below 4, 13.2% were related to pharyngeal reflux, 87.5% were related to swallows, and 18.1 and 21.5% were associated with proximal and distal gastro-oesophageal reflux events, respectively. Conclusions This study demonstrates that pharyngeal pH alone is not reliable for the detection of pharyngeal reflux and that adding distal oesophageal pH analysis is not helpful. The only reliable analysis should take into account impedance patterns demonstrating the presence of pharyngeal reflux event preceded by a distal and proximal reflux event within the oesophagus. PMID:24917995

  6. Doppler method leak detection for LMFBR steam generators. Pt. 3. Investigation of detection sensitivity and method

    International Nuclear Information System (INIS)

    Kumagai, Hiromichi; Kinoshita, Izumi

    2001-01-01

    To prevent the expansion of tube damage and to maintain structural integrity in the steam generators (SGs) of a fast breeder reactor (FBR), it is necessary to detect precisely and immediately any leakage of water from heat transfer tubes. Therefore, the Doppler method was developed. Previous studies have revealed that, in the SG full-sector model that simulates actual SGs, the Doppler method can detect bubbles of 0.4 l/s within a few seconds. However in consideration of the dissolution rate of hydrogen generated by a sodium-water reaction even from a small water leak, it is necessary to detect smaller leakages of water from the heat transfer tubes. The detection sensitivity of the Doppler method and the influence of background noise were experimentally investigated. In-water experiments were performed using the SG model. The results show that the Doppler method can detect bubbles of 0.01 l/s (equivalent to a water leak rate of about 0.01 g/s) within a few seconds and that the background noise has little effect on water leak detection performance. The Doppler method thus has great potential for the detection of water leakage in SGs. (author)

  7. Comparison of three methods for detection of melamine in compost and soil

    International Nuclear Information System (INIS)

    Tian, Yongqiang; Chen, Liming; Gao, Lihong; Wu, Manli; Dick, Warren A.

    2012-01-01

    Recent product recalls and food safety incidents due to melamine (MM) adulteration or contamination have caused a worldwide food security concern. This has led to many methods being developed to detect MM in foods, but few methods haves been reported that can rapidly and reliably measure MM in environmental samples. To meet this need, a high performance liquid chromatography (HPLC) with UV detection method, an enzyme-linked immunosorbent assay (ELISA) test kit, and an enzyme-linked rapid colorimetric assay (RCA) test kit were evaluated for their ability to accurately measure MM concentrations in compost and soil samples. All three methods accurately detected MM concentrations if no MM degradation products, such as ammeline (AMN), ammelide (AMD) and cyanuric acid (CA), were present in an aqueous sample. In the presence of these MM degradation products, the HPLC yielded more accurate concentrations than the ELISA method and there was no significant (P > 0.05) difference between the HPLC and RCA methods. However, if samples were purified by SPE or prepared with blocking buffer, the ELISA method accurately measured MM concentrations, even in the presence of the MM degradation products. The HPLC method generally outperformed the RCA method for measuring MM in soil extracts but gave similar results for compost extracts. The number of samples that can be analyzed by the ELISA and RCA methods in a 24-hour time period is much greater than by the HPLC method. Thus the RCA method would seem to be a good screening method for measuring MM in compost and soil samples and the results obtained could then be confirmed by the HPLC method. The HPLC method, however, also allows simultaneous measurement of MM and its degradation products of AMD, AMN and CA. - Highlights: ► We detected melamine in environmental samples by three methods. ► ELISA can measures melamine in an environmental sample if a blocking buffer is used. ► A rapid colorimetric assay (RCA) rapidly screens

  8. Reliability analysis based on a novel density estimation method for structures with correlations

    Directory of Open Access Journals (Sweden)

    Baoyu LI

    2017-06-01

    Full Text Available Estimating the Probability Density Function (PDF of the performance function is a direct way for structural reliability analysis, and the failure probability can be easily obtained by integration in the failure domain. However, efficiently estimating the PDF is still an urgent problem to be solved. The existing fractional moment based maximum entropy has provided a very advanced method for the PDF estimation, whereas the main shortcoming is that it limits the application of the reliability analysis method only to structures with independent inputs. While in fact, structures with correlated inputs always exist in engineering, thus this paper improves the maximum entropy method, and applies the Unscented Transformation (UT technique to compute the fractional moments of the performance function for structures with correlations, which is a very efficient moment estimation method for models with any inputs. The proposed method can precisely estimate the probability distributions of performance functions for structures with correlations. Besides, the number of function evaluations of the proposed method in reliability analysis, which is determined by UT, is really small. Several examples are employed to illustrate the accuracy and advantages of the proposed method.

  9. An overview of reliability methods in mechanical and structural design

    Science.gov (United States)

    Wirsching, P. H.; Ortiz, K.; Lee, S. J.

    1987-01-01

    An evaluation is made of modern methods of fast probability integration and Monte Carlo treatment for the assessment of structural systems' and components' reliability. Fast probability integration methods are noted to be more efficient than Monte Carlo ones. This is judged to be an important consideration when several point probability estimates must be made in order to construct a distribution function. An example illustrating the relative efficiency of the various methods is included.

  10. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software.

    Science.gov (United States)

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2015-05-01

    Several sophisticated methods of footprint analysis currently exist. However, it is sometimes useful to apply standard measurement methods of recognized evidence with an easy and quick application. We sought to assess the reliability and validity of a new method of footprint assessment in a healthy population using Photoshop CS5 software (Adobe Systems Inc, San Jose, California). Forty-two footprints, corresponding to 21 healthy individuals (11 men with a mean ± SD age of 20.45 ± 2.16 years and 10 women with a mean ± SD age of 20.00 ± 1.70 years) were analyzed. Footprints were recorded in static bipedal standing position using optical podography and digital photography. Three trials for each participant were performed. The Hernández-Corvo, Chippaux-Smirak, and Staheli indices and the Clarke angle were calculated by manual method and by computerized method using Photoshop CS5 software. Test-retest was used to determine reliability. Validity was obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed high values (ICC, 0.98-0.99). Moreover, the validity test clearly showed no difference between techniques (ICC, 0.99-1). The reliability and validity of a method to measure, assess, and record the podometric indices using Photoshop CS5 software has been demonstrated. This provides a quick and accurate tool useful for the digital recording of morphostatic foot study parameters and their control.

  11. Network reliability analysis of complex systems using a non-simulation-based method

    International Nuclear Information System (INIS)

    Kim, Youngsuk; Kang, Won-Hee

    2013-01-01

    Civil infrastructures such as transportation, water supply, sewers, telecommunications, and electrical and gas networks often establish highly complex networks, due to their multiple source and distribution nodes, complex topology, and functional interdependence between network components. To understand the reliability of such complex network system under catastrophic events such as earthquakes and to provide proper emergency management actions under such situation, efficient and accurate reliability analysis methods are necessary. In this paper, a non-simulation-based network reliability analysis method is developed based on the Recursive Decomposition Algorithm (RDA) for risk assessment of generic networks whose operation is defined by the connections of multiple initial and terminal node pairs. The proposed method has two separate decomposition processes for two logical functions, intersection and union, and combinations of these processes are used for the decomposition of any general system event with multiple node pairs. The proposed method is illustrated through numerical network examples with a variety of system definitions, and is applied to a benchmark gas transmission pipe network in Memphis TN to estimate the seismic performance and functional degradation of the network under a set of earthquake scenarios.

  12. Test-retest reliability of myofascial trigger point detection in hip and thigh areas.

    Science.gov (United States)

    Rozenfeld, E; Finestone, A S; Moran, U; Damri, E; Kalichman, L

    2017-10-01

    Myofascial trigger points (MTrP's) are a primary source of pain in patients with musculoskeletal disorders. Nevertheless, they are frequently underdiagnosed. Reliable MTrP palpation is the necessary for their diagnosis and treatment. The few studies that have looked for intra-tester reliability of MTrPs detection in upper body, provide preliminary evidence that MTrP palpation is reliable. Reliability tests for MTrP palpation on the lower limb have not yet been performed. To evaluate inter- and intra-tester reliability of MTrP recognition in hip and thigh muscles. Reliability study. 21 patients (15 males and 6 females, mean age 21.1 years) referred to the physical therapy clinic, 10 with knee or hip pain and 11 with pain in an upper limb, low back, shin or ankle. Two experienced physical therapists performed the examinations, blinded to the subjects' identity, medical condition and results of the previous MTrP evaluation. Each subject was evaluated four times, twice by each examiner in a random order. Dichotomous findings included a palpable taut band, tenderness, referred pain, and relevance of referred pain to patient's complaint. Based on these, diagnosis of latent MTrP's or active MTrP's was established. The evaluation was performed on both legs and included a total of 16 locations in the following muscles: rectus femoris (proximal), vastus medialis (middle and distal), vastus lateralis (middle and distal) and gluteus medius (anterior, posterior and distal). Inter- and intra-tester reliability (Cohen's kappa (κ)) values for single sites ranged from -0.25 to 0.77. Median intra-tester reliability was 0.45 and 0.46 for latent and active MTrP's, and median inter-tester reliability was 0.51 and 0.64 for latent and active MTrPs, respectively. The examination of the distal vastus medialis was most reliable for latent and active MTrP's (intra-tester k = 0.27-0.77, inter-tester k = 0.77 and intra-tester k = 0.53-0.72, inter-tester k = 0.72, correspondingly

  13. A generic method for assignment of reliability scores applied to solvent accessibility predictions

    DEFF Research Database (Denmark)

    Petersen, Bent; Petersen, Thomas Nordahl; Andersen, Pernille

    2009-01-01

    : The performance of the neural networks was evaluated on a commonly used set of sequences known as the CB513 set. An overall Pearson's correlation coefficient of 0.72 was obtained, which is comparable to the performance of the currently best public available method, Real-SPINE. Both methods associate a reliability...... comparing the Pearson's correlation coefficient for the upper 20% of predictions sorted according to reliability. For this subset, values of 0.79 and 0.74 are obtained using our and the compared method, respectively. This tendency is true for any selected subset....

  14. Study on reliability analysis based on multilevel flow models and fault tree method

    International Nuclear Information System (INIS)

    Chen Qiang; Yang Ming

    2014-01-01

    Multilevel flow models (MFM) and fault tree method describe the system knowledge in different forms, so the two methods express an equivalent logic of the system reliability under the same boundary conditions and assumptions. Based on this and combined with the characteristics of MFM, a method mapping MFM to fault tree was put forward, thus providing a way to establish fault tree rapidly and realizing qualitative reliability analysis based on MFM. Taking the safety injection system of pressurized water reactor nuclear power plant as an example, its MFM was established and its reliability was analyzed qualitatively. The analysis result shows that the logic of mapping MFM to fault tree is correct. The MFM is easily understood, created and modified. Compared with the traditional fault tree analysis, the workload is greatly reduced and the modeling time is saved. (authors)

  15. Increased efficacy for in-house validation of real-time PCR GMO detection methods.

    Science.gov (United States)

    Scholtens, I M J; Kok, E J; Hougs, L; Molenaar, B; Thissen, J T N M; van der Voet, H

    2010-03-01

    To improve the efficacy of the in-house validation of GMO detection methods (DNA isolation and real-time PCR, polymerase chain reaction), a study was performed to gain insight in the contribution of the different steps of the GMO detection method to the repeatability and in-house reproducibility. In the present study, 19 methods for (GM) soy, maize canola and potato were validated in-house of which 14 on the basis of an 8-day validation scheme using eight different samples and five on the basis of a more concise validation protocol. In this way, data was obtained with respect to the detection limit, accuracy and precision. Also, decision limits were calculated for declaring non-conformance (>0.9%) with 95% reliability. In order to estimate the contribution of the different steps in the GMO analysis to the total variation variance components were estimated using REML (residual maximum likelihood method). From these components, relative standard deviations for repeatability and reproducibility (RSD(r) and RSD(R)) were calculated. The results showed that not only the PCR reaction but also the factors 'DNA isolation' and 'PCR day' are important factors for the total variance and should therefore be included in the in-house validation. It is proposed to use a statistical model to estimate these factors from a large dataset of initial validations so that for similar GMO methods in the future, only the PCR step needs to be validated. The resulting data are discussed in the light of agreed European criteria for qualified GMO detection methods.

  16. A Comparison of Vibration and Oil Debris Gear Damage Detection Methods Applied to Pitting Damage

    Science.gov (United States)

    Dempsey, Paula J.

    2000-01-01

    Helicopter Health Usage Monitoring Systems (HUMS) must provide reliable, real-time performance monitoring of helicopter operating parameters to prevent damage of flight critical components. Helicopter transmission diagnostics are an important part of a helicopter HUMS. In order to improve the reliability of transmission diagnostics, many researchers propose combining two technologies, vibration and oil monitoring, using data fusion and intelligent systems. Some benefits of combining multiple sensors to make decisions include improved detection capabilities and increased probability the event is detected. However, if the sensors are inaccurate, or the features extracted from the sensors are poor predictors of transmission health, integration of these sensors will decrease the accuracy of damage prediction. For this reason, one must verify the individual integrity of vibration and oil analysis methods prior to integrating the two technologies. This research focuses on comparing the capability of two vibration algorithms, FM4 and NA4, and a commercially available on-line oil debris monitor to detect pitting damage on spur gears in the NASA Glenn Research Center Spur Gear Fatigue Test Rig. Results from this research indicate that the rate of change of debris mass measured by the oil debris monitor is comparable to the vibration algorithms in detecting gear pitting damage.

  17. A quantitative calculation for software reliability evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young-Jun; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    To meet these regulatory requirements, the software used in the nuclear safety field has been ensured through the development, validation, safety analysis, and quality assurance activities throughout the entire process life cycle from the planning phase to the installation phase. A variety of activities, such as the quality assurance activities are also required to improve the quality of a software. However, there are limitations to ensure that the quality is improved enough. Therefore, the effort to calculate the reliability of the software continues for a quantitative evaluation instead of a qualitative evaluation. In this paper, we propose a quantitative calculation method for the software to be used for a specific operation of the digital controller in an NPP. After injecting random faults in the internal space of a developed controller and calculating the ability to detect the injected faults using diagnostic software, we can evaluate the software reliability of a digital controller in an NPP. We tried to calculate the software reliability of the controller in an NPP using a new method that differs from a traditional method. It calculates the fault detection coverage after injecting the faults into the software memory space rather than the activity through the life cycle process. We attempt differentiation by creating a new definition of the fault, imitating the software fault using the hardware, and giving a consideration and weights for injection faults.

  18. A Model to Partly but Reliably Distinguish DDOS Flood Traffic from Aggregated One

    Directory of Open Access Journals (Sweden)

    Ming Li

    2012-01-01

    Full Text Available Reliable distinguishing DDOS flood traffic from aggregated traffic is desperately desired by reliable prevention of DDOS attacks. By reliable distinguishing, we mean that flood traffic can be distinguished from aggregated one for a predetermined probability. The basis to reliably distinguish flood traffic from aggregated one is reliable detection of signs of DDOS flood attacks. As is known, reliably distinguishing DDOS flood traffic from aggregated traffic becomes a tough task mainly due to the effects of flash-crowd traffic. For this reason, this paper studies reliable detection in the underlying DiffServ network to use static-priority schedulers. In this network environment, we present a method for reliable detection of signs of DDOS flood attacks for a given class with a given priority. There are two assumptions introduced in this study. One is that flash-crowd traffic does not have all priorities but some. The other is that attack traffic has all priorities in all classes, otherwise an attacker cannot completely achieve its DDOS goal. Further, we suppose that the protected site is equipped with a sensor that has a signature library of the legitimate traffic with the priorities flash-crowd traffic does not have. Based on those, we are able to reliably distinguish attack traffic from aggregated traffic with the priorities that flash-crowd traffic does not have according to a given detection probability.

  19. A fast and reliable readout method for quantitative analysis of surface-enhanced Raman scattering nanoprobes on chip surface

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Hyejin; Jeong, Sinyoung; Ko, Eunbyeol; Jeong, Dae Hong, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Department of Chemistry Education, Seoul National University, Seoul 151-742 (Korea, Republic of); Kang, Homan [Interdisciplinary Program in Nano-Science and Technology, Seoul National University, Seoul 151-742 (Korea, Republic of); Lee, Yoon-Sik, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Interdisciplinary Program in Nano-Science and Technology, Seoul National University, Seoul 151-742 (Korea, Republic of); School of Chemical and Biological Engineering, Seoul National University, Seoul 151-742 (Korea, Republic of); Lee, Ho-Young, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Department of Nuclear Medicine, Seoul National University Bundang Hospital, Seongnam 463-707 (Korea, Republic of)

    2015-05-15

    Surface-enhanced Raman scattering techniques have been widely used for bioanalysis due to its high sensitivity and multiplex capacity. However, the point-scanning method using a micro-Raman system, which is the most common method in the literature, has a disadvantage of extremely long measurement time for on-chip immunoassay adopting a large chip area of approximately 1-mm scale and confocal beam point of ca. 1-μm size. Alternative methods such as sampled spot scan with high confocality and large-area scan method with enlarged field of view and low confocality have been utilized in order to minimize the measurement time practically. In this study, we analyzed the two methods in respect of signal-to-noise ratio and sampling-led signal fluctuations to obtain insights into a fast and reliable readout strategy. On this basis, we proposed a methodology for fast and reliable quantitative measurement of the whole chip area. The proposed method adopted a raster scan covering a full area of 100 μm × 100 μm region as a proof-of-concept experiment while accumulating signals in the CCD detector for single spectrum per frame. One single scan with 10 s over 100 μm × 100 μm area yielded much higher sensitivity compared to sampled spot scanning measurements and no signal fluctuations attributed to sampled spot scan. This readout method is able to serve as one of key technologies that will bring quantitative multiplexed detection and analysis into practice.

  20. Larvas output and influence of human factor in reliability of meat inspection by the method of artificial digestion

    Directory of Open Access Journals (Sweden)

    Đorđević Vesna

    2013-01-01

    Full Text Available On the basis of the performed analyses of the factors that contributed the infected meat reach food chain, we have found out that the infection occurred after consuming the meat inspected by the method of collective samples artificial digestion by using a magnetic stirrer (MM. In this work there are presented assay results which show how modifications of the method, on the level of final sedimentation, influence the reliability of Trichinella larvas detection in the infected meat samples. It has been shown that use of inadequate laboratory containers for larva collecting in final sedimentation and change of volume of digestive liquid that outflow during colouring preparations, can significantly influence inspection results. Larva detection errors ranged from 4 to 80% in presented the experimental groups in regard to the control group of samples inspected by using MM method, which had been carried out completely according to Europe Commission procedure No 2075/2005, where no errors in larva number per sample was found. We consider that the results of this work will contribute to the improvement of control of the method performance and especially of the critical point during inspection of meat samples to Trichinella larvas in Serbia.

  1. Comparison of Methods for Oscillation Detection

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Trangbæk, Klaus

    2006-01-01

    This paper compares a selection of methods for detecting oscillations in control loops. The methods are tested on measurement data from a coal-fired power plant, where some oscillations are occurring. Emphasis is put on being able to detect oscillations without having a system model and without...... using process knowledge. The tested methods show potential for detecting the oscillations, however, transient components in the signals cause false detections as well, motivating usage of models in order to remove the expected signals behavior....

  2. Methods and Reliability of Radiographic Vertebral Fracture Detection in Older Men: The Osteoporotic Fractures in Men Study

    Science.gov (United States)

    Cawthon, Peggy M.; Haslam, Jane; Fullman, Robin; Peters, Katherine W.; Black, Dennis; Ensrud, Kristine E.; Cummings, Steven R.; Orwoll, Eric S.; Barrett-Connor, Elizabeth; Marshall, Lynn; Steiger, Peter; Schousboe, John T.

    2014-01-01

    We describe the methods and reliability of radiographic vertebral fracture assessment in MrOS, a cohort of community dwelling men aged ≥65 yrs. Lateral spine radiographs were obtained at Visit 1 (2000-2) and 4.6 years later (Visit 2). Using a workflow tool (SpineAnalyzer™, Optasia Medical), a physician reader completed semi-quantitative (SQ) scoring. Prior to SQ scoring, technicians performed “triage” to reduce physician reader workload, whereby clearly normal spine images were eliminated from SQ scoring with all levels assumed to be SQ=0 (no fracture, “triage negative”); spine images with any possible fracture or abnormality were passed to the physician reader as “triage positive” images. Using a quality assurance sample of images (n=20 participants; 8 with baseline only and 12 with baseline and follow-up images) read multiple times, we calculated intra-reader kappa statistics and percent agreement for SQ scores. A subset of 494 participants' images were read regardless of triage classification to calculate the specificity and sensitivity of triage. Technically adequate images were available for 5958 of 5994 participants at Visit 1, and 4399 of 4423 participants at Visit 2. Triage identified 3215 (53.9%) participants with radiographs that required further evaluation by the physician reader. For prevalent fractures at Visit 1 (SQ≥1), intra-reader kappa statistics ranged from 0.79-0.92; percent agreement ranged from 96.9%-98.9%; sensitivity of the triage was 96.8% and specificity of triage was 46.3%. In conclusion, SQ scoring had excellent intra-rater reliability in our study. The triage process reduces expert reader workload without hindering the ability to identify vertebral fractures. PMID:25003811

  3. A Voltage Quality Detection Method

    DEFF Research Database (Denmark)

    Chen, Zhe; Wei, Mu

    2008-01-01

    This paper presents a voltage quality detection method based on a phase-locked loop (PLL) technique. The technique can detect the voltage magnitude and phase angle of each individual phase under both normal and fault power system conditions. The proposed method has the potential to evaluate various...

  4. A dynamic particle filter-support vector regression method for reliability prediction

    International Nuclear Information System (INIS)

    Wei, Zhao; Tao, Tao; ZhuoShu, Ding; Zio, Enrico

    2013-01-01

    Support vector regression (SVR) has been applied to time series prediction and some works have demonstrated the feasibility of its use to forecast system reliability. For accuracy of reliability forecasting, the selection of SVR's parameters is important. The existing research works on SVR's parameters selection divide the example dataset into training and test subsets, and tune the parameters on the training data. However, these fixed parameters can lead to poor prediction capabilities if the data of the test subset differ significantly from those of training. Differently, the novel method proposed in this paper uses particle filtering to estimate the SVR model parameters according to the whole measurement sequence up to the last observation instance. By treating the SVR training model as the observation equation of a particle filter, our method allows updating the SVR model parameters dynamically when a new observation comes. Because of the adaptability of the parameters to dynamic data pattern, the new PF–SVR method has superior prediction performance over that of standard SVR. Four application results show that PF–SVR is more robust than SVR to the decrease of the number of training data and the change of initial SVR parameter values. Also, even if there are trends in the test data different from those in the training data, the method can capture the changes, correct the SVR parameters and obtain good predictions. -- Highlights: •A dynamic PF–SVR method is proposed to predict the system reliability. •The method can adjust the SVR parameters according to the change of data. •The method is robust to the size of training data and initial parameter values. •Some cases based on both artificial and real data are studied. •PF–SVR shows superior prediction performance over standard SVR

  5. Methods to compute reliabilities for genomic predictions of feed intake

    Science.gov (United States)

    For new traits without historical reference data, cross-validation is often the preferred method to validate reliability (REL). Time truncation is less useful because few animals gain substantial REL after the truncation point. Accurate cross-validation requires separating genomic gain from pedigree...

  6. Verification of practicability of quantitative reliability evaluation method (De-BDA) in nuclear power plants

    International Nuclear Information System (INIS)

    Takahashi, Kinshiro; Yukimachi, Takeo.

    1988-01-01

    A variety of methods have been applied to study of reliability analysis in which human factors are included in order to enhance the safety and availability of nuclear power plants. De-BDA (Detailed Block Diagram Analysis) is one of such mehtods developed with the objective of creating a more comprehensive and understandable tool for quantitative analysis of reliability associated with plant operations. The practicability of this method has been verified by applying it to reliability analysis of various phases of plant operation as well as evaluation of enhanced man-machine interface in the central control room. (author)

  7. Application of Modal Parameter Estimation Methods for Continuous Wavelet Transform-Based Damage Detection for Beam-Like Structures

    Directory of Open Access Journals (Sweden)

    Zhi Qiu

    2015-02-01

    Full Text Available This paper presents a hybrid damage detection method based on continuous wavelet transform (CWT and modal parameter identification techniques for beam-like structures. First, two kinds of mode shape estimation methods, herein referred to as the quadrature peaks picking (QPP and rational fraction polynomial (RFP methods, are used to identify the first four mode shapes of an intact beam-like structure based on the hammer/accelerometer modal experiment. The results are compared and validated using a numerical simulation with ABAQUS software. In order to determine the damage detection effectiveness between the QPP-based method and the RFP-based method when applying the CWT technique, the first two mode shapes calculated by the QPP and RFP methods are analyzed using CWT. The experiment, performed on different damage scenarios involving beam-like structures, shows that, due to the outstanding advantage of the denoising characteristic of the RFP-based (RFP-CWT technique, the RFP-CWT method gives a clearer indication of the damage location than the conventionally used QPP-based (QPP-CWT method. Finally, an overall evaluation of the damage detection is outlined, as the identification results suggest that the newly proposed RFP-CWT method is accurate and reliable in terms of detection of damage locations on beam-like structures.

  8. Application of system reliability analytical method, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Mitomo, Nobuo; Miyazaki, Keiko; Matsukura, Hiroshi; Kobayashi, Michiyuki

    1999-01-01

    The Ship Research Institute proceed a developmental study on GO-FLOW method with various advancing functionalities for the system reliability analysis method occupying main parts of PSA (Probabilistic Safety Assessment). Here was attempted to intend to upgrade functionality of the GO-FLOW method, to develop an analytical function integrated with dynamic behavior analytical function, physical behavior and probable subject transfer, and to prepare a main accident sequence picking-out function. In 1997 fiscal year, in dynamic event-tree analytical system, an analytical function was developed by adding dependency between headings. In simulation analytical function of the accident sequence, main accident sequence of MRX for improved ship propulsion reactor became possible to be covered perfectly. And, input data for analysis was prepared with a function capable easily to set by an analysis operator. (G.K.)

  9. A method of bias correction for maximal reliability with dichotomous measures.

    Science.gov (United States)

    Penev, Spiridon; Raykov, Tenko

    2010-02-01

    This paper is concerned with the reliability of weighted combinations of a given set of dichotomous measures. Maximal reliability for such measures has been discussed in the past, but the pertinent estimator exhibits a considerable bias and mean squared error for moderate sample sizes. We examine this bias, propose a procedure for bias correction, and develop a more accurate asymptotic confidence interval for the resulting estimator. In most empirically relevant cases, the bias correction and mean squared error correction can be performed simultaneously. We propose an approximate (asymptotic) confidence interval for the maximal reliability coefficient, discuss the implementation of this estimator, and investigate the mean squared error of the associated asymptotic approximation. We illustrate the proposed methods using a numerical example.

  10. Leak detection by vibrational diagnostic methods

    International Nuclear Information System (INIS)

    Siklossy, P.

    1983-01-01

    The possibilities and methods of leak detection due to mechanical failures in nuclear power plants are reviewed on the basis of the literature. Great importance is attributed to vibrational diagnostic methods for their adventageous characteristics which enable them to become final leak detecting methods. The problems of noise analysis, e.g. leak detection by impact sound measurements, probe characteristics, gain problems, probe selection, off-line analysis and correlation functions, types of leak noises etc. are summarized. Leak detection based on noise analysis can be installed additionally to power plants. Its maintenance and testing is simple. On the other hand, it requires special training and measuring methods. (Sz.J.)

  11. Enrichment methods to detect bone marrow micrometastases in breast carcinoma patients: clinical relevance

    International Nuclear Information System (INIS)

    Choesmel, Valérie; Pierga, Jean-Yves; Nos, Claude; Vincent-Salomon, Anne; Sigal-Zafrani, Brigitte; Thiery, Jean-Paul; Blin, Nathalie

    2004-01-01

    Improving technologies for the detection and purification of bone marrow (BM) micrometastatic cells in breast cancer patients should lead to earlier prognosis of the risk of relapse and should make it possible to design more appropriate therapies. The technique used has to overcome the challenges resulting from the small number of target cells (one per million hematopoietic cells) and the heterogeneous expression of micrometastatic cell markers. In the present study, we have assessed the clinical relevance of current methods aimed at detecting rare disseminated carcinoma cells. BM aspirates from 32 carcinoma patients were screened for the presence of micrometastatic cells positive for epithelial cell adhesion molecule and positive for cytokeratins, using optimized immunodetection methods. A comparison with data obtained for 46 control BM aspirates and a correlation with the clinical status of patients were performed. We developed a sensitive and efficient immunomagnetic protocol for the enrichment of BM micrometastases. This method was used to divide 32 breast carcinoma patients into three categories according to their epithelial cell adhesion molecule status. These categories were highly correlated with the recently revised American Joint Committee on Cancer staging system for breast cancer, demonstrating the clinical relevance of this simple and reliable immunomagnetic technique. We also evaluated immunocytochemical detection of cytokeratin-positive cells and cytomorphological parameters. Immunocytochemistry-based methods for the detection of BM micrometastases did not provide any information about the clinical status of patients, but helped to refine the immunomagnetic data by confirming the presence of micrometastases in some cases. We also tested a new density gradient centrifugation system, able to enrich the tumor fraction of BM specimens by twofold to threefold as compared with standard Ficoll methods. These improved methods for the detection of

  12. Human reliability-based MC and A methods for evaluating the effectiveness of protecting nuclear material - 59379

    International Nuclear Information System (INIS)

    Duran, Felicia A.; Wyss, Gregory D.

    2012-01-01

    Material control and accountability (MC and A) operations that track and account for critical assets at nuclear facilities provide a key protection approach for defeating insider adversaries. MC and A activities, from monitoring to inventory measurements, provide critical information about target materials and define security elements that are useful against insider threats. However, these activities have been difficult to characterize in ways that are compatible with the path analysis methods that are used to systematically evaluate the effectiveness of a site's protection system. The path analysis methodology focuses on a systematic, quantitative evaluation of the physical protection component of the system for potential external threats, and often calculates the probability that the physical protection system (PPS) is effective (PE) in defeating an adversary who uses that attack pathway. In previous work, Dawson and Hester observed that many MC and A activities can be considered a type of sensor system with alarm and assessment capabilities that provide recurring opportunities for 'detecting' the status of critical items. This work has extended that characterization of MC and A activities as probabilistic sensors that are interwoven within each protection layer of the PPS. In addition, MC and A activities have similar characteristics to operator tasks performed in a nuclear power plant (NPP) in that the reliability of these activities depends significantly on human performance. Many of the procedures involve human performance in checking for anomalous conditions. Further characterization of MC and A activities as operational procedures that check the status of critical assets provides a basis for applying human reliability analysis (HRA) models and methods to determine probabilities of detection for MC and A protection elements. This paper will discuss the application of HRA methods used in nuclear power plant probabilistic risk assessments to define detection

  13. Classification methods to detect sleep apnea in adults based on respiratory and oximetry signals: a systematic review.

    Science.gov (United States)

    Uddin, M B; Chow, C M; Su, S W

    2018-03-26

    Sleep apnea (SA), a common sleep disorder, can significantly decrease the quality of life, and is closely associated with major health risks such as cardiovascular disease, sudden death, depression, and hypertension. The normal diagnostic process of SA using polysomnography is costly and time consuming. In addition, the accuracy of different classification methods to detect SA varies with the use of different physiological signals. If an effective, reliable, and accurate classification method is developed, then the diagnosis of SA and its associated treatment will be time-efficient and economical. This study aims to systematically review the literature and present an overview of classification methods to detect SA using respiratory and oximetry signals and address the automated detection approach. Sixty-two included studies revealed the application of single and multiple signals (respiratory and oximetry) for the diagnosis of SA. Both airflow and oxygen saturation signals alone were effective in detecting SA in the case of binary decision-making, whereas multiple signals were good for multi-class detection. In addition, some machine learning methods were superior to the other classification methods for SA detection using respiratory and oximetry signals. To deal with the respiratory and oximetry signals, a good choice of classification method as well as the consideration of associated factors would result in high accuracy in the detection of SA. An accurate classification method should provide a high detection rate with an automated (independent of human action) analysis of respiratory and oximetry signals. Future high-quality automated studies using large samples of data from multiple patient groups or record batches are recommended.

  14. Integration of high resolution geophysical methods. Detection of shallow depth bodies of archaeological interest

    Energy Technology Data Exchange (ETDEWEB)

    Cammarano, F.; Piro, S.; Rosso, F.; Versino, L. [Centro Nazionale delle Ricerche, Rome (Italy). Istituto per le Tecnologie Applicate ai Beni Culturali; Mauriello, P. [Neaples, Univ. `Federico II` (Italy). Dip. di Scienze Fisiche

    1998-08-01

    A combined survey using ground penetrating radar, self-potential, geo electrical and magnetic methods has been carried out to detect near-surface tombs in the archaeological test site of the Sabine Necropolis at Colle del Forno, Rome, Italy. A 2D data acquisition mode has been adopted to obtain a 3D image of the investigated volumes. The multi-methodological approach has not only demonstrated the reliability of each method in delineating the spatial behaviour of the governing parameter, but mainly helped to obtain a detailed physical image closely conforming to the target geometry through the whole set of parameters involved

  15. Integration of high resolution geophysical methods. Detection of shallow depth bodies of archaeological interest

    Directory of Open Access Journals (Sweden)

    F. Rosso

    1998-06-01

    Full Text Available A combined survey using ground penetrating radar, self-potential, geoelectrical and magnetic methods has been carried out to detect near-surface tombs in the archaeological test site of the Sabine Necropolis at Colle del Forno, Rome, Italy. A 2D data acquisition mode has been adopted to obtain a 3D image of the investigated volumes. The multi-methodological approach has not only demonstrated the reliability of each method in delineating the spatial behaviour of the governing parameter, but mainly helped to obtain a detailed physical image closely conforming to the target geometry through the whole set of parameters involved.

  16. Reliability Testing the Die-Attach of CPV Cell Assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Bosco, N.; Sweet, C.; Kurtz, S.

    2011-02-01

    Results and progress are reported for a course of work to establish an efficient reliability test for the die-attach of CPV cell assemblies. Test vehicle design consists of a ~1 cm2 multijunction cell attached to a substrate via several processes. A thermal cycling sequence is developed in a test-to-failure protocol. Methods of detecting a failed or failing joint are prerequisite for this work; therefore both in-situ and non-destructive methods, including infrared imaging techniques, are being explored as a method to quickly detect non-ideal or failing bonds.

  17. Designing a reliable leak bio-detection system for natural gas pipelines

    International Nuclear Information System (INIS)

    Batzias, F.A.; Siontorou, C.G.; Spanidis, P.-M.P.

    2011-01-01

    Monitoring of natural gas (NG) pipelines is an important task for economical/safety operation, loss prevention and environmental protection. Timely and reliable leak detection of gas pipeline, therefore, plays a key role in the overall integrity management for the pipeline system. Owing to the various limitations of the currently available techniques and the surveillance area that needs to be covered, the research on new detector systems is still thriving. Biosensors are worldwide considered as a niche technology in the environmental market, since they afford the desired detector capabilities at low cost, provided they have been properly designed/developed and rationally placed/networked/maintained by the aid of operational research techniques. This paper addresses NG leakage surveillance through a robust cooperative/synergistic scheme between biosensors and conventional detector systems; the network is validated in situ and optimized in order to provide reliable information at the required granularity level. The proposed scheme is substantiated through a knowledge based approach and relies on Fuzzy Multicriteria Analysis (FMCA), for selecting the best biosensor design that suits both, the target analyte and the operational micro-environment. This approach is illustrated in the design of leak surveying over a pipeline network in Greece.

  18. Designing a reliable leak bio-detection system for natural gas pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Batzias, F.A., E-mail: fbatzi@unipi.gr [Univ. Piraeus, Dept. Industrial Management and Technology, Karaoli and Dimitriou 80, 18534 Piraeus (Greece); Siontorou, C.G., E-mail: csiontor@unipi.gr [Univ. Piraeus, Dept. Industrial Management and Technology, Karaoli and Dimitriou 80, 18534 Piraeus (Greece); Spanidis, P.-M.P., E-mail: pspani@asprofos.gr [Asprofos Engineering S.A, El. Venizelos 284, 17675 Kallithea (Greece)

    2011-02-15

    Monitoring of natural gas (NG) pipelines is an important task for economical/safety operation, loss prevention and environmental protection. Timely and reliable leak detection of gas pipeline, therefore, plays a key role in the overall integrity management for the pipeline system. Owing to the various limitations of the currently available techniques and the surveillance area that needs to be covered, the research on new detector systems is still thriving. Biosensors are worldwide considered as a niche technology in the environmental market, since they afford the desired detector capabilities at low cost, provided they have been properly designed/developed and rationally placed/networked/maintained by the aid of operational research techniques. This paper addresses NG leakage surveillance through a robust cooperative/synergistic scheme between biosensors and conventional detector systems; the network is validated in situ and optimized in order to provide reliable information at the required granularity level. The proposed scheme is substantiated through a knowledge based approach and relies on Fuzzy Multicriteria Analysis (FMCA), for selecting the best biosensor design that suits both, the target analyte and the operational micro-environment. This approach is illustrated in the design of leak surveying over a pipeline network in Greece.

  19. Reliability considerations of NDT by probability of detection (POD). Determination using ultrasound phased array. Results from a project in frame of the German nuclear safety research program

    International Nuclear Information System (INIS)

    Kurz, Jochen H.; Dugan, Sandra; Juengert, Anne

    2013-01-01

    Reliable assessment procedures are an important aspect of maintenance concepts. Non-destructive testing (NDT) methods are an essential part of a variety of maintenance plans. Fracture mechanical assessments require knowledge of flaw dimensions, loads and material parameters. NDT methods are able to acquire information on all of these areas. However, it has to be considered that the level of detail information depends on the case investigated and therefore on the applicable methods. Reliability aspects of NDT methods are of importance if quantitative information is required. Different design concepts e.g. the damage tolerance approach in aerospace already include reliability criteria of NDT methods applied in maintenance plans. NDT is also an essential part during construction and maintenance of nuclear power plants. In Germany, type and extent of inspection are specified in Safety Standards of the Nuclear Safety Standards Commission (KTA). Only certified inspections are allowed in the nuclear industry. The qualification of NDT is carried out in form of performance demonstrations of the inspection teams and the equipment, witnessed by an authorized inspector. The results of these tests are mainly statements regarding the detection capabilities of certain artificial flaws. In other countries, e.g. the U.S., additional blind tests on test blocks with hidden and unknown flaws may be required, in which a certain percentage of these flaws has to be detected. The knowledge of the probability of detection (POD) curves of specific flaws in specific testing conditions is often not present. This paper shows the results of a research project designed for POD determination of ultrasound phased array inspections of real and artificial cracks. The continuative objective of this project was to generate quantitative POD results. The distribution of the crack sizes of the specimens and the inspection planning is discussed, and results of the ultrasound inspections are presented. In

  20. Feasibility of geophysical methods as a tool to detect urban subsurface cavity

    Science.gov (United States)

    Bang, E.; Kim, C.; Rim, H.; Ryu, D.; Lee, H.; Jeong, S. W.; Jung, B.; Yum, B. W.

    2016-12-01

    Urban road collapse problem become a social issue in Korea these days. Underground cavity cannot be cured by itself, we need to detect existing underground cavity before road collapse. We should consider cost, reliability, availability, skill requirement for field work and interpretation procedure in selecting detecting method because it's huge area and very long length to complete. We constructed a real-scale ground model for this purpose. Its size is about 15m*8m*3m (L*W*D) and sewer pipes are buried at the depth of 1.2m. We modeled upward moving or enlargement of underground cavity by digging the ground through the hole of sewer pipe inside. There are two or three steps having different cavity size and depth. We performed all five methods on the ground model to monitor ground collapse and detect underground cavity at each step. The first one is GPR method, which is very popular for this kind of project. GPR provided very good images showing underground cavity well at each step. DC resistivity survey is also selected because it is a common tool to locate underground anomaly. It provided the images showing underground cavity, but field setup is not favorable for the project. The third method is micro gravity method which can differentiate cavity zone from gravity distribution. Micro Gravity gave smaller g values around the cavity compared to normal condition, but it takes very long time to perform. The fourth method is thermal image. The temperature of the ground surface on the cavity will be different from the other area. We used multi-copter for rapid thermal imaging and we could pick the area of underground cavity from the aerial thermal image of ground surface. The last method we applied is RFID/magnetic survey. When the ground is collapsed around the buried RFID/magnetic tag in depth, tag will be moved downward. We can know the ground collapse through checking tag detecting condition. We could pick the area of ground collapse easily. When we compared each

  1. A New Method of Reliability Evaluation Based on Wavelet Information Entropy for Equipment Condition Identification

    International Nuclear Information System (INIS)

    He, Z J; Zhang, X L; Chen, X F

    2012-01-01

    Aiming at reliability evaluation of condition identification of mechanical equipment, it is necessary to analyze condition monitoring information. A new method of reliability evaluation based on wavelet information entropy extracted from vibration signals of mechanical equipment is proposed. The method is quite different from traditional reliability evaluation models that are dependent on probability statistics analysis of large number sample data. The vibration signals of mechanical equipment were analyzed by means of second generation wavelet package (SGWP). We take relative energy in each frequency band of decomposed signal that equals a percentage of the whole signal energy as probability. Normalized information entropy (IE) is obtained based on the relative energy to describe uncertainty of a system instead of probability. The reliability degree is transformed by the normalized wavelet information entropy. A successful application has been achieved to evaluate the assembled quality reliability for a kind of dismountable disk-drum aero-engine. The reliability degree indicates the assembled quality satisfactorily.

  2. Development of a fluorescent antibody method for the detection of Enterococcus faecium and its potential for coastal aquatic environment monitoring.

    Science.gov (United States)

    Caruso, Gabriella; Monticelli, L S; Caruso, R; Bergamasco, A

    2008-02-01

    A direct, microscopic fluorescent antibody method was developed to detect the occurrence of Enterococcus faecium in coastal aquatic environments and was compared with the conventional membrane filtering method. The "in situ" application of the antibody-based protocol in the analysis of water samples collected from coastal polyhaline habitats demonstrated good sensitivity and ease of implementation. Data obtained with the microscopic technique were in agreement with those obtained from culture counts. The fluorescent antibody method proved to be a rapid and reliable technique for the detection of E. faecium. The advantages and limitations intrinsic to the method are discussed, highlighting the potential of this new technique for monitoring coastal aquatic environments.

  3. Standard method for detecting Bombyx mori nucleopolyhedrovirus disease-resistant silkworm varieties

    Directory of Open Access Journals (Sweden)

    Yang Qiong

    Full Text Available ABSTRACT Bombyx mori nucleopolyhedrovirus (BmNPV disease is one of the most serious silkworm diseases, and it has caused great economic losses to the sericulture industry. So far, the disease has not been controlled effectively by therapeutic agents. Breeding resistant silkworm varieties breeding may be an effective way to improve resistance to BmNPV and reduce economic losses. A precise resistance-detection method will help to accelerate the breeding process. For this purpose, here we described the individual inoculation method (IIM. Details of the IIM include pathogen BmNPV preparation, mulberry leaf size, pathogen volume, rearing conditions, course of infection, and breeding conditions. Finally, a resistance comparison experiment was performed using the IIM and the traditional group inoculation method (GIM. The incidence of BmNPV infection and the within-group variance results showed that the IIM was more precise and reliable than the GIM.

  4. Planning of operation & maintenance using risk and reliability based methods

    DEFF Research Database (Denmark)

    Florian, Mihai; Sørensen, John Dalsgaard

    2015-01-01

    Operation and maintenance (OM) of offshore wind turbines contributes with a substantial part of the total levelized cost of energy (LCOE). The objective of this paper is to present an application of risk- and reliability-based methods for planning of OM. The theoretical basis is presented...

  5. Germination test as a fast method to detect glyphosate-resistant sourgrass

    Directory of Open Access Journals (Sweden)

    Marcos Altomani Neves Dias

    2015-01-01

    Full Text Available The occurrence of weed species with different levels of resistance to glyphosate has increasingly spread in agricultural areas. In Brazil, sourgrass is among the main species presenting issues in this regard. Thus, fast and reliable methods to detect glyphosate resistance are of special interest for this specie, either for research or rational management purposes. This study was carried out to verify the feasibility of using the germination test to detect glyphosate resistance in sourgrass. The experiment was conducted with two sourgrass biotypes, with different levels of susceptibility to glyphosate. The seeds were previously imbibed in solutions composed of 0, 0.1875%, 0.25%, 0.75%, 1.5%, 3% and 6% of glyphosate during two periods, five and ten minutes, and submitted to germination tests. The results indicate the germination test as a feasible and time-saving approach to evaluate glyphosate-resistant sourgrass, with results available in seven days.

  6. Germination test as a fast method to detect glyphosate-resistant sourgrass

    Directory of Open Access Journals (Sweden)

    Marcos Altomani Neves Dias

    2015-09-01

    Full Text Available The occurrence of weed species with different levels of resistance to glyphosate has increasingly spread in agricultural areas. In Brazil, sourgrass is among the main species presenting issues in this regard. Thus, fast and reliable methods to detect glyphosate resistance are of special interest for this specie, either for research or rational management purposes. This study was carried out to verify the feasibility of using the germination test to detect glyphosate resistance in sourgrass. The experiment was conducted with two sourgrass biotypes, with different levels of susceptibility to glyphosate. The seeds were previously imbibed in solutions composed of 0, 0.1875%, 0.25%, 0.75%, 1.5%, 3% and 6% of glyphosate during two periods, five and ten minutes, and submitted to germination tests. The results indicate the germination test as a feasible and time-saving approach to evaluate glyphosate-resistant sourgrass, with results available in seven days.

  7. Diffusion-weighted MR imaging in postoperative follow-up: Reliability for detection of recurrent cholesteatoma

    Energy Technology Data Exchange (ETDEWEB)

    Cimsit, Nuri Cagatay [Marmara University Hospital, Department of Radiology, Istanbul (Turkey); Engin Sitesi Peker Sokak No:1 D:13, 34330 Levent, Istanbul (Turkey)], E-mail: cagataycimsit@gmail.com; Cimsit, Canan [Goztepe Education and Research Hospital, Department of Radiology, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, Radyoloji Klinigi, Goztepe, Istanbul (Turkey)], E-mail: ccimsit@ttmail.com; Baysal, Begumhan [Goztepe Education and Research Hospital, Department of Radiology, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, Radyoloji Klinigi, Goztepe, Istanbul (Turkey)], E-mail: begumbaysal@yahoo.com; Ruhi, Ilteris Cagatay [Goztepe Education and Research Hospital, Department of ENT, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, KBB Klinigi, Goztepe, Istanbul (Turkey)], E-mail: cruhi@yahoo.com; Ozbilgen, Suha [Goztepe Education and Research Hospital, Department of ENT, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, KBB Klinigi, Goztepe, Istanbul (Turkey)], E-mail: sozbilgen@yahoo.com; Aksoy, Elif Ayanoglu [Acibadem Bakirkoy Hospital, Department of ENT, Istanbul (Turkey); Acibadem Hastanesi, KBB Boeluemue, Bakirkoey, Istanbul (Turkey)], E-mail: elifayanoglu@yahoo.com

    2010-04-15

    Introduction: Cholesteatoma is a progressively growing process that destroy the neighboring bony structures and treatment is surgical removal. Follow-up is important in the postoperative period, since further surgery is necessary if recurrence is present, but not if granulation tissue is detected. This study evaluates if diffusion-weighted MR imaging alone can be a reliable alternative to CT, without use of contrast agent for follow-up of postoperative patients in detecting recurrent cholesteatoma. Materials and methods: 26 consecutive patients with mastoidectomy reporting for routine follow-up CT after mastoidectomy were included in the study, if there was loss of middle ear aeration on CT examination. MR images were evaluated for loss of aeration and signal intensity changes on diffusion-weighted sequences. Surgical results were compared with imaging findings. Results: Interpretation of MR images were parallel with the loss of aeration detected on CT for all 26 patients. Of the 26 patients examined, 14 were not evaluated as recurrent cholesteatoma and verified with surgery (NPV: 100%). Twelve patients were diagnosed as recurrent cholesteatoma and 11 were surgically diagnosed as recurrent cholesteatoma (PPV: 91.7%). Four of these 11 patients had loss of aeration size greater than the high signal intensity area on DWI, which were surgically confirmed as granulation tissue or fibrosis accompanying recurrent cholesteatoma. Conclusion: Diffusion-weighted MR for suspected recurrent cholesteatoma is a valuable tool to cut costs and prevent unnecessary second-look surgeries. It has the potential to become the MR sequence of choice to differentiate recurrent cholesteatoma from other causes of loss of aeration in patients with mastoidectomy.

  8. A Comparison of Three Methods for the Analysis of Skin Flap Viability: Reliability and Validity.

    Science.gov (United States)

    Tim, Carla Roberta; Martignago, Cintia Cristina Santi; da Silva, Viviane Ribeiro; Dos Santos, Estefany Camila Bonfim; Vieira, Fabiana Nascimento; Parizotto, Nivaldo Antonio; Liebano, Richard Eloin

    2018-05-01

    Objective: Technological advances have provided new alternatives to the analysis of skin flap viability in animal models; however, the interrater validity and reliability of these techniques have yet to be analyzed. The present study aimed to evaluate the interrater validity and reliability of three different methods: weight of paper template (WPT), paper template area (PTA), and photographic analysis. Approach: Sixteen male Wistar rats had their cranially based dorsal skin flap elevated. On the seventh postoperative day, the viable tissue area and the necrotic area of the skin flap were recorded using the paper template method and photo image. The evaluation of the percentage of viable tissue was performed using three methods, simultaneously and independently by two raters. The analysis of interrater reliability and viability was performed using the intraclass correlation coefficient and Bland Altman Plot Analysis was used to visualize the presence or absence of systematic bias in the evaluations of data validity. Results: The results showed that interrater reliability for WPT, measurement of PTA, and photographic analysis were 0.995, 0.990, and 0.982, respectively. For data validity, a correlation >0.90 was observed for all comparisons made between the three methods. In addition, Bland Altman Plot Analysis showed agreement between the comparisons of the methods and the presence of systematic bias was not observed. Innovation: Digital methods are an excellent choice for assessing skin flap viability; moreover, they make data use and storage easier. Conclusion: Independently from the method used, the interrater reliability and validity proved to be excellent for the analysis of skin flaps' viability.

  9. A comparison of moving object detection methods for real-time moving object detection

    Science.gov (United States)

    Roshan, Aditya; Zhang, Yun

    2014-06-01

    Moving object detection has a wide variety of applications from traffic monitoring, site monitoring, automatic theft identification, face detection to military surveillance. Many methods have been developed across the globe for moving object detection, but it is very difficult to find one which can work globally in all situations and with different types of videos. The purpose of this paper is to evaluate existing moving object detection methods which can be implemented in software on a desktop or laptop, for real time object detection. There are several moving object detection methods noted in the literature, but few of them are suitable for real time moving object detection. Most of the methods which provide for real time movement are further limited by the number of objects and the scene complexity. This paper evaluates the four most commonly used moving object detection methods as background subtraction technique, Gaussian mixture model, wavelet based and optical flow based methods. The work is based on evaluation of these four moving object detection methods using two (2) different sets of cameras and two (2) different scenes. The moving object detection methods have been implemented using MatLab and results are compared based on completeness of detected objects, noise, light change sensitivity, processing time etc. After comparison, it is observed that optical flow based method took least processing time and successfully detected boundary of moving objects which also implies that it can be implemented for real-time moving object detection.

  10. Matrix-based system reliability method and applications to bridge networks

    International Nuclear Information System (INIS)

    Kang, W.-H.; Song Junho; Gardoni, Paolo

    2008-01-01

    Using a matrix-based system reliability (MSR) method, one can estimate the probabilities of complex system events by simple matrix calculations. Unlike existing system reliability methods whose complexity depends highly on that of the system event, the MSR method describes any general system event in a simple matrix form and therefore provides a more convenient way of handling the system event and estimating its probability. Even in the case where one has incomplete information on the component probabilities and/or the statistical dependence thereof, the matrix-based framework enables us to estimate the narrowest bounds on the system failure probability by linear programming. This paper presents the MSR method and applies it to a transportation network consisting of bridge structures. The seismic failure probabilities of bridges are estimated by use of the predictive fragility curves developed by a Bayesian methodology based on experimental data and existing deterministic models of the seismic capacity and demand. Using the MSR method, the probability of disconnection between each city/county and a critical facility is estimated. The probability mass function of the number of failed bridges is computed as well. In order to quantify the relative importance of bridges, the MSR method is used to compute the conditional probabilities of bridge failures given that there is at least one city disconnected from the critical facility. The bounds on the probability of disconnection are also obtained for cases with incomplete information

  11. Method to improve reliability of a fuel cell system using low performance cell detection at low power operation

    Science.gov (United States)

    Choi, Tayoung; Ganapathy, Sriram; Jung, Jaehak; Savage, David R.; Lakshmanan, Balasubramanian; Vecasey, Pamela M.

    2013-04-16

    A system and method for detecting a low performing cell in a fuel cell stack using measured cell voltages. The method includes determining that the fuel cell stack is running, the stack coolant temperature is above a certain temperature and the stack current density is within a relatively low power range. The method further includes calculating the average cell voltage, and determining whether the difference between the average cell voltage and the minimum cell voltage is greater than a predetermined threshold. If the difference between the average cell voltage and the minimum cell voltage is greater than the predetermined threshold and the minimum cell voltage is less than another predetermined threshold, then the method increments a low performing cell timer. A ratio of the low performing cell timer and a system run timer is calculated to identify a low performing cell.

  12. Detection of Yersinia Enterocolitica Species in Pig Tonsils and Raw Pork Meat by the Real-Time Pcr and Culture Methods.

    Science.gov (United States)

    Stachelska, M A

    2017-09-26

    The aim of the present study was to establish a rapid and accurate real-time PCR method to detect pathogenic Yersinia enterocolitica in pork. Yersinia enterocolitica is considered to be a crucial zoonosis, which can provoke diseases both in humans and animals. The classical culture methods designated to detect Y. enterocolitica species in food matrices are often very time-consuming. The chromosomal locus _tag CH49_3099 gene, that appears in pathogenic Y. enterocolitica strains, was applied as DNA target for the 5' nuclease PCR protocol. The probe was labelled at the 5' end with the fluorescent reporter dye (FAM) and at the 3' end with the quencher dye (TAMRA). The real-time PCR cycling parameters included 41 cycles. A Ct value which reached a value higher than 40 constituted a negative result. The developed for the needs of this study qualitative real-time PCR method appeared to give very specific and reliable results. The detection rate of locus _tag CH49_3099 - positive Y. enterocolitica in 150 pig tonsils was 85 % and 32 % with PCR and culture methods, respectively. Both the Real-time PCR results and culture method results were obtained from material that was enriched during overnight incubation. The subject of the study were also raw pork meat samples. Among 80 samples examined, 7 ones were positive when real-time PCR was applied, and 6 ones were positive when classical culture method was applied. The application of molecular techniques based on the analysis of DNA sequences such as the Real-time PCR enables to detect this pathogenic bacteria very rapidly and with higher specificity, sensitivity and reliability in comparison to classical culture methods.

  13. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  14. Evaluating the reliability of multi-body mechanisms: A method considering the uncertainties of dynamic performance

    International Nuclear Information System (INIS)

    Wu, Jianing; Yan, Shaoze; Zuo, Ming J.

    2016-01-01

    Mechanism reliability is defined as the ability of a certain mechanism to maintain output accuracy under specified conditions. Mechanism reliability is generally assessed by the classical direct probability method (DPM) derived from the first order second moment (FOSM) method. The DPM relies strongly on the analytical form of the dynamic solution so it is not applicable to multi-body mechanisms that have only numerical solutions. In this paper, an indirect probability model (IPM) is proposed for mechanism reliability evaluation of multi-body mechanisms. IPM combines the dynamic equation, degradation function and Kaplan–Meier estimator to evaluate mechanism reliability comprehensively. Furthermore, to reduce the amount of computation in practical applications, the IPM is simplified into the indirect probability step model (IPSM). A case study of a crank–slider mechanism with clearance is investigated. Results show that relative errors between the theoretical and experimental results of mechanism reliability are less than 5%, demonstrating the effectiveness of the proposed method. - Highlights: • An indirect probability model (IPM) is proposed for mechanism reliability evaluation. • The dynamic equation, degradation function and Kaplan–Meier estimator are used. • Then the simplified form of indirect probability model is proposed. • The experimental results agree well with the predicted results.

  15. Comparison of three methods for detection of melamine in compost and soil

    Energy Technology Data Exchange (ETDEWEB)

    Tian, Yongqiang [Depatment of Vegetable Science, College of Agronomy and Biotechnology, China Agricultural University, 2 Yuanmingyuan Xilu, Beijing 100193 (China); School of Environment and Natural Resources, The Ohio State University, 1680 Madison Avenue, Wooster, OH 44691 (United States); Chen, Liming [School of Environment and Natural Resources, The Ohio State University, 1680 Madison Avenue, Wooster, OH 44691 (United States); Gao, Lihong [Depatment of Vegetable Science, College of Agronomy and Biotechnology, China Agricultural University, 2 Yuanmingyuan Xilu, Beijing 100193 (China); Wu, Manli [School of Environment and Natural Resources, The Ohio State University, 1680 Madison Avenue, Wooster, OH 44691 (United States); School of Environmental and Municipal Engineering, Xi' an University of Architecture and Technology, No.13 Yanta Road, Xi' an, Shaanxi Province 710055 (China); Dick, Warren A., E-mail: dick.5@osu.edu [School of Environment and Natural Resources, The Ohio State University, 1680 Madison Avenue, Wooster, OH 44691 (United States)

    2012-02-15

    Recent product recalls and food safety incidents due to melamine (MM) adulteration or contamination have caused a worldwide food security concern. This has led to many methods being developed to detect MM in foods, but few methods haves been reported that can rapidly and reliably measure MM in environmental samples. To meet this need, a high performance liquid chromatography (HPLC) with UV detection method, an enzyme-linked immunosorbent assay (ELISA) test kit, and an enzyme-linked rapid colorimetric assay (RCA) test kit were evaluated for their ability to accurately measure MM concentrations in compost and soil samples. All three methods accurately detected MM concentrations if no MM degradation products, such as ammeline (AMN), ammelide (AMD) and cyanuric acid (CA), were present in an aqueous sample. In the presence of these MM degradation products, the HPLC yielded more accurate concentrations than the ELISA method and there was no significant (P > 0.05) difference between the HPLC and RCA methods. However, if samples were purified by SPE or prepared with blocking buffer, the ELISA method accurately measured MM concentrations, even in the presence of the MM degradation products. The HPLC method generally outperformed the RCA method for measuring MM in soil extracts but gave similar results for compost extracts. The number of samples that can be analyzed by the ELISA and RCA methods in a 24-hour time period is much greater than by the HPLC method. Thus the RCA method would seem to be a good screening method for measuring MM in compost and soil samples and the results obtained could then be confirmed by the HPLC method. The HPLC method, however, also allows simultaneous measurement of MM and its degradation products of AMD, AMN and CA. - Highlights: Black-Right-Pointing-Pointer We detected melamine in environmental samples by three methods. Black-Right-Pointing-Pointer ELISA can measures melamine in an environmental sample if a blocking buffer is used. Black

  16. Precision profiles and analytic reliability of radioimmunologic methods

    International Nuclear Information System (INIS)

    Yaneva, Z.; Popova, Yu.

    1991-01-01

    The aim of the present study is to investigate and compare some methods for creation of 'precision profiles' (PP) and to clarify their possibilities for determining the analytical reliability of RIA. Only methods without complicated mathematical calculations has been used. The reproducibility in serums with a concentration of the determinable hormone in the whole range of the calibration curve has been studied. The radioimmunoassay has been performed with TSH-RIA set (ex East Germany), and comparative evaluations - with commercial sets of HOECHST (Germany) and AMERSHAM (GB). Three methods for obtaining the relationship concentration (IU/l) -reproducibility (C.V.,%) are used and a comparison is made of their corresponding profiles: preliminary rough profile, Rodbard-PP and Ekins-PP. It is concluded that the creation of a precision profile is obligatory and the method of its construction does not influence the relationship's course. PP allows to determine concentration range giving stable results which improves the efficiency of the analitical work. 16 refs., 4 figs

  17. Experimental Research of Reliability of Plant Stress State Detection by Laser-Induced Fluorescence Method

    Directory of Open Access Journals (Sweden)

    Yury Fedotov

    2016-01-01

    Full Text Available Experimental laboratory investigations of the laser-induced fluorescence spectra of watercress and lawn grass were conducted. The fluorescence spectra were excited by YAG:Nd laser emitting at 532 nm. It was established that the influence of stress caused by mechanical damage, overwatering, and soil pollution is manifested in changes of the spectra shapes. The mean values and confidence intervals for the ratio of two fluorescence maxima near 685 and 740 nm were estimated. It is presented that the fluorescence ratio could be considered a reliable characteristic of plant stress state.

  18. Reliability Assessment of Active Distribution System Using Monte Carlo Simulation Method

    Directory of Open Access Journals (Sweden)

    Shaoyun Ge

    2014-01-01

    Full Text Available In this paper we have treated the reliability assessment problem of low and high DG penetration level of active distribution system using the Monte Carlo simulation method. The problem is formulated as a two-case program, the program of low penetration simulation and the program of high penetration simulation. The load shedding strategy and the simulation process were introduced in detail during each FMEA process. Results indicate that the integration of DG can improve the reliability of the system if the system was operated actively.

  19. A Systematic Review of Statistical Methods Used to Test for Reliability of Medical Instruments Measuring Continuous Variables

    Directory of Open Access Journals (Sweden)

    Rafdzah Zaki

    2013-06-01

    Full Text Available   Objective(s: Reliability measures precision or the extent to which test results can be replicated. This is the first ever systematic review to identify statistical methods used to measure reliability of equipment measuring continuous variables. This studyalso aims to highlight the inappropriate statistical method used in the reliability analysis and its implication in the medical practice.   Materials and Methods: In 2010, five electronic databases were searched between 2007 and 2009 to look for reliability studies. A total of 5,795 titles were initially identified. Only 282 titles were potentially related, and finally 42 fitted the inclusion criteria. Results: The Intra-class Correlation Coefficient (ICC is the most popular method with 25 (60% studies having used this method followed by the comparing means (8 or 19%. Out of 25 studies using the ICC, only 7 (28% reported the confidence intervals and types of ICC used. Most studies (71% also tested the agreement of instruments. Conclusion: This study finds that the Intra-class Correlation Coefficient is the most popular method used to assess the reliability of medical instruments measuring continuous outcomes. There are also inappropriate applications and interpretations of statistical methods in some studies. It is important for medical researchers to be aware of this issue, and be able to correctly perform analysis in reliability studies.

  20. Rapid methods for detection of bacteria

    DEFF Research Database (Denmark)

    Corfitzen, Charlotte B.; Andersen, B.Ø.; Miller, M.

    2006-01-01

    Traditional methods for detection of bacteria in drinking water e.g. Heterotrophic Plate Counts (HPC) or Most Probable Number (MNP) take 48-72 hours to give the result. New rapid methods for detection of bacteria are needed to protect the consumers against contaminations. Two rapid methods...

  1. Detecting binary black holes with efficient and reliable templates

    International Nuclear Information System (INIS)

    Damour, T.; Iyer, B.R.; Sathyaprakash, B.S.

    2001-01-01

    Detecting binary black holes in interferometer data requires an accurate knowledge of the orbital phase evolution of the system. From the point of view of data analysis one also needs fast algorithms to compute the templates that will be employed in searching for black hole binaries. Recently, there has been progress on both these fronts: On one hand, re-summation techniques have made it possible to accelerate the convergence of poorly convergent asymptotic post-Newtonian series and derive waveforms beyond the conventional adiabatic approximation. We now have a waveform model that extends beyond the inspiral regime into the plunge phase followed by the quasi-normal mode ringing. On the other hand, explicit Fourier domain waveforms have been derived that make the generation of waveforms fast enough so as not to be a burden on the computational resources required in filtering the detector data. These new developments should make it possible to efficiently and reliably search for black hole binaries in data from first interferometers. (author)

  2. A Sequential Kriging reliability analysis method with characteristics of adaptive sampling regions and parallelizability

    International Nuclear Information System (INIS)

    Wen, Zhixun; Pei, Haiqing; Liu, Hai; Yue, Zhufeng

    2016-01-01

    The sequential Kriging reliability analysis (SKRA) method has been developed in recent years for nonlinear implicit response functions which are expensive to evaluate. This type of method includes EGRA: the efficient reliability analysis method, and AK-MCS: the active learning reliability method combining Kriging model and Monte Carlo simulation. The purpose of this paper is to improve SKRA by adaptive sampling regions and parallelizability. The adaptive sampling regions strategy is proposed to avoid selecting samples in regions where the probability density is so low that the accuracy of these regions has negligible effects on the results. The size of the sampling regions is adapted according to the failure probability calculated by last iteration. Two parallel strategies are introduced and compared, aimed at selecting multiple sample points at a time. The improvement is verified through several troublesome examples. - Highlights: • The ISKRA method improves the efficiency of SKRA. • Adaptive sampling regions strategy reduces the number of needed samples. • The two parallel strategies reduce the number of needed iterations. • The accuracy of the optimal value impacts the number of samples significantly.

  3. Screening, sensitivity, and uncertainty for the CREAM method of Human Reliability Analysis

    International Nuclear Information System (INIS)

    Bedford, Tim; Bayley, Clare; Revie, Matthew

    2013-01-01

    This paper reports a sensitivity analysis of the Cognitive Reliability and Error Analysis Method for Human Reliability Analysis. We consider three different aspects: the difference between the outputs of the Basic and Extended methods, on the same HRA scenario; the variability in outputs through the choices made for common performance conditions (CPCs); and the variability in outputs through the assignment of choices for cognitive function failures (CFFs). We discuss the problem of interpreting categories when applying the method, compare its quantitative structure to that of first generation methods and discuss also how dependence is modelled with the approach. We show that the control mode intervals used in the Basic method are too narrow to be consistent with the Extended method. This motivates a new screening method that gives improved accuracy with respect to the Basic method, in the sense that (on average) halves the uncertainty associated with the Basic method. We make some observations on the design of a screening method that are generally applicable in Risk Analysis. Finally, we propose a new method of combining CPC weights with nominal probabilities so that the calculated probabilities are always in range (i.e. between 0 and 1), while satisfying sensible properties that are consistent with the overall CREAM method

  4. Estimation of reliability on digital plant protection system in nuclear power plants using fault simulation with self-checking

    International Nuclear Information System (INIS)

    Lee, Jun Seok; Kim, Suk Joon; Seong, Poong Hyun

    2004-01-01

    Safety-critical digital systems in nuclear power plants require high design reliability. Reliable software design and accurate prediction methods for the system reliability are important problems. In the reliability analysis, the error detection coverage of the system is one of the crucial factors, however, it is difficult to evaluate the error detection coverage of digital instrumentation and control system in nuclear power plants due to complexity of the system. To evaluate the error detection coverage for high efficiency and low cost, the simulation based fault injections with self checking are needed for digital instrumentation and control system in nuclear power plants. The target system is local coincidence logic in digital plant protection system and a simplified software modeling for this target system is used in this work. C++ based hardware description of micro computer simulator system is used to evaluate the error detection coverage of the system. From the simulation result, it is possible to estimate the error detection coverage of digital plant protection system in nuclear power plants using simulation based fault injection method with self checking. (author)

  5. Assessment of Crack Detection in Heavy-Walled Cast Stainless Steel Piping Welds Using Advanced Low-Frequency Ultrasonic Methods

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Michael T.; Crawford, Susan L.; Cumblidge, Stephen E.; Denslow, Kayte M.; Diaz, Aaron A.; Doctor, Steven R.

    2007-03-01

    Studies conducted at the Pacific Northwest National Laboratory in Richland, Washington, have focused on assessing the effectiveness and reliability of novel approaches to nondestructive examination (NDE) for inspecting coarse-grained, cast stainless steel reactor components. The primary objective of this work is to provide information to the U.S. Nuclear Regulatory Commission on the effectiveness and reliability of advanced NDE methods as related to the inservice inspection of safety-related components in pressurized water reactors (PWRs). This report provides progress, recent developments, and results from an assessment of low frequency ultrasonic testing (UT) for detection of inside surface-breaking cracks in cast stainless steel reactor piping weldments as applied from the outside surface of the components. Vintage centrifugally cast stainless steel piping segments were examined to assess the capability of low-frequency UT to adequately penetrate challenging microstructures and determine acoustic propagation limitations or conditions that may interfere with reliable flaw detection. In addition, welded specimens containing mechanical and thermal fatigue cracks were examined. The specimens were fabricated using vintage centrifugally cast and statically cast stainless steel materials, which are typical of configurations installed in PWR primary coolant circuits. Ultrasonic studies on the vintage centrifugally cast stainless steel piping segments were conducted with a 400-kHz synthetic aperture focusing technique and phased array technology applied at 500 kHz, 750 kHz, and 1.0 MHz. Flaw detection and characterization on the welded specimens was performed with the phased array method operating at the frequencies stated above. This report documents the methodologies used and provides results from laboratory studies to assess baseline material noise, crack detection, and length-sizing capability for low-frequency UT in cast stainless steel piping.

  6. Assessment of reliability of Greulich and Pyle (gp) method for ...

    African Journals Online (AJOL)

    Background: Greulich and Pyle standards are the most widely used age estimation standards all over the world. The applicability of the Greulich and Pyle standards to populations which differ from their reference population is often questioned. This study aimed to assess the reliability of Greulich and Pyle (GP) method for ...

  7. Proceeding of 35th domestic symposium on applications of structural reliability and risk assessment methods to nuclear power plants

    International Nuclear Information System (INIS)

    2005-06-01

    As the 35th domestic symposium of Atomic Energy Research Committee, the Japan Welding Engineering Society, the symposium was held titled as Applications of structural reliability/risk assessment methods to nuclear energy'. Six speakers gave lectures titled as 'Structural reliability and risk assessment methods', 'Risk-informed regulation of US nuclear energy and role of probabilistic risk assessment', 'Reliability and risk assessment methods in chemical plants', 'Practical structural design methods based on reliability in architectural and civil areas', 'Maintenance activities based on reliability in thermal power plants' and 'LWR maintenance strategies based on Probabilistic Fracture Mechanics'. (T. Tanaka)

  8. Detection of oral HPV infection - Comparison of two different specimen collection methods and two HPV detection methods.

    Science.gov (United States)

    de Souza, Marjorie M A; Hartel, Gunter; Whiteman, David C; Antonsson, Annika

    2018-04-01

    Very little is known about the natural history of oral HPV infection. Several different methods exist to collect oral specimens and detect HPV, but their respective performance characteristics are unknown. We compared two different methods for oral specimen collection (oral saline rinse and commercial saliva kit) from 96 individuals and then analyzed the samples for HPV by two different PCR detection methods (single GP5+/6+ PCR and nested MY09/11 and GP5+/6+ PCR). For the oral rinse samples, the oral HPV prevalence was 10.4% (GP+ PCR; 10% repeatability) vs 11.5% (nested PCR method; 100% repeatability). For the commercial saliva kit samples, the prevalences were 3.1% vs 16.7% with the GP+ PCR vs the nested PCR method (repeatability 100% for both detection methods). Overall the agreement was fair or poor between samples and methods (kappa 0.06-0.36). Standardizing methods of oral sample collection and HPV detection would ensure comparability between future oral HPV studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Towards achieving a reliable leakage detection and localization algorithm for application in water piping networks: an overview

    CSIR Research Space (South Africa)

    Adedeji, KB

    2017-09-01

    Full Text Available Leakage detection and localization in pipelines has become an important aspect of water management systems. Since monitoring leakage in large-scale water distribution networks (WDNs) is a challenging task, the need to develop a reliable and robust...

  10. A real-time insulation detection method for battery packs used in electric vehicles

    Science.gov (United States)

    Tian, Jiaqiang; Wang, Yujie; Yang, Duo; Zhang, Xu; Chen, Zonghai

    2018-05-01

    Due to the energy crisis and environmental pollution, electric vehicles have become more and more popular. Compared to traditional fuel vehicles, the electric vehicles are integrated with more high-voltage components, which have potential security risks of insulation. The insulation resistance between the chassis and the direct current bus of the battery pack is easily affected by factors such as temperature, humidity and vibration. In order to ensure the safe and reliable operation of the electric vehicles, it is necessary to detect the insulation resistance of the battery pack. This paper proposes an insulation detection scheme based on low-frequency signal injection method. Considering the insulation detector which can be easily affected by noises, the algorithm based on Kalman filter is proposed. Moreover, the battery pack is always in the states of charging and discharging during driving, which will lead to frequent changes in the voltage of the battery pack and affect the estimation accuracy of insulation detector. Therefore the recursive least squares algorithm is adopted to solve the problem that the detection results of insulation detector mutate with the voltage of the battery pack. The performance of the proposed method is verified by dynamic and static experiments.

  11. Final report : testing and evaluation for solar hot water reliability.

    Energy Technology Data Exchange (ETDEWEB)

    Caudell, Thomas P. (University of New Mexico, Albuquerque, NM); He, Hongbo (University of New Mexico, Albuquerque, NM); Menicucci, David F. (Building Specialists, Inc., Albuquerque, NM); Mammoli, Andrea A. (University of New Mexico, Albuquerque, NM); Burch, Jay (National Renewable Energy Laboratory, Golden CO)

    2011-07-01

    Solar hot water (SHW) systems are being installed by the thousands. Tax credits and utility rebate programs are spurring this burgeoning market. However, the reliability of these systems is virtually unknown. Recent work by Sandia National Laboratories (SNL) has shown that few data exist to quantify the mean time to failure of these systems. However, there is keen interest in developing new techniques to measure SHW reliability, particularly among utilities that use ratepayer money to pay the rebates. This document reports on an effort to develop and test new, simplified techniques to directly measure the state of health of fielded SHW systems. One approach was developed by the National Renewable Energy Laboratory (NREL) and is based on the idea that the performance of the solar storage tank can reliably indicate the operational status of the SHW systems. Another approach, developed by the University of New Mexico (UNM), uses adaptive resonance theory, a type of neural network, to detect and predict failures. This method uses the same sensors that are normally used to control the SHW system. The NREL method uses two additional temperature sensors on the solar tank. The theories, development, application, and testing of both methods are described in the report. Testing was performed on the SHW Reliability Testbed at UNM, a highly instrumented SHW system developed jointly by SNL and UNM. The two methods were tested against a number of simulated failures. The results show that both methods show promise for inclusion in conventional SHW controllers, giving them advanced capability in detecting and predicting component failures.

  12. A reliability program approach to operational safety

    International Nuclear Information System (INIS)

    Mueller, C.J.; Bezella, W.A.

    1985-01-01

    A Reliability Program (RP) model based on proven reliability techniques is being formulated for potential application in the nuclear power industry. Methods employed under NASA and military direction, commercial airline and related FAA programs were surveyed and a review of current nuclear risk-dominant issues conducted. The need for a reliability approach to address dependent system failures, operating and emergency procedures and human performance, and develop a plant-specific performance data base for safety decision making is demonstrated. Current research has concentrated on developing a Reliability Program approach for the operating phase of a nuclear plant's lifecycle. The approach incorporates performance monitoring and evaluation activities with dedicated tasks that integrate these activities with operation, surveillance, and maintenance of the plant. The detection, root-cause evaluation and before-the-fact correction of incipient or actual systems failures as a mechanism for maintaining plant safety is a major objective of the Reliability Program. (orig./HP)

  13. Reliability analysis of idealized tunnel support system using probability-based methods with case studies

    Science.gov (United States)

    Gharouni-Nik, Morteza; Naeimi, Meysam; Ahadi, Sodayf; Alimoradi, Zahra

    2014-06-01

    In order to determine the overall safety of a tunnel support lining, a reliability-based approach is presented in this paper. Support elements in jointed rock tunnels are provided to control the ground movement caused by stress redistribution during the tunnel drive. Main support elements contribute to stability of the tunnel structure are recognized owing to identify various aspects of reliability and sustainability in the system. The selection of efficient support methods for rock tunneling is a key factor in order to reduce the number of problems during construction and maintain the project cost and time within the limited budget and planned schedule. This paper introduces a smart approach by which decision-makers will be able to find the overall reliability of tunnel support system before selecting the final scheme of the lining system. Due to this research focus, engineering reliability which is a branch of statistics and probability is being appropriately applied to the field and much effort has been made to use it in tunneling while investigating the reliability of the lining support system for the tunnel structure. Therefore, reliability analysis for evaluating the tunnel support performance is the main idea used in this research. Decomposition approaches are used for producing system block diagram and determining the failure probability of the whole system. Effectiveness of the proposed reliability model of tunnel lining together with the recommended approaches is examined using several case studies and the final value of reliability obtained for different designing scenarios. Considering the idea of linear correlation between safety factors and reliability parameters, the values of isolated reliabilities determined for different structural components of tunnel support system. In order to determine individual safety factors, finite element modeling is employed for different structural subsystems and the results of numerical analyses are obtained in

  14. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  15. Reliability, standard error, and minimum detectable change of clinical pressure pain threshold testing in people with and without acute neck pain.

    Science.gov (United States)

    Walton, David M; Macdermid, Joy C; Nielson, Warren; Teasell, Robert W; Chiasson, Marco; Brown, Lauren

    2011-09-01

    Clinical measurement. To evaluate the intrarater, interrater, and test-retest reliability of an accessible digital algometer, and to determine the minimum detectable change in normal healthy individuals and a clinical population with neck pain. Pressure pain threshold testing may be a valuable assessment and prognostic indicator for people with neck pain. To date, most of this research has been completed using algometers that are too resource intensive for routine clinical use. Novice raters (physiotherapy students or clinical physiotherapists) were trained to perform algometry testing over 2 clinically relevant sites: the angle of the upper trapezius and the belly of the tibialis anterior. A convenience sample of normal healthy individuals and a clinical sample of people with neck pain were tested by 2 different raters (all participants) and on 2 different days (healthy participants only). Intraclass correlation coefficient (ICC), standard error of measurement, and minimum detectable change were calculated. A total of 60 healthy volunteers and 40 people with neck pain were recruited. Intrarater reliability was almost perfect (ICC = 0.94-0.97), interrater reliability was substantial to near perfect (ICC = 0.79-0.90), and test-retest reliability was substantial (ICC = 0.76-0.79). Smaller change was detectable in the trapezius compared to the tibialis anterior. This study provides evidence that novice raters can perform digital algometry with adequate reliability for research and clinical use in people with and without neck pain.

  16. Can non-destructive inspection be reliable

    International Nuclear Information System (INIS)

    Silk, M.G.; Stoneham, A.M.; Temple, J.A.G.

    1988-01-01

    The paper on inspection is based on the book ''The reliability of non-destructive inspection: assessing the assessment of structures under stress'' by the present authors (published by Adam Hilger 1987). Emphasis is placed on the reliability of inspection and whether cracks in welds or flaws in components can be detected. The need for non-destructive testing and the historical attitudes to non-destructive testing are outlined, along with the case of failure. Factors influencing reliable inspection are discussed, and defect detection trials involving round robin tests are described. The development of reliable inspection techniques and the costs of reliability and unreliability are also examined. (U.K.)

  17. A simple reliability block diagram method for safety integrity verification

    International Nuclear Information System (INIS)

    Guo Haitao; Yang Xianhui

    2007-01-01

    IEC 61508 requires safety integrity verification for safety related systems to be a necessary procedure in safety life cycle. PFD avg must be calculated to verify the safety integrity level (SIL). Since IEC 61508-6 does not give detailed explanations of the definitions and PFD avg calculations for its examples, it is difficult for common reliability or safety engineers to understand when they use the standard as guidance in practice. A method using reliability block diagram is investigated in this study in order to provide a clear and feasible way of PFD avg calculation and help those who take IEC 61508-6 as their guidance. The method finds mean down times (MDTs) of both channel and voted group first and then PFD avg . The calculated results of various voted groups are compared with those in IEC61508 part 6 and Ref. [Zhang T, Long W, Sato Y. Availability of systems with self-diagnostic components-applying Markov model to IEC 61508-6. Reliab Eng System Saf 2003;80(2):133-41]. An interesting outcome can be realized from the comparison. Furthermore, although differences in MDT of voted groups exist between IEC 61508-6 and this paper, PFD avg of voted groups are comparatively close. With detailed description, the method of RBD presented can be applied to the quantitative SIL verification, showing a similarity of the method in IEC 61508-6

  18. Enhanced reliability and accuracy for field deployable bioforensic detection and discrimination of Xylella fastidiosa subsp. pauca, causal agent of citrus variegated chlorosis using razor ex technology and TaqMan quantitative PCR.

    Science.gov (United States)

    Ouyang, Ping; Arif, Mohammad; Fletcher, Jacqueline; Melcher, Ulrich; Ochoa Corona, Francisco Manuel

    2013-01-01

    A reliable, accurate and rapid multigene-based assay combining real time quantitative PCR (qPCR) and a Razor Ex BioDetection System (Razor Ex) was validated for detection of Xylella fastidiosa subsp. pauca (Xfp, a xylem-limited bacterium that causes citrus variegated chlorosis [CVC]). CVC, which is exotic to the United States, has spread through South and Central America and could significantly impact U.S. citrus if it arrives. A method for early, accurate and sensitive detection of Xfp in plant tissues is needed by plant health officials for inspection of products from quarantined locations, and by extension specialists for detection, identification and management of disease outbreaks and reservoir hosts. Two sets of specific PCR primers and probes, targeting Xfp genes for fimbrillin and the periplasmic iron-binding protein were designed. A third pair of primers targeting the conserved cobalamin synthesis protein gene was designed to detect all possible X. fastidiosa (Xf) strains. All three primer sets detected as little as 1 fg of plasmid DNA carrying X. fastidiosa target sequences and genomic DNA of Xfp at as little as 1 - 10 fg. The use of Razor Ex facilitates a rapid (about 30 min) in-field assay capability for detection of all Xf strains, and for specific detection of Xfp. Combined use of three primer sets targeting different genes increased the assay accuracy and broadened the range of detection. To our knowledge, this is the first report of a field-deployable rapid and reliable bioforensic detection and discrimination method for a bacterial phytopathogen based on multigene targets.

  19. Some methods for the detection of fissionable matter; Quelques methodes de detection des corps fissiles

    Energy Technology Data Exchange (ETDEWEB)

    Guery, M [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1967-03-01

    A number of equipments or processes allowing to detect uranium or plutonium in industrial plants, and in particular to measure solution concentrations, are studied here. Each method has its own field of applications and has its own performances, which we have tried to define by calculations and by experiments. The following topics have been treated: {gamma} absorptiometer with an Am source, detection test by neutron multiplication, apparatus for the measurement of the {alpha} activity of a solution, fissionable matter detection by {gamma} emission, fissionable matter detection by neutron emission. (author) [French] On examine ici plusieurs appareils ou procedes qui permettent de detecter l'uranium ou le plutonium dans les installations industrielles, et en particulier de mesurer les concentrations de solutions. Chacune des methodes a son domaine d'application et ses performances, qu'on a tente de definir par le calcul et par des experiences. Les sujets traites sont les suivants: absorptiometre {gamma} a source d'americium, essais de detection par multiplication neutronique, appareil de mesure de l'activite {alpha} d'une solution, detection des matieres fissiles par leur emission {gamma}, detection des matieres fissiles par leur emission neutronique. (auteur)

  20. An Entropy-Based Network Anomaly Detection Method

    Directory of Open Access Journals (Sweden)

    Przemysław Bereziński

    2015-04-01

    Full Text Available Data mining is an interdisciplinary subfield of computer science involving methods at the intersection of artificial intelligence, machine learning and statistics. One of the data mining tasks is anomaly detection which is the analysis of large quantities of data to identify items, events or observations which do not conform to an expected pattern. Anomaly detection is applicable in a variety of domains, e.g., fraud detection, fault detection, system health monitoring but this article focuses on application of anomaly detection in the field of network intrusion detection.The main goal of the article is to prove that an entropy-based approach is suitable to detect modern botnet-like malware based on anomalous patterns in network. This aim is achieved by realization of the following points: (i preparation of a concept of original entropy-based network anomaly detection method, (ii implementation of the method, (iii preparation of original dataset, (iv evaluation of the method.

  1. Selection and reporting of statistical methods to assess reliability of a diagnostic test: Conformity to recommended methods in a peer-reviewed journal

    International Nuclear Information System (INIS)

    Park, Ji Eun; Sung, Yu Sub; Han, Kyung Hwa

    2017-01-01

    To evaluate the frequency and adequacy of statistical analyses in a general radiology journal when reporting a reliability analysis for a diagnostic test. Sixty-three studies of diagnostic test accuracy (DTA) and 36 studies reporting reliability analyses published in the Korean Journal of Radiology between 2012 and 2016 were analyzed. Studies were judged using the methodological guidelines of the Radiological Society of North America-Quantitative Imaging Biomarkers Alliance (RSNA-QIBA), and COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) initiative. DTA studies were evaluated by nine editorial board members of the journal. Reliability studies were evaluated by study reviewers experienced with reliability analysis. Thirty-one (49.2%) of the 63 DTA studies did not include a reliability analysis when deemed necessary. Among the 36 reliability studies, proper statistical methods were used in all (5/5) studies dealing with dichotomous/nominal data, 46.7% (7/15) of studies dealing with ordinal data, and 95.2% (20/21) of studies dealing with continuous data. Statistical methods were described in sufficient detail regarding weighted kappa in 28.6% (2/7) of studies and regarding the model and assumptions of intraclass correlation coefficient in 35.3% (6/17) and 29.4% (5/17) of studies, respectively. Reliability parameters were used as if they were agreement parameters in 23.1% (3/13) of studies. Reproducibility and repeatability were used incorrectly in 20% (3/15) of studies. Greater attention to the importance of reporting reliability, thorough description of the related statistical methods, efforts not to neglect agreement parameters, and better use of relevant terminology is necessary

  2. Selection and reporting of statistical methods to assess reliability of a diagnostic test: Conformity to recommended methods in a peer-reviewed journal

    Energy Technology Data Exchange (ETDEWEB)

    Park, Ji Eun; Sung, Yu Sub [Dept. of Radiology and Research Institute of Radiology, University of Ulsan College of Medicine, Asan Medical Center, Seoul (Korea, Republic of); Han, Kyung Hwa [Dept. of Radiology, Research Institute of Radiological Science, Yonsei University College of Medicine, Seoul (Korea, Republic of); and others

    2017-11-15

    To evaluate the frequency and adequacy of statistical analyses in a general radiology journal when reporting a reliability analysis for a diagnostic test. Sixty-three studies of diagnostic test accuracy (DTA) and 36 studies reporting reliability analyses published in the Korean Journal of Radiology between 2012 and 2016 were analyzed. Studies were judged using the methodological guidelines of the Radiological Society of North America-Quantitative Imaging Biomarkers Alliance (RSNA-QIBA), and COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) initiative. DTA studies were evaluated by nine editorial board members of the journal. Reliability studies were evaluated by study reviewers experienced with reliability analysis. Thirty-one (49.2%) of the 63 DTA studies did not include a reliability analysis when deemed necessary. Among the 36 reliability studies, proper statistical methods were used in all (5/5) studies dealing with dichotomous/nominal data, 46.7% (7/15) of studies dealing with ordinal data, and 95.2% (20/21) of studies dealing with continuous data. Statistical methods were described in sufficient detail regarding weighted kappa in 28.6% (2/7) of studies and regarding the model and assumptions of intraclass correlation coefficient in 35.3% (6/17) and 29.4% (5/17) of studies, respectively. Reliability parameters were used as if they were agreement parameters in 23.1% (3/13) of studies. Reproducibility and repeatability were used incorrectly in 20% (3/15) of studies. Greater attention to the importance of reporting reliability, thorough description of the related statistical methods, efforts not to neglect agreement parameters, and better use of relevant terminology is necessary.

  3. Designing a reliable leak bio-detection system for natural gas pipelines.

    Science.gov (United States)

    Batzias, F A; Siontorou, C G; Spanidis, P-M P

    2011-02-15

    Monitoring of natural gas (NG) pipelines is an important task for economical/safety operation, loss prevention and environmental protection. Timely and reliable leak detection of gas pipeline, therefore, plays a key role in the overall integrity management for the pipeline system. Owing to the various limitations of the currently available techniques and the surveillance area that needs to be covered, the research on new detector systems is still thriving. Biosensors are worldwide considered as a niche technology in the environmental market, since they afford the desired detector capabilities at low cost, provided they have been properly designed/developed and rationally placed/networked/maintained by the aid of operational research techniques. This paper addresses NG leakage surveillance through a robust cooperative/synergistic scheme between biosensors and conventional detector systems; the network is validated in situ and optimized in order to provide reliable information at the required granularity level. The proposed scheme is substantiated through a knowledge based approach and relies on Fuzzy Multicriteria Analysis (FMCA), for selecting the best biosensor design that suits both, the target analyte and the operational micro-environment. This approach is illustrated in the design of leak surveying over a pipeline network in Greece. Copyright © 2010 Elsevier B.V. All rights reserved.

  4. Comprehensive reliability allocation method for CNC lathes based on cubic transformed functions of failure mode and effects analysis

    Science.gov (United States)

    Yang, Zhou; Zhu, Yunpeng; Ren, Hongrui; Zhang, Yimin

    2015-03-01

    Reliability allocation of computerized numerical controlled(CNC) lathes is very important in industry. Traditional allocation methods only focus on high-failure rate components rather than moderate failure rate components, which is not applicable in some conditions. Aiming at solving the problem of CNC lathes reliability allocating, a comprehensive reliability allocation method based on cubic transformed functions of failure modes and effects analysis(FMEA) is presented. Firstly, conventional reliability allocation methods are introduced. Then the limitations of direct combination of comprehensive allocation method with the exponential transformed FMEA method are investigated. Subsequently, a cubic transformed function is established in order to overcome these limitations. Properties of the new transformed functions are discussed by considering the failure severity and the failure occurrence. Designers can choose appropriate transform amplitudes according to their requirements. Finally, a CNC lathe and a spindle system are used as an example to verify the new allocation method. Seven criteria are considered to compare the results of the new method with traditional methods. The allocation results indicate that the new method is more flexible than traditional methods. By employing the new cubic transformed function, the method covers a wider range of problems in CNC reliability allocation without losing the advantages of traditional methods.

  5. An accelerated method for the detection of Extended-Spectrum B- Lactamases in urinary isolates of Escherichia Coli and Klebsiella pneumoniae

    International Nuclear Information System (INIS)

    Kader, Abdulrahman A.; Kumar, A.; Krishna, A.; Zaman, M.N.

    2006-01-01

    We prospectively studied an accelerated phenotypic method by incorporating the double disk synergy test in the standard Kirby-Bauer disk diffusion susceptibility testing, to evaluate a protocol for the rapid detection of extended of extended-spectrum B-lactamases (ESBL) in urinary isolates of Escherichia coli (E. coli) and Klebsiella, pneumoniae (K. pneumoniae). All ESBL-positive isolates were confirmed by the standard Clinical Laboratory Standards Institute (CLSI) confirmatory disk diffusion method. Between November 2004 and December 2005, a total of 6988 urine specimens were analyzed of which 776 (11%) showed significant growth. They included E. coli in 577 cases (74%) and K. pneumoniae in 199 (25.6%). Of these, 63 E. coli (8%) and 15 K. pneumoniae (7.5%) were positive for ESBL by the accelerated and CLSI methods. Compared to the standard CLSI method, the accelerated method reduced the ESBL detection time from two days to one day. We conclude that the accelerated ESBL detection technique used by us in this study is a reliable and rapid method for detecting ESBL in urinary isolates of E. coli and K. pneumoniae. (author)

  6. Comparison of methods for dependency determination between human failure events within human reliability analysis

    International Nuclear Information System (INIS)

    Cepis, M.

    2007-01-01

    The Human Reliability Analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease of subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan - Human Reliability Analysis (IJS-HRA) and Standardized Plant Analysis Risk Human Reliability Analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance. (author)

  7. Comparison of Methods for Dependency Determination between Human Failure Events within Human Reliability Analysis

    International Nuclear Information System (INIS)

    Cepin, M.

    2008-01-01

    The human reliability analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan human reliability analysis (IJS-HRA) and standardized plant analysis risk human reliability analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance

  8. Reliability versus reproducibility

    International Nuclear Information System (INIS)

    Lautzenheiser, C.E.

    1976-01-01

    Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results

  9. A Simple and Reliable Method of Design for Standalone Photovoltaic Systems

    Science.gov (United States)

    Srinivasarao, Mantri; Sudha, K. Rama; Bhanu, C. V. K.

    2017-06-01

    Standalone photovoltaic (SAPV) systems are seen as a promoting method of electrifying areas of developing world that lack power grid infrastructure. Proliferations of these systems require a design procedure that is simple, reliable and exhibit good performance over its life time. The proposed methodology uses simple empirical formulae and easily available parameters to design SAPV systems, that is, array size with energy storage. After arriving at the different array size (area), performance curves are obtained for optimal design of SAPV system with high amount of reliability in terms of autonomy at a specified value of loss of load probability (LOLP). Based on the array to load ratio (ALR) and levelized energy cost (LEC) through life cycle cost (LCC) analysis, it is shown that the proposed methodology gives better performance, requires simple data and is more reliable when compared with conventional design using monthly average daily load and insolation.

  10. A Hierarchical Reliability Control Method for a Space Manipulator Based on the Strategy of Autonomous Decision-Making

    Directory of Open Access Journals (Sweden)

    Xin Gao

    2016-01-01

    Full Text Available In order to maintain and enhance the operational reliability of a robotic manipulator deployed in space, an operational reliability system control method is presented in this paper. First, a method to divide factors affecting the operational reliability is proposed, which divides the operational reliability factors into task-related factors and cost-related factors. Then the models describing the relationships between the two kinds of factors and control variables are established. Based on this, a multivariable and multiconstraint optimization model is constructed. Second, a hierarchical system control model which incorporates the operational reliability factors is constructed. The control process of the space manipulator is divided into three layers: task planning, path planning, and motion control. Operational reliability related performance parameters are measured and used as the system’s feedback. Taking the factors affecting the operational reliability into consideration, the system can autonomously decide which control layer of the system should be optimized and how to optimize it using a control level adjustment decision module. The operational reliability factors affect these three control levels in the form of control variable constraints. Simulation results demonstrate that the proposed method can achieve a greater probability of meeting the task accuracy requirements, while extending the expected lifetime of the space manipulator.

  11. Comparison of Molecular and Phenotypic Methods for the Detection and Characterization of Carbapenem Resistant Enterobacteriaceae.

    Science.gov (United States)

    Somily, Ali M; Garaween, Ghada A; Abukhalid, Norah; Absar, Muhammad M; Senok, Abiola C

    2016-03-01

    In recent years, there has been a rapid dissemination of carbapenem resistant Enterobacteriaceae (CRE). This study aimed to compare phenotypic and molecular methods for detection and characterization of CRE isolates at a large tertiary care hospital in Saudi Arabia. This study was carried out between January 2011 and November 2013 at the King Khalid University Hospital (KKUH) in Saudi Arabia. Determination of presence of extended-spectrum beta-lactamases (ESBL) and carbapenem resistance was in accordance with Clinical and Laboratory Standards Institute (CLSI) guidelines. Phenotypic classification was done by the MASTDISCS(TM) ID inhibitor combination disk method. Genotypic characterization of ESBL and carbapenemase genes was performed by the Check-MDR CT102. Diversilab rep-PCR was used for the determination of clonal relationship. Of the 883 ESBL-positive Enterobacteriaceae detected during the study period, 14 (1.6%) isolates were carbapenem resistant. Both the molecular genotypic characterization and phenotypic testing were in agreement in the detection of all 8 metalo-beta-lactamases (MBL) producing isolates. Of these 8 MBL-producers, 5 were positive for blaNDM gene and 3 were positive for blaVIM gene. Molecular method identified additional blaOXA gene isolates while MASTDISCS(TM) ID detected one AmpC producer isolate. Both methods agreed in identifying 2 carbapenem resistant isolates which were negative for carbapenemase genes. Diversilab rep-PCR analysis of the 9 Klebsiella pneumoniae isolates revealed polyclonal distribution into eight clusters. MASTDISCS(TM) ID is a reliable simple cheap phenotypic method for detection of majority of carbapenemase genes with the exception of the blaOXA gene. We recommend to use such method in the clinical laboratory.

  12. Reliability analysis of operator's monitoring behavior in digital main control room of nuclear power plants and its application

    International Nuclear Information System (INIS)

    Zhang Li; Hu Hong; Li Pengcheng; Jiang Jianjun; Yi Cannan; Chen Qingqing

    2015-01-01

    In order to build a quantitative model to analyze operators' monitoring behavior reliability of digital main control room of nuclear power plants, based on the analysis of the design characteristics of digital main control room of a nuclear power plant and operator's monitoring behavior, and combining with operators' monitoring behavior process, monitoring behavior reliability was divided into three parts including information transfer reliability among screens, inside-screen information sampling reliability and information detection reliability. Quantitative calculation model of information transfer reliability among screens was established based on Senders's monitoring theory; the inside screen information sampling reliability model was established based on the allocation theory of attention resources; and considering the performance shaping factor causality, a fuzzy Bayesian method was presented to quantify information detection reliability and an example of application was given. The results show that the established model of monitoring behavior reliability gives an objective description for monitoring process, which can quantify the monitoring reliability and overcome the shortcomings of traditional methods. Therefore, it provides theoretical support for operator's monitoring behavior reliability analysis in digital main control room of nuclear power plants and improves the precision of human reliability analysis. (authors)

  13. GMDD: a database of GMO detection methods.

    Science.gov (United States)

    Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans J P; Guo, Rong; Liang, Wanqi; Zhang, Dabing

    2008-06-04

    Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier.

  14. Comparative study of protoporphyrin IX fluorescence image enhancement methods to improve an optical imaging system for oral cancer detection

    Science.gov (United States)

    Jiang, Ching-Fen; Wang, Chih-Yu; Chiang, Chun-Ping

    2011-07-01

    Optoelectronics techniques to induce protoporphyrin IX fluorescence with topically applied 5-aminolevulinic acid on the oral mucosa have been developed to noninvasively detect oral cancer. Fluorescence imaging enables wide-area screening for oral premalignancy, but the lack of an adequate fluorescence enhancement method restricts the clinical imaging application of these techniques. This study aimed to develop a reliable fluorescence enhancement method to improve PpIX fluorescence imaging systems for oral cancer detection. Three contrast features, red-green-blue reflectance difference, R/B ratio, and R/G ratio, were developed first based on the optical properties of the fluorescence images. A comparative study was then carried out with one negative control and four biopsy confirmed clinical cases to validate the optimal image processing method for the detection of the distribution of malignancy. The results showed the superiority of the R/G ratio in terms of yielding a better contrast between normal and neoplastic tissue, and this method was less prone to errors in detection. Quantitative comparison with the clinical diagnoses in the four neoplastic cases showed that the regions of premalignancy obtained using the proposed method accorded with the expert's determination, suggesting the potential clinical application of this method for the detection of oral cancer.

  15. Collection of methods for reliability and safety engineering

    International Nuclear Information System (INIS)

    Fussell, J.B.; Rasmuson, D.M.; Wilson, J.R.; Burdick, G.R.; Zipperer, J.C.

    1976-04-01

    The document presented contains five reports each describing a method of reliability and safety engineering. Report I provides a conceptual framework for the study of component malfunctions during system evaluations. Report II provides methods for locating groups of critical component failures such that all the component failures in a given group can be caused to occur by the occurrence of a single separate event. These groups of component failures are called common cause candidates. Report III provides a method for acquiring and storing system-independent component failure logic information. The information stored is influenced by the concepts presented in Report I and also includes information useful in locating common cause candidates. Report IV puts forth methods for analyzing situations that involve systems which change character in a predetermined time sequence. These phased missions techniques are applicable to the hypothetical ''accident chains'' frequently analyzed for nuclear power plants. Report V presents a unified approach to cause-consequence analysis, a method of analysis useful during risk assessments. This approach, as developed by the Danish Atomic Energy Commission, is modified to reflect the format and symbology conventionally used for other types of analysis of nuclear reactor systems

  16. Development of a filter-based method for detecting silver nanoparticles and their heteroaggregation in aqueous environments by surface-enhanced Raman spectroscopy

    International Nuclear Information System (INIS)

    Guo, Huiyuan; Xing, Baoshan; He, Lili

    2016-01-01

    The rising application of silver nanoparticles (AgNPs) and subsequent release into aquatic systems have generated public concerns over their potential risk and harm to aquatic organisms and human health. Effective and practical analytical methods for AgNPs are urgently needed for their risk assessment. In this study we established an innovative approach to detect trace levels of AgNPs in environmental water through integrating a filtration technique into surface-enhanced Raman spectroscopy (SERS) and compared it with previously established centrifuge-based method. The purpose of filtration was to trap and enrich salt-aggregated AgNPs from water samples onto the filter membrane, through which indicator was then passed and complexed with AgNPs. The enhanced SERS signals of indicator could reflect the presence and quantity of AgNPs in the samples. The most favorable benefit of filtration is being able to process large volume samples, which is more practical for water samples, and greatly improves the sensitivity of AgNP detection. In this study, we tested 20 mL AgNPs-containing samples and the filter-based method is able to detect AgNPs as low as 5 μg/L, which is 20 folds lower than the centrifuge-based method. In addition, the speed and precision of the detection were greatly improved. This approach was used to detect trace levels of AgNPs in real environmental water successfully. Meanwhile, the heteroaggregation of AgNPs with minerals in water was reliably monitored by the new method. Overall, a combination of the filtration-SERS approach provides a rapid, simple, and sensitive way to detect AgNPs and analyze their environmental behavior. - Highlights: • We developed a filtration-SERS method for analyzing AgNPs in water. • Detection limit can be improved by increasing sample volume for filtration. • Trace levels of AgNPs in natural water samples can be successfully detected. • Filtration-SERS is more efficient and precise than centrifugation-SERS.

  17. Method of reliability allocation based on fault tree analysis and fuzzy math in nuclear power plants

    International Nuclear Information System (INIS)

    Chen Zhaobing; Deng Jian; Cao Xuewu

    2005-01-01

    Reliability allocation is a kind of a difficult multi-objective optimization problem. It can not only be applied to determine the reliability characteristic of reactor systems, subsystem and main components but also be performed to improve the design, operation and maintenance of nuclear plants. The fuzzy math known as one of the powerful tools for fuzzy optimization and the fault analysis deemed to be one of the effective methods of reliability analysis can be applied to the reliability allocation model so as to work out the problems of fuzzy characteristic of some factors and subsystem's choice respectively in this paper. Thus we develop a failure rate allocation model on the basis of the fault tree analysis and fuzzy math. For the choice of the reliability constraint factors, we choose the six important ones according to practical need for conducting the reliability allocation. The subsystem selected by the top-level fault tree analysis is to avoid allocating reliability for all the equipment and components including the unnecessary parts. During the reliability process, some factors can be calculated or measured quantitatively while others only can be assessed qualitatively by the expert rating method. So we adopt fuzzy decision and dualistic contrast to realize the reliability allocation with the help of fault tree analysis. Finally the example of the emergency diesel generator's reliability allocation is used to illustrate reliability allocation model and improve this model simple and applicable. (authors)

  18. A new fault detection method for computer networks

    International Nuclear Information System (INIS)

    Lu, Lu; Xu, Zhengguo; Wang, Wenhai; Sun, Youxian

    2013-01-01

    Over the past few years, fault detection for computer networks has attracted extensive attentions for its importance in network management. Most existing fault detection methods are based on active probing techniques which can detect the occurrence of faults fast and precisely. But these methods suffer from the limitation of traffic overhead, especially in large scale networks. To relieve traffic overhead induced by active probing based methods, a new fault detection method, whose key is to divide the detection process into multiple stages, is proposed in this paper. During each stage, only a small region of the network is detected by using a small set of probes. Meanwhile, it also ensures that the entire network can be covered after multiple detection stages. This method can guarantee that the traffic used by probes during each detection stage is small sufficiently so that the network can operate without severe disturbance from probes. Several simulation results verify the effectiveness of the proposed method

  19. A Spatially Offset Raman Spectroscopy Method for Non-Destructive Detection of Gelatin-Encapsulated Powders

    Directory of Open Access Journals (Sweden)

    Kuanglin Chao

    2017-03-01

    Full Text Available Non-destructive subsurface detection of encapsulated, coated, or seal-packaged foods and pharmaceuticals can help prevent distribution and consumption of counterfeit or hazardous products. This study used a Spatially Offset Raman Spectroscopy (SORS method to detect and identify urea, ibuprofen, and acetaminophen powders contained within one or more (up to eight layers of gelatin capsules to demonstrate subsurface chemical detection and identification. A 785-nm point-scan Raman spectroscopy system was used to acquire spatially offset Raman spectra for an offset range of 0 to 10 mm from the surfaces of 24 encapsulated samples, using a step size of 0.1 mm to obtain 101 spectral measurements per sample. As the offset distance was increased, the spectral contribution from the subsurface powder gradually outweighed that of the surface capsule layers, allowing for detection of the encapsulated powders. Containing mixed contributions from the powder and capsule, the SORS spectra for each sample were resolved into pure component spectra using self-modeling mixture analysis (SMA and the corresponding components were identified using spectral information divergence values. As demonstrated here for detecting chemicals contained inside thick capsule layers, this SORS measurement technique coupled with SMA has the potential to be a reliable non-destructive method for subsurface inspection and authentication of foods, health supplements, and pharmaceutical products that are prepared or packaged with semi-transparent materials.

  20. Reliability of Doppler and stethoscope methods of determining systolic blood pressures: considerations for calculating an ankle-brachial index.

    Science.gov (United States)

    Chesbro, Steven B; Asongwed, Elmira T; Brown, Jamesha; John, Emmanuel B

    2011-01-01

    The purposes of this study were to: (1) identify the interrater and intrarater reliability of systolic blood pressures using a stethoscope and Doppler to determine an ankle-brachial index (ABI), and (2) to determine the correlation between the 2 methods. Peripheral arterial disease (PAD) affects approximately 8 to 12 million people in the United States, and nearly half of those with this disease are asymptomatic. Early detection and prompt treatment of PAD will improve health outcomes. It is important that clinicians perform tests that determine the presence of PAD. Two individual raters trained in ABI procedure measured the systolic blood pressures of 20 individuals' upper and lower extremities. Standard ABI measurement protocols were observed. Raters individually recorded the systolic blood pressures of each extremity using a stethoscope and a Doppler, for a total of 640 independent measures. Interrater reliability of Doppler measurements to determine SBP at the ankle was very strong (intraclass correlation coefficient [ICC], 0.93-0.99) compared to moderate to strong reliability using a stethoscope (ICC, 0.64-0.87). Agreement between the 2 devices to determine SBP was moderate to very weak (ICC, 0.13-0.61). Comparisons of the use of Doppler and stethoscope to determine ABI showed weak to very weak intrarater correlation (ICC, 0.17-0.35). Linear regression analysis of the 2 methods to determine ABI showed positive but weak to very weak correlations (r2 = .013, P = .184). A Doppler ultrasound is recommended over a stethoscope for accuracy in systolic pressure readings for ABI measurements.

  1. Methods for estimating the reliability of the RBMK fuel assemblies and elements

    International Nuclear Information System (INIS)

    Klemin, A.I.; Sitkarev, A.G.

    1985-01-01

    Applied non-parametric methods for calculation of point and interval estimations for the basic nomenclature of reliability factors for the RBMK fuel assemblies and elements are described. As the fuel assembly and element reliability factors, the average lifetime is considered at a preset operating time up to unloading due to fuel burnout as well as the average lifetime at the reactor transient operation and at the steady-state fuel reloading mode of reactor operation. The formulae obtained are included into the special standardized engineering documentation

  2. Assessment of Electronic Circuits Reliability Using Boolean Truth Table Modeling Method

    International Nuclear Information System (INIS)

    EI-Shanshoury, A.I.

    2011-01-01

    This paper explores the use of Boolean Truth Table modeling Method (BTTM) in the analysis of qualitative data. It is widely used in certain fields especially in the fields of electrical and electronic engineering. Our work focuses on the evaluation of power supply circuit reliability using (BTTM) which involves systematic attempts to falsify and identify hypotheses on the basis of truth tables constructed from qualitative data. Reliability parameters such as the system's failure rates for the power supply case study are estimated. All possible state combinations (operating and failed states) of the major components in the circuit were listed and their effects on overall system were studied

  3. Leak detection in medium density polyethylene (MDPE) pipe using pressure transient method

    Science.gov (United States)

    Amin, M. M.; Ghazali, M. F.; PiRemli, M. A.; Hamat, A. M. A.; Adnan, N. F.

    2015-12-01

    Water is an essential part of commodity for a daily life usage for an average person, from personal uses such as residential or commercial consumers to industries utilization. This study emphasizes on detection of leaking in medium density polyethylene (MDPE) pipe using pressure transient method. This type of pipe is used to analyze the position of the leakage in the pipeline by using Ensemble Empirical Mode Decomposition Method (EEMD) with signal masking. Water hammer would induce an impulse throughout the pipeline that caused the system turns into a surge of water wave. Thus, solenoid valve is used to create a water hammer through the pipelines. The data from the pressure sensor is collected using DASYLab software. The data analysis of the pressure signal will be decomposed into a series of wave composition using EEMD signal masking method in matrix laboratory (MATLAB) software. The series of decomposition of signals is then carefully selected which reflected intrinsic mode function (IMF). These IMFs will be displayed by using a mathematical algorithm, known as Hilbert transform (HT) spectrum. The IMF signal was analysed to capture the differences. The analyzed data is compared with the actual measurement of the leakage in term of percentage error. The error recorded is below than 1% and it is proved that this method highly reliable and accurate for leak detection.

  4. Testing effort dependent software reliability model for imperfect debugging process considering both detection and correction

    International Nuclear Information System (INIS)

    Peng, R.; Li, Y.F.; Zhang, W.J.; Hu, Q.P.

    2014-01-01

    This paper studies the fault detection process (FDP) and fault correction process (FCP) with the incorporation of testing effort function and imperfect debugging. In order to ensure high reliability, it is essential for software to undergo a testing phase, during which faults can be detected and corrected by debuggers. The testing resource allocation during this phase, which is usually depicted by the testing effort function, considerably influences not only the fault detection rate but also the time to correct a detected fault. In addition, testing is usually far from perfect such that new faults may be introduced. In this paper, we first show how to incorporate testing effort function and fault introduction into FDP and then develop FCP as delayed FDP with a correction effort. Various specific paired FDP and FCP models are obtained based on different assumptions of fault introduction and correction effort. An illustrative example is presented. The optimal release policy under different criteria is also discussed

  5. An attempt to use FMEA method for an approximate reliability assessment of machinery

    Directory of Open Access Journals (Sweden)

    Przystupa Krzysztof

    2017-01-01

    Full Text Available The paper presents a modified FMEA (Failure Mode and Effect Analysis method to assess reliability of the components that make up a wrench type 2145: MAX Impactol TM Driver Ingersoll Rand Company. This case concerns the analysis of reliability in conditions, when full service data is not known. The aim of the study is to determine the weakest element in the design of the tool.

  6. Detection by EPR method of radiation treatment in dried fruits containing crystalline sugar

    International Nuclear Information System (INIS)

    Lehner, K.; Stachowicz, W.

    2006-01-01

    The results of EPR (electron paramagnetic resonance) measurements are presented on the detection ability and stability of radiation induced sugar-born radicals in the samples of dried (dehydrated) fruits available in the market and related to doses of 0.5, 1 and 3 kGy, respectively. The experiments have been conducted during 12 months of storage. Measurements were done with an EPR - 10 MINI spectrometer in X band (frequency of microwaves 9.5 GHz), St. Petersburg Instruments Ltd. The aim of the work was to prove the reliability of acceptability of the method in routine control of irradiated food. (author)

  7. Identification of reliable gridded reference data for statistical downscaling methods in Alberta

    Science.gov (United States)

    Eum, H. I.; Gupta, A.

    2017-12-01

    Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system

  8. [A reliability growth assessment method and its application in the development of equipment in space cabin].

    Science.gov (United States)

    Chen, J D; Sun, H L

    1999-04-01

    Objective. To assess and predict reliability of an equipment dynamically by making full use of various test informations in the development of products. Method. A new reliability growth assessment method based on army material system analysis activity (AMSAA) model was developed. The method is composed of the AMSAA model and test data conversion technology. Result. The assessment and prediction results of a space-borne equipment conform to its expectations. Conclusion. It is suggested that this method should be further researched and popularized.

  9. Reliability-Based Topology Optimization Using Stochastic Response Surface Method with Sparse Grid Design

    Directory of Open Access Journals (Sweden)

    Qinghai Zhao

    2015-01-01

    Full Text Available A mathematical framework is developed which integrates the reliability concept into topology optimization to solve reliability-based topology optimization (RBTO problems under uncertainty. Two typical methodologies have been presented and implemented, including the performance measure approach (PMA and the sequential optimization and reliability assessment (SORA. To enhance the computational efficiency of reliability analysis, stochastic response surface method (SRSM is applied to approximate the true limit state function with respect to the normalized random variables, combined with the reasonable design of experiments generated by sparse grid design, which was proven to be an effective and special discretization technique. The uncertainties such as material property and external loads are considered on three numerical examples: a cantilever beam, a loaded knee structure, and a heat conduction problem. Monte-Carlo simulations are also performed to verify the accuracy of the failure probabilities computed by the proposed approach. Based on the results, it is demonstrated that application of SRSM with SGD can produce an efficient reliability analysis in RBTO which enables a more reliable design than that obtained by DTO. It is also found that, under identical accuracy, SORA is superior to PMA in view of computational efficiency.

  10. Reliability and Sensitivity Analysis for Laminated Composite Plate Using Response Surface Method

    International Nuclear Information System (INIS)

    Lee, Seokje; Kim, Ingul; Jang, Moonho; Kim, Jaeki; Moon, Jungwon

    2013-01-01

    Advanced fiber-reinforced laminated composites are widely used in various fields of engineering to reduce weight. The material property of each ply is well known; specifically, it is known that ply is less reliable than metallic materials and very sensitive to the loading direction. Therefore, it is important to consider this uncertainty in the design of laminated composites. In this study, reliability analysis is conducted using Callosum and Meatball interactions for a laminated composite plate for the case in which the tip deflection is the design requirement and the material property is a random variable. Furthermore, the efficiency and accuracy of the approximation method is identified, and a probabilistic sensitivity analysis is conducted. As a result, we can prove the applicability of the advanced design method for the stabilizer of an underwater vehicle

  11. Reliability and Sensitivity Analysis for Laminated Composite Plate Using Response Surface Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seokje; Kim, Ingul [Chungnam National Univ., Daejeon (Korea, Republic of); Jang, Moonho; Kim, Jaeki; Moon, Jungwon [LIG Nex1, Yongin (Korea, Republic of)

    2013-04-15

    Advanced fiber-reinforced laminated composites are widely used in various fields of engineering to reduce weight. The material property of each ply is well known; specifically, it is known that ply is less reliable than metallic materials and very sensitive to the loading direction. Therefore, it is important to consider this uncertainty in the design of laminated composites. In this study, reliability analysis is conducted using Callosum and Meatball interactions for a laminated composite plate for the case in which the tip deflection is the design requirement and the material property is a random variable. Furthermore, the efficiency and accuracy of the approximation method is identified, and a probabilistic sensitivity analysis is conducted. As a result, we can prove the applicability of the advanced design method for the stabilizer of an underwater vehicle.

  12. Analyzing the reliability of shuffle-exchange networks using reliability block diagrams

    International Nuclear Information System (INIS)

    Bistouni, Fathollah; Jahanshahi, Mohsen

    2014-01-01

    Supercomputers and multi-processor systems are comprised of thousands of processors that need to communicate in an efficient way. One reasonable solution would be the utilization of multistage interconnection networks (MINs), where the challenge is to analyze the reliability of such networks. One of the methods to increase the reliability and fault-tolerance of the MINs is use of various switching stages. Therefore, recently, the reliability of one of the most common MINs namely shuffle-exchange network (SEN) has been evaluated through the investigation on the impact of increasing the number of switching stage. Also, it is concluded that the reliability of SEN with one additional stage (SEN+) is better than SEN or SEN with two additional stages (SEN+2), even so, the reliability of SEN is better compared to SEN with two additional stages (SEN+2). Here we re-evaluate the reliability of these networks where the results of the terminal, broadcast, and network reliability analysis demonstrate that SEN+ and SEN+2 continuously outperform SEN and are very alike in terms of reliability. - Highlights: • The impact of increasing the number of stages on reliability of MINs is investigated. • The RBD method as an accurate method is used for the reliability analysis of MINs. • Complex series–parallel RBDs are used to determine the reliability of the MINs. • All measures of the reliability (i.e. terminal, broadcast, and network reliability) are analyzed. • All reliability equations will be calculated for different size N×N

  13. Reliable methods for computer simulation error control and a posteriori estimates

    CERN Document Server

    Neittaanmäki, P

    2004-01-01

    Recent decades have seen a very rapid success in developing numerical methods based on explicit control over approximation errors. It may be said that nowadays a new direction is forming in numerical analysis, the main goal of which is to develop methods ofreliable computations. In general, a reliable numerical method must solve two basic problems: (a) generate a sequence of approximations that converges to a solution and (b) verify the accuracy of these approximations. A computer code for such a method must consist of two respective blocks: solver and checker.In this book, we are chie

  14. Development of an in situ magnetic beads based RT-PCR method for electrochemiluminescent detection of rotavirus

    Science.gov (United States)

    Zhan, Fangfang; Zhou, Xiaoming

    2012-12-01

    Rotaviruses are double-stranded RNA viruses belonging to the family of enteric pathogens. It is a major cause of diarrhoeal disease in infants and young children worldwide. Consequently, rapid and accurate detection of rotaviruses is of great importance in controlling and preventing food- and waterborne diseases and outbreaks. Reverse transcription-polymerase chain reaction (RT-PCR) is a reliable method that possesses high specificity and sensitivity. It has been widely used to detection of viruses. Electrochemiluminescence (ECL) can be considered as an important and powerful tool in analytical and clinical application with high sensitivity, excellent specificity, and low cost. Here we have developed a method for the detection of rotavirus by combining in situ magnetic beads (MBs) based RT-PCR with ECL. RT of rotavirus RNA was carried out in a traditional way and the resulting cDNA was directly amplified on MBs. Forward primers were covalently bounded to MBs and reverse primers were labeled with tris-(2, 2'-bipyridyl) ruthenium (TBR). During the PCR cycling, the TBR labeled products were directly loaded and enriched on the surface of MBs. Then the MBs-TBR complexes could be analyzed by a magnetic ECL platform without any post-modification or post-incubation which avoid some laborious manual operations and achieve rapid yet sensitive detection. In this study, rotavirus from fecal specimens was successfully detected within 2 h, and the limit of detection was estimated to be 104copies/μL. This novel in situ MBs based RT-PCR with ECL detection method can be used for pathogen detection in food safety field and clinical diagnosis.

  15. Evaluation of ECT reliability for axial ODSCC in steam generator tubes

    International Nuclear Information System (INIS)

    Lee, Jae Bong; Park, Jai Hak; Kim, Hong Deok; Chung, Han Sub

    2010-01-01

    The integrity of steam generator tubes is usually evaluated based on eddy current test (ECT) results. Because detection capacity of the ECT is not perfect, all of the physical flaws, which actually exist in steam generator tubes, cannot be detected by ECT inspection. Therefore it is very important to analyze ECT reliability in the integrity assessment of steam generators. The reliability of an ECT inspection system is divided into reliability of inspection technique and reliability of quality of analyst. And the reliability of ECT results is also divided into reliability of size and reliability of detection. The reliability of ECT sizing is often characterized as a linear regression model relating true flaw size data to measured flaw size data. The reliability of detection is characterized in terms of probability of detection (POD), which is expressed as a function of flaw size. In this paper the reliability of an ECT inspection system is analyzed quantitatively. POD of the ECT inspection system for axial outside diameter stress corrosion cracks (ODSCC) in steam generator tubes is evaluated. Using a log-logistic regression model, POD is evaluated from hit (detection) and miss (no detection) binary data obtained from destructive and non-destructive inspections of cracked tubes. Crack length and crack depth are considered as variables in multivariate log-logistic regression and their effects on detection capacity are assessed using two-dimensional POD (2-D POD) surface. The reliability of detection is also analyzed using POD for inspection technique (POD T ) and POD for analyst (POD A ).

  16. Quantitative metal magnetic memory reliability modeling for welded joints

    Science.gov (United States)

    Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng

    2016-03-01

    Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.

  17. Detection of HBV Covalently Closed Circular DNA

    Directory of Open Access Journals (Sweden)

    Xiaoling Li

    2017-06-01

    Full Text Available Chronic hepatitis B virus (HBV infection affects approximately 240 million people worldwide and remains a serious public health concern because its complete cure is impossible with current treatments. Covalently closed circular DNA (cccDNA in the nucleus of infected cells cannot be eliminated by present therapeutics and may result in persistence and relapse. Drug development targeting cccDNA formation and maintenance is hindered by the lack of efficient cccDNA models and reliable cccDNA detection methods. Southern blotting is regarded as the gold standard for quantitative cccDNA detection, but it is complicated and not suitable for high-throughput drug screening, so more sensitive and simple methods, including polymerase chain reaction (PCR-based methods, Invader assays, in situ hybridization and surrogates, have been developed for cccDNA detection. However, most methods are not reliable enough, and there are no unified standards for these approaches. This review will summarize available methods for cccDNA detection. It is hoped that more robust methods for cccDNA monitoring will be developed and that standard operation procedures for routine cccDNA detection in scientific research and clinical monitoring will be established.

  18. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    Science.gov (United States)

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  19. Inter- and intra- observer reliability of risk assessment of repetitive work without an explicit method.

    Science.gov (United States)

    Eliasson, Kristina; Palm, Peter; Nyman, Teresia; Forsman, Mikael

    2017-07-01

    A common way to conduct practical risk assessments is to observe a job and report the observed long term risks for musculoskeletal disorders. The aim of this study was to evaluate the inter- and intra-observer reliability of ergonomists' risk assessments without the support of an explicit risk assessment method. Twenty-one experienced ergonomists assessed the risk level (low, moderate, high risk) of eight upper body regions, as well as the global risk of 10 video recorded work tasks. Intra-observer reliability was assessed by having nine of the ergonomists repeat the procedure at least three weeks after the first assessment. The ergonomists made their risk assessment based on his/her experience and knowledge. The statistical parameters of reliability included agreement in %, kappa, linearly weighted kappa, intraclass correlation and Kendall's coefficient of concordance. The average inter-observer agreement of the global risk was 53% and the corresponding weighted kappa (K w ) was 0.32, indicating fair reliability. The intra-observer agreement was 61% and 0.41 (K w ). This study indicates that risk assessments of the upper body, without the use of an explicit observational method, have non-acceptable reliability. It is therefore recommended to use systematic risk assessment methods to a higher degree. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. DEPEND-HRA-A method for consideration of dependency in human reliability analysis

    International Nuclear Information System (INIS)

    Cepin, Marko

    2008-01-01

    A consideration of dependencies between human actions is an important issue within the human reliability analysis. A method was developed, which integrates the features of existing methods and the experience from a full scope plant simulator. The method is used on real plant-specific human reliability analysis as a part of the probabilistic safety assessment of a nuclear power plant. The method distinguishes dependency for pre-initiator events from dependency for initiator and post-initiator events. The method identifies dependencies based on scenarios, where consecutive human actions are modeled, and based on a list of minimal cut sets, which is obtained by running the minimal cut set analysis considering high values of human error probabilities in the evaluation. A large example study, which consisted of a large number of human failure events, demonstrated the applicability of the method. Comparative analyses that were performed show that both selection of dependency method and selection of dependency levels within the method largely impact the results of probabilistic safety assessment. If the core damage frequency is not impacted much, the listings of important basic events in terms of risk increase and risk decrease factors may change considerably. More efforts are needed on the subject, which will prepare the background for more detailed guidelines, which will remove the subjectivity from the evaluations as much as it is possible

  1. Brain GABA Detection in vivo with the J-editing 1H MRS Technique: A Comprehensive Methodological Evaluation of Sensitivity Enhancement, Macromolecule Contamination and Test-Retest Reliability

    Science.gov (United States)

    Shungu, Dikoma C.; Mao, Xiangling; Gonzales, Robyn; Soones, Tacara N.; Dyke, Jonathan P.; van der Veen, Jan Willem; Kegeles, Lawrence S.

    2016-01-01

    Abnormalities in brain γ-aminobutyric acid (GABA) have been implicated in various neuropsychiatric and neurological disorders. However, in vivo GABA detection by proton magnetic resonance spectroscopy (1H MRS) presents significant challenges arising from low brain concentration, overlap by much stronger resonances, and contamination by mobile macromolecule (MM) signals. This study addresses these impediments to reliable brain GABA detection with the J-editing difference technique on a 3T MR system in healthy human subjects by (a) assessing the sensitivity gains attainable with an 8-channel phased-array head coil, (b) determining the magnitude and anatomic variation of the contamination of GABA by MM, and (c) estimating the test-retest reliability of measuring GABA with this method. Sensitivity gains and test-retest reliability were examined in the dorsolateral prefrontal cortex (DLPFC), while MM levels were compared across three cortical regions: the DLPFC, the medial prefrontal cortex (MPFC) and the occipital cortex (OCC). A 3-fold higher GABA detection sensitivity was attained with the 8-channel head coil compared to the standard single-channel head coil in DLPFC. Despite significant anatomic variation in GABA+MM and MM across the three brain regions (p GABA+MM was relatively stable across the three voxels, ranging from 41% to 49%, a non-significant regional variation (p = 0.58). The test-retest reliability of GABA measurement, expressed either as ratios to voxel tissue water (W) or total creatine, was found to be very high for both the single-channel coil and the 8-channel phased-array coil. For the 8-channel coil, for example, Pearson’s correlation coefficient of test vs. retest for GABA/W was 0.98 (R2 = 0.96, p = 0.0007), the percent coefficient of variation (CV) was 1.25%, and the intraclass correlation coefficient (ICC) was 0.98. Similar reliability was also found for the co-edited resonance of combined glutamate and glutamine (Glx) for both coils. PMID

  2. European-American workshop: Determination of reliability and validation methods on NDE. Proceedings

    International Nuclear Information System (INIS)

    1997-01-01

    The invited papers focused on the following issues: 1. The different technical and scientific approaches to the problem of how to guarantees or demonstrate the reliability of NDE: a. Application of established prescriptive standards, b. Probabilities of Detection (PDO) and False Alarm (PFA) from blind trials, c. POD and PFA from signal statistics, d. Modeling, e. ''Technical Justification''; 2. The dissimilar validation/qualification concepts used in different industries in Europe and North America: a. Nuclear Power Generation, b. Aerospace Industry, c. Offcshore Industry and d. Service Companies

  3. Reliability test and failure analysis of high power LED packages

    International Nuclear Information System (INIS)

    Chen Zhaohui; Zhang Qin; Wang Kai; Luo Xiaobing; Liu Sheng

    2011-01-01

    A new type application specific light emitting diode (LED) package (ASLP) with freeform polycarbonate lens for street lighting is developed, whose manufacturing processes are compatible with a typical LED packaging process. The reliability test methods and failure criterions from different vendors are reviewed and compared. It is found that test methods and failure criterions are quite different. The rapid reliability assessment standards are urgently needed for the LED industry. 85 0 C/85 RH with 700 mA is used to test our LED modules with three other vendors for 1000 h, showing no visible degradation in optical performance for our modules, with two other vendors showing significant degradation. Some failure analysis methods such as C-SAM, Nano X-ray CT and optical microscope are used for LED packages. Some failure mechanisms such as delaminations and cracks are detected in the LED packages after the accelerated reliability testing. The finite element simulation method is helpful for the failure analysis and design of the reliability of the LED packaging. One example is used to show one currently used module in industry is vulnerable and may not easily pass the harsh thermal cycle testing. (semiconductor devices)

  4. Circuit design for reliability

    CERN Document Server

    Cao, Yu; Wirth, Gilson

    2015-01-01

    This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units.  The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.

  5. Development of techniques using DNA analysis method for detection/analysis of radiation-induced mutation. Development of an useful probe/primer and improvement of detection efficacy

    International Nuclear Information System (INIS)

    Maekawa, Hideaki; Tsuchida, Kozo; Hashido, Kazuo; Takada, Naoko; Kameoka, Yosuke; Hirata, Makoto

    1999-01-01

    Previously, it was demonstrated that detection of centromere became easy and reliable through fluorescent staining by FISH method using a probe of the sequence preserved in α-satelite DNA. Since it was, however, found inappropriate to detect dicentrics based on the relative amount of DNA probe on each chromosome. A prove which allows homogeneous detection of α-satelite DNA for each chromosome was constructed. A presumed sequence specific to kinetochore, CENP-B box was amplified by PCR method and the product DNA was used as a probe. However, the variation in amounts of probe DNA among chromosomes was decreased by only about 20%. Then, a program for image processing of the results obtained from FISH using α-satelite DNA was constructed to use as a marker for centromere. When compared with detection of abnormal chromosomes stained by the conventional method, calculation efficacy for only detection of centromere was improved by the use of this program. Calculation to discriminate the normal or not was still complicated and the detection efficacy was little improved. Chromosomal abnormalities in lymphocytes were used to detect the effects of radiation. In this method, it is needed to shift the phase of cells into metaphase. The mutation induced by radiation might be often repaired during shifting. To exclude this possibility, DNA extraction was conducted at a low temperature and immediately after exposure to 137 Cs, and a rapid genome detection method was established using the genome DNA. As the model genomes, the following three were used: 1) long chain repeated sequences widely dispersed over chromosome, 2) cluster genes, 3) single copy genes. The effects of radiation were detectable at 1-2 Gy for the long repeated sequences and at 7 Gy for the cluster genes, respectively, whereas no significant effects were observed at any Gy tested for the single copy genes. Amplification was marked in the cells exposed at 1-10 Gy (peak at 4 Gy), suggesting that these regions had

  6. Development of reliability centered maintenance methods and tools

    International Nuclear Information System (INIS)

    Jacquot, J.P.; Dubreuil-Chambardel, A.; Lannoy, A.; Monnier, B.

    1992-12-01

    This paper recalls the development of the RCM (Reliability Centered Maintenance) approach in the nuclear industry and describes the trial study implemented by EDF in the context of the OMF (RCM) Project. The approach developed is currently being applied to about thirty systems (Industrial Project). On a parallel, R and D efforts are being maintained to improve the selectivity of the analysis methods. These methods use Probabilistic Safety Study models, thereby guaranteeing better selectivity in the identification of safety critical elements and enhancing consistency between Maintenance and Safety studies. They also offer more detailed analysis of operation feedback, invoking for example Bayes' methods combining expert judgement and feedback data. Finally, they propose a functional and material representation of the plant. This dual representation describes both the functions assured by maintenance provisions and the material elements required for their implementation. In the final chapter, the targets of the future OMF workstation are summarized and the latter's insertion in the EDF information system is briefly described. (authors). 5 figs., 2 tabs., 7 refs

  7. Evaluation of the reliability of Levine method of wound swab for ...

    African Journals Online (AJOL)

    The aim of this paper is to evaluate the reliability of Levine swab in accurate identification of microorganisms present in a wound and identify the necessity for further studies in this regard. Methods: A semi structured questionnaire was administered and physical examination was performed on patients with chronic wounds ...

  8. EVALUATION OF ELISA METHOD TO DETECTION OF COW β-LACTOGLOBULIN IN SHEEP MILK AND SHEEP MILK PRODUCTS

    Directory of Open Access Journals (Sweden)

    Juraj Paulov

    2010-11-01

    Full Text Available The aim of work was to optimalize the ELISA method to detect the adulteration of sheep milk and sheep milk products by cow milk in the laboratory. We have focused on laboratory testing of ELISA kit (β-Lactoglobulin ELISA Set, SEDIUM R&D for detection of cow β-Lg in sheep milk order to obtain high-quality, reliable and economically advantageous method suitable for routine use in practice. The results shown that for the quality of adulteration determination  it is necessary to verify the sensitivity of applied kit by the samples dilution in accordance with the producer declared quantification range contained in the manual ELISA kit. The starting point for obtaining of relevant data was to create separate regression curves with high deter­mination coefficient, which allowed to quickly and easily detect the cow milk additions in sheep milk, cloddish sheep and Slovak sheep cheese. doi:10.5219/78  

  9. GNSS Single Frequency, Single Epoch Reliable Attitude Determination Method with Baseline Vector Constraint

    Directory of Open Access Journals (Sweden)

    Ang Gong

    2015-12-01

    Full Text Available For Global Navigation Satellite System (GNSS single frequency, single epoch attitude determination, this paper proposes a new reliable method with baseline vector constraint. First, prior knowledge of baseline length, heading, and pitch obtained from other navigation equipment or sensors are used to reconstruct objective function rigorously. Then, searching strategy is improved. It substitutes gradually Enlarged ellipsoidal search space for non-ellipsoidal search space to ensure correct ambiguity candidates are within it and make the searching process directly be carried out by least squares ambiguity decorrelation algorithm (LAMBDA method. For all vector candidates, some ones are further eliminated by derived approximate inequality, which accelerates the searching process. Experimental results show that compared to traditional method with only baseline length constraint, this new method can utilize a priori baseline three-dimensional knowledge to fix ambiguity reliably and achieve a high success rate. Experimental tests also verify it is not very sensitive to baseline vector error and can perform robustly when angular error is not great.

  10. Leak detection method

    International Nuclear Information System (INIS)

    1978-01-01

    This invention provides a method for removing nuclear fuel elements from a fabrication building while at the same time testing the fuel elements for leaks without releasing contaminants from the fabrication building or from the fuel elements. The vacuum source used, leak detecting mechanism and fuel element fabrication building are specified to withstand environmental hazards. (UK)

  11. Features of an advanced human reliability analysis method, AGAPE-ET

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Kyun [Korea Atomic Energy Research Institute, Taejeon (Korea, Republic of)

    2005-11-15

    This paper presents the main features of an advanced human reliability analysis (HRA) method, AGAPE-ET. It has the capabilities to deal with the diagnosis failures and the errors of commission (EOC), which have not been normally treated in the conventional HRAs. For the analysis of the potential for diagnosis failures, an analysis framework, which is called the misdiagnosis tree analysis (MDTA), and a taxonomy of the misdiagnosis causes with appropriate quantification schemes are provided. For the identification of the EOC events from the misdiagnosis, some procedural guidance is given. An example of the application of the method is also provided.

  12. Features of an advanced human reliability analysis method, AGAPE-ET

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Kyun

    2005-01-01

    This paper presents the main features of an advanced human reliability analysis (HRA) method, AGAPE-ET. It has the capabilities to deal with the diagnosis failures and the errors of commission (EOC), which have not been normally treated in the conventional HRAs. For the analysis of the potential for diagnosis failures, an analysis framework, which is called the misdiagnosis tree analysis (MDTA), and a taxonomy of the misdiagnosis causes with appropriate quantification schemes are provided. For the identification of the EOC events from the misdiagnosis, some procedural guidance is given. An example of the application of the method is also provided

  13. Assessment of Advanced Life Support competence when combining different test methods--reliability and validity

    DEFF Research Database (Denmark)

    Ringsted, C; Lippert, F; Hesselfeldt, R

    2007-01-01

    Cardiac Arrest Simulation Test (CASTest) scenarios for the assessments according to guidelines 2005. AIMS: To analyse the reliability and validity of the individual sub-tests provided by ERC and to find a combination of MCQ and CASTest that provides a reliable and valid single effect measure of ALS...... that possessed high reliability, equality of test sets, and ability to discriminate between the two groups of supposedly different ALS competence. CONCLUSIONS: ERC sub-tests of ALS competence possess sufficient reliability and validity. A combined ALS score with equal weighting of one MCQ and one CASTest can...... competence. METHODS: Two groups of participants were included in this randomised, controlled experimental study: a group of newly graduated doctors, who had not taken the ALS course (N=17) and a group of students, who had passed the ALS course 9 months before the study (N=16). Reliability in terms of inter...

  14. A Fast Optimization Method for Reliability and Performance of Cloud Services Composition Application

    Directory of Open Access Journals (Sweden)

    Zhao Wu

    2013-01-01

    Full Text Available At present the cloud computing is one of the newest trends of distributed computation, which is propelling another important revolution of software industry. The cloud services composition is one of the key techniques in software development. The optimization for reliability and performance of cloud services composition application, which is a typical stochastic optimization problem, is confronted with severe challenges due to its randomness and long transaction, as well as the characteristics of the cloud computing resources such as openness and dynamic. The traditional reliability and performance optimization techniques, for example, Markov model and state space analysis and so forth, have some defects such as being too time consuming and easy to cause state space explosion and unsatisfied the assumptions of component execution independence. To overcome these defects, we propose a fast optimization method for reliability and performance of cloud services composition application based on universal generating function and genetic algorithm in this paper. At first, a reliability and performance model for cloud service composition application based on the multiple state system theory is presented. Then the reliability and performance definition based on universal generating function is proposed. Based on this, a fast reliability and performance optimization algorithm is presented. In the end, the illustrative examples are given.

  15. Coupling finite elements and reliability methods - application to safety evaluation of pressurized water reactor vessels

    International Nuclear Information System (INIS)

    Pitner, P.; Venturini, V.

    1995-02-01

    When reliability studies are extended form deterministic calculations in mechanics, it is necessary to take into account input parameters variabilities which are linked to the different sources of uncertainty. Integrals must then be calculated to evaluate the failure risk. This can be performed either by simulation methods, or by approximations ones (FORM/SORM). Model in mechanics often require to perform calculation codes. These ones must then be coupled with the reliability calculations. Theses codes can involve large calculation times when they are invoked numerous times during simulations sequences or in complex iterative procedures. Response surface method gives an approximation of the real response from a reduced number of points for which the finite element code is run. Thus, when it is combined with FORM/SORM methods, a coupling can be carried out which gives results in a reasonable calculation time. An application of response surface method to mechanics reliability coupling for a mechanical model which calls for a finite element code is presented. It corresponds to a probabilistic fracture mechanics study of a pressurized water reactor vessel. (authors). 5 refs., 3 figs

  16. A new, rapid and reliable method for the determination of reduced sulphur (S2-) species in natural water discharges

    International Nuclear Information System (INIS)

    Montegrossi, Giordano; Tassi, Franco; Vaselli, Orlando; Bidini, Eva; Minissale, Angelo

    2006-01-01

    The determination of reduced S species in natural waters is particularly difficult due to their high instability and chemical and physical interferences in the current analytical methods. In this paper a new, rapid and reliable analytical procedure is presented, named the Cd-IC method, for their determination as ΣS 2- via oxidation to SO 4 2- after chemical trapping with an ammonia-cadmium solution that allows precipitation of all the reduced S species as CdS. The S 2- -SO 4 is analysed by ion-chromatography. The main advantages of this method are: low cost, high stability of CdS precipitate, absence of interferences, low detection limit (0.01mg/L as SO 4 for 10mL of water) and low analytical error (about 5%). The proposed method has been applied to more than 100 water samples from different natural systems (water discharges and cold wells from volcanic and geothermal areas, crater lakes) in central-southern Italy

  17. Irradiation of spices and its detection

    International Nuclear Information System (INIS)

    Sjöberg, A.M.; Manninen, M.; Honkanen, E.; Latva-Kala, K.; Pinnioja, S.

    1991-01-01

    In this study, possible methods for detecting irradiation of spices are reviewed. Of the different kinds of techniques, the most promising control methods are thermo‐ and chemiluminescence and the microbiological and viscosimetric methods. The suitability of analytical methods for detecting possible degradation of the main compounds in the aromas of spices during irradiation is also discussed. The irradiation of spices can be detected reliably with thermoluminescence and by chemiluminescence measurements. Advantages of a new thermoluminescence method, based on mineral measurements, are also presented. Spices and their microbial contents and decontamination are discussed. Combined use of the direct epifluorescent filter technique (DEFT) and the aerobic plate count method (APC) is possibly a suitable method for detecting the irradiation of spices. Also, viscosity measurements combined with luminometric methods have been considered as possible detection methods

  18. MOA-2010-BLG-311: A PLANETARY CANDIDATE BELOW THE THRESHOLD OF RELIABLE DETECTION

    International Nuclear Information System (INIS)

    Yee, J. C.; Hung, L.-W.; Gould, A.; Gaudi, B. S.; Bond, I. A.; Allen, W.; Monard, L. A. G.; Albrow, M. D.; Fouqué, P.; Dominik, M.; Tsapras, Y.; Udalski, A.; Zellem, R.; Bos, M.; Christie, G. W.; DePoy, D. L.; Dong, Subo; Drummond, J.; Gorbikov, E.; Han, C.

    2013-01-01

    We analyze MOA-2010-BLG-311, a high magnification (A max > 600) microlensing event with complete data coverage over the peak, making it very sensitive to planetary signals. We fit this event with both a point lens and a two-body lens model and find that the two-body lens model is a better fit but with only Δχ 2 ∼ 80. The preferred mass ratio between the lens star and its companion is q = 10 –3.7±0.1 , placing the candidate companion in the planetary regime. Despite the formal significance of the planet, we show that because of systematics in the data the evidence for a planetary companion to the lens is too tenuous to claim a secure detection. When combined with analyses of other high-magnification events, this event helps empirically define the threshold for reliable planet detection in high-magnification events, which remains an open question.

  19. MOA-2010-BLG-311: A PLANETARY CANDIDATE BELOW THE THRESHOLD OF RELIABLE DETECTION

    Energy Technology Data Exchange (ETDEWEB)

    Yee, J. C.; Hung, L.-W.; Gould, A.; Gaudi, B. S. [Department of Astronomy, Ohio State University, 140 West 18th Avenue, Columbus, OH 43210 (United States); Bond, I. A. [Institute for Information and Mathematical Sciences, Massey University, Private Bag 102-904, Auckland 1330 (New Zealand); Allen, W. [Vintage Lane Observatory, Blenheim (New Zealand); Monard, L. A. G. [Bronberg Observatory, Centre for Backyard Astrophysics, Pretoria (South Africa); Albrow, M. D. [Department of Physics and Astronomy, University of Canterbury, Private Bag 4800, Christchurch 8020 (New Zealand); Fouque, P. [IRAP, CNRS, Universite de Toulouse, 14 avenue Edouard Belin, F-31400 Toulouse (France); Dominik, M. [SUPA, University of St. Andrews, School of Physics and Astronomy, North Haugh, St. Andrews, KY16 9SS (United Kingdom); Tsapras, Y. [Las Cumbres Observatory Global Telescope Network, 6740B Cortona Drive, Goleta, CA 93117 (United States); Udalski, A. [Warsaw University Observatory, Al. Ujazdowskie 4, 00-478 Warszawa (Poland); Zellem, R. [Department of Planetary Sciences/LPL, University of Arizona, 1629 East University Boulevard, Tucson, AZ 85721 (United States); Bos, M. [Molehill Astronomical Observatory, North Shore City, Auckland (New Zealand); Christie, G. W. [Auckland Observatory, P.O. Box 24-180, Auckland (New Zealand); DePoy, D. L. [Department of Physics, Texas A and M University, 4242 TAMU, College Station, TX 77843-4242 (United States); Dong, Subo [Institute for Advanced Study, Einstein Drive, Princeton, NJ 08540 (United States); Drummond, J. [Possum Observatory, Patutahi (New Zealand); Gorbikov, E. [School of Physics and Astronomy, Raymond and Beverley Sackler Faculty of Exact Sciences, Tel-Aviv University, Tel Aviv 69978 (Israel); Han, C., E-mail: liweih@astro.ucla.edu, E-mail: rzellem@lpl.arizona.edu, E-mail: tim.natusch@aut.ac.nz [Department of Physics, Chungbuk National University, 410 Seongbong-Rho, Hungduk-Gu, Chongju 371-763 (Korea, Republic of); Collaboration: muFUN Collaboration; MOA Collaboration; OGLE Collaboration; PLANET Collaboration; RoboNet Collaboration; MiNDSTEp Consortium; and others

    2013-05-20

    We analyze MOA-2010-BLG-311, a high magnification (A{sub max} > 600) microlensing event with complete data coverage over the peak, making it very sensitive to planetary signals. We fit this event with both a point lens and a two-body lens model and find that the two-body lens model is a better fit but with only {Delta}{chi}{sup 2} {approx} 80. The preferred mass ratio between the lens star and its companion is q = 10{sup -3.7{+-}0.1}, placing the candidate companion in the planetary regime. Despite the formal significance of the planet, we show that because of systematics in the data the evidence for a planetary companion to the lens is too tenuous to claim a secure detection. When combined with analyses of other high-magnification events, this event helps empirically define the threshold for reliable planet detection in high-magnification events, which remains an open question.

  20. Evaluation of extraction methods for ochratoxin A detection in cocoa beans employing HPLC.

    Science.gov (United States)

    Mishra, Rupesh K; Catanante, Gaëlle; Hayat, Akhtar; Marty, Jean-Louis

    2016-01-01

    Cocoa is an important ingredient for the chocolate industry and for many food products. However, it is prone to contamination by ochratoxin A (OTA), which is highly toxic and potentially carcinogenic to humans. In this work, four different extraction methods were tested and compared based on their recoveries. The best protocol was established which involves an organic solvent-free extraction method for the detection of OTA in cocoa beans using 1% sodium hydrogen carbonate (NaHCO3) in water within 30 min. The extraction method is rapid (as compared with existing methods), simple, reliable and practical to perform without complex experimental set-ups. The cocoa samples were freshly extracted and cleaned-up using immunoaffinity column (IAC) for HPLC analysis using a fluorescence detector. Under the optimised condition, the limit of detection (LOD) and limit of quantification (LOQ) for OTA were 0.62 and 1.25 ng ml(-1) respectively in standard solutions. The method could successfully quantify OTA in naturally contaminated samples. Moreover, good recoveries of OTA were obtained up to 86.5% in artificially spiked cocoa samples, with a maximum relative standard deviation (RSD) of 2.7%. The proposed extraction method could determine OTA at the level 1.5 µg kg(-)(1), which surpassed the standards set by the European Union for cocoa (2 µg kg(-1)). In addition, an efficiency comparison of IAC and molecular imprinted polymer (MIP) column was also performed and evaluated.

  1. A robust sub-pixel edge detection method of infrared image based on tremor-based retinal receptive field model

    Science.gov (United States)

    Gao, Kun; Yang, Hu; Chen, Xiaomei; Ni, Guoqiang

    2008-03-01

    Because of complex thermal objects in an infrared image, the prevalent image edge detection operators are often suitable for a certain scene and extract too wide edges sometimes. From a biological point of view, the image edge detection operators work reliably when assuming a convolution-based receptive field architecture. A DoG (Difference-of- Gaussians) model filter based on ON-center retinal ganglion cell receptive field architecture with artificial eye tremors introduced is proposed for the image contour detection. Aiming at the blurred edges of an infrared image, the subsequent orthogonal polynomial interpolation and sub-pixel level edge detection in rough edge pixel neighborhood is adopted to locate the foregoing rough edges in sub-pixel level. Numerical simulations show that this method can locate the target edge accurately and robustly.

  2. Structural Reliability Methods for Wind Power Converter System Component Reliability Assessment

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    Wind power converter systems are essential subsystems in both off-shore and on-shore wind turbines. It is the main interface between generator and grid connection. This system is affected by numerous stresses where the main contributors might be defined as vibration and temperature loadings....... The temperature variations induce time-varying stresses and thereby fatigue loads. A probabilistic model is used to model fatigue failure for an electrical component in the power converter system. This model is based on a linear damage accumulation and physics of failure approaches, where a failure criterion...... is defined by the threshold model. The attention is focused on crack propagation in solder joints of electrical components due to the temperature loadings. Structural Reliability approaches are used to incorporate model, physical and statistical uncertainties. Reliability estimation by means of structural...

  3. The acoustic detection of cavitation in pumps

    International Nuclear Information System (INIS)

    Macleod, I.D.; Gray, B.S.; Taylor, C.G.

    1978-01-01

    A programme was initiated to develop a reliable technique for detecting the onset of acoustic noise from cavitation in a pump and to relate this to cavitation inception data, since significant noise from collapse of vapour bubbles arising from such cavitation would reduce the sensitivity of a noise detection system for boiling of sodium in fast breeder reactors. Factors affecting the detection of cavitation are discussed. The instrumentation and techniques of frequency analysis and pulse detection are described. Two examples are then given of the application of acoustic detection techniques under controlled conditions. It is concluded that acoustic detection can be a reliable method for detecting inception of cavitation in a pump and the required conditions are stated. (U.K.)

  4. A Reliability-Oriented Design Method for Power Electronic Converters

    DEFF Research Database (Denmark)

    Wang, Huai; Zhou, Dao; Blaabjerg, Frede

    2013-01-01

    Reliability is a crucial performance indicator of power electronic systems in terms of availability, mission accomplishment and life cycle cost. A paradigm shift in the research on reliability of power electronics is going on from simple handbook based calculations (e.g. models in MIL-HDBK-217F h...... and reliability prediction models are provided. A case study on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical component IGBT modules....

  5. Sensitivity Weaknesses in Application of some Statistical Distribution in First Order Reliability Methods

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Enevoldsen, I.

    1993-01-01

    It has been observed and shown that in some examples a sensitivity analysis of the first order reliability index results in increasing reliability index, when the standard deviation for a stochastic variable is increased while the expected value is fixed. This unfortunate behaviour can occur when...... a stochastic variable is modelled by an asymmetrical density function. For lognormally, Gumbel and Weibull distributed stochastic variables it is shown for which combinations of the/3-point, the expected value and standard deviation the weakness can occur. In relation to practical application the behaviour...... is probably rather infrequent. A simple example is shown as illustration and to exemplify that for second order reliability methods and for exact calculations of the probability of failure this behaviour is much more infrequent....

  6. Review on Laryngeal Palpation Methods in Muscle Tension Dysphonia: Validity and Reliability Issues.

    Science.gov (United States)

    Khoddami, Seyyedeh Maryam; Ansari, Noureddin Nakhostin; Jalaie, Shohreh

    2015-07-01

    Laryngeal palpation is a common clinical method for the assessment of neck and laryngeal muscles in muscle tension dysphonia (MTD). To review the available laryngeal palpation methods used in patients with MTD for the assessment, diagnosis, or document of treatment outcomes. A systematic review of the literature concerning palpatory methods in MTD was conducted using the databases MEDLINE (PubMed), ScienceDirect, Scopus, Web of science, Web of knowledge and Cochrane Library between July and October 2013. Relevant studies were identified by one reviewer based on screened titles/abstracts and full texts. Manual searching was also used to track the source literature. There were five main as well as miscellaneous palpation methods that were different according to target anatomical structures, judgment or grading system, and using tasks. There were only a few scales available, and the majority of the palpatory methods were qualitative. Most of the palpatory methods evaluate the tension at both static and dynamic tasks. There was little information about the validity and reliability of the available methods. The literature on the scientific evidence of muscle tension indicators perceived by laryngeal palpation in MTD is scarce. Future studies should be conducted to investigate the validity and reliability of palpation methods. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  7. Label-free glucose detection using cantilever sensor technology based on gravimetric detection principles.

    Science.gov (United States)

    Hsieh, Shuchen; Hsieh, Shu-Ling; Hsieh, Chiung-Wen; Lin, Po-Chiao; Wu, Chun-Hsin

    2013-01-01

    Efficient maintenance of glucose homeostasis is a major challenge in diabetes therapy, where accurate and reliable glucose level detection is required. Though several methods are currently used, these suffer from impaired response and often unpredictable drift, making them unsuitable for long-term therapeutic practice. In this study, we demonstrate a method that uses a functionalized atomic force microscope (AFM) cantilever as the sensor for reliable glucose detection with sufficient sensitivity and selectivity for clinical use. We first modified the AFM tip with aminopropylsilatrane (APS) and then adsorbed glucose-specific lectin concanavalin A (Con A) onto the surface. The Con A/APS-modified probes were then used to detect glucose by monitoring shifts in the cantilever resonance frequency. To confirm the molecule-specific interaction, AFM topographical images were acquired of identically treated silicon substrates which indicated a specific attachment for glucose-Con A and not for galactose-Con A. These results demonstrate that by monitoring the frequency shift of the AFM cantilever, this sensing system can detect the interaction between Con A and glucose, one of the biomolecule recognition processes, and may assist in the detection and mass quantification of glucose for clinical applications with very high sensitivity.

  8. [Knowledge of university students in Szeged, Hungary about reliable contraceptive methods and sexually transmitted diseases].

    Science.gov (United States)

    Devosa, Iván; Kozinszky, Zoltán; Vanya, Melinda; Szili, Károly; Fáyné Dombi, Alice; Barabás, Katalin

    2016-04-03

    Promiscuity and lack of use of reliable contraceptive methods increase the probability of sexually transmitted diseases and the risk of unwanted pregnancies, which are quite common among university students. The aim of the study was to assess the knowledge of university students about reliable contraceptive methods and sexually transmitted diseases, and to assess the effectiveness of the sexual health education in secondary schools, with specific focus on the education held by peers. An anonymous, self-administered questionnaire survey was carried out in a randomized sample of students at the University of Szeged (n = 472, 298 women and 174 men, average age 21 years) between 2009 and 2011. 62.1% of the respondents declared that reproductive health education lessons in high schools held by peers were reliable and authentic source of information, 12.3% considered as a less reliable source, and 25.6% defined the school health education as irrelevant source. Among those, who considered the health education held by peers as a reliable source, there were significantly more females (69.3% vs. 46.6%, p = 0.001), significantly fewer lived in cities (83.6% vs. 94.8%, p = 0.025), and significantly more responders knew that Candida infection can be transmitted through sexual intercourse (79.5% versus 63.9%, p = 0.02) as compared to those who did not consider health education held by peers as a reliable source. The majority of respondents obtained knowledge about sexual issues from the mass media. Young people who considered health educating programs reliable were significantly better informed about Candida disease.

  9. Detection methods of irradiated foodstuffs

    Energy Technology Data Exchange (ETDEWEB)

    Ponta, C C; Cutrubinis, M; Georgescu, R [IRASM Center, Horia Hulubei National Institute for Physics and Nuclear Engineering, PO Box MG-6, RO-077125 Magurele-Bucharest (Romania); Mihai, R [Life and Environmental Physics Department, Horia Hulubei National Institute for Physics and Nuclear Engineering, PO Box MG-6, RO-077125 Magurele-Bucharest (Romania); Secu, M [National Institute of Materials Physics, Bucharest (Romania)

    2005-07-01

    food is marketed as irradiated or if irradiated goods are sold without the appropriate labeling, then detection tests should be able to prove the authenticity of the product. For the moment in Romania there is not any food control laboratory able to detect irradiated foodstuffs. The Technological Irradiation Department coordinates and co finances a research project aimed to establish the first Laboratory of Irradiated Foodstuffs Detection. The detection methods studied in this project are the ESR methods (for cellulose EN 1787/2000, bone EN 1786/1996 and crystalline sugar EN 13708/2003), the TL method (EN 1788/2001), the PSL method (EN 13751/2002) and the DNA Comet Assay method (EN 13784/2001). The above detection methods will be applied on various foodstuffs such: garlic, onion, potatoes, rice, beans, wheat, maize, pistachio, sunflower seeds, raisins, figs, strawberries, chicken, beef, fish, pepper, paprika, thyme, laurel and mushrooms. As an example of the application of a detection method there are presented the ESR spectra of irradiated and nonirradiated paprika acquired according to ESR detection method for irradiated foodstuffs containing cellulose. First of all it can be noticed that the intensity of the signal of cellulose is much higher for the irradiated sample than that for the nonirradiated one and second that appear two radiation specific signals symmetrical to the cellulose signal. These two radiation specific signals prove the irradiation treatment of paprika. (author)

  10. A Test-Retest Reliability Study of the Whiplash Disability Questionnaire in Patients With Acute Whiplash-Associated Disorders

    DEFF Research Database (Denmark)

    Stupar, Maja; Côté, Pierre; Beaton, Dorcas E

    2015-01-01

    OBJECTIVE: The purpose of this study was to determine the test-retest reliability and the Minimal Detectable Change (MDC) of the Whiplash Disability Questionnaire (WDQ) in individuals with acute whiplash-associated disorders (WADs). METHODS: We performed a test-retest reliability study. We includ...

  11. A critical evaluation of deterministic methods in size optimisation of reliable and cost effective standalone hybrid renewable energy systems

    International Nuclear Information System (INIS)

    Maheri, Alireza

    2014-01-01

    Reliability of a hybrid renewable energy system (HRES) strongly depends on various uncertainties affecting the amount of power produced by the system. In the design of systems subject to uncertainties, both deterministic and nondeterministic design approaches can be adopted. In a deterministic design approach, the designer considers the presence of uncertainties and incorporates them indirectly into the design by applying safety factors. It is assumed that, by employing suitable safety factors and considering worst-case-scenarios, reliable systems can be designed. In fact, the multi-objective optimisation problem with two objectives of reliability and cost is reduced to a single-objective optimisation problem with the objective of cost only. In this paper the competence of deterministic design methods in size optimisation of reliable standalone wind–PV–battery, wind–PV–diesel and wind–PV–battery–diesel configurations is examined. For each configuration, first, using different values of safety factors, the optimal size of the system components which minimises the system cost is found deterministically. Then, for each case, using a Monte Carlo simulation, the effect of safety factors on the reliability and the cost are investigated. In performing reliability analysis, several reliability measures, namely, unmet load, blackout durations (total, maximum and average) and mean time between failures are considered. It is shown that the traditional methods of considering the effect of uncertainties in deterministic designs such as design for an autonomy period and employing safety factors have either little or unpredictable impact on the actual reliability of the designed wind–PV–battery configuration. In the case of wind–PV–diesel and wind–PV–battery–diesel configurations it is shown that, while using a high-enough margin of safety in sizing diesel generator leads to reliable systems, the optimum value for this margin of safety leading to a

  12. A reliability design method for a lithium-ion battery pack considering the thermal disequilibrium in electric vehicles

    Science.gov (United States)

    Xia, Quan; Wang, Zili; Ren, Yi; Sun, Bo; Yang, Dezhen; Feng, Qiang

    2018-05-01

    With the rapid development of lithium-ion battery technology in the electric vehicle (EV) industry, the lifetime of the battery cell increases substantially; however, the reliability of the battery pack is still inadequate. Because of the complexity of the battery pack, a reliability design method for a lithium-ion battery pack considering the thermal disequilibrium is proposed in this paper based on cell redundancy. Based on this method, a three-dimensional electric-thermal-flow-coupled model, a stochastic degradation model of cells under field dynamic conditions and a multi-state system reliability model of a battery pack are established. The relationships between the multi-physics coupling model, the degradation model and the system reliability model are first constructed to analyze the reliability of the battery pack and followed by analysis examples with different redundancy strategies. By comparing the reliability of battery packs of different redundant cell numbers and configurations, several conclusions for the redundancy strategy are obtained. More notably, the reliability does not monotonically increase with the number of redundant cells for the thermal disequilibrium effects. In this work, the reliability of a 6 × 5 parallel-series configuration is the optimal system structure. In addition, the effect of the cell arrangement and cooling conditions are investigated.

  13. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  14. A review on exudates detection methods for diabetic retinopathy.

    Science.gov (United States)

    Joshi, Shilpa; Karule, P T

    2018-01-01

    The presence of exudates on the retina is the most characteristic symptom of diabetic retinopathy. As exudates are among early clinical signs of DR, their detection would be an essential asset to the mass screening task and serve as an important step towards automatic grading and monitoring of the disease. Reliable identification and classification of exudates are of inherent interest in an automated diabetic retinopathy screening system. Here we review the numerous early studies that used for automatic exudates detection with the aim of providing decision support in addition to reducing the workload of an ophthalmologist. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  15. Reliability and validity in measurement of true humeral retroversion by a three-dimensional cylinder fitting method.

    Science.gov (United States)

    Saka, Masayuki; Yamauchi, Hiroki; Hoshi, Kenji; Yoshioka, Toru; Hamada, Hidetoshi; Gamada, Kazuyoshi

    2015-05-01

    Humeral retroversion is defined as the orientation of the humeral head relative to the distal humerus. Because none of the previous methods used to measure humeral retroversion strictly follow this definition, values obtained by these techniques vary and may be biased by morphologic variations of the humerus. The purpose of this study was 2-fold: to validate a method to define the axis of the distal humerus with a virtual cylinder and to establish the reliability of 3-dimensional (3D) measurement of humeral retroversion by this cylinder fitting method. Humeral retroversion in 14 baseball players (28 humeri) was measured by the 3D cylinder fitting method. The root mean square error was calculated to compare values obtained by a single tester and by 2 different testers using the embedded coordinate system. To establish the reliability, intraclass correlation coefficient (ICC) and precision (standard error of measurement [SEM]) were calculated. The root mean square errors for the humeral coordinate system were reliability and precision of the 3D measurement of retroversion yielded an intratester ICC of 0.99 (SEM, 1.0°) and intertester ICC of 0.96 (SEM, 2.8°). The error in measurements obtained by a distal humerus cylinder fitting method was small enough not to affect retroversion measurement. The 3D measurement of retroversion by this method provides excellent intratester and intertester reliability. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  16. Conceptual transitions in methods of skull-photo superimposition that impact the reliability of identification: a review.

    Science.gov (United States)

    Jayaprakash, Paul T

    2015-01-01

    Establishing identification during skull-photo superimposition relies on correlating the salient morphological features of an unidentified skull with those of a face-image of a suspected dead individual using image overlay processes. Technical progression in the process of overlay has included the incorporation of video cameras, image-mixing devices and software that enables real-time vision-mixing. Conceptual transitions occur in the superimposition methods that involve 'life-size' images, that achieve orientation of the skull to the posture of the face in the photograph and that assess the extent of match. A recent report on the reliability of identification using the superimposition method adopted the currently prevalent methods and suggested an increased rate of failures when skulls were compared with related and unrelated face images. The reported reduction in the reliability of the superimposition method prompted a review of the transition in the concepts that are involved in skull-photo superimposition. The prevalent popular methods for visualizing the superimposed images at less than 'life-size', overlaying skull-face images by relying on the cranial and facial landmarks in the frontal plane when orienting the skull for matching and evaluating the match on a morphological basis by relying on mix-mode alone are the major departures in the methodology that may have reduced the identification reliability. The need to reassess the reliability of the method that incorporates the concepts which have been considered appropriate by the practitioners is stressed. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Review on the application of physiological and biomechanical measurement methods in driving fatigue detection

    Directory of Open Access Journals (Sweden)

    Kadek Heri Sanjaya

    2016-07-01

    Full Text Available Previous studies have identified driving fatigue as the main cause of road traffic accidents, therefore, the aim of this literature review is to explore the characteristics of driving fatigue both physically and mentally as well as to explore the technology available to measure the process of fatigue physiologically. We performed e-searching in the field of fatigue detection methods through keywords tracking. The instruments studied have their own strength and weakness, and some are intrusive while the others are non-intrusive. The accuracy and stability of measurements are also varied between those instruments. In order to create more reliable fatigue detection methods, it is necessary to involve more instruments with an inter-disciplinary approach. Our intention is to make this study as a stepping stone for a more comprehensive in-vehicle real-time man-machine interaction study. Such study will not only be useful to prevent traffic accidents but also to bridge man and machine communication in the vehicle control along with developing newer technology in the field of vehicle automation.

  18. Detection of enterotoxigenic Clostridium perfringens in meat samples by using molecular methods.

    Science.gov (United States)

    Kaneko, Ikuko; Miyamoto, Kazuaki; Mimura, Kanako; Yumine, Natsuko; Utsunomiya, Hirotoshi; Akimoto, Shigeru; McClane, Bruce A

    2011-11-01

    To prevent food-borne bacterial diseases and to trace bacterial contamination events to foods, microbial source tracking (MST) methods provide important epidemiological information. To apply molecular methods to MST, it is necessary not only to amplify bacterial cells to detection limit levels but also to prepare DNA with reduced inhibitory compounds and contamination. Isolates carrying the Clostridium perfringens enterotoxin gene (cpe) on the chromosome or a plasmid rank among the most important food-borne pathogens. Previous surveys indicated that cpe-positive C. perfringens isolates are present in only ∼5% of nonoutbreak food samples and then only at low numbers, usually less than 3 cells/g. In this study, four molecular assays for the detection of cpe-positive C. perfringens isolates, i.e., ordinary PCR, nested PCR, real-time PCR, and loop-mediated isothermal amplification (LAMP), were developed and evaluated for their reliability using purified DNA. For use in the artificial contamination of meat samples, DNA templates were prepared by three different commercial DNA preparation kits. The four molecular assays always detected cpe when >10³ cells/g of cpe-positive C. perfringens were present, using any kit. Of three tested commercial DNA preparation kits, the InstaGene matrix kit appeared to be most suitable for the testing of a large number of samples. By using the InstaGene matrix kit, the four molecular assays efficiently detected cpe using DNA prepared from enrichment culture specimens of meat samples contaminated with low numbers of cpe-positive C. perfringens vegetative cells or spores. Overall, the current study developed molecular assay protocols for MST to detect the contamination of foods with low numbers of cells, and at a low frequency, of cpe-positive C. perfringens isolates.

  19. Automatically generated acceptance test: A software reliability experiment

    Science.gov (United States)

    Protzel, Peter W.

    1988-01-01

    This study presents results of a software reliability experiment investigating the feasibility of a new error detection method. The method can be used as an acceptance test and is solely based on empirical data about the behavior of internal states of a program. The experimental design uses the existing environment of a multi-version experiment previously conducted at the NASA Langley Research Center, in which the launch interceptor problem is used as a model. This allows the controlled experimental investigation of versions with well-known single and multiple faults, and the availability of an oracle permits the determination of the error detection performance of the test. Fault interaction phenomena are observed that have an amplifying effect on the number of error occurrences. Preliminary results indicate that all faults examined so far are detected by the acceptance test. This shows promise for further investigations, and for the employment of this test method on other applications.

  20. Weibull distribution in reliability data analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang

    2015-01-01

    Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)

  1. Reliable allele detection using SNP-based PCR primers containing Locked Nucleic Acid: application in genetic mapping

    Directory of Open Access Journals (Sweden)

    Trognitz Friederike

    2007-02-01

    Full Text Available Abstract Background The diploid, Solanum caripense, a wild relative of potato and tomato, possesses valuable resistance to potato late blight and we are interested in the genetic base of this resistance. Due to extremely low levels of genetic variation within the S. caripense genome it proved impossible to generate a dense genetic map and to assign individual Solanum chromosomes through the use of conventional chromosome-specific SSR, RFLP, AFLP, as well as gene- or locus-specific markers. The ease of detection of DNA polymorphisms depends on both frequency and form of sequence variation. The narrow genetic background of close relatives and inbreds complicates the detection of persisting, reduced polymorphism and is a challenge to the development of reliable molecular markers. Nonetheless, monomorphic DNA fragments representing not directly usable conventional markers can contain considerable variation at the level of single nucleotide polymorphisms (SNPs. This can be used for the design of allele-specific molecular markers. The reproducible detection of allele-specific markers based on SNPs has been a technical challenge. Results We present a fast and cost-effective protocol for the detection of allele-specific SNPs by applying Sequence Polymorphism-Derived (SPD markers. These markers proved highly efficient for fingerprinting of individuals possessing a homogeneous genetic background. SPD markers are obtained from within non-informative, conventional molecular marker fragments that are screened for SNPs to design allele-specific PCR primers. The method makes use of primers containing a single, 3'-terminal Locked Nucleic Acid (LNA base. We demonstrate the applicability of the technique by successful genetic mapping of allele-specific SNP markers derived from monomorphic Conserved Ortholog Set II (COSII markers mapped to Solanum chromosomes, in S. caripense. By using SPD markers it was possible for the first time to map the S. caripense alleles

  2. Exploratory factor analysis and reliability analysis with missing data: A simple method for SPSS users

    Directory of Open Access Journals (Sweden)

    Bruce Weaver

    2014-09-01

    Full Text Available Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009 argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM covariances(or correlations as input for the analysis. SPSS users who have the Missing Values Analysis add-on module can obtain vectors ofEM means and standard deviations plus EM correlation and covariance matrices via the MVA procedure. But unfortunately, MVA has no /MATRIX subcommand, and therefore cannot write the EM correlations directly to a matrix dataset of the type needed as input to the FACTOR and RELIABILITY procedures. We describe two macros that (in conjunction with an intervening MVA command carry out the data management steps needed to create two matrix datasets, one containing EM correlations and the other EM covariances. Either of those matrix datasets can then be used asinput to the FACTOR procedure, and the EM correlations can also be used as input to RELIABILITY. We provide an example that illustrates the use of the two macros to generate the matrix datasets and how to use those datasets as input to the FACTOR and RELIABILITY procedures. We hope that this simple method for handling missing data will prove useful to both students andresearchers who are conducting EFA or reliability analysis.

  3. A Method to Increase Drivers' Trust in Collision Warning Systems Based on Reliability Information of Sensor

    Science.gov (United States)

    Tsutsumi, Shigeyoshi; Wada, Takahiro; Akita, Tokihiko; Doi, Shun'ichi

    Driver's workload tends to be increased during driving under complicated traffic environments like a lane change. In such cases, rear collision warning is effective for reduction of cognitive workload. On the other hand, it is pointed out that false alarm or missing alarm caused by sensor errors leads to decrease of driver' s trust in the warning system and it can result in low efficiency of the system. Suppose that reliability information of the sensor is provided in real-time. In this paper, we propose a new warning method to increase driver' s trust in the system even with low sensor reliability utilizing the sensor reliability information. The effectiveness of the warning methods is shown by driving simulator experiments.

  4. Automatic EEG spike detection.

    Science.gov (United States)

    Harner, Richard

    2009-10-01

    Since the 1970s advances in science and technology during each succeeding decade have renewed the expectation of efficient, reliable automatic epileptiform spike detection (AESD). But even when reinforced with better, faster tools, clinically reliable unsupervised spike detection remains beyond our reach. Expert-selected spike parameters were the first and still most widely used for AESD. Thresholds for amplitude, duration, sharpness, rise-time, fall-time, after-coming slow waves, background frequency, and more have been used. It is still unclear which of these wave parameters are essential, beyond peak-peak amplitude and duration. Wavelet parameters are very appropriate to AESD but need to be combined with other parameters to achieve desired levels of spike detection efficiency. Artificial Neural Network (ANN) and expert-system methods may have reached peak efficiency. Support Vector Machine (SVM) technology focuses on outliers rather than centroids of spike and nonspike data clusters and should improve AESD efficiency. An exemplary spike/nonspike database is suggested as a tool for assessing parameters and methods for AESD and is available in CSV or Matlab formats from the author at brainvue@gmail.com. Exploratory Data Analysis (EDA) is presented as a graphic method for finding better spike parameters and for the step-wise evaluation of the spike detection process.

  5. Safety and reliability analysis based on nonprobabilistic methods

    International Nuclear Information System (INIS)

    Kozin, I.O.; Petersen, K.E.

    1996-01-01

    Imprecise probabilities, being developed during the last two decades, offer a considerably more general theory having many advantages which make it very promising for reliability and safety analysis. The objective of the paper is to argue that imprecise probabilities are more appropriate tool for reliability and safety analysis, that they allow to model the behavior of nuclear industry objects more comprehensively and give a possibility to solve some problems unsolved in the framework of conventional approach. Furthermore, some specific examples are given from which we can see the usefulness of the tool for solving some reliability tasks

  6. A new s-adenosylhomocysteine hydrolase-linked method for adenosine detection based on DNA-templated fluorescent Cu/Ag nanoclusters.

    Science.gov (United States)

    Ahn, Jun Ki; Kim, Hyo Yong; Baek, Songyi; Park, Hyun Gyu

    2017-07-15

    We herein describe a novel fluorescent method for the rapid and selective detection of adenosine by utilizing DNA-templated Cu/Ag nanoclusters (NCs) and employing s-adenosylhomocysteine hydrolase (SAHH). SAHH is allowed to promote hydrolysis reaction of s-adenosylhomocysteine (SAH) and consequently produces homocysteine, which would quench the fluorescence signal from DNA-templated Cu/Ag nanoclusters employed as a signaling probe in this study. On the other hand, adenosine significantly inhibits the hydrolysis reaction and prevent the formation of homocysteine. Consequently, highly enhanced fluorescence signal from DNA-Cu/Ag NCs is retained, which could be used to identify the presence of adenosine. By employing this design principle, adenosine was sensitively detected down to 19nM with high specificity over other adenosine analogs such as AMP, ADP, ATP, cAMP, guanosine, cytidine, and urine. Finally, the diagnostic capability of this method was successfully verified by reliably detecting adenosine present in a real human serum sample. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Validity and reliability of a method for assessment of cervical vertebral maturation.

    Science.gov (United States)

    Zhao, Xiao-Guang; Lin, Jiuxiang; Jiang, Jiu-Hui; Wang, Qingzhu; Ng, Sut Hong

    2012-03-01

    To evaluate the validity and reliability of the cervical vertebral maturation (CVM) method with a longitudinal sample. Eighty-six cephalograms from 18 subjects (5 males and 13 females) were selected from the longitudinal database. Total mandibular length was measured on each film; an increased rate served as the gold standard in examination of the validity of the CVM method. Eleven orthodontists, after receiving intensive training in the CVM method, evaluated all films twice. Kendall's W and the weighted kappa statistic were employed. Kendall's W values were higher than 0.8 at both times, indicating strong interobserver reproducibility, but interobserver agreement was documented twice at less than 50%. A wide range of intraobserver agreement was noted (40.7%-79.1%), and substantial intraobserver reproducibility was proved by kappa values (0.53-0.86). With regard to validity, moderate agreement was reported between the gold standard and observer staging at the initial time (kappa values 0.44-0.61). However, agreement seemed to be unacceptable for clinical use, especially in cervical stage 3 (26.8%). Even though the validity and reliability of the CVM method proved statistically acceptable, we suggest that many other growth indicators should be taken into consideration in evaluating adolescent skeletal maturation.

  8. Improved GLR method to instrument failure detection

    International Nuclear Information System (INIS)

    Jeong, Hak Yeoung; Chang, Soon Heung

    1985-01-01

    The generalized likehood radio(GLR) method performs statistical tests on the innovations sequence of a Kalman-Buchy filter state estimator for system failure detection and its identification. However, the major drawback of the convensional GLR is to hypothesize particular failure type in each case. In this paper, a method to solve this drawback is proposed. The improved GLR method is applied to a PWR pressurizer and gives successful results in detection and identification of any failure. Furthmore, some benefit on the processing time per each cycle of failure detection and its identification can be accompanied. (Author)

  9. Thermoelectric SQUID method for the detection of segregations

    Science.gov (United States)

    Hinken, Johann H.; Tavrin, Yury

    2000-05-01

    Aero engine turbine discs are most critical parts. Material inhomogeneities can cause disc fractures during the flight with fatal air disasters. Nondestructive testing (NDT) of the discs in various machining steps is necessary and performed as well as possible. Conventional NDT methods, however, like eddy current testing and ultrasonic testing have unacceptable limits. For example, subsurface segregations often cannot be detected directly but only indirectly in such cases when cracks already have developed from them. This may be too late. A new NDT method, which we call the Thermoelectric SQUID Method, has been developed. It allows for the detection of metallic inclusions within non-ferromagnetic metallic base material. This paper describes the results of a feasibility study on aero engine turbine discs made from Inconel® 718. These contained segregations that had been detected before by anodic etching. With the Thermoelectric SQUID Method, these segregations were detected again, and further segregations below the surfaces have been found, which had not been detected before. For this new NDT method the disc material is quasi-transparent. The Thermoelectric SQUID Method is also useful to detect distributed and localized inhomogeneities in pure metals like niobium sheets for particle accelerators.

  10. Distance Measurement Methods for Improved Insider Threat Detection

    Directory of Open Access Journals (Sweden)

    Owen Lo

    2018-01-01

    Full Text Available Insider threats are a considerable problem within cyber security and it is often difficult to detect these threats using signature detection. Increasing machine learning can provide a solution, but these methods often fail to take into account changes of behaviour of users. This work builds on a published method of detecting insider threats and applies Hidden Markov method on a CERT data set (CERT r4.2 and analyses a number of distance vector methods (Damerau–Levenshtein Distance, Cosine Distance, and Jaccard Distance in order to detect changes of behaviour, which are shown to have success in determining different insider threats.

  11. Prevent cervical cancer by screening with reliable human papillomavirus detection and genotyping

    International Nuclear Information System (INIS)

    Ge, Shichao; Gong, Bo; Cai, Xushan; Yang, Xiaoer; Gan, Xiaowei; Tong, Xinghai; Li, Haichuan; Zhu, Meijuan; Yang, Fengyun; Zhou, Hongrong; Hong, Guofan

    2012-01-01

    The incidence of cervical cancer is expected to rise sharply in China. A reliable routine human papillomavirus (HPV) detection and genotyping test to be supplemented by the limited Papanicolaou cytology facilities is urgently needed to help identify the patients with cervical precancer for preventive interventions. To this end, we evaluated a nested polymerase chain reaction (PCR) protocol for detection of HPV L1 gene DNA in cervicovaginal cells. The PCR amplicons were genotyped by direct DNA sequencing. In parallel, split samples were subjected to a Digene HC2 HPV test which has been widely used for “cervical cancer risk” screen. Of the 1826 specimens, 1655 contained sufficient materials for analysis and 657 were truly negative. PCR/DNA sequencing showed 674 infected by a single high-risk HPV, 188 by a single low-risk HPV, and 136 by multiple HPV genotypes with up to five HPV genotypes in one specimen. In comparison, the HC2 test classified 713 specimens as infected by high-risk HPV, and 942 as negative for HPV infections. The high-risk HC2 test correctly detected 388 (57.6%) of the 674 high-risk HPV isolates in clinical specimens, mislabeled 88 (46.8%) of the 188 low-risk HPV isolates as high-risk genotypes, and classified 180 (27.4%) of the 657 “true-negative” samples as being infected by high-risk HPV. It was found to cross-react with 20 low-risk HPV genotypes. We conclude that nested PCR detection of HPV followed by short target DNA sequencing can be used for screening and genotyping to formulate a paradigm in clinical management of HPV-related disorders in a rapidly developing economy

  12. Comparative analysis of methods for detecting interacting loci.

    Science.gov (United States)

    Chen, Li; Yu, Guoqiang; Langefeld, Carl D; Miller, David J; Guy, Richard T; Raghuram, Jayaram; Yuan, Xiguo; Herrington, David M; Wang, Yue

    2011-07-05

    Interactions among genetic loci are believed to play an important role in disease risk. While many methods have been proposed for detecting such interactions, their relative performance remains largely unclear, mainly because different data sources, detection performance criteria, and experimental protocols were used in the papers introducing these methods and in subsequent studies. Moreover, there have been very few studies strictly focused on comparison of existing methods. Given the importance of detecting gene-gene and gene-environment interactions, a rigorous, comprehensive comparison of performance and limitations of available interaction detection methods is warranted. We report a comparison of eight representative methods, of which seven were specifically designed to detect interactions among single nucleotide polymorphisms (SNPs), with the last a popular main-effect testing method used as a baseline for performance evaluation. The selected methods, multifactor dimensionality reduction (MDR), full interaction model (FIM), information gain (IG), Bayesian epistasis association mapping (BEAM), SNP harvester (SH), maximum entropy conditional probability modeling (MECPM), logistic regression with an interaction term (LRIT), and logistic regression (LR) were compared on a large number of simulated data sets, each, consistent with complex disease models, embedding multiple sets of interacting SNPs, under different interaction models. The assessment criteria included several relevant detection power measures, family-wise type I error rate, and computational complexity. There are several important results from this study. First, while some SNPs in interactions with strong effects are successfully detected, most of the methods miss many interacting SNPs at an acceptable rate of false positives. In this study, the best-performing method was MECPM. Second, the statistical significance assessment criteria, used by some of the methods to control the type I error rate

  13. Comparative analysis of methods for detecting interacting loci

    Directory of Open Access Journals (Sweden)

    Yuan Xiguo

    2011-07-01

    Full Text Available Abstract Background Interactions among genetic loci are believed to play an important role in disease risk. While many methods have been proposed for detecting such interactions, their relative performance remains largely unclear, mainly because different data sources, detection performance criteria, and experimental protocols were used in the papers introducing these methods and in subsequent studies. Moreover, there have been very few studies strictly focused on comparison of existing methods. Given the importance of detecting gene-gene and gene-environment interactions, a rigorous, comprehensive comparison of performance and limitations of available interaction detection methods is warranted. Results We report a comparison of eight representative methods, of which seven were specifically designed to detect interactions among single nucleotide polymorphisms (SNPs, with the last a popular main-effect testing method used as a baseline for performance evaluation. The selected methods, multifactor dimensionality reduction (MDR, full interaction model (FIM, information gain (IG, Bayesian epistasis association mapping (BEAM, SNP harvester (SH, maximum entropy conditional probability modeling (MECPM, logistic regression with an interaction term (LRIT, and logistic regression (LR were compared on a large number of simulated data sets, each, consistent with complex disease models, embedding multiple sets of interacting SNPs, under different interaction models. The assessment criteria included several relevant detection power measures, family-wise type I error rate, and computational complexity. There are several important results from this study. First, while some SNPs in interactions with strong effects are successfully detected, most of the methods miss many interacting SNPs at an acceptable rate of false positives. In this study, the best-performing method was MECPM. Second, the statistical significance assessment criteria, used by some of the

  14. Minimum Delay Moving Object Detection

    KAUST Repository

    Lao, Dong

    2017-11-09

    We present a general framework and method for detection of an object in a video based on apparent motion. The object moves relative to background motion at some unknown time in the video, and the goal is to detect and segment the object as soon it moves in an online manner. Due to unreliability of motion between frames, more than two frames are needed to reliably detect the object. Our method is designed to detect the object(s) with minimum delay, i.e., frames after the object moves, constraining the false alarms. Experiments on a new extensive dataset for moving object detection show that our method achieves less delay for all false alarm constraints than existing state-of-the-art.

  15. Minimum Delay Moving Object Detection

    KAUST Repository

    Lao, Dong

    2017-01-08

    We present a general framework and method for detection of an object in a video based on apparent motion. The object moves relative to background motion at some unknown time in the video, and the goal is to detect and segment the object as soon it moves in an online manner. Due to unreliability of motion between frames, more than two frames are needed to reliably detect the object. Our method is designed to detect the object(s) with minimum delay, i.e., frames after the object moves, constraining the false alarms. Experiments on a new extensive dataset for moving object detection show that our method achieves less delay for all false alarm constraints than existing state-of-the-art.

  16. Minimum Delay Moving Object Detection

    KAUST Repository

    Lao, Dong; Sundaramoorthi, Ganesh

    2017-01-01

    We present a general framework and method for detection of an object in a video based on apparent motion. The object moves relative to background motion at some unknown time in the video, and the goal is to detect and segment the object as soon it moves in an online manner. Due to unreliability of motion between frames, more than two frames are needed to reliably detect the object. Our method is designed to detect the object(s) with minimum delay, i.e., frames after the object moves, constraining the false alarms. Experiments on a new extensive dataset for moving object detection show that our method achieves less delay for all false alarm constraints than existing state-of-the-art.

  17. Fault tolerant control of a three-phase three-wire shunt active filter system based on reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Poure, P. [Laboratoire d' Instrumentation Electronique de Nancy LIEN, EA 3440, Nancy-Universite, Faculte des Sciences et Techniques, BP 239, 54506 Vandoeuvre Cedex (France); Weber, P.; Theilliol, D. [Centre de Recherche en Automatique de Nancy UMR 7039, Nancy-Universite, CNRS, Faculte des Sciences et Techniques, BP 239, 54506 Vandoeuvre Cedex (France); Saadate, S. [Groupe de Recherches en Electrotechnique et Electronique de Nancy UMR 7037, Nancy-Universite, CNRS, Faculte des Sciences et Techniques, BP 239, 54506 Vandoeuvre Cedex (France)

    2009-02-15

    This paper deals with fault tolerant shunt three-phase three-wire active filter topologies for which reliability is very important in industry applications. The determination of the optimal reconfiguration structure among various ones with or without redundant components is discussed based on reliability criteria. First, the reconfiguration of the inverter is detailed and a fast fault diagnosis method for power semi-conductor or driver fault detection and compensation is presented. This method avoids false fault detection due to power semi-conductors switching. The control architecture and algorithm are studied and a fault tolerant control strategy is considered. Simulation results in open and short circuit cases validate the theoretical study. Finally, the reliability of the studied three-phase three-wire filter shunt active topologies is analyzed to determine the optimal one. (author)

  18. Investigation of Detectability of Elementary Composition of Rainbow trout muscle with EDS (Energy Dispersive Spectroscopy Method

    Directory of Open Access Journals (Sweden)

    Saltuk Buğrahan CEYHUN

    2017-06-01

    Full Text Available In present study, it is investigated that detectability of elementary composition of rainbow trout muscle using Energy Dispersive Spectroscopy (EDS. EDS system which has worked with attached to scanning electron microscope can do qualitative and semi-quantitative elementary analyses on selected region of sample using characteristic X-rays. For this purpose, it was performed four point and two mapping analyses from four samples. According to results, it was detected 13 elements which are consist of C, N and O in 87.70 percentage. As a result, although the method is sensitive and reliable, it is concluded that not adequate for elemental analysis alone but can be used as a support for analyzes with systems such as especially atomic absorption and ICP-MS.

  19. Reliability and validity of a brief method to assess nociceptive flexion reflex (NFR) threshold.

    Science.gov (United States)

    Rhudy, Jamie L; France, Christopher R

    2011-07-01

    The nociceptive flexion reflex (NFR) is a physiological tool to study spinal nociception. However, NFR assessment can take several minutes and expose participants to repeated suprathreshold stimulations. The 4 studies reported here assessed the reliability and validity of a brief method to assess NFR threshold that uses a single ascending series of stimulations (Peak 1 NFR), by comparing it to a well-validated method that uses 3 ascending/descending staircases of stimulations (Staircase NFR). Correlations between the NFR definitions were high, were on par with test-retest correlations of Staircase NFR, and were not affected by participant sex or chronic pain status. Results also indicated the test-retest reliabilities for the 2 definitions were similar. Using larger stimulus increments (4 mAs) to assess Peak 1 NFR tended to result in higher NFR threshold estimates than using the Staircase NFR definition, whereas smaller stimulus increments (2 mAs) tended to result in lower NFR threshold estimates than the Staircase NFR definition. Neither NFR definition was correlated with anxiety, pain catastrophizing, or anxiety sensitivity. In sum, a single ascending series of electrical stimulations results in a reliable and valid estimate of NFR threshold. However, caution may be warranted when comparing NFR thresholds across studies that differ in the ascending stimulus increments. This brief method to assess NFR threshold is reliable and valid; therefore, it should be useful to clinical pain researchers interested in quickly assessing inter- and intra-individual differences in spinal nociceptive processes. Copyright © 2011 American Pain Society. Published by Elsevier Inc. All rights reserved.

  20. Application of reflectance spectroscopies (FTIR-ATR & FT-NIR) coupled with multivariate methods for robust in vivo detection of begomovirus infection in papaya leaves

    Science.gov (United States)

    Haq, Quazi M. I.; Mabood, Fazal; Naureen, Zakira; Al-Harrasi, Ahmed; Gilani, Sayed A.; Hussain, Javid; Jabeen, Farah; Khan, Ajmal; Al-Sabari, Ruqaya S. M.; Al-khanbashi, Fatema H. S.; Al-Fahdi, Amira A. M.; Al-Zaabi, Ahoud K. A.; Al-Shuraiqi, Fatma A. M.; Al-Bahaisi, Iman M.

    2018-06-01

    Nucleic acid & serology based methods have revolutionized plant disease detection, however, they are not very reliable at asymptomatic stage, especially in case of pathogen with systemic infection, in addition, they need at least 1-2 days for sample harvesting, processing, and analysis. In this study, two reflectance spectroscopies i.e. Near Infrared reflectance spectroscopy (NIR) and Fourier-Transform-Infrared spectroscopy with Attenuated Total Reflection (FT-IR, ATR) coupled with multivariate exploratory methods like Principle Component Analysis (PCA) and Partial least square discriminant analysis (PLS-DA) have been deployed to detect begomovirus infection in papaya leaves. The application of those techniques demonstrates that they are very useful for robust in vivo detection of plant begomovirus infection. These methods are simple, sensitive, reproducible, precise, and do not require any lengthy samples preparation procedures.

  1. A new, rapid and reliable method for the determination of reduced sulphur (S{sup 2-}) species in natural water discharges

    Energy Technology Data Exchange (ETDEWEB)

    Montegrossi, Giordano [C.N.R. - Institute of Geosciences and Earth Resources, Via G. La Pira 4, 50121 Florence (Italy)]. E-mail: giordano@geo.unifi.it; Tassi, Franco [Department of Earth Sciences, University of Florence, Via G. La Pira 4, 50121 Florence (Italy); Vaselli, Orlando [C.N.R. - Institute of Geosciences and Earth Resources, Via G. La Pira 4, 50121 Florence (Italy); Department of Earth Sciences, University of Florence, Via G. La Pira 4, 50121 Florence (Italy); Bidini, Eva [Department of Earth Sciences, University of Florence, Via G. La Pira 4, 50121 Florence (Italy); Minissale, Angelo [C.N.R. - Institute of Geosciences and Earth Resources, Via G. La Pira 4, 50121 Florence (Italy)

    2006-05-15

    The determination of reduced S species in natural waters is particularly difficult due to their high instability and chemical and physical interferences in the current analytical methods. In this paper a new, rapid and reliable analytical procedure is presented, named the Cd-IC method, for their determination as {sigma}S{sup 2-} via oxidation to SO{sub 4}{sup 2-} after chemical trapping with an ammonia-cadmium solution that allows precipitation of all the reduced S species as CdS. The S{sup 2-}-SO{sub 4} is analysed by ion-chromatography. The main advantages of this method are: low cost, high stability of CdS precipitate, absence of interferences, low detection limit (0.01mg/L as SO{sub 4} for 10mL of water) and low analytical error (about 5%). The proposed method has been applied to more than 100 water samples from different natural systems (water discharges and cold wells from volcanic and geothermal areas, crater lakes) in central-southern Italy.

  2. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  3. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  4. Validity and reliability of the Thai version of the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU

    Directory of Open Access Journals (Sweden)

    Pipanmekaporn T

    2014-05-01

    Full Text Available Tanyong Pipanmekaporn,1 Nahathai Wongpakaran,2 Sirirat Mueankwan,3 Piyawat Dendumrongkul,2 Kaweesak Chittawatanarat,3 Nantiya Khongpheng,3 Nongnut Duangsoy31Department of Anesthesiology, Faculty of Medicine, Chiang Mai University, Chiang Mai, Thailand; 2Department of Psychiatry, Faculty of Medicine, Chiang Mai University, Chiang Mai, Thailand; 3Division of Surgical Critical Care and Trauma, Department of Surgery, Chiang Mai University Hospital, Chiang Mai, ThailandPurpose: The purpose of this study was to determine the validity and reliability of the Thai version of the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU, when compared to the diagnoses made by delirium experts.Patients and methods: This was a cross-sectional study conducted in both surgical intensive care and subintensive care units in Thailand between February–June 2011. Seventy patients aged 60 years or older who had been admitted to the units were enrolled into the study within the first 48 hours of admission. Each patient was randomly assessed as to whether they had delirium by a nurse using the Thai version of the CAM-ICU algorithm (Thai CAM-ICU or by a delirium expert using the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision.Results: The prevalence of delirium was found to be 18.6% (n=13 by the delirium experts. The sensitivity of the Thai CAM-ICU’s algorithms was found to be 92.3% (95% confidence interval [CI] =64.0%-99.8%, while the specificity was 94.7% (95% CI =85.4%-98.9%. The instrument displayed good interrater reliability (Cohen’s κ=0.81; 95% CI =0.64-0.99. The time taken to complete the Thai CAM-ICU was 1 minute (interquatile range, 1-2 minutes.Conclusion: The Thai CAM-ICU demonstrated good validity, reliability, and ease of use when diagnosing delirium in a surgical intensive care unit setting. The use of this diagnostic tool should be encouraged for daily, routine use, so as to promote the early detection

  5. Microbleed detection using automated segmentation (MIDAS): a new method applicable to standard clinical MR images.

    Science.gov (United States)

    Seghier, Mohamed L; Kolanko, Magdalena A; Leff, Alexander P; Jäger, Hans R; Gregoire, Simone M; Werring, David J

    2011-03-23

    Cerebral microbleeds, visible on gradient-recalled echo (GRE) T2* MRI, have generated increasing interest as an imaging marker of small vessel diseases, with relevance for intracerebral bleeding risk or brain dysfunction. Manual rating methods have limited reliability and are time-consuming. We developed a new method for microbleed detection using automated segmentation (MIDAS) and compared it with a validated visual rating system. In thirty consecutive stroke service patients, standard GRE T2* images were acquired and manually rated for microbleeds by a trained observer. After spatially normalizing each patient's GRE T2* images into a standard stereotaxic space, the automated microbleed detection algorithm (MIDAS) identified cerebral microbleeds by explicitly incorporating an "extra" tissue class for abnormal voxels within a unified segmentation-normalization model. The agreement between manual and automated methods was assessed using the intraclass correlation coefficient (ICC) and Kappa statistic. We found that MIDAS had generally moderate to good agreement with the manual reference method for the presence of lobar microbleeds (Kappa = 0.43, improved to 0.65 after manual exclusion of obvious artefacts). Agreement for the number of microbleeds was very good for lobar regions: (ICC = 0.71, improved to ICC = 0.87). MIDAS successfully detected all patients with multiple (≥2) lobar microbleeds. MIDAS can identify microbleeds on standard MR datasets, and with an additional rapid editing step shows good agreement with a validated visual rating system. MIDAS may be useful in screening for multiple lobar microbleeds.

  6. Testing the sensitivity of Nested PCR method to detect Aspergillus fumigates in experimentally infected Sputum samples

    International Nuclear Information System (INIS)

    Ramadan, A.; Soukkaria, S.

    2013-01-01

    Fungal infections caused by Aspergillus species generally are occupying a second place among invasive fungal infections in the world, especially A. fumigatus, which is considered the main cause of invasive Aspergillosis (IA). Although IA rarely infects immunocompetent individuals, however, it can lead to death in immunocompromised patients. Therefore, it is necessary to diagnose the infection early in order to treat the disease efficiently. However, the conventional diagnostic tools, currently used to detect infections, has low sensitivity and reliability. Polymerase chain reaction (PCR) technology distribution as a molecular and high sensitive technology has allowed us to make comparative study between sensitivity of traditional currently used diagnostic method and Nested-PCR, the result of the study of sputum samples that experimentally infected with different concentrations of A.fumigatus spores ramping from 10 to10 6 spore/ml, have high sensitivity and specificity of Nested-PCR in detecting the lower concentrations, comparing with traditional diagnostic method (culture on Sabouraud media) that were negative in all concetrations. (author)

  7. Reliability of Semiautomated Computational Methods for Estimating Tibiofemoral Contact Stress in the Multicenter Osteoarthritis Study

    Directory of Open Access Journals (Sweden)

    Donald D. Anderson

    2012-01-01

    Full Text Available Recent findings suggest that contact stress is a potent predictor of subsequent symptomatic osteoarthritis development in the knee. However, much larger numbers of knees (likely on the order of hundreds, if not thousands need to be reliably analyzed to achieve the statistical power necessary to clarify this relationship. This study assessed the reliability of new semiautomated computational methods for estimating contact stress in knees from large population-based cohorts. Ten knees of subjects from the Multicenter Osteoarthritis Study were included. Bone surfaces were manually segmented from sequential 1.0 Tesla magnetic resonance imaging slices by three individuals on two nonconsecutive days. Four individuals then registered the resulting bone surfaces to corresponding bone edges on weight-bearing radiographs, using a semi-automated algorithm. Discrete element analysis methods were used to estimate contact stress distributions for each knee. Segmentation and registration reliabilities (day-to-day and interrater for peak and mean medial and lateral tibiofemoral contact stress were assessed with Shrout-Fleiss intraclass correlation coefficients (ICCs. The segmentation and registration steps of the modeling approach were found to have excellent day-to-day (ICC 0.93–0.99 and good inter-rater reliability (0.84–0.97. This approach for estimating compartment-specific tibiofemoral contact stress appears to be sufficiently reliable for use in large population-based cohorts.

  8. A comparison of two sleep spindle detection methods based on all night averages: individually adjusted versus fixed frequencies

    Directory of Open Access Journals (Sweden)

    Péter Przemyslaw Ujma

    2015-02-01

    Full Text Available Sleep spindles are frequently studied for their relationship with state and trait cognitive variables, and they are thought to play an important role in sleep-related memory consolidation. Due to their frequent occurrence in NREM sleep, the detection of sleep spindles is only feasible using automatic algorithms, of which a large number is available. We compared subject averages of the spindle parameters computed by a fixed frequency (11-13 Hz for slow spindles, 13-15 Hz for fast spindles automatic detection algorithm and the individual adjustment method (IAM, which uses individual frequency bands for sleep spindle detection. Fast spindle duration and amplitude are strongly correlated in the two algorithms, but there is little overlap in fast spindle density and slow spindle parameters in general. The agreement between fixed and manually determined sleep spindle frequencies is limited, especially in case of slow spindles. This is the most likely reason for the poor agreement between the two detection methods in case of slow spindle parameters. Our results suggest that while various algorithms may reliably detect fast spindles, a more sophisticated algorithm primed to individual spindle frequencies is necessary for the detection of slow spindles as well as individual variations in the number of spindles in general.

  9. Reliable Detection and Smart Deletion of Malassez Counting Chamber Grid in Microscopic White Light Images for Microbiological Applications.

    Science.gov (United States)

    Denimal, Emmanuel; Marin, Ambroise; Guyot, Stéphane; Journaux, Ludovic; Molin, Paul

    2015-08-01

    In biology, hemocytometers such as Malassez slides are widely used and are effective tools for counting cells manually. In a previous work, a robust algorithm was developed for grid extraction in Malassez slide images. This algorithm was evaluated on a set of 135 images and grids were accurately detected in most cases, but there remained failures for the most difficult images. In this work, we present an optimization of this algorithm that allows for 100% grid detection and a 25% improvement in grid positioning accuracy. These improvements make the algorithm fully reliable for grid detection. This optimization also allows complete erasing of the grid without altering the cells, which eases their segmentation.

  10. Autism detection in early childhood (ADEC): reliability and validity data for a Level 2 screening tool for autistic disorder.

    Science.gov (United States)

    Nah, Yong-Hwee; Young, Robyn L; Brewer, Neil; Berlingeri, Genna

    2014-03-01

    The Autism Detection in Early Childhood (ADEC; Young, 2007) was developed as a Level 2 clinician-administered autistic disorder (AD) screening tool that was time-efficient, suitable for children under 3 years, easy to administer, and suitable for persons with minimal training and experience with AD. A best estimate clinical Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev.; DSM-IV-TR; American Psychiatric Association, 2000) diagnosis of AD was made for 70 children using all available information and assessment results, except for the ADEC data. A screening study compared these children on the ADEC with 57 children with other developmental disorders and 64 typically developing children. Results indicated high internal consistency (α = .91). Interrater reliability and test-retest reliability of the ADEC were also adequate. ADEC scores reliably discriminated different diagnostic groups after controlling for nonverbal IQ and Vineland Adaptive Behavior Composite scores. Construct validity (using exploratory factor analysis) and concurrent validity using performance on the Autism Diagnostic Observation Schedule (Lord et al., 2000), the Autism Diagnostic Interview-Revised (Le Couteur, Lord, & Rutter, 2003), and DSM-IV-TR criteria were also demonstrated. Signal detection analysis identified the optimal ADEC cutoff score, with the ADEC identifying all children who had an AD (N = 70, sensitivity = 1.0) but overincluding children with other disabilities (N = 13, specificity ranging from .74 to .90). Together, the reliability and validity data indicate that the ADEC has potential to be established as a suitable and efficient screening tool for infants with AD. 2014 APA

  11. Machine Maintenance Scheduling with Reliability Engineering Method and Maintenance Value Stream Mapping

    Science.gov (United States)

    Sembiring, N.; Nasution, A. H.

    2018-02-01

    Corrective maintenance i.e replacing or repairing the machine component after machine break down always done in a manufacturing company. It causes the production process must be stopped. Production time will decrease due to the maintenance team must replace or repair the damage machine component. This paper proposes a preventive maintenance’s schedule for a critical component of a critical machine of an crude palm oil and kernel company due to increase maintenance efficiency. The Reliability Engineering & Maintenance Value Stream Mapping is used as a method and a tool to analize the reliability of the component and reduce the wastage in any process by segregating value added and non value added activities.

  12. Reliable method for fission source convergence of Monte Carlo criticality calculation with Wielandt's method

    International Nuclear Information System (INIS)

    Yamamoto, Toshihiro; Miyoshi, Yoshinori

    2004-01-01

    A new algorithm of Monte Carlo criticality calculations for implementing Wielandt's method, which is one of acceleration techniques for deterministic source iteration methods, is developed, and the algorithm can be successfully implemented into MCNP code. In this algorithm, part of fission neutrons emitted during random walk processes are tracked within the current cycle, and thus a fission source distribution used in the next cycle spread more widely. Applying this method intensifies a neutron interaction effect even in a loosely-coupled array where conventional Monte Carlo criticality methods have difficulties, and a converged fission source distribution can be obtained with fewer cycles. Computing time spent for one cycle, however, increases because of tracking fission neutrons within the current cycle, which eventually results in an increase of total computing time up to convergence. In addition, statistical fluctuations of a fission source distribution in a cycle are worsened by applying Wielandt's method to Monte Carlo criticality calculations. However, since a fission source convergence is attained with fewer source iterations, a reliable determination of convergence can easily be made even in a system with a slow convergence. This acceleration method is expected to contribute to prevention of incorrect Monte Carlo criticality calculations. (author)

  13. Development and Establishment of Detection Method of Irradiated Foods

    International Nuclear Information System (INIS)

    Byun, Myung Woo; Lee, Ju Woon; Kim, Dong Ho; Jo, Cheo Run; Kim, Jang Ho; Kim, Kyong Su

    2004-12-01

    The present project was related to the development and establishment of the detection techniques for the safety management of gamma-irradiated food and particularly conducted for the establishment of standard detection method for gamma-irradiated dried spices and raw materials, dried meat and fish powder for processed foods, bean paste powder, red pepper paste powder, soy sauce powder, and starch for flavoring ingredients described in 3, 6, 7 section of Korean Food Standard. Since the approvement of gamma-irradiated food items will be enlarged due to the international tendency for gamma-irradiated food, it was concluded that the establishment of detailed detection methods for each food group is not efficient for the enactment and enforcement of related regulations. For this reason, in order to establish the standard detection method, a detection system for gamma-irradiated food suitable for domestic operation was studied using comparative analysis of domestic and foreign research data classified by items and methods and European Standard as a reference. According to the comparative analyses of domestic and foreign research data and regulations of detection for gamma-irradiated food, it was concluded to be desirable that the optimal detection method should be decided after principal detection tests such as physical, chemical, and biological detection methods are established as standard methods and that the specific descriptions such as pre-treatment of raw materials, test methods, and the evaluation of results should be separately prescribed

  14. Development and Establishment of Detection Method of Irradiated Foods

    Energy Technology Data Exchange (ETDEWEB)

    Byun, Myung Woo; Lee, Ju Woon; Kim, Dong Ho; Jo, Cheo Run; Kim, Jang Ho; Kim, Kyong Su

    2004-12-15

    The present project was related to the development and establishment of the detection techniques for the safety management of gamma-irradiated food and particularly conducted for the establishment of standard detection method for gamma-irradiated dried spices and raw materials, dried meat and fish powder for processed foods, bean paste powder, red pepper paste powder, soy sauce powder, and starch for flavoring ingredients described in 3, 6, 7 section of Korean Food Standard. Since the approvement of gamma-irradiated food items will be enlarged due to the international tendency for gamma-irradiated food, it was concluded that the establishment of detailed detection methods for each food group is not efficient for the enactment and enforcement of related regulations. For this reason, in order to establish the standard detection method, a detection system for gamma-irradiated food suitable for domestic operation was studied using comparative analysis of domestic and foreign research data classified by items and methods and European Standard as a reference. According to the comparative analyses of domestic and foreign research data and regulations of detection for gamma-irradiated food, it was concluded to be desirable that the optimal detection method should be decided after principal detection tests such as physical, chemical, and biological detection methods are established as standard methods and that the specific descriptions such as pre-treatment of raw materials, test methods, and the evaluation of results should be separately prescribed.

  15. Using a Hybrid Cost-FMEA Analysis for Wind Turbine Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Nacef Tazi

    2017-02-01

    Full Text Available Failure mode and effects analysis (FMEA has been proven to be an effective methodology to improve system design reliability. However, the standard approach reveals some weaknesses when applied to wind turbine systems. The conventional criticality assessment method has been criticized as having many limitations such as the weighting of severity and detection factors. In this paper, we aim to overcome these drawbacks and develop a hybrid cost-FMEA by integrating cost factors to assess the criticality, these costs vary from replacement costs to expected failure costs. Then, a quantitative comparative study is carried out to point out average failure rate, main cause of failure, expected failure costs and failure detection techniques. A special reliability analysis of gearbox and rotor-blades are presented.

  16. A fracture mechanics and reliability based method to assess non-destructive testings for pressure vessels

    International Nuclear Information System (INIS)

    Kitagawa, Hideo; Hisada, Toshiaki

    1979-01-01

    Quantitative evaluation has not been made on the effects of carrying out preservice and in-service nondestructive tests for securing the soundness, safety and maintainability of pressure vessels, spending large expenses and labor. Especially the problems concerning the time and interval of in-service inspections lack the reasonable, quantitative evaluation method. In this paper, the problems of pressure vessels are treated by having developed the analysis method based on reliability technology and probability theory. The growth of surface cracks in pressure vessels was estimated, using the results of previous studies. The effects of nondestructive inspection on the defects in pressure vessels were evaluated, and the influences of many factors, such as plate thickness, stress, the accuracy of inspection and so on, on the effects of inspection, and the method of evaluating the inspections at unequal intervals were investigated. The analysis of reliability taking in-service inspection into consideration, the evaluation of in-service inspection and other affecting factors through the typical examples of analysis, and the review concerning the time of inspection are described. The method of analyzing the reliability of pressure vessels, considering the growth of defects and preservice and in-service nondestructive tests, was able to be systematized so as to be practically usable. (Kako, I.)

  17. High Sensitive Methods for Health Monitoring of Compressor Blades and Fatigue Detection

    Science.gov (United States)

    Witoś, Mirosław

    2013-01-01

    The diagnostic and research aspects of compressor blade fatigue detection have been elaborated in the paper. The real maintenance and overhaul problems and characteristic of different modes of metal blade fatigue (LCF, HCF, and VHCF) have been presented. The polycrystalline defects and impurities influencing the fatigue, along with their related surface finish techniques, are taken into account. The three experimental methods of structural health assessment are considered. The metal magnetic memory (MMM), experimental modal analysis (EMA) and tip timing (TTM) methods provide information on the damage of diagnosed objects, for example, compressor blades. Early damage symptoms, that is, magnetic and modal properties of material strengthening and weakening phases (change of local dislocation density and grain diameter, increase of structural and magnetic anisotropy), have been described. It has been proven that the shape of resonance characteristic gives abilities to determine if fatigue or a blade crack is concerned. The capabilities of the methods for steel and titanium alloy blades have been illustrated in examples from active and passive experiments. In the conclusion, the MMM, EMA, and TTM have been verified, and the potential for reliable diagnosis of the compressor blades using this method has been confirmed. PMID:24191135

  18. High Sensitive Methods for Health Monitoring of Compressor Blades and Fatigue Detection

    Directory of Open Access Journals (Sweden)

    Mirosław Witoś

    2013-01-01

    Full Text Available The diagnostic and research aspects of compressor blade fatigue detection have been elaborated in the paper. The real maintenance and overhaul problems and characteristic of different modes of metal blade fatigue (LCF, HCF, and VHCF have been presented. The polycrystalline defects and impurities influencing the fatigue, along with their related surface finish techniques, are taken into account. The three experimental methods of structural health assessment are considered. The metal magnetic memory (MMM, experimental modal analysis (EMA and tip timing (TTM methods provide information on the damage of diagnosed objects, for example, compressor blades. Early damage symptoms, that is, magnetic and modal properties of material strengthening and weakening phases (change of local dislocation density and grain diameter, increase of structural and magnetic anisotropy, have been described. It has been proven that the shape of resonance characteristic gives abilities to determine if fatigue or a blade crack is concerned. The capabilities of the methods for steel and titanium alloy blades have been illustrated in examples from active and passive experiments. In the conclusion, the MMM, EMA, and TTM have been verified, and the potential for reliable diagnosis of the compressor blades using this method has been confirmed.

  19. Applicability of simplified human reliability analysis methods for severe accidents

    Energy Technology Data Exchange (ETDEWEB)

    Boring, R.; St Germain, S. [Idaho National Lab., Idaho Falls, Idaho (United States); Banaseanu, G.; Chatri, H.; Akl, Y. [Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2016-03-15

    Most contemporary human reliability analysis (HRA) methods were created to analyse design-basis accidents at nuclear power plants. As part of a comprehensive expansion of risk assessments at many plants internationally, HRAs will begin considering severe accident scenarios. Severe accidents, while extremely rare, constitute high consequence events that significantly challenge successful operations and recovery. Challenges during severe accidents include degraded and hazardous operating conditions at the plant, the shift in control from the main control room to the technical support center, the unavailability of plant instrumentation, and the need to use different types of operating procedures. Such shifts in operations may also test key assumptions in existing HRA methods. This paper discusses key differences between design basis and severe accidents, reviews efforts to date to create customized HRA methods suitable for severe accidents, and recommends practices for adapting existing HRA methods that are already being used for HRAs at the plants. (author)

  20. Image Processing Methods Usable for Object Detection on the Chessboard

    Directory of Open Access Journals (Sweden)

    Beran Ladislav

    2016-01-01

    Full Text Available Image segmentation and object detection is challenging problem in many research. Although many algorithms for image segmentation have been invented, there is no simple algorithm for image segmentation and object detection. Our research is based on combination of several methods for object detection. The first method suitable for image segmentation and object detection is colour detection. This method is very simply, but there is problem with different colours. For this method it is necessary to have precisely determined colour of segmented object before all calculations. In many cases it is necessary to determine this colour manually. Alternative simply method is method based on background removal. This method is based on difference between reference image and detected image. In this paper several methods suitable for object detection are described. Thisresearch is focused on coloured object detection on chessboard. The results from this research with fusion of neural networks for user-computer game checkers will be applied.

  1. Application of fuzzy-MOORA method: Ranking of components for reliability estimation of component-based software systems

    Directory of Open Access Journals (Sweden)

    Zeeshan Ali Siddiqui

    2016-01-01

    Full Text Available Component-based software system (CBSS development technique is an emerging discipline that promises to take software development into a new era. As hardware systems are presently being constructed from kits of parts, software systems may also be assembled from components. It is more reliable to reuse software than to create. It is the glue code and individual components reliability that contribute to the reliability of the overall system. Every component contributes to overall system reliability according to the number of times it is being used, some components are of critical usage, known as usage frequency of component. The usage frequency decides the weight of each component. According to their weights, each component contributes to the overall reliability of the system. Therefore, ranking of components may be obtained by analyzing their reliability impacts on overall application. In this paper, we propose the application of fuzzy multi-objective optimization on the basis of ratio analysis, Fuzzy-MOORA. The method helps us find the best suitable alternative, software component, from a set of available feasible alternatives named software components. It is an accurate and easy to understand tool for solving multi-criteria decision making problems that have imprecise and vague evaluation data. By the use of ratio analysis, the proposed method determines the most suitable alternative among all possible alternatives, and dimensionless measurement will realize the job of ranking of components for estimating CBSS reliability in a non-subjective way. Finally, three case studies are shown to illustrate the use of the proposed technique.

  2. A comparative study of cultural methods for the detection of Salmonella in feed and feed ingredients

    Directory of Open Access Journals (Sweden)

    Haggblom Per

    2009-02-01

    Full Text Available Abstract Background Animal feed as a source of infection to food producing animals is much debated. In order to increase our present knowledge about possible feed transmission it is important to know that the present isolation methods for Salmonella are reliable also for feed materials. In a comparative study the ability of the standard method used for isolation of Salmonella in feed in the Nordic countries, the NMKL71 method (Nordic Committee on Food Analysis was compared to the Modified Semisolid Rappaport Vassiliadis method (MSRV and the international standard method (EN ISO 6579:2002. Five different feed materials were investigated, namely wheat grain, soybean meal, rape seed meal, palm kernel meal, pellets of pig feed and also scrapings from a feed mill elevator. Four different levels of the Salmonella serotypes S. Typhimurium, S. Cubana and S. Yoruba were added to each feed material, respectively. For all methods pre-enrichment in Buffered Peptone Water (BPW were carried out followed by enrichments in the different selective media and finally plating on selective agar media. Results The results obtained with all three methods showed no differences in detection levels, with an accuracy and sensitivity of 65% and 56%, respectively. However, Müller-Kauffmann tetrathionate-novobiocin broth (MKTTn, performed less well due to many false-negative results on Brilliant Green agar (BGA plates. Compared to other feed materials palm kernel meal showed a higher detection level with all serotypes and methods tested. Conclusion The results of this study showed that the accuracy, sensitivity and specificity of the investigated cultural methods were equivalent. However, the detection levels for different feed and feed ingredients varied considerably.

  3. The Evaluation Method of the Lightning Strike on Transmission Lines Aiming at Power Grid Reliability

    Science.gov (United States)

    Wen, Jianfeng; Wu, Jianwei; Huang, Liandong; Geng, Yinan; Yu, zhanqing

    2018-01-01

    Lightning protection of power system focuses on reducing the flashover rate, only distinguishing by the voltage level, without considering the functional differences between the transmission lines, and being lack of analysis the effect on the reliability of power grid. This will lead lightning protection design of general transmission lines is surplus but insufficient for key lines. In order to solve this problem, the analysis method of lightning striking on transmission lines for power grid reliability is given. Full wave process theory is used to analyze the lightning back striking; the leader propagation model is used to describe the process of shielding failure of transmission lines. The index of power grid reliability is introduced and the effect of transmission line fault on the reliability of power system is discussed in detail.

  4. Detection methods for irradiated food

    International Nuclear Information System (INIS)

    Stevenson, M.H.

    1993-01-01

    The plenary lecture gives a brief historical review of the development of methods for the detection of food irradiation and defines the demands on such methods. The methods described in detail are as follows: 1) Physical methods: As examples of luminescence methods, thermoluminescence and chermoluminescence are mentioned; ESR spectroscopy is discussed in detail by means of individual examples (crustaceans, frutis and vegetables, spieces and herbs, nuts). 2) Chemical methods: Examples given for these are methods that make use of alterations in lipids through radiation (formation of long-chain hydrocarbons, formation of 2-alkyl butanones), respectively radiation-induced alterations in the DNA. 3) Microbiological methods. An extensive bibliography is appended. (VHE) [de

  5. Validity and reliability of the session-RPE method for quantifying training in Australian football: a comparison of the CR10 and CR100 scales.

    Science.gov (United States)

    Scott, Tannath J; Black, Cameron R; Quinn, John; Coutts, Aaron J

    2013-01-01

    The purpose of this study was to examine and compare the criterion validity and test-retest reliability of the CR10 and CR100 rating of perceived exertion (RPE) scales for team sport athletes that undertake high-intensity, intermittent exercise. Twenty-one male Australian football (AF) players (age: 19.0 ± 1.8 years, body mass: 83.92 ± 7.88 kg) participated the first part (part A) of this study, which examined the construct validity of the session-RPE (sRPE) method for quantifying training load in AF. Ten male athletes (age: 16.1 ± 0.5 years) participated in the second part of the study (part B), which compared the test-retest reliability of the CR10 and CR100 RPE scales. In part A, the validity of the sRPE method was assessed by examining the relationships between sRPE, and objective measures of internal (i.e., heart rate) and external training load (i.e., distance traveled), collected from AF training sessions. Part B of the study assessed the reliability of sRPE through examining the test-retest reliability of sRPE during 3 different intensities of controlled intermittent running (10, 11.5, and 13 km·h(-1)). Results from part A demonstrated strong correlations for CR10- and CR100-derived sRPE with measures of internal training load (Banisters TRIMP and Edwards TRIMP) (CR10: r = 0.83 and 0.83, and CR100: r = 0.80 and 0.81, p training load (distance, higher speed running and player load) for both the CR10 (r = 0.81, 0.71, and 0.83) and CR100 (r = 0.78, 0.69, and 0.80) were significant (p reliability for both the CR10 (31.9% CV) and CR100 (38.6% CV) RPE scales after short bouts of intermittent running. Collectively, these results suggest both CR10- and CR100-derived sRPE methods have good construct validity for assessing training load in AF. The poor levels of reliability revealed under field testing indicate that the sRPE method may not be sensible to detecting small changes in exercise intensity during brief intermittent running bouts. Despite this limitation

  6. OCT4 and SOX2 are reliable markers in detecting stem cells in odontogenic lesions

    Directory of Open Access Journals (Sweden)

    Abhishek Banerjee

    2016-01-01

    Full Text Available Context (Background: Stem cells are a unique subpopulation of cells in the human body with a capacity to initiate differentiation into various cell lines. Tumor stem cells (TSCs are a unique subpopulation of cells that possess the ability to initiate a neoplasm and sustain self-renewal. Epithelial stem cell (ESC markers such as octamer-binding transcription factor 4 (OCT4 and sex-determining region Y (SRY-box 2 (SOX2 are capable of identifying these stem cells expressed during the early stages of tooth development. Aims: To detect the expression of the stem cell markers OCT4 and SOX2 in the normal odontogenic tissues and the odontogenic cysts and tumors. Materials and Methods: Paraffin sections of follicular tissue, radicular cyst, dentigerous cyst, odontogenic keratocyst, ameloblastoma, adenomatoid odontogenic tumor, and ameloblastic carcinoma were obtained from the archives. The sections were subjected to immunohistochemical assay by the use of mouse monoclonal antibodies to OCT4 and SOX2. Statistical Analysis: The results were evaluated by descriptive analysis. Results: The results show the presence of stem cells in the normal and lesional tissues with these stem cell identifying markers. SOX2 was found to be more consistent and reliable in the detection of stem cells. Conclusion: The stem cell expressions are maintained in the tumor transformation of tissue and probably suggest that there is no phenotypic change of stem cells in progression from normal embryonic state to its tumor component. The quantification and localization reveals interesting trends that indicate the probable role of the cells in the pathogenesis of the lesions.

  7. High resolution melting analysis: a rapid and accurate method to detect CALR mutations.

    Directory of Open Access Journals (Sweden)

    Cristina Bilbao-Sieyro

    Full Text Available The recent discovery of CALR mutations in essential thrombocythemia (ET and primary myelofibrosis (PMF patients without JAK2/MPL mutations has emerged as a relevant finding for the molecular diagnosis of these myeloproliferative neoplasms (MPN. We tested the feasibility of high-resolution melting (HRM as a screening method for rapid detection of CALR mutations.CALR was studied in wild-type JAK2/MPL patients including 34 ET, 21 persistent thrombocytosis suggestive of MPN and 98 suspected secondary thrombocytosis. CALR mutation analysis was performed through HRM and Sanger sequencing. We compared clinical features of CALR-mutated versus 45 JAK2/MPL-mutated subjects in ET.Nineteen samples showed distinct HRM patterns from wild-type. Of them, 18 were mutations and one a polymorphism as confirmed by direct sequencing. CALR mutations were present in 44% of ET (15/34, 14% of persistent thrombocytosis suggestive of MPN (3/21 and none of the secondary thrombocytosis (0/98. Of the 18 mutants, 9 were 52 bp deletions, 8 were 5 bp insertions and other was a complex mutation with insertion/deletion. No mutations were found after sequencing analysis of 45 samples displaying wild-type HRM curves. HRM technique was reproducible, no false positive or negative were detected and the limit of detection was of 3%.This study establishes a sensitive, reliable and rapid HRM method to screen for the presence of CALR mutations.

  8. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  9. Efficient reliability analysis of structures with the rotational quasi-symmetric point- and the maximum entropy methods

    Science.gov (United States)

    Xu, Jun; Dang, Chao; Kong, Fan

    2017-10-01

    This paper presents a new method for efficient structural reliability analysis. In this method, a rotational quasi-symmetric point method (RQ-SPM) is proposed for evaluating the fractional moments of the performance function. Then, the derivation of the performance function's probability density function (PDF) is carried out based on the maximum entropy method in which constraints are specified in terms of fractional moments. In this regard, the probability of failure can be obtained by a simple integral over the performance function's PDF. Six examples, including a finite element-based reliability analysis and a dynamic system with strong nonlinearity, are used to illustrate the efficacy of the proposed method. All the computed results are compared with those by Monte Carlo simulation (MCS). It is found that the proposed method can provide very accurate results with low computational effort.

  10. Application of reflectance spectroscopies (FTIR-ATR & FT-NIR) coupled with multivariate methods for robust in vivo detection of begomovirus infection in papaya leaves.

    Science.gov (United States)

    Haq, Quazi M I; Mabood, Fazal; Naureen, Zakira; Al-Harrasi, Ahmed; Gilani, Sayed A; Hussain, Javid; Jabeen, Farah; Khan, Ajmal; Al-Sabari, Ruqaya S M; Al-Khanbashi, Fatema H S; Al-Fahdi, Amira A M; Al-Zaabi, Ahoud K A; Al-Shuraiqi, Fatma A M; Al-Bahaisi, Iman M

    2018-06-05

    Nucleic acid & serology based methods have revolutionized plant disease detection, however, they are not very reliable at asymptomatic stage, especially in case of pathogen with systemic infection, in addition, they need at least 1-2days for sample harvesting, processing, and analysis. In this study, two reflectance spectroscopies i.e. Near Infrared reflectance spectroscopy (NIR) and Fourier-Transform-Infrared spectroscopy with Attenuated Total Reflection (FT-IR, ATR) coupled with multivariate exploratory methods like Principle Component Analysis (PCA) and Partial least square discriminant analysis (PLS-DA) have been deployed to detect begomovirus infection in papaya leaves. The application of those techniques demonstrates that they are very useful for robust in vivo detection of plant begomovirus infection. These methods are simple, sensitive, reproducible, precise, and do not require any lengthy samples preparation procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. INNOVATIVE METHODS TO EVALUATE THE RELIABILITY OF INFORMATION CONSOLIDATED FINANCIAL STATEMENTS

    Directory of Open Access Journals (Sweden)

    Irina P. Kurochkina

    2014-01-01

    Full Text Available The article explores the possibility of using foreign innovative methods to assess the reliabilityof information consolidated fi nancial statements of Russian companies. Recommendations aremade under their adaptation and applicationinto commercial organizations. Banish methodindicators are implemented in one of the world’s largest vertically integrated steel and miningcompanies. Audit firms are proposed to usemethods of assessing the reliability of information in the practical application of ISA.

  12. Reliability Analysis of a Composite Wind Turbine Blade Section Using the Model Correction Factor Method: Numerical Study and Validation

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov; Friis-Hansen, Peter; Berggreen, Christian

    2013-01-01

    by the composite failure criteria. Each failure mode has been considered in a separate component reliability analysis, followed by a system analysis which gives the total probability of failure of the structure. The Model Correction Factor method used in connection with FORM (First-Order Reliability Method) proved...

  13. Reliability of an experimental method to analyse the impact point on a golf ball during putting.

    Science.gov (United States)

    Richardson, Ashley K; Mitchell, Andrew C S; Hughes, Gerwyn

    2015-06-01

    This study aimed to examine the reliability of an experimental method identifying the location of the impact point on a golf ball during putting. Forty trials were completed using a mechanical putting robot set to reproduce a putt of 3.2 m, with four different putter-ball combinations. After locating the centre of the dimple pattern (centroid) the following variables were tested; distance of the impact point from the centroid, angle of the impact point from the centroid and distance of the impact point from the centroid derived from the X, Y coordinates. Good to excellent reliability was demonstrated in all impact variables reflected in very strong relative (ICC = 0.98-1.00) and absolute reliability (SEM% = 0.9-4.3%). The highest SEM% observed was 7% for the angle of the impact point from the centroid. In conclusion, the experimental method was shown to be reliable at locating the centroid location of a golf ball, therefore allowing for the identification of the point of impact with the putter head and is suitable for use in subsequent studies.

  14. Minimum Delay Moving Object Detection

    KAUST Repository

    Lao, Dong

    2017-05-14

    This thesis presents a general framework and method for detection of an object in a video based on apparent motion. The object moves, at some unknown time, differently than the “background” motion, which can be induced from camera motion. The goal of proposed method is to detect and segment the object as soon it moves in an online manner. Since motion estimation can be unreliable between frames, more than two frames are needed to reliably detect the object. Observing more frames before declaring a detection may lead to a more accurate detection and segmentation, since more motion may be observed leading to a stronger motion cue. However, this leads to greater delay. The proposed method is designed to detect the object(s) with minimum delay, i.e., frames after the object moves, constraining the false alarms, defined as declarations of detection before the object moves or incorrect or inaccurate segmentation at the detection time. Experiments on a new extensive dataset for moving object detection show that our method achieves less delay for all false alarm constraints than existing state-of-the-art.

  15. Improving the reliability of POD curves in NDI methods using a Bayesian inversion approach for uncertainty quantification

    Science.gov (United States)

    Ben Abdessalem, A.; Jenson, F.; Calmon, P.

    2016-02-01

    This contribution provides an example of the possible advantages of adopting a Bayesian inversion approach to uncertainty quantification in nondestructive inspection methods. In such problem, the uncertainty associated to the random parameters is not always known and needs to be characterised from scattering signal measurements. The uncertainties may then correctly propagated in order to determine a reliable probability of detection curve. To this end, we establish a general Bayesian framework based on a non-parametric maximum likelihood function formulation and some priors from expert knowledge. However, the presented inverse problem is time-consuming and computationally intensive. To cope with this difficulty, we replace the real model by a surrogate one in order to speed-up the model evaluation and to make the problem to be computationally feasible for implementation. The least squares support vector regression is adopted as metamodelling technique due to its robustness to deal with non-linear problems. We illustrate the usefulness of this methodology through the control of tube with enclosed defect using ultrasonic inspection method.

  16. Rapid modified QuEChERS method for pesticides detection in honey by high-performance liquid chromatography UV-visible

    Directory of Open Access Journals (Sweden)

    Elisabetta Bonerba

    2014-05-01

    Full Text Available The extensive use of pesticides in agriculture plays an important role in bees die-off and allows the presence of residues in hive products, particularly in honey. An accurate and reliable analytical method, based on QuEChERS extractive technique, has been developed for the quantitative determination by high-performance liquid chromatography UV-visible detector of 5 pesticides (Deltamethrin, Dimethoate, Imidacloprid, Acetamiprid, Chlorfenvinphos in honey. The method, according to Commission Directive 2002/63/EC and Regulation 882/2004/EC, provided excellent results with respect to linearity (correlation coefficient up to 0.993, limits of detection and quantification (0.005 and 0.01 μg/mL for Dimethoate, Deltamethrin and Chlorfenvinphos; 0.02 and 0.05 μg/mL for Acetamiprid and Imidacloprid, recovery values (86.4 to 96.3%, precision and relative expanded uncertainty of a measurement, demonstrating the conformity of the this method with the European directives. The proposed method was applied to 23 samples of Apulian honey. None of the investigated pesticides was detected in these samples.

  17. Comparison of various methods of detection of different forms of dengue virus type 2 RNA in cultured cells

    International Nuclear Information System (INIS)

    Liu, H.S.; Lin, Y.L.; Chen, C.C.

    1997-01-01

    In this report, the sensitivity of various methods of detection of dengue virus type 2 (DEN-2) sense, antisense, replicative intermediate (RI) and replicative form (RF) RNAs in infected mosquito Aedes pseudoscutellaris AP-61 and mammalian baby hamster kidney BHK-21 cells is compared. LiCl precipitation was used for separation of viral RF RNA from RI RNA. Our results show that reverse transcription-polymerase chain reaction (RT-PCR) followed by Southern blot analysis and slot blot hybridisation of LiCl-fractionated RNA were the most sensitive methods of detection of viral RNA and determination of its single-stranded form. Northern blot analysis was the least sensitive method of detection of any form of viral RNA. U sing slot blot hybridisation of LiCl-precipitated RNA, viral RI RNA containing de novo synthesised negative strand viral RNA was first detected 30 min after virus inoculation in both cell lines. This is the earliest time of detection of DEN viral RNA synthesis in host cells so far reported. However, RF RNA could not be detected until 24 hrs post infection (p.i.) in AP-61 and 2 days p.i. in BHK-21 cells, respectively. The sequential order of individual forms of viral RNA detected in the infected cells was RI, RF and genomic RNAs. Viral RNA was detected in AP-61 cells always earlier than in BHK-21 cells. Moreover, the level of viral RNA in AP-61 cells was higher than that in BHK-21 cells, suggesting that the virus replicated more actively in AP-61 cells. In conclusion, the LiCl separation of viral RNA followed by slot blot hybridisation was found to be the most sensitive and reliable method of detection of DEN virus RI, RF and genomic RNAs in the infected cells. Moreover, this method can be applied to determine the replication status of any single-stranded RNA virus in the host. (authors)

  18. Efficient Estimation of Extreme Non-linear Roll Motions using the First-order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    2007-01-01

    In on-board decision support systems efficient procedures are needed for real-time estimation of the maximum ship responses to be expected within the next few hours, given on-line information on the sea state and user defined ranges of possible headings and speeds. For linear responses standard...... frequency domain methods can be applied. To non-linear responses like the roll motion, standard methods like direct time domain simulations are not feasible due to the required computational time. However, the statistical distribution of non-linear ship responses can be estimated very accurately using...... the first-order reliability method (FORM), well-known from structural reliability problems. To illustrate the proposed procedure, the roll motion is modelled by a simplified non-linear procedure taking into account non-linear hydrodynamic damping, time-varying restoring and wave excitation moments...

  19. [A Duplex PCR Method for Detection of Babesia caballi and Theileria equi].

    Science.gov (United States)

    Zhang, Yang; Zhang, Yu-ting; Wang, Zhen-bao; Bolati; Li, Hai; Bayinchahan

    2015-04-01

    To develop a duplex PCR assay for detection of Babesia caballi and Theileria equi. Two pairs of primers were designed according to the BC48 gene of B. caballi and 18 s rRNA gene of T. equi, and a duplex PCR assay was developed by the optimization of reaction conditions. The specificity, sensitivity and reliability of the method were tested. The horse blood samples of suspected cases were collected from Yili region, and detected by the duplex PCR, microspopy, conventional PCR, and fluorescence quantitative PCR, and the results were compared. Using the duplex PCR assay, the specific fragments of 155 bp and 280 bp were amplified from DNA samples of B. caballi and T. equi, respectively. No specific fragment was amplified from DNA samples of B. bigemina, Theilerdia annulata, Theilerdia sergenti, Toxoplasma gondii, Neospora caninum, and Trypanosoma evansi. The limit of detection was 4.85 x 10(5) copies/L for B. caballi DNA and 4.85 x 10(4) copies/µl for T. equi DNA, respectively. Among the 24 blood samples, 11 were found B. caballi-positive by the duplex PCR assay, and 18 were T. equi-positive. The coincidence rate of microscopy, conventional PCR, and fluorescence quantitative PCR with duplex PCR was 91.7% (22/24), 95.8% (23/24), and 95.8% (23/24), respectively. A duplex PCR assay for simultaneous detection of B. caballi and T. equi is established.

  20. Measurement of thermoluminescence - a new method for detecting radiation treatment of spices. Die Messung der Thermolumineszenz - ein neues Verfahren zur Identifizierung strahlenbehandelter Gewuerze

    Energy Technology Data Exchange (ETDEWEB)

    Heide, L; Boegl, W

    1984-12-01

    By the experiments described in this report it was examined in 14 different spices to which extent measurements of thermoluminescence intensity of up to 300/sup 0/C are suitable for detecting treatment with ionizing radiation. The optimal weight of each spice was first determined for a later investigation of the dependence of thermoluminescence intensity on dose and postirradiation storage. In most spices, radiation treatment is detectable as long as after a storage period of over 2 months. In general it may be stated that thermoluminescence measurement is a reliable method for detecting radiation treatment in supplement to chemiluminescence measurement.

  1. Reliability of the Fermilab Antiproton Source

    International Nuclear Information System (INIS)

    Harms, E. Jr.

    1993-05-01

    This paper reports on the reliability of the Fermilab Antiproton source since it began operation in 1985. Reliability of the complex as a whole as well as subsystem performance is summarized. Also discussed is the trending done to determine causes of significant machine downtime and actions taken to reduce the incidence of failure. Finally, results of a study to detect previously unidentified reliability limitations are presented

  2. Multi-immunoreaction-based dual-color capillary electrophoresis for enhanced diagnostic reliability of thyroid gland disease.

    Science.gov (United States)

    Woo, Nain; Kim, Su-Kang; Kang, Seong Ho

    2017-08-04

    Thyroid-stimulating hormone (TSH) secretion plays a critical role in regulating thyroid gland function and circulating thyroid hormones (i.e., thyroxine (T4) and triiodothyronine (T3)). A novel multi-immunoreaction-based dual-color capillary electrophoresis (CE) technique was investigated in this study to assess its reliability in diagnosing thyroid gland disease via simultaneous detection of TSH, T3, and T4 in a single run of CE. Compared to the conventional immunoreaction technique, multi-immunoreaction of biotinylated streptavidin antibodies increased the selectivity and sensitivity for individual hormones in human blood samples. Dual-color laser-induced fluorescence (LIF) detection-based CE performed in a running buffer of 25mM Na 2 B 4 O 7 -NaOH (pH 9.3) allowed for fast, simultaneous quantitative analysis of three target thyroid hormones using different excited wavelengths within 3.2min. This process had excellent sensitivity and detection limits of 0.05-5.32 fM. The results showed 1000-100,000 times higher detection sensitivity than previous methods. Method validation with enzyme linked immunosorbent assay for application with human blood samples showed that the CE method was not significantly different at the 98% confidence level. Therefore, the developed CE-LIF method has the advantages of high detection sensitivity, faster analysis time, and smaller sample amount compared to the conventional methods The combined multi-immunoreaction and dual-color CE-LIF method should have increased diagnostic reliability for thyroid gland disease compared to conventional methods based on its highly sensitive detection of thyroid hormones using a single injection and high-throughput screening. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Novel Method For Low-Rate Ddos Attack Detection

    Science.gov (United States)

    Chistokhodova, A. A.; Sidorov, I. D.

    2018-05-01

    The relevance of the work is associated with an increasing number of advanced types of DDoS attacks, in particular, low-rate HTTP-flood. Last year, the power and complexity of such attacks increased significantly. The article is devoted to the analysis of DDoS attacks detecting methods and their modifications with the purpose of increasing the accuracy of DDoS attack detection. The article details low-rate attacks features in comparison with conventional DDoS attacks. During the analysis, significant shortcomings of the available method for detecting low-rate DDoS attacks were found. Thus, the result of the study is an informal description of a new method for detecting low-rate denial-of-service attacks. The architecture of the stand for approbation of the method is developed. At the current stage of the study, it is possible to improve the efficiency of an already existing method by using a classifier with memory, as well as additional information.

  4. A Reliable Method to Measure Lip Height Using Photogrammetry in Unilateral Cleft Lip Patients.

    Science.gov (United States)

    van der Zeeuw, Frederique; Murabit, Amera; Volcano, Johnny; Torensma, Bart; Patel, Brijesh; Hay, Norman; Thorburn, Guy; Morris, Paul; Sommerlad, Brian; Gnarra, Maria; van der Horst, Chantal; Kangesu, Loshan

    2015-09-01

    There is still no reliable tool to determine the outcome of the repaired unilateral cleft lip (UCL). The aim of this study was therefore to develop an accurate, reliable tool to measure vertical lip height from photographs. The authors measured the vertical height of the cutaneous and vermilion parts of the lip in 72 anterior-posterior view photographs of 17 patients with repairs to a UCL. Points on the lip's white roll and vermillion were marked on both the cleft and the noncleft sides on each image. Two new concepts were tested. First, photographs were standardized using the horizontal (medial to lateral) eye fissure width (EFW) for calibration. Second, the authors tested the interpupillary line (IPL) and the alar base line (ABL) for their reliability as horizontal lines of reference. Measurements were taken by 2 independent researchers, at 2 different time points each. Overall 2304 data points were obtained and analyzed. Results showed that the method was very effective in measuring the height of the lip on the cleft side with the noncleft side. When using the IPL, inter- and intra-rater reliability was 0.99 to 1.0, with the ABL it varied from 0.91 to 0.99 with one exception at 0.84. The IPL was easier to define because in some subjects the overhanging nasal tip obscured the alar base and gave more consistent measurements possibly because the reconstructed alar base was sometimes indistinct. However, measurements from the IPL can only give the percentage difference between the left and right sides of the lip, whereas those from the ABL can also give exact measurements. Patient examples were given that show how the measurements correlate with clinical assessment. The authors propose this method of photogrammetry with the innovative use of the IPL as a reliable horizontal plane and use of the EFW for calibration as a useful and reliable tool to assess the outcome of UCL repair.

  5. Retinal microaneurysms detection using local convergence index features

    NARCIS (Netherlands)

    Dashtbozorg, B.; Zhang, J.; Huang, F.; ter Haar Romeny, B.M.

    2018-01-01

    Retinal microaneurysms (MAs) are the earliest clinical sign of diabetic retinopathy disease. Detection of microaneurysms is crucial for the early diagnosis of diabetic retinopathy and prevention of blindness. In this paper, a novel and reliable method for automatic detection of microaneurysms in

  6. Retinal microaneurysms detection using local convergence index features

    NARCIS (Netherlands)

    Dasht Bozorg, B.; Zhang, J.; ter Haar Romeny, B.M.

    2017-01-01

    Retinal microaneurysms are the earliest clinical sign of diabetic retinopathy disease. Detection of microaneurysms is crucial for the early diagnosis of diabetic retinopathy and prevention of blindness. In this paper, a novel and reliable method for automatic detection of microaneurysms in retinal

  7. Data-Driven Method for Wind Turbine Yaw Angle Sensor Zero-Point Shifting Fault Detection

    Directory of Open Access Journals (Sweden)

    Yan Pei

    2018-03-01

    Full Text Available Wind turbine yaw control plays an important role in increasing the wind turbine production and also in protecting the wind turbine. Accurate measurement of yaw angle is the basis of an effective wind turbine yaw controller. The accuracy of yaw angle measurement is affected significantly by the problem of zero-point shifting. Hence, it is essential to evaluate the zero-point shifting error on wind turbines on-line in order to improve the reliability of yaw angle measurement in real time. Particularly, qualitative evaluation of the zero-point shifting error could be useful for wind farm operators to realize prompt and cost-effective maintenance on yaw angle sensors. In the aim of qualitatively evaluating the zero-point shifting error, the yaw angle sensor zero-point shifting fault is firstly defined in this paper. A data-driven method is then proposed to detect the zero-point shifting fault based on Supervisory Control and Data Acquisition (SCADA data. The zero-point shifting fault is detected in the proposed method by analyzing the power performance under different yaw angles. The SCADA data are partitioned into different bins according to both wind speed and yaw angle in order to deeply evaluate the power performance. An indicator is proposed in this method for power performance evaluation under each yaw angle. The yaw angle with the largest indicator is considered as the yaw angle measurement error in our work. A zero-point shifting fault would trigger an alarm if the error is larger than a predefined threshold. Case studies from several actual wind farms proved the effectiveness of the proposed method in detecting zero-point shifting fault and also in improving the wind turbine performance. Results of the proposed method could be useful for wind farm operators to realize prompt adjustment if there exists a large error of yaw angle measurement.

  8. A Novel Unscheduled Islanding Detection Method for Microgrid

    Directory of Open Access Journals (Sweden)

    Li Hui

    2018-01-01

    Full Text Available Microgrid with its intelligent and flexible control characteristics conform to the trend of sustainable development of electricity, and when the microgrid in the unplanned island state, the successful detection of the island is a prerequisite, energy storage inverter as the key equipment in the microgrid system, island protection is one of the necessary functions. In this paper, an improved islanding detection method based on active frequency drift and q-axis reactive power perturbation is proposed. The method has the advantages of faster detection speed and minor influence on power quality, which makes the energy storage inverter with better output power quality when it works on grid-connected state, and can be detected the islanding state quickly from grid-connected mode to islanded mode. Finally, the validity and superiority of the improved island detection method are verified by simulation experiments.

  9. A rapid reliability estimation method for directed acyclic lifeline networks with statistically dependent components

    International Nuclear Information System (INIS)

    Kang, Won-Hee; Kliese, Alyce

    2014-01-01

    Lifeline networks, such as transportation, water supply, sewers, telecommunications, and electrical and gas networks, are essential elements for the economic and societal functions of urban areas, but their components are highly susceptible to natural or man-made hazards. In this context, it is essential to provide effective pre-disaster hazard mitigation strategies and prompt post-disaster risk management efforts based on rapid system reliability assessment. This paper proposes a rapid reliability estimation method for node-pair connectivity analysis of lifeline networks especially when the network components are statistically correlated. Recursive procedures are proposed to compound all network nodes until they become a single super node representing the connectivity between the origin and destination nodes. The proposed method is applied to numerical network examples and benchmark interconnected power and water networks in Memphis, Shelby County. The connectivity analysis results show the proposed method's reasonable accuracy and remarkable efficiency as compared to the Monte Carlo simulations

  10. Accuracy and Reliability of the Klales et al. (2012) Morphoscopic Pelvic Sexing Method.

    Science.gov (United States)

    Lesciotto, Kate M; Doershuk, Lily J

    2018-01-01

    Klales et al. (2012) devised an ordinal scoring system for the morphoscopic pelvic traits described by Phenice (1969) and used for sex estimation of skeletal remains. The aim of this study was to test the accuracy and reliability of the Klales method using a large sample from the Hamann-Todd collection (n = 279). Two observers were blinded to sex, ancestry, and age and used the Klales et al. method to estimate the sex of each individual. Sex was correctly estimated for females with over 95% accuracy; however, the male allocation accuracy was approximately 50%. Weighted Cohen's kappa and intraclass correlation coefficient analysis for evaluating intra- and interobserver error showed moderate to substantial agreement for all traits. Although each trait can be reliably scored using the Klales method, low accuracy rates and high sex bias indicate better trait descriptions and visual guides are necessary to more accurately reflect the range of morphological variation. © 2017 American Academy of Forensic Sciences.

  11. Detecting Android Malwares with High-Efficient Hybrid Analyzing Methods

    Directory of Open Access Journals (Sweden)

    Yu Liu

    2018-01-01

    Full Text Available In order to tackle the security issues caused by malwares of Android OS, we proposed a high-efficient hybrid-detecting scheme for Android malwares. Our scheme employed different analyzing methods (static and dynamic methods to construct a flexible detecting scheme. In this paper, we proposed some detecting techniques such as Com+ feature based on traditional Permission and API call features to improve the performance of static detection. The collapsing issue of traditional function call graph-based malware detection was also avoided, as we adopted feature selection and clustering method to unify function call graph features of various dimensions into same dimension. In order to verify the performance of our scheme, we built an open-access malware dataset in our experiments. The experimental results showed that the suggested scheme achieved high malware-detecting accuracy, and the scheme could be used to establish Android malware-detecting cloud services, which can automatically adopt high-efficiency analyzing methods according to the properties of the Android applications.

  12. Inverse Reliability Task: Artificial Neural Networks and Reliability-Based Optimization Approaches

    OpenAIRE

    Lehký , David; Slowik , Ondřej; Novák , Drahomír

    2014-01-01

    Part 7: Genetic Algorithms; International audience; The paper presents two alternative approaches to solve inverse reliability task – to determine the design parameters to achieve desired target reliabilities. The first approach is based on utilization of artificial neural networks and small-sample simulation Latin hypercube sampling. The second approach considers inverse reliability task as reliability-based optimization task using double-loop method and also small-sample simulation. Efficie...

  13. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  14. Reliability design of a critical facility: An application of PRA methods

    International Nuclear Information System (INIS)

    Souza Vieira Neto, A.; Souza Borges, W. de

    1987-01-01

    Although a general agreement concerning the enforcement of reliability (probabilistic) design criteria for nuclear utilities is yet to be achieved. PRA methodology can still be used successfully as a project design and review tool, aimed at improving system's prospective performance or minimizing expected accident consequences. In this paper, the potential of such an application of PRA methods is examined in the special case of a critical design project currently being developed in Brazil. (orig.)

  15. Cancer Detection and Diagnosis Methods - Annual Plan

    Science.gov (United States)

    Early cancer detection is a proven life-saving strategy. Learn about the research opportunities NCI supports, including liquid biopsies and other less-invasive methods, for detecting early cancers and precancerous growths.

  16. Test-Retest Reliability and Minimal Detectable Change of Randomized Dichotic Digits in Learning-Disabled Children: Implications for Dichotic Listening Training.

    Science.gov (United States)

    Mahdavi, Mohammad Ebrahim; Pourbakht, Akram; Parand, Akram; Jalaie, Shohreh

    2018-03-01

    Evaluation of dichotic listening to digits is a common part of many studies for diagnosis and managing auditory processing disorders in children. Previous researchers have verified test-retest relative reliability of dichotic digits results in normal children and adults. However, detecting intervention-related changes in the ear scores after dichotic listening training requires information regarding trial-to-trial typical variation of individual ear scores that is estimated using indices of absolute reliability. Previous studies have not addressed absolute reliability of dichotic listening results. To compare the results of the Persian randomized dichotic digits test (PRDDT) and its relative and absolute indices of reliability between typical achieving (TA) and learning-disabled (LD) children. A repeated measures observational study. Fifteen LD children were recruited from a previously performed study with age range of 7-12 yr. The control group consisted of 15 TA schoolchildren with age range of 8-11 yr. The Persian randomized dichotic digits test was administered on the children under free recall condition in two test sessions 7-12 days apart. We compared the average of the ear scores and ear advantage between TA and LD children. Relative indices of reliability included Pearson's correlation and intraclass correlation (ICC 2,1 ) coefficients and absolute reliability was evaluated by calculation of standard error of measurement (SEM) and minimal detectable change (MDC) using the raw ear scores. The Pearson correlation coefficient indicated that in both groups of children the ear scores of test and retest sessions were strongly and positively (greater than +0.8) correlated. The ear scores showed excellent ICC coefficient of consistency (0.78-0.82) and fair to excellent ICC coefficient of absolute agreement (0.62-0.74) in TA children and excellent ICC coefficients of consistency and absolute agreement in LD children (0.76-0.87). SEM and SEM% of the ear scores in TA

  17. Improvement of human reliability analysis method for PRA

    International Nuclear Information System (INIS)

    Tanji, Junichi; Fujimoto, Haruo

    2013-09-01

    It is required to refine human reliability analysis (HRA) method by, for example, incorporating consideration for the cognitive process of operator into the evaluation of diagnosis errors and decision-making errors, as a part of the development and improvement of methods used in probabilistic risk assessments (PRAs). JNES has been developed a HRA method based on ATHENA which is suitable to handle the structured relationship among diagnosis errors, decision-making errors and operator cognition process. This report summarizes outcomes obtained from the improvement of HRA method, in which enhancement to evaluate how the plant degraded condition affects operator cognitive process and to evaluate human error probabilities (HEPs) which correspond to the contents of operator tasks is made. In addition, this report describes the results of case studies on the representative accident sequences to investigate the applicability of HRA method developed. HEPs of the same accident sequences are also estimated using THERP method, which is most popularly used HRA method, and comparisons of the results obtained using these two methods are made to depict the differences of these methods and issues to be solved. Important conclusions obtained are as follows: (1) Improvement of HRA method using operator cognitive action model. Clarification of factors to be considered in the evaluation of human errors, incorporation of degraded plant safety condition into HRA and investigation of HEPs which are affected by the contents of operator tasks were made to improve the HRA method which can integrate operator cognitive action model into ATHENA method. In addition, the detail of procedures of the improved method was delineated in the form of flowchart. (2) Case studies and comparison with the results evaluated by THERP method. Four operator actions modeled in the PRAs of representative BWR5 and 4-loop PWR plants were selected and evaluated as case studies. These cases were also evaluated using

  18. Reliability, validity, and minimal detectable change of the push-off test scores in assessing upper extremity weight-bearing ability.

    Science.gov (United States)

    Mehta, Saurabh P; George, Hannah R; Goering, Christian A; Shafer, Danielle R; Koester, Alan; Novotny, Steven

    2017-11-01

    Clinical measurement study. The push-off test (POT) was recently conceived and found to be reliable and valid for assessing weight bearing through injured wrist or elbow. However, further research with larger sample can lend credence to the preliminary findings supporting the use of the POT. This study examined the interrater reliability, construct validity, and measurement error for the POT in patients with wrist conditions. Participants with musculoskeletal (MSK) wrist conditions were recruited. The performance on the POT, grip isometric strength of wrist extensors was assessed. The shortened version of the Disabilities of the Arm, Shoulder and Hand and numeric pain rating scale were completed. The intraclass correlation coefficient assessed interrater reliability of the POT. Pearson correlation coefficients (r) examined the concurrent relationships between the POT and other measures. The standard error of measurement and the minimal detectable change at 90% confidence interval were assessed as measurement error and index of true change for the POT. A total of 50 participants with different elbow or wrist conditions (age: 48.1 ± 16.6 years) were included in this study. The results of this study strongly supported the interrater reliability (intraclass correlation coefficient: 0.96 and 0.93 for the affected and unaffected sides, respectively) of the POT in patients with wrist MSK conditions. The POT showed convergent relationships with the grip strength on the injured side (r = 0.89) and the wrist extensor strength (r = 0.7). The POT showed smaller standard error of measurement (1.9 kg). The minimal detectable change at 90% confidence interval for the POT was 4.4 kg for the sample. This study provides additional evidence to support the reliability and validity of the POT. This is the first study that provides the values for the measurement error and true change on the POT scores in patients with wrist MSK conditions. Further research should examine the

  19. Using the graphs models for evaluating in-core monitoring systems reliability by the method of imiting simulaton

    International Nuclear Information System (INIS)

    Golovanov, M.N.; Zyuzin, N.N.; Levin, G.L.; Chesnokov, A.N.

    1987-01-01

    An approach for estimation of reliability factors of complex reserved systems at early stages of development using the method of imitating simulation is considered. Different types of models, their merits and lacks are given. Features of in-core monitoring systems and advosability of graph model and graph theory element application for estimating reliability of such systems are shown. The results of investigation of the reliability factors of the reactor monitoring, control and core local protection subsystem are shown

  20. How Well Can We Detect Lineage-Specific Diversification-Rate Shifts? A Simulation Study of Sequential AIC Methods.

    Science.gov (United States)

    May, Michael R; Moore, Brian R

    2016-11-01

    Evolutionary biologists have long been fascinated by the extreme differences in species numbers across branches of the Tree of Life. This has motivated the development of statistical methods for detecting shifts in the rate of lineage diversification across the branches of phylogenic trees. One of the most frequently used methods, MEDUSA, explores a set of diversification-rate models, where each model assigns branches of the phylogeny to a set of diversification-rate categories. Each model is first fit to the data, and the Akaike information criterion (AIC) is then used to identify the optimal diversification model. Surprisingly, the statistical behavior of this popular method is uncharacterized, which is a concern in light of: (1) the poor performance of the AIC as a means of choosing among models in other phylogenetic contexts; (2) the ad hoc algorithm used to visit diversification models, and; (3) errors that we reveal in the likelihood function used to fit diversification models to the phylogenetic data. Here, we perform an extensive simulation study demonstrating that MEDUSA (1) has a high false-discovery rate (on average, spurious diversification-rate shifts are identified [Formula: see text] of the time), and (2) provides biased estimates of diversification-rate parameters. Understanding the statistical behavior of MEDUSA is critical both to empirical researchers-in order to clarify whether these methods can make reliable inferences from empirical datasets-and to theoretical biologists-in order to clarify the specific problems that need to be solved in order to develop more reliable approaches for detecting shifts in the rate of lineage diversification. [Akaike information criterion; extinction; lineage-specific diversification rates; phylogenetic model selection; speciation.]. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  1. Reliability of Power Electronic Converter Systems

    DEFF Research Database (Denmark)

    -link capacitance in power electronic converter systems; wind turbine systems; smart control strategies for improved reliability of power electronics system; lifetime modelling; power module lifetime test and state monitoring; tools for performance and reliability analysis of power electronics systems; fault...... for advancing the reliability, availability, system robustness, and maintainability of PECS at different levels of complexity. Drawing on the experience of an international team of experts, this book explores the reliability of PECS covering topics including an introduction to reliability engineering in power...... electronic converter systems; anomaly detection and remaining-life prediction for power electronics; reliability of DC-link capacitors in power electronic converters; reliability of power electronics packaging; modeling for life-time prediction of power semiconductor modules; minimization of DC...

  2. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  3. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  4. Detection method of internal leakage from valve using acoustic method

    International Nuclear Information System (INIS)

    Kumagai, Horomichi

    1990-01-01

    The purpose of this study is to estimate the availability of acoustic method for detecting the internal leakage of valves at power plants. Experiments have been carried out on the characteristics of acoustic noise caused by the leak simulated flow. From the experimental results, the mechanism of the acoustic noisegenerated from flow, the relation between acoustic intensity and leak flow velocity, and the characteristics of the acoustic frequency spectrum were clarified. The acoustic method was applied to valves at site, and the background noises were measured in abnormal plant conditions. When the background level is higher than the acoustic signal, the difference between the background noise frequency spectrum and the acoustic signal spectrum provide a very useful leak detection method. (author)

  5. Monte Carlo methods for the reliability analysis of Markov systems

    International Nuclear Information System (INIS)

    Buslik, A.J.

    1985-01-01

    This paper presents Monte Carlo methods for the reliability analysis of Markov systems. Markov models are useful in treating dependencies between components. The present paper shows how the adjoint Monte Carlo method for the continuous time Markov process can be derived from the method for the discrete-time Markov process by a limiting process. The straightforward extensions to the treatment of mean unavailability (over a time interval) are given. System unavailabilities can also be estimated; this is done by making the system failed states absorbing, and not permitting repair from them. A forward Monte Carlo method is presented in which the weighting functions are related to the adjoint function. In particular, if the exact adjoint function is known then weighting factors can be constructed such that the exact answer can be obtained with a single Monte Carlo trial. Of course, if the exact adjoint function is known, there is no need to perform the Monte Carlo calculation. However, the formulation is useful since it gives insight into choices of the weight factors which will reduce the variance of the estimator

  6. Raman spectroscopy-based detection of chemical contaminants in food powders

    Science.gov (United States)

    Raman spectroscopy technique has proven to be a reliable method for qualitative detection of chemical contaminants in food ingredients and products. For quantitative imaging-based detection, each contaminant particle in a food sample must be detected and it is important to determine the necessary sp...

  7. Method of detecting failed fuels

    International Nuclear Information System (INIS)

    Ishizaki, Hideaki; Suzumura, Takeshi.

    1982-01-01

    Purpose: To enable the settlement of the temperature of an adequate filling high temperature pure water by detecting the outlet temperature of a high temperature pure water filling tube to a fuel assembly to control the heating of the pure water and detecting the failed fuel due to the sampling of the pure water. Method: A temperature sensor is provided at a water tube connected to a sipping cap for filling high temperature pure water to detect the temperature of the high temperature pure water at the outlet of the tube, and the temperature is confirmed by a temperature indicator. A heater is controlled on the basis of this confirmation, an adequate high temperature pure water is filled in the fuel assembly, and the pure water is replaced with coolant. Then, it is sampled to settle the adequate temperature of the high temperature coolant used for detecting the failure of the fuel assembly. As a result, the sipping effect does not decrease, and the failed fuel can be precisely detected. (Yoshihara, H.)

  8. Development of advanced methods and related software for human reliability evaluation within probabilistic safety analyses

    International Nuclear Information System (INIS)

    Kosmowski, K.T.; Mertens, J.; Degen, G.; Reer, B.

    1994-06-01

    Human Reliability Analysis (HRA) is an important part of Probabilistic Safety Analysis (PSA). The first part of this report consists of an overview of types of human behaviour and human error including the effect of significant performance shaping factors on human reliability. Particularly with regard to safety assessments for nuclear power plants a lot of HRA methods have been developed. The most important of these methods are presented and discussed in the report, together with techniques for incorporating HRA into PSA and with models of operator cognitive behaviour. Based on existing HRA methods the concept of a software system is described. For the development of this system the utilization of modern programming tools is proposed; the essential goal is the effective application of HRA methods. A possible integration of computeraided HRA within PSA is discussed. The features of Expert System Technology and examples of applications (PSA, HRA) are presented in four appendices. (orig.) [de

  9. Study on Feasibility of Applying Function Approximation Moment Method to Achieve Reliability-Based Design Optimization

    International Nuclear Information System (INIS)

    Huh, Jae Sung; Kwak, Byung Man

    2011-01-01

    Robust optimization or reliability-based design optimization are some of the methodologies that are employed to take into account the uncertainties of a system at the design stage. For applying such methodologies to solve industrial problems, accurate and efficient methods for estimating statistical moments and failure probability are required, and further, the results of sensitivity analysis, which is needed for searching direction during the optimization process, should also be accurate. The aim of this study is to employ the function approximation moment method into the sensitivity analysis formulation, which is expressed as an integral form, to verify the accuracy of the sensitivity results, and to solve a typical problem of reliability-based design optimization. These results are compared with those of other moment methods, and the feasibility of the function approximation moment method is verified. The sensitivity analysis formula with integral form is the efficient formulation for evaluating sensitivity because any additional function calculation is not needed provided the failure probability or statistical moments are calculated

  10. System and Method for Multi-Wavelength Optical Signal Detection

    Science.gov (United States)

    McGlone, Thomas D. (Inventor)

    2017-01-01

    The system and method for multi-wavelength optical signal detection enables the detection of optical signal levels significantly below those processed at the discrete circuit level by the use of mixed-signal processing methods implemented with integrated circuit technologies. The present invention is configured to detect and process small signals, which enables the reduction of the optical power required to stimulate detection networks, and lowers the required laser power to make specific measurements. The present invention provides an adaptation of active pixel networks combined with mixed-signal processing methods to provide an integer representation of the received signal as an output. The present invention also provides multi-wavelength laser detection circuits for use in various systems, such as a differential absorption light detection and ranging system.

  11. A method and application study on holistic decision tree for human reliability analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Sun Feng; Zhong Shan; Wu Zhiyu

    2008-01-01

    The paper introduces a human reliability analysis method mainly used in Nuclear Power Plant Safety Assessment and the Holistic Decision Tree (HDT) method and how to apply it. The focus is primarily on providing the basic framework and some background of HDT method and steps to perform it. Influence factors and quality descriptors are formed by the interview with operators in Qinshan Nuclear Power Plant and HDT analysis performed for SGTR and SLOCA based on this information. The HDT model can use a graphic tree structure to indicate that error rate is a function of influence factors. HDT method is capable of dealing with the uncertainty in HRA, and it is reliable and practical. (authors)

  12. Models on reliability of non-destructive testing

    International Nuclear Information System (INIS)

    Simola, K.; Pulkkinen, U.

    1998-01-01

    The reliability of ultrasonic inspections has been studied in e.g. international PISC (Programme for the Inspection of Steel Components) exercises. These exercises have produced a large amount of information on the effect of various factors on the reliability of inspections. The information obtained from reliability experiments are used to model the dependency of flaw detection probability on various factors and to evaluate the performance of inspection equipment, including the sizing accuracy. The information from experiments is utilised in a most effective way when mathematical models are applied. Here, some statistical models for reliability of non-destructive tests are introduced. In order to demonstrate the use of inspection reliability models, they have been applied to the inspection results of intergranular stress corrosion cracking (IGSCC) type flaws in PISC III exercise (PISC 1995). The models are applied to both flaw detection frequency data of all inspection teams and to flaw sizing data of one participating team. (author)

  13. Steam leak detection method in pipeline using histogram analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Se Oh; Jeon, Hyeong Seop; Son, Ki Sung; Chae, Gyung Sun [Saean Engineering Corp, Seoul (Korea, Republic of); Park, Jong Won [Dept. of Information Communications Engineering, Chungnam NationalUnversity, Daejeon (Korea, Republic of)

    2015-10-15

    Leak detection in a pipeline usually involves acoustic emission sensors such as contact type sensors. These contact type sensors pose difficulties for installation and cannot operate in areas having high temperature and radiation. Therefore, recently, many researchers have studied the leak detection phenomenon by using a camera. Leak detection by using a camera has the advantages of long distance monitoring and wide area surveillance. However, the conventional leak detection method by using difference images often mistakes the vibration of a structure for a leak. In this paper, we propose a method for steam leakage detection by using the moving average of difference images and histogram analysis. The proposed method can separate the leakage and the vibration of a structure. The working performance of the proposed method is verified by comparing with experimental results.

  14. Transition from Partial Factors Method to Simulation-Based Reliability Assessment in Structural Design

    Czech Academy of Sciences Publication Activity Database

    Marek, Pavel; Guštar, M.; Permaul, K.

    1999-01-01

    Roč. 14, č. 1 (1999), s. 105-118 ISSN 0266-8920 R&D Projects: GA ČR GA103/94/0562; GA ČR GV103/96/K034 Keywords : reliability * safety * failure * durability * Monte Carlo method Subject RIV: JM - Building Engineering Impact factor: 0.522, year: 1999

  15. A review on automated pavement distress detection methods

    NARCIS (Netherlands)

    Coenen, Tom B.J.; Golroo, Amir

    2017-01-01

    In recent years, extensive research has been conducted on pavement distress detection. A large part of these studies applied automated methods to capture different distresses. In this paper, a literature review on the distresses and related detection methods are presented. This review also includes

  16. Method for predicting peptide detection in mass spectrometry

    Science.gov (United States)

    Kangas, Lars [West Richland, WA; Smith, Richard D [Richland, WA; Petritis, Konstantinos [Richland, WA

    2010-07-13

    A method of predicting whether a peptide present in a biological sample will be detected by analysis with a mass spectrometer. The method uses at least one mass spectrometer to perform repeated analysis of a sample containing peptides from proteins with known amino acids. The method then generates a data set of peptides identified as contained within the sample by the repeated analysis. The method then calculates the probability that a specific peptide in the data set was detected in the repeated analysis. The method then creates a plurality of vectors, where each vector has a plurality of dimensions, and each dimension represents a property of one or more of the amino acids present in each peptide and adjacent peptides in the data set. Using these vectors, the method then generates an algorithm from the plurality of vectors and the calculated probabilities that specific peptides in the data set were detected in the repeated analysis. The algorithm is thus capable of calculating the probability that a hypothetical peptide represented as a vector will be detected by a mass spectrometry based proteomic platform, given that the peptide is present in a sample introduced into a mass spectrometer.

  17. Intra-observer reliability and agreement of manual and digital orthodontic model analysis.

    Science.gov (United States)

    Koretsi, Vasiliki; Tingelhoff, Linda; Proff, Peter; Kirschneck, Christian

    2018-01-23

    Digital orthodontic model analysis is gaining acceptance in orthodontics, but its reliability is dependent on the digitalisation hardware and software used. We thus investigated intra-observer reliability and agreement / conformity of a particular digital model analysis work-flow in relation to traditional manual plaster model analysis. Forty-eight plaster casts of the upper/lower dentition were collected. Virtual models were obtained with orthoX®scan (Dentaurum) and analysed with ivoris®analyze3D (Computer konkret). Manual model analyses were done with a dial caliper (0.1 mm). Common parameters were measured on each plaster cast and its virtual counterpart five times each by an experienced observer. We assessed intra-observer reliability within method (ICC), agreement/conformity between methods (Bland-Altman analyses and Lin's concordance correlation), and changing bias (regression analyses). Intra-observer reliability was substantial within each method (ICC ≥ 0.7), except for five manual outcomes (12.8 per cent). Bias between methods was statistically significant, but less than 0.5 mm for 87.2 per cent of the outcomes. In general, larger tooth sizes were measured digitally. Total difference maxilla and mandible had wide limits of agreement (-3.25/6.15 and -2.31/4.57 mm), but bias between methods was mostly smaller than intra-observer variation within each method with substantial conformity of manual and digital measurements in general. No changing bias was detected. Although both work-flows were reliable, the investigated digital work-flow proved to be more reliable and yielded on average larger tooth sizes. Averaged differences between methods were within 0.5 mm for directly measured outcomes but wide ranges are expected for some computed space parameters due to cumulative error. © The Author 2017. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com

  18. Development of a Reliability Program approach to assuring operational nuclear safety

    International Nuclear Information System (INIS)

    Mueller, C.J.; Bezella, W.A.

    1985-01-01

    A Reliability Program (RP) model based on proven reliability techniques used in other high technology industries is being formulated for potential application in the nuclear power industry. Research findings are discussed. The reliability methods employed under NASA and military direction, commercial airline and related FAA programs were surveyed with several reliability concepts (e.g., quantitative reliability goals, reliability centered maintenance) appearing to be directly transferable. Other tasks in the RP development effort involved the benchmarking and evaluation of the existing nuclear regulations and practices relevant to safety/reliability integration. A review of current risk-dominant issues was also conducted using results from existing probabilistic risk assessment studies. The ongoing RP development tasks have concentrated on defining a RP for the operating phase of a nuclear plant's lifecycle. The RP approach incorporates safety systems risk/reliability analysis and performance monitoring activities with dedicated tasks that integrate these activities with operating, surveillance, and maintenance of the plant. The detection, root-cause evaluation and before-the-fact correction of incipient or actual systems failures as a mechanism for maintaining plant safety is a major objective of the RP

  19. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  20. Test-retest reliability and agreement of the SPI-Questionnaire to detect symptoms of digital ischemia in elite volleyball players.

    Science.gov (United States)

    van de Pol, Daan; Zacharian, Tigran; Maas, Mario; Kuijer, P Paul F M

    2017-06-01

    The Shoulder posterior circumflex humeral artery Pathology and digital Ischemia - questionnaire (SPI-Q) has been developed to enable periodic surveillance of elite volleyball players, who are at risk for digital ischemia. Prior to implementation, assessing reliability is mandatory. Therefore, the test-retest reliability and agreement of the SPI-Q were evaluated among the population at risk. A questionnaire survey was performed with a 2-week interval among 65 elite male volleyball players assessing symptoms of cold, pale and blue digits in the dominant hand during or after practice or competition using a 4-point Likert scale (never, sometimes, often and always). Kappa (κ) and percentage of agreement (POA) were calculated for individual symptoms, and to distinguish symptomatic and asymptomatic players. For the individual symptoms, κ ranged from "poor" (0.25) to "good" (0.63), and POA ranged from "moderate" (78%) to "good" (97%). To classify symptomatic players, the SPI-Q showed "good" reliability (κ = 0.83; 95%CI 0.69-0.97) and "good" agreement (POA = 92%). The current study has proven the SPI-Q to be reliable for detecting elite male indoor volleyball players with symptoms of digital ischemia.