WorldWideScience

Sample records for reliable detection methods

  1. Reliably detectable flaw size for NDE methods that use calibration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-04-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-1823 and associated mh18232 POD software gives most common methods of POD analysis. In this paper, POD analysis is applied to an NDE method, such as eddy current testing, where calibration is used. NDE calibration standards have known size artificial flaws such as electro-discharge machined (EDM) notches and flat bottom hole (FBH) reflectors which are used to set instrument sensitivity for detection of real flaws. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. Therefore, it is important to correlate signal responses from real flaws with signal responses form artificial flaws used in calibration process to determine reliably detectable flaw size.

  2. Objective Methods for Reliable Detection of Concealed Depression

    Directory of Open Access Journals (Sweden)

    Cynthia eSolomon

    2015-04-01

    Full Text Available Recent research has shown that it is possible to automatically detect clinical depression from audio-visual recordings. Before considering integration in a clinical pathway, a key question that must be asked is whether such systems can be easily fooled. This work explores the potential of acoustic features to detect clinical depression in adults both when acting normally and when asked to conceal their depression. Nine adults diagnosed with mild to moderate depression as per the Beck Depression Inventory (BDI-II and Patient Health Questionnaire (PHQ-9 were asked a series of questions and to read a excerpt from a novel aloud under two different experimental conditions. In one, participants were asked to act naturally and in the other, to suppress anything that they felt would be indicative of their depression. Acoustic features were then extracted from this data and analysed using paired t-tests to determine any statistically significant differences between healthy and depressed participants. Most features that were found to be significantly different during normal behaviour remained so during concealed behaviour. In leave-one-subject-out automatic classification studies of the 9 depressed subjects and 8 matched healthy controls, an 88% classification accuracy and 89% sensitivity was achieved. Results remained relatively robust during concealed behaviour, with classifiers trained on only non-concealed data achieving 81% detection accuracy and 75% sensitivity when tested on concealed data. These results indicate there is good potential to build deception-proof automatic depression monitoring systems.

  3. Reliability Study Regarding the Use of Histogram Similarity Methods for Damage Detection

    Directory of Open Access Journals (Sweden)

    Nicoleta Gillich

    2013-01-01

    Full Text Available The paper analyses the reliability of three dissimilarity estimators to compare histograms, as support for a frequency-based damage detection method, able to identify structural changes in beam-like structures. First a brief presentation of the own developed damage detection method is made, with focus on damage localization. It consists actually in comparing a histogram derived from measurement results, with a large series of histograms, namely the damage location indexes for all locations along the beam, obtained by calculus. We tested some dissimilarity estimators like the Minkowski-form Distances, the Kullback-Leibler Divergence and the Histogram Intersection and found the Minkowski Distance as the method providing best results. It was tested for numerous locations, using real measurement results and with results artificially debased by noise, proving its reliability.

  4. A Method for Improving Reliability of Radiation Detection using Deep Learning Framework

    International Nuclear Information System (INIS)

    Chang, Hojong; Kim, Tae-Ho; Han, Byunghun; Kim, Hyunduk; Kim, Ki-duk

    2017-01-01

    Radiation detection is essential technology for overall field of radiation and nuclear engineering. Previously, technology for radiation detection composes of preparation of the table of the input spectrum to output spectrum in advance, which requires simulation of numerous predicted output spectrum with simulation using parameters modeling the spectrum. In this paper, we propose new technique to improve the performance of radiation detector. The software in the radiation detector has been stagnant for a while with possible intrinsic error of simulation. In the proposed method, to predict the input source using output spectrum measured by radiation detector is performed using deep neural network. With highly complex model, we expect that the complex pattern between data and the label can be captured well. Furthermore, the radiation detector should be calibrated regularly and beforehand. We propose a method to calibrate radiation detector using GAN. We hope that the power of deep learning may also reach to radiation detectors and make huge improvement on the field. Using improved radiation detector, the reliability of detection would be confident, and there are many tasks remaining to solve using deep learning in nuclear engineering society.

  5. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  6. Structural Reliability Methods

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Madsen, H. O.

    The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...... of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature...... of the uncertainties and their interplay is the developed, step-by-step. The concepts presented are illustrated by numerous examples throughout the text....

  7. Self-Tuning Method for Increased Obstacle Detection Reliability Based on Internet of Things LiDAR Sensor Models.

    Science.gov (United States)

    Castaño, Fernando; Beruvides, Gerardo; Villalonga, Alberto; Haber, Rodolfo E

    2018-05-10

    On-chip LiDAR sensors for vehicle collision avoidance are a rapidly expanding area of research and development. The assessment of reliable obstacle detection using data collected by LiDAR sensors has become a key issue that the scientific community is actively exploring. The design of a self-tuning methodology and its implementation are presented in this paper, to maximize the reliability of LiDAR sensors network for obstacle detection in the 'Internet of Things' (IoT) mobility scenarios. The Webots Automobile 3D simulation tool for emulating sensor interaction in complex driving environments is selected in order to achieve that objective. Furthermore, a model-based framework is defined that employs a point-cloud clustering technique, and an error-based prediction model library that is composed of a multilayer perceptron neural network, and k-nearest neighbors and linear regression models. Finally, a reinforcement learning technique, specifically a Q-learning method, is implemented to determine the number of LiDAR sensors that are required to increase sensor reliability for obstacle localization tasks. In addition, a IoT driving assistance user scenario, connecting a five LiDAR sensor network is designed and implemented to validate the accuracy of the computational intelligence-based framework. The results demonstrated that the self-tuning method is an appropriate strategy to increase the reliability of the sensor network while minimizing detection thresholds.

  8. Robust and reliable banknote authentification and print flaw detection with opto-acoustical sensor fusion methods

    Science.gov (United States)

    Lohweg, Volker; Schaede, Johannes; Türke, Thomas

    2006-02-01

    The authenticity checking and inspection of bank notes is a high labour intensive process where traditionally every note on every sheet is inspected manually. However with the advent of more and more sophisticated security features, both visible and invisible, and the requirement of cost reduction in the printing process, it is clear that automation is required. As more and more print techniques and new security features will be established, total quality security, authenticity and bank note printing must be assured. Therefore, this factor necessitates amplification of a sensorial concept in general. We propose a concept for both authenticity checking and inspection methods for pattern recognition and classification for securities and banknotes, which is based on the concept of sensor fusion and fuzzy interpretation of data measures. In the approach different methods of authenticity analysis and print flaw detection are combined, which can be used for vending or sorting machines, as well as for printing machines. Usually only the existence or appearance of colours and their textures are checked by cameras. Our method combines the visible camera images with IR-spectral sensitive sensors, acoustical and other measurements like temperature and pressure of printing machines.

  9. Experimental Research of Reliability of Plant Stress State Detection by Laser-Induced Fluorescence Method

    Directory of Open Access Journals (Sweden)

    Yury Fedotov

    2016-01-01

    Full Text Available Experimental laboratory investigations of the laser-induced fluorescence spectra of watercress and lawn grass were conducted. The fluorescence spectra were excited by YAG:Nd laser emitting at 532 nm. It was established that the influence of stress caused by mechanical damage, overwatering, and soil pollution is manifested in changes of the spectra shapes. The mean values and confidence intervals for the ratio of two fluorescence maxima near 685 and 740 nm were estimated. It is presented that the fluorescence ratio could be considered a reliable characteristic of plant stress state.

  10. Multivariate normative comparison, a novel method for more reliably detecting cognitive impairment in HIV infection

    NARCIS (Netherlands)

    Su, Tanja; Schouten, Judith; Geurtsen, Gert J.; Wit, Ferdinand W.; Stolte, Ineke G.; Prins, Maria; Portegies, Peter; Caan, Matthan W. A.; Reiss, Peter; Majoie, Charles B.; Schmand, Ben A.

    2015-01-01

    The objective of this study is to assess whether multivariate normative comparison (MNC) improves detection of HIV-1-associated neurocognitive disorder (HAND) as compared with Frascati and Gisslén criteria. One-hundred and three HIV-1-infected men with suppressed viremia on combination

  11. Simultaneous amplification of two bacterial genes: more reliable method of Helicobacter pylori detection in microbial rich dental plaque samples.

    Science.gov (United States)

    Chaudhry, Saima; Idrees, Muhammad; Izhar, Mateen; Butt, Arshad Kamal; Khan, Ayyaz Ali

    2011-01-01

    Polymerase Chain reaction (PCR) assay is considered superior to other methods for detection of Helicobacter pylori (H. pylori) in oral cavity; however, it also has limitations when sample under study is microbial rich dental plaque. The type of gene targeted and number of primers used for bacterial detection in dental plaque samples can have a significant effect on the results obtained as there are a number of closely related bacterial species residing in plaque biofilm. Also due to high recombination rate of H. pylori some of the genes might be down regulated or absent. The present study was conducted to determine the frequency of H. pylori colonization of dental plaque by simultaneously amplifying two genes of the bacterium. One hundred dental plaque specimens were collected from dyspeptic patients before their upper gastrointestinal endoscopy and presence of H. pylori was determined through PCR assay using primers targeting two different genes of the bacterium. Eighty-nine of the 100 samples were included in final analysis. With simultaneous amplification of two bacterial genes 51.6% of the dental plaque samples were positive for H. pylori while this prevalence increased to 73% when only one gene amplification was used for bacterial identification. Detection of H. pylori in dental plaque samples is more reliable when two genes of the bacterium are simultaneously amplified as compared to one gene amplification only.

  12. Is air-displacement plethysmography a reliable method of detecting ongoing changes in percent body fat within obese children involved in a weight management program?

    DEFF Research Database (Denmark)

    Ewane, Cecile; McConkey, Stacy A; Kreiter, Clarence D

    2010-01-01

    (percent body fat) over time. The gold standard method, hydrodensitometry, has severe limitations for the pediatric population. OBJECTIVE: This study examines the reliability of air-displacement plethysmography (ADP) in detecting percent body fat changes within obese children over time. METHODS: Percent...... body fat by ADP, weight, and body mass index (BMI) were measured for eight obese children aged 5-12 years enrolled in a weight management program over a 12-month period. These measurements were taken at initial evaluation, 1.5 months, 3 months, 6 months, and 12 months to monitor the progress...... of the subjects and detect any changes in these measures over time. Statistical analysis was used to determine the reliability of the data collected. RESULTS: The reliability estimate for percent body fat by ADP was 0.78. This was much lower than the reliability of BMI, 0.98, and weight measurements, 0...

  13. A novel method for rapid and reliable detection of complex vertebral malformation and bovine leukocyte adhesion deficiency in Holstein cattle

    Directory of Open Access Journals (Sweden)

    Zhang Yi

    2012-07-01

    Full Text Available Abstract Background Complex vertebral malformation (CVM and bovine leukocyte adhesion deficiency (BLAD are two autosomal recessive lethal genetic defects frequently occurring in Holstein cattle, identifiable by single nucleotide polymorphisms. The objective of this study is to develop a rapid and reliable genotyping assay to screen the active Holstein sires and determine the carrier frequency of CVM and BLAD in Chinese dairy cattle population. Results We developed real-time PCR-based assays for discrimination of wild-type and defective alleles, so that carriers can be detected. Only one step was required after the DNA extraction from the sample and time consumption was about 2 hours. A total of 587 Chinese Holstein bulls were assayed, and fifty-six CVM-carriers and eight BLAD-carriers were identified, corresponding to heterozygote carrier frequencies of 9.54% and 1.36%, respectively. The pedigree analysis showed that most of the carriers could be traced back to the common ancestry, Osborndale Ivanhoe for BLAD and Pennstate Ivanhoe Star for CVM. Conclusions These results demonstrate that real-time PCR is a simple, rapid and reliable assay for BLAD and CVM defective allele detection. The high frequency of the CVM allele suggests that implementing a routine testing system is necessary to gradually eradicate the deleterious gene from the Chinese Holstein population.

  14. Validity and reliability of methods for the detection of secondary caries around amalgam restorations in primary teeth

    Directory of Open Access Journals (Sweden)

    Mariana Minatel Braga

    2010-03-01

    Full Text Available Secondary caries has been reported as the main reason for restoration replacement. The aim of this in vitro study was to evaluate the performance of different methods - visual inspection, laser fluorescence (DIAGNOdent, radiography and tactile examination - for secondary caries detection in primary molars restored with amalgam. Fifty-four primary molars were photographed and 73 suspect sites adjacent to amalgam restorations were selected. Two examiners evaluated independently these sites using all methods. Agreement between examiners was assessed by the Kappa test. To validate the methods, a caries-detector dye was used after restoration removal. The best cut-off points for the sample were found by a Receiver Operator Characteristic (ROC analysis, and the area under the ROC curve (Az, and the sensitivity, specificity and accuracy of the methods were calculated for enamel (D2 and dentine (D3 thresholds. These parameters were found for each method and then compared by the McNemar test. The tactile examination and visual inspection presented the highest inter-examiner agreement for the D2 and D3 thresholds, respectively. The visual inspection also showed better performance than the other methods for both thresholds (Az = 0.861 and Az = 0.841, respectively. In conclusion, the visual inspection presented the best performance for detecting enamel and dentin secondary caries in primary teeth restored with amalgam.

  15. Method to improve reliability of a fuel cell system using low performance cell detection at low power operation

    Science.gov (United States)

    Choi, Tayoung; Ganapathy, Sriram; Jung, Jaehak; Savage, David R.; Lakshmanan, Balasubramanian; Vecasey, Pamela M.

    2013-04-16

    A system and method for detecting a low performing cell in a fuel cell stack using measured cell voltages. The method includes determining that the fuel cell stack is running, the stack coolant temperature is above a certain temperature and the stack current density is within a relatively low power range. The method further includes calculating the average cell voltage, and determining whether the difference between the average cell voltage and the minimum cell voltage is greater than a predetermined threshold. If the difference between the average cell voltage and the minimum cell voltage is greater than the predetermined threshold and the minimum cell voltage is less than another predetermined threshold, then the method increments a low performing cell timer. A ratio of the low performing cell timer and a system run timer is calculated to identify a low performing cell.

  16. Methods and Reliability of Radiographic Vertebral Fracture Detection in Older Men: The Osteoporotic Fractures in Men Study

    Science.gov (United States)

    Cawthon, Peggy M.; Haslam, Jane; Fullman, Robin; Peters, Katherine W.; Black, Dennis; Ensrud, Kristine E.; Cummings, Steven R.; Orwoll, Eric S.; Barrett-Connor, Elizabeth; Marshall, Lynn; Steiger, Peter; Schousboe, John T.

    2014-01-01

    We describe the methods and reliability of radiographic vertebral fracture assessment in MrOS, a cohort of community dwelling men aged ≥65 yrs. Lateral spine radiographs were obtained at Visit 1 (2000-2) and 4.6 years later (Visit 2). Using a workflow tool (SpineAnalyzer™, Optasia Medical), a physician reader completed semi-quantitative (SQ) scoring. Prior to SQ scoring, technicians performed “triage” to reduce physician reader workload, whereby clearly normal spine images were eliminated from SQ scoring with all levels assumed to be SQ=0 (no fracture, “triage negative”); spine images with any possible fracture or abnormality were passed to the physician reader as “triage positive” images. Using a quality assurance sample of images (n=20 participants; 8 with baseline only and 12 with baseline and follow-up images) read multiple times, we calculated intra-reader kappa statistics and percent agreement for SQ scores. A subset of 494 participants' images were read regardless of triage classification to calculate the specificity and sensitivity of triage. Technically adequate images were available for 5958 of 5994 participants at Visit 1, and 4399 of 4423 participants at Visit 2. Triage identified 3215 (53.9%) participants with radiographs that required further evaluation by the physician reader. For prevalent fractures at Visit 1 (SQ≥1), intra-reader kappa statistics ranged from 0.79-0.92; percent agreement ranged from 96.9%-98.9%; sensitivity of the triage was 96.8% and specificity of triage was 46.3%. In conclusion, SQ scoring had excellent intra-rater reliability in our study. The triage process reduces expert reader workload without hindering the ability to identify vertebral fractures. PMID:25003811

  17. A New Method to Detect and Correct the Critical Errors and Determine the Software-Reliability in Critical Software-System

    International Nuclear Information System (INIS)

    Krini, Ossmane; Börcsök, Josef

    2012-01-01

    In order to use electronic systems comprising of software and hardware components in safety related and high safety related applications, it is necessary to meet the Marginal risk numbers required by standards and legislative provisions. Existing processes and mathematical models are used to verify the risk numbers. On the hardware side, various accepted mathematical models, processes, and methods exist to provide the required proof. To this day, however, there are no closed models or mathematical procedures known that allow for a dependable prediction of software reliability. This work presents a method that makes a prognosis on the residual critical error number in software. Conventional models lack this ability and right now, there are no methods that forecast critical errors. The new method will show that an estimate of the residual error number of critical errors in software systems is possible by using a combination of prediction models, a ratio of critical errors, and the total error number. Subsequently, the critical expected value-function at any point in time can be derived from the new solution method, provided the detection rate has been calculated using an appropriate estimation method. Also, the presented method makes it possible to make an estimate on the critical failure rate. The approach is modelled on a real process and therefore describes two essential processes - detection and correction process.

  18. Reliability of leak detection systems in LWRs

    International Nuclear Information System (INIS)

    Kupperman, D.S.

    1986-10-01

    In this paper, NRC guidelines for leak detection will be reviewed, current practices described, potential safety-related problems discussed, and potential improvements in leak detection technology (with emphasis on acoustic methods) evaluated

  19. The reliability and accuracy of two methods for proximal caries detection and depth on directly visible proximal surfaces: an in vitro study

    DEFF Research Database (Denmark)

    Ekstrand, K R; Alloza, Alvaro Luna; Promisiero, L

    2011-01-01

    This study aimed to determine the reliability and accuracy of the ICDAS and radiographs in detecting and estimating the depth of proximal lesions on extracted teeth. The lesions were visible to the naked eye. Three trained examiners scored a total of 132 sound/carious proximal surfaces from 106 p...

  20. Application of reliability methods in Ontario Hydro

    International Nuclear Information System (INIS)

    Jeppesen, R.; Ravishankar, T.J.

    1985-01-01

    Ontario Hydro have established a reliability program in support of its substantial nuclear program. Application of the reliability program to achieve both production and safety goals is described. The value of such a reliability program is evident in the record of Ontario Hydro's operating nuclear stations. The factors which have contributed to the success of the reliability program are identified as line management's commitment to reliability; selective and judicious application of reliability methods; establishing performance goals and monitoring the in-service performance; and collection, distribution, review and utilization of performance information to facilitate cost-effective achievement of goals and improvements. (orig.)

  1. New Multiplexing Tools for Reliable GMO Detection

    NARCIS (Netherlands)

    Pla, M.; Nadal, A.; Baeten, V.; Bahrdt, C.; Berben, G.; Bertheau, Y.; Coll, A.; Dijk, van J.P.; Dobnik, D.; Fernandez-Pierna, J.A.; Gruden, K.; Hamels, S.; Holck, A.; Holst-Jensen, A.; Janssen, E.; Kok, E.J.; Paz, La J.L.; Laval, V.; Leimanis, S.; Malcevschi, A.; Marmiroli, N.; Morisset, D.; Prins, T.W.; Remacle, J.; Ujhelyi, G.; Wulff, D.

    2012-01-01

    Among the available methods for GMO detection, enforcement and routine laboratories use in practice PCR, based on the detection of transgenic DNA. The cost required for GMO analysis is constantly increasing due to the progress of GMO commercialization, with inclusion of higher diversity of species,

  2. A reliable method for the stability analysis of structures ...

    African Journals Online (AJOL)

    The detection of structural configurations with singular tangent stiffness matrix is essential because they can be unstable. The secondary paths, especially in unstable buckling, can play the most important role in the loss of stability and collapse of the structure. A new method for reliable detection and accurate computation of ...

  3. Evaluation of structural reliability using simulation methods

    Directory of Open Access Journals (Sweden)

    Baballëku Markel

    2015-01-01

    Full Text Available Eurocode describes the 'index of reliability' as a measure of structural reliability, related to the 'probability of failure'. This paper is focused on the assessment of this index for a reinforced concrete bridge pier. It is rare to explicitly use reliability concepts for design of structures, but the problems of structural engineering are better known through them. Some of the main methods for the estimation of the probability of failure are the exact analytical integration, numerical integration, approximate analytical methods and simulation methods. Monte Carlo Simulation is used in this paper, because it offers a very good tool for the estimation of probability in multivariate functions. Complicated probability and statistics problems are solved through computer aided simulations of a large number of tests. The procedures of structural reliability assessment for the bridge pier and the comparison with the partial factor method of the Eurocodes have been demonstrated in this paper.

  4. A Method of Nuclear Software Reliability Estimation

    International Nuclear Information System (INIS)

    Park, Gee Yong; Eom, Heung Seop; Cheon, Se Woo; Jang, Seung Cheol

    2011-01-01

    A method on estimating software reliability for nuclear safety software is proposed. This method is based on the software reliability growth model (SRGM) where the behavior of software failure is assumed to follow the non-homogeneous Poisson process. Several modeling schemes are presented in order to estimate and predict more precisely the number of software defects based on a few of software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating the software test cases into the model. It is identified that this method is capable of accurately estimating the remaining number of software defects which are on-demand type directly affecting safety trip functions. The software reliability can be estimated from a model equation and one method of obtaining the software reliability is proposed

  5. Spices, irradiation and detection methods

    International Nuclear Information System (INIS)

    Sjoeberg, A.M.; Manninen, M.

    1991-01-01

    This paper is about microbiological aspects of spices and microbiological methods to detect irradiated food. The proposed method is a combination of the Direct Epifluorescence Filter Technique (DEFT) and the Aerobic Plate Count (APC). The evidence for irradiation of spices is based on the demonstration of a higher DEFT count than the APC. The principle was first tested in our earlier investigation in the detection of irradiation of whole spices. The combined DEFT+APC procedure was found to give a fairly reliable indication of whether or not a whole spice sample had been irradiated. The results are given (8 figs, 22 refs)

  6. Reliability of Estimation Pile Load Capacity Methods

    Directory of Open Access Journals (Sweden)

    Yudhi Lastiasih

    2014-04-01

    Full Text Available None of numerous previous methods for predicting pile capacity is known how accurate any of them are when compared with the actual ultimate capacity of piles tested to failure. The author’s of the present paper have conducted such an analysis, based on 130 data sets of field loading tests. Out of these 130 data sets, only 44 could be analysed, of which 15 were conducted until the piles actually reached failure. The pile prediction methods used were: Brinch Hansen’s method (1963, Chin’s method (1970, Decourt’s Extrapolation Method (1999, Mazurkiewicz’s method (1972, Van der Veen’s method (1953, and the Quadratic Hyperbolic Method proposed by Lastiasih et al. (2012. It was obtained that all the above methods were sufficiently reliable when applied to data from pile loading tests that loaded to reach failure. However, when applied to data from pile loading tests that loaded without reaching failure, the methods that yielded lower values for correction factor N are more recommended. Finally, the empirical method of Reese and O’Neill (1988 was found to be reliable enough to be used to estimate the Qult of a pile foundation based on soil data only.

  7. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    1984-10-01

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  8. Assessment of the reliability of ultrasonic inspection methods

    International Nuclear Information System (INIS)

    Haines, N.F.; Langston, D.B.; Green, A.J.; Wilson, R.

    1982-01-01

    The reliability of NDT techniques has remained an open question for many years. A reliable technique may be defined as one that, when rigorously applied by a number of inspection teams, consistently finds then correctly sizes all defects of concern. In this paper we report an assessment of the reliability of defect detection by manual ultrasonic methods applied to the inspection of thick section pressure vessel weldments. Initially we consider the available data relating to the inherent physical capabilities of ultrasonic techniques to detect cracks in weldment and then, independently, we assess the likely variability in team to team performance when several teams are asked to follow the same specified test procedure. The two aspects of 'capability' and 'variability' are brought together to provide quantitative estimates of the overall reliability of ultrasonic inspection of thick section pressure vessel weldments based on currently existing data. The final section of the paper considers current research programmes on reliability and presents a view on how these will help to further improve NDT reliability. (author)

  9. Crack detecting method

    International Nuclear Information System (INIS)

    Narita, Michiko; Aida, Shigekazu

    1998-01-01

    A penetration liquid or a slow drying penetration liquid prepared by mixing a penetration liquid and a slow drying liquid is filled to the inside of an artificial crack formed to a member to be detected such as of boiler power generation facilities and nuclear power facilities. A developing liquid is applied to the periphery of the artificial crack on the surface of a member to be detected. As the slow-drying liquid, an oil having a viscosity of 56 is preferably used. Loads are applied repeatedly to the member to be detected, and when a crack is caused to the artificial crack, the permeation liquid penetrates into the crack. The penetration liquid penetrated into the crack is developed by the developing liquid previously coated to the periphery of the artificial crack of the surface of the member to be detected. When a crack is caused, since the crack is developed clearly even if it is a small opening, the crack can be recognized visually reliably. (I.N.)

  10. Reliability of non-destructive testing methods

    International Nuclear Information System (INIS)

    Broekhoven, M.J.G.

    1988-01-01

    This contribution regards the results of an evaluation of the reliability of radiography (X-rays and gamma-rays), manual-, and mechanized/automated ultrasonic examination by generally accepted codes/rules, with respect to detection, characterization and sizing/localization of defects. The evaluation is based on the results of examinations, by a number of teams, of 30 test plates, 30 and 50 mm thickness, containing V,U, X and K-shaped welds each containing several types of imperfections (211) in total) typical for steel arc fusion welding, such as porosity, inclusions, lack of fusion or penetration and cracks. Besides, some results are presented obtained from research on advanced UT-techniques, viz. the time-of-flight-diffraction and flaw-tip deflection technique. (author)

  11. Reliability of non-destructive testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Broekhoven, M J.G. [Ministry of Social Affairs, (Netherlands)

    1988-12-31

    This contribution regards the results of an evaluation of the reliability of radiography (X-rays and gamma-rays), manual-, and mechanized/automated ultrasonic examination by generally accepted codes/rules, with respect to detection, characterization and sizing/localization of defects. The evaluation is based on the results of examinations, by a number of teams, of 30 test plates, 30 and 50 mm thickness, containing V,U, X and K-shaped welds each containing several types of imperfections (211) in total) typical for steel arc fusion welding, such as porosity, inclusions, lack of fusion or penetration and cracks. Besides, some results are presented obtained from research on advanced UT-techniques, viz. the time-of-flight-diffraction and flaw-tip deflection technique. (author). 4 refs.

  12. Review of Quantitative Software Reliability Methods

    Energy Technology Data Exchange (ETDEWEB)

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of

  13. Structural reliability methods: Code development status

    Science.gov (United States)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  14. Leak detection method

    International Nuclear Information System (INIS)

    1978-01-01

    This invention provides a method for removing nuclear fuel elements from a fabrication building while at the same time testing the fuel elements for leaks without releasing contaminants from the fabrication building or from the fuel elements. The vacuum source used, leak detecting mechanism and fuel element fabrication building are specified to withstand environmental hazards. (UK)

  15. Reliability testing of tendon disease using two different scanning methods in patients with rheumatoid arthritis

    DEFF Research Database (Denmark)

    Bruyn, George A W; Möller, Ingrid; Garrido, Jesus

    2012-01-01

    To assess the intra- and interobserver reliability of musculoskeletal ultrasonography (US) in detecting inflammatory and destructive tendon abnormalities in patients with RA using two different scanning methods.......To assess the intra- and interobserver reliability of musculoskeletal ultrasonography (US) in detecting inflammatory and destructive tendon abnormalities in patients with RA using two different scanning methods....

  16. Reliability of nucleic acid amplification methods for detection of Chlamydia trachomatis in urine: results of the first international collaborative quality control study among 96 laboratories

    NARCIS (Netherlands)

    R.P.A.J. Verkooyen (Roel); G.T. Noordhoek; P.E. Klapper; J. Reid; J. Schirm; G.M. Cleator; M. Ieven; G. Hoddevik

    2003-01-01

    textabstractThe first European Quality Control Concerted Action study was organized to assess the ability of laboratories to detect Chlamydia trachomatis in a panel of urine samples by nucleic acid amplification tests (NATs). The panel consisted of lyophilized urine samples,

  17. Melting curve analysis after T allele enrichment (MelcaTle as a highly sensitive and reliable method for detecting the JAK2V617F mutation.

    Directory of Open Access Journals (Sweden)

    Soji Morishita

    Full Text Available Detection of the JAK2V617F mutation is essential for diagnosing patients with classical myeloproliferative neoplasms (MPNs. However, detection of the low-frequency JAK2V617F mutation is a challenging task due to the necessity of discriminating between true-positive and false-positive results. Here, we have developed a highly sensitive and accurate assay for the detection of JAK2V617F and named it melting curve analysis after T allele enrichment (MelcaTle. MelcaTle comprises three steps: 1 two cycles of JAK2V617F allele enrichment by PCR amplification followed by BsaXI digestion, 2 selective amplification of the JAK2V617F allele in the presence of a bridged nucleic acid (BNA probe, and 3 a melting curve assay using a BODIPY-FL-labeled oligonucleotide. Using this assay, we successfully detected nearly a single copy of the JAK2V617F allele, without false-positive signals, using 10 ng of genomic DNA standard. Furthermore, MelcaTle showed no positive signals in 90 assays screening healthy individuals for JAK2V617F. When applying MelcaTle to 27 patients who were initially classified as JAK2V617F-positive on the basis of allele-specific PCR analysis and were thus suspected as having MPNs, we found that two of the patients were actually JAK2V617F-negative. A more careful clinical data analysis revealed that these two patients had developed transient erythrocytosis of unknown etiology but not polycythemia vera, a subtype of MPNs. These findings indicate that the newly developed MelcaTle assay should markedly improve the diagnosis of JAK2V617F-positive MPNs.

  18. Reliability and discriminatory power of methods for dental plaque quantification

    Directory of Open Access Journals (Sweden)

    Daniela Prócida Raggio

    2010-04-01

    Full Text Available OBJECTIVE: This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI and fluorescence camera (FC to detect plaque. MATERIAL AND METHODS: Six volunteers used palatal appliances with six bovine enamel blocks presenting different stages of plaque accumulation. The presence of plaque with and without disclosing was assessed using VI. Images were obtained with FC and digital camera in both conditions. The area covered by plaque was assessed. Examinations were done by two independent examiners. Data were analyzed by Kruskal-Wallis and Kappa tests to compare different conditions of samples and to assess the inter-examiner reproducibility. RESULTS: Some methods presented adequate reproducibility. The Turesky index and the assessment of area covered by disclosed plaque in the FC images presented the highest discriminatory powers. CONCLUSION: The Turesky index and images with FC with disclosing present good reliability and discriminatory power in quantifying dental plaque.

  19. A simple and reliable method to detect gamma irradiated lentil (Lens culinaris Medik.) seeds by germination efficiency and seedling growth test

    International Nuclear Information System (INIS)

    Chaudhuri, Sadhan K.

    2002-01-01

    Germination efficiency and root/shoot length of germinated seedling is proposed to identify irradiated lentil seeds. Germination percentage was reduced above 0.2 kGy and lentil seeds were unable to germinate above 1.0 kGy dose. The critical dose that prevented the root elongation varied from 0.1 to 0.5 kGy. The sensitivity of lentil seeds to gamma irradiation was inversely proportional to moisture content of the seeds. Radiation effects could be detected in seeds even 12 months storage after gamma irradiation

  20. An Evaluation Method of Equipment Reliability Configuration Management

    Science.gov (United States)

    Wang, Wei; Feng, Weijia; Zhang, Wei; Li, Yuan

    2018-01-01

    At present, many equipment development companies have been aware of the great significance of reliability of the equipment development. But, due to the lack of effective management evaluation method, it is very difficult for the equipment development company to manage its own reliability work. Evaluation method of equipment reliability configuration management is to determine the reliability management capabilities of equipment development company. Reliability is not only designed, but also managed to achieve. This paper evaluates the reliability management capabilities by reliability configuration capability maturity model(RCM-CMM) evaluation method.

  1. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  2. The CT (Hounsfield unit) number of brain tissue in healthy infants. A new reliable method for detection of possible degenerative disease.

    Science.gov (United States)

    Boris, P; Bundgaard, F; Olsen, A

    1987-01-01

    It is difficult to correlate CT Hounsfield unit (H. U.) numbers from one CT investigation to another and from one CT scanner to another, especially when dealing with small changes in the brain substance, as in degenerative brain diseases in children. By subtracting the mean value of cerebrospinal fluid (CSF) from the mean value of grey and white matter, it is possible to remove most of the errors due, for example, to maladjustments, short and long-term drift, X-ray fan, and detector asymmetry. Measurements of white and grey matter using these methods showed CT H. U. numbers changing from 15 H. U. to 22 H. U. in white matter and 23 H. U. to 30 H. U. in grey matter in 86 healthy infants aged 0-5 years. In all measurements, the difference between grey and white matter was exactly 8 H. U. The method has proven to be highly accurate and reproducible.

  3. Remote detection device and detection method therefor

    International Nuclear Information System (INIS)

    Kogure, Sumio; Yoshida, Yoji; Matsuo, Takashiro; Takehara, Hidetoshi; Kojima, Shinsaku.

    1997-01-01

    The present invention provides a non-destructive detection device for collectively, efficiently and effectively conducting maintenance and detection for confirming the integrity of a nuclear reactor by way of a shielding member for shielding radiation rays generated from an objective portion to be detected. Namely, devices for direct visual detection using an under water TV camera as a sensor, an eddy current detection using a coil as a sensor and each magnetic powder flow detection are integrated and applied collectively. Specifically, the visual detection by using the TV camera and the eddy current flaw detection are adopted together. The flaw detection with magnetic powder is applied as a means for confirming the results of the two kinds of detections by other method. With such procedures, detection techniques using respective specific theories are combined thereby enabling to enhance the accuracy for the evaluation of the detection. (I.S.)

  4. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine

    DEFF Research Database (Denmark)

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon

    2013-01-01

    as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine...

  5. Failed fuel detection method

    International Nuclear Information System (INIS)

    Utamura, Motoaki; Urata, Megumu.

    1976-01-01

    Object: To detect failed fuel element in a reactor with high precision by measuring the radioactivity concentrations for more than one nuclides of fission products ( 131 I and 132 I, for example) contained in each sample of coolant in fuel channel. Method: The radioactivity concentrations in the sampled coolant are obtained from gamma spectra measured by a pulse height analyser after suitable cooling periods according to the half-lives of the fission products to be measured. The first measurement for 132 I is made in two hours after sampling, and the second for 131 I is started one day after the sampling. Fuel element corresponding to the high radioactivity concentrations for both 131 I and 132 I is expected with certainty to have failed

  6. Reliability evaluation of the Savannah River reactor leak detection system

    International Nuclear Information System (INIS)

    Daugherty, W.L.; Sindelar, R.L.; Wallace, I.T.

    1991-01-01

    The Savannah River Reactors have been in operation since the mid-1950's. The primary degradation mode for the primary coolant loop piping is intergranular stress corrosion cracking. The leak-before-break (LBB) capability of the primary system piping has been demonstrated as part of an overall structural integrity evaluation. One element of the LBB analyses is a reliability evaluation of the leak detection system. The most sensitive element of the leak detection system is the airborne tritium monitors. The presence of small amounts of tritium in the heavy water coolant provide the basis for a very sensitive system of leak detection. The reliability of the tritium monitors to properly identify a crack leaking at a rate of either 50 or 300 lb/day (0.004 or 0.023 gpm, respectively) has been characterized. These leak rates correspond to action points for which specific operator actions are required. High reliability has been demonstrated using standard fault tree techniques. The probability of not detecting a leak within an assumed mission time of 24 hours is estimated to be approximately 5 x 10 -5 per demand. This result is obtained for both leak rates considered. The methodology and assumptions used to obtain this result are described in this paper. 3 refs., 1 fig., 1 tab

  7. Reliability of leak detection systems in light water reactors

    International Nuclear Information System (INIS)

    Kupperman, D.S.

    1987-01-01

    US Nuclear Regulatory Commission Guide 1.45 recommends the use of at least three different detection methods in reactors to detect leakage. Monitoring of both sump-flow and airborne particulate radioactivity is recommended. A third method can involve either monitoring of condensate flow rate from air coolers or monitoring of airborne gaseous radioactivity. Although the methods currently used for leak detection reflect the state of the art, other techniques may be developed and used. Since the recommendations of Regulatory Guide 1.45 are not mandatory, the technical specifications for 74 operating plants have been reviewed to determine the types of leak detection methods employed. In addition, Licensee Event Report (LER) Compilations from June 1985 to June 1986 have been reviewed to help establish actual capabilities for detecting leaks and determining their source. Work at Argonne National Laboratory has demonstrated that improvements in leak detection, location, and sizing are possible with advanced acoustic leak detection technology

  8. Extrapolation Method for System Reliability Assessment

    DEFF Research Database (Denmark)

    Qin, Jianjun; Nishijima, Kazuyoshi; Faber, Michael Havbro

    2012-01-01

    of integrals with scaled domains. The performance of this class of approximation depends on the approach applied for the scaling and the functional form utilized for the extrapolation. A scheme for this task is derived here taking basis in the theory of asymptotic solutions to multinormal probability integrals......The present paper presents a new scheme for probability integral solution for system reliability analysis, which takes basis in the approaches by Naess et al. (2009) and Bucher (2009). The idea is to evaluate the probability integral by extrapolation, based on a sequence of MC approximations...... that the proposed scheme is efficient and adds to generality for this class of approximations for probability integrals....

  9. Fault detection and reliability, knowledge based and other approaches

    International Nuclear Information System (INIS)

    Singh, M.G.; Hindi, K.S.; Tzafestas, S.G.

    1987-01-01

    These proceedings are split up into four major parts in order to reflect the most significant aspects of reliability and fault detection as viewed at present. The first part deals with knowledge-based systems and comprises eleven contributions from leading experts in the field. The emphasis here is primarily on the use of artificial intelligence, expert systems and other knowledge-based systems for fault detection and reliability. The second part is devoted to fault detection of technological systems and comprises thirteen contributions dealing with applications of fault detection techniques to various technological systems such as gas networks, electric power systems, nuclear reactors and assembly cells. The third part of the proceedings, which consists of seven contributions, treats robust, fault tolerant and intelligent controllers and covers methodological issues as well as several applications ranging from nuclear power plants to industrial robots to steel grinding. The fourth part treats fault tolerant digital techniques and comprises five contributions. Two papers, one on reactor noise analysis, the other on reactor control system design, are indexed separately. (author)

  10. A reliability evaluation method for NPP safety DCS application software

    International Nuclear Information System (INIS)

    Li Yunjian; Zhang Lei; Liu Yuan

    2014-01-01

    In the field of nuclear power plant (NPP) digital i and c application, reliability evaluation for safety DCS application software is a key obstacle to be removed. In order to quantitatively evaluate reliability of NPP safety DCS application software, this paper propose a reliability evaluating method based on software development life cycle every stage's v and v defects density characteristics, by which the operating reliability level of the software can be predicted before its delivery, and helps to improve the reliability of NPP safety important software. (authors)

  11. RELIABILITY OF THE DETECTION OF THE BARYON ACOUSTIC PEAK

    International Nuclear Information System (INIS)

    MartInez, Vicent J.; Arnalte-Mur, Pablo; De la Cruz, Pablo; Saar, Enn; Tempel, Elmo; Pons-BorderIa, MarIa Jesus; Paredes, Silvestre; Fernandez-Soto, Alberto

    2009-01-01

    The correlation function of the distribution of matter in the universe shows, at large scales, baryon acoustic oscillations, which were imprinted prior to recombination. This feature was first detected in the correlation function of the luminous red galaxies of the Sloan Digital Sky Survey (SDSS). Recently, the final release (DR7) of the SDSS has been made available, and the useful volume is about two times bigger than in the old sample. We present here, for the first time, the redshift-space correlation function of this sample at large scales together with that for one shallower, but denser volume-limited subsample drawn from the Two-Degree Field Redshift Survey. We test the reliability of the detection of the acoustic peak at about 100 h -1 Mpc and the behavior of the correlation function at larger scales by means of careful estimation of errors. We confirm the presence of the peak in the latest data although broader than in previous detections.

  12. Reliability analysis for the quench detection in the LHC machine

    CERN Document Server

    Denz, R; Vergara-Fernández, A

    2002-01-01

    The Large Hadron Collider (LHC) will incorporate a large amount of superconducting elements that require protection in case of a quench. Key elements in the quench protection system are the electronic quench detectors. Their reliability will have an important impact on the down time as well as on the operational cost of the collider. The expected rates of both false and missed quenches have been computed for several redundant detection schemes. The developed model takes account of the maintainability of the system to optimise the frequency of foreseen checks, and evaluate their influence on the performance of different detection topologies. Seen the uncertainty of the failure rate of the components combined with the LHC tunnel environment, the study has been completed with a sensitivity analysis of the results. The chosen detection scheme and the maintainability strategy for each detector family are given.

  13. Nonlinear Multiantenna Detection Methods

    Directory of Open Access Journals (Sweden)

    Chen Sheng

    2004-01-01

    Full Text Available A nonlinear detection technique designed for multiple-antenna assisted receivers employed in space-division multiple-access systems is investigated. We derive the optimal solution of the nonlinear spatial-processing assisted receiver for binary phase shift keying signalling, which we refer to as the Bayesian detector. It is shown that this optimal Bayesian receiver significantly outperforms the standard linear beamforming assisted receiver in terms of a reduced bit error rate, at the expense of an increased complexity, while the achievable system capacity is substantially enhanced with the advent of employing nonlinear detection. Specifically, when the spatial separation expressed in terms of the angle of arrival between the desired and interfering signals is below a certain threshold, a linear beamformer would fail to separate them, while a nonlinear detection assisted receiver is still capable of performing adequately. The adaptive implementation of the optimal Bayesian detector can be realized using a radial basis function network. Two techniques are presented for constructing block-data-based adaptive nonlinear multiple-antenna assisted receivers. One of them is based on the relevance vector machine invoked for classification, while the other on the orthogonal forward selection procedure combined with the Fisher ratio class-separability measure. A recursive sample-by-sample adaptation procedure is also proposed for training nonlinear detectors based on an amalgam of enhanced -means clustering techniques and the recursive least squares algorithm.

  14. Mathematical Methods in Survival Analysis, Reliability and Quality of Life

    CERN Document Server

    Huber, Catherine; Mesbah, Mounir

    2008-01-01

    Reliability and survival analysis are important applications of stochastic mathematics (probability, statistics and stochastic processes) that are usually covered separately in spite of the similarity of the involved mathematical theory. This title aims to redress this situation: it includes 21 chapters divided into four parts: Survival analysis, Reliability, Quality of life, and Related topics. Many of these chapters were presented at the European Seminar on Mathematical Methods for Survival Analysis, Reliability and Quality of Life in 2006.

  15. The reliability of magnetic resonance imaging in traumatic brain injury lesion detection

    NARCIS (Netherlands)

    Geurts, B.H.J.; Andriessen, T.M.J.C.; Goraj, B.M.; Vos, P.E.

    2012-01-01

    Objective: This study compares inter-rater-reliability, lesion detection and clinical relevance of T2-weighted imaging (T2WI), Fluid Attenuated Inversion Recovery (FLAIR), T2*-gradient recalled echo (T2*-GRE) and Susceptibility Weighted Imaging (SWI) in Traumatic Brain Injury (TBI). Methods: Three

  16. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    Science.gov (United States)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  17. A novel reliability evaluation method for large engineering systems

    Directory of Open Access Journals (Sweden)

    Reda Farag

    2016-06-01

    Full Text Available A novel reliability evaluation method for large nonlinear engineering systems excited by dynamic loading applied in time domain is presented. For this class of problems, the performance functions are expected to be function of time and implicit in nature. Available first- or second-order reliability method (FORM/SORM will be challenging to estimate reliability of such systems. Because of its inefficiency, the classical Monte Carlo simulation (MCS method also cannot be used for large nonlinear dynamic systems. In the proposed approach, only tens instead of hundreds or thousands of deterministic evaluations at intelligently selected points are used to extract the reliability information. A hybrid approach, consisting of the stochastic finite element method (SFEM developed by the author and his research team using FORM, response surface method (RSM, an interpolation scheme, and advanced factorial schemes, is proposed. The method is clarified with the help of several numerical examples.

  18. Advances in methods and applications of reliability and safety analysis

    International Nuclear Information System (INIS)

    Fieandt, J.; Hossi, H.; Laakso, K.; Lyytikaeinen, A.; Niemelae, I.; Pulkkinen, U.; Pulli, T.

    1986-01-01

    The know-how of the reliability and safety design and analysis techniques of Vtt has been established over several years in analyzing the reliability in the Finnish nuclear power plants Loviisa and Olkiluoto. This experience has been later on applied and developed to be used in the process industry, conventional power industry, automation and electronics. VTT develops and transfers methods and tools for reliability and safety analysis to the private and public sectors. The technology transfer takes place in joint development projects with potential users. Several computer-aided methods, such as RELVEC for reliability modelling and analysis, have been developed. The tool developed are today used by major Finnish companies in the fields of automation, nuclear power, shipbuilding and electronics. Development of computer-aided and other methods needed in analysis of operating experience, reliability or safety is further going on in a number of research and development projects

  19. A Voltage Quality Detection Method

    DEFF Research Database (Denmark)

    Chen, Zhe; Wei, Mu

    2008-01-01

    This paper presents a voltage quality detection method based on a phase-locked loop (PLL) technique. The technique can detect the voltage magnitude and phase angle of each individual phase under both normal and fault power system conditions. The proposed method has the potential to evaluate various...

  20. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  1. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    International Nuclear Information System (INIS)

    Boring, Ronald Laurids

    2010-01-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  2. A method to assign failure rates for piping reliability assessments

    International Nuclear Information System (INIS)

    Gamble, R.M.; Tagart, S.W. Jr.

    1991-01-01

    This paper reports on a simplified method that has been developed to assign failure rates that can be used in reliability and risk studies of piping. The method can be applied on a line-by-line basis by identifying line and location specific attributes that can lead to piping unreliability from in-service degradation mechanisms and random events. A survey of service experience for nuclear piping reliability also was performed. The data from this survey provides a basis for identifying in-service failure attributes and assigning failure rates for risk and reliability studies

  3. Reliability analysis of neutron transport simulation using Monte Carlo method

    International Nuclear Information System (INIS)

    Souza, Bismarck A. de; Borges, Jose C.

    1995-01-01

    This work presents a statistical and reliability analysis covering data obtained by computer simulation of neutron transport process, using the Monte Carlo method. A general description of the method and its applications is presented. Several simulations, corresponding to slowing down and shielding problems have been accomplished. The influence of the physical dimensions of the materials and of the sample size on the reliability level of results was investigated. The objective was to optimize the sample size, in order to obtain reliable results, optimizing computation time. (author). 5 refs, 8 figs

  4. Modifying nodal pricing method considering market participants optimality and reliability

    Directory of Open Access Journals (Sweden)

    A. R. Soofiabadi

    2015-06-01

    Full Text Available This paper develops a method for nodal pricing and market clearing mechanism considering reliability of the system. The effects of components reliability on electricity price, market participants’ profit and system social welfare is considered. This paper considers reliability both for evaluation of market participant’s optimality as well as for fair pricing and market clearing mechanism. To achieve fair pricing, nodal price has been obtained through a two stage optimization problem and to achieve fair market clearing mechanism, comprehensive criteria has been introduced for optimality evaluation of market participant. Social welfare of the system and system efficiency are increased under proposed modified nodal pricing method.

  5. Scenario based approach to structural damage detection and its value in a risk and reliability perspective

    DEFF Research Database (Denmark)

    Hovgaard, Mads Knude; Hansen, Jannick Balleby; Brincker, Rune

    2013-01-01

    A scenario- and vibration based structural damage detection method is demonstrated though simulation. The method is Finite Element (FE) based. The value of the monitoring is calculated using structural reliability theory. A high cycle fatigue crack propagation model is assumed as the damage mecha......- and without monitoring. Monte Carlo Sampling (MCS) is used to estimate the probabilities and the tower of an onshore NREL 5MW wind turbine is given as a calculation case......A scenario- and vibration based structural damage detection method is demonstrated though simulation. The method is Finite Element (FE) based. The value of the monitoring is calculated using structural reliability theory. A high cycle fatigue crack propagation model is assumed as the damage...

  6. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  7. Evaluation of Information Requirements of Reliability Methods in Engineering Design

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema

    2010-01-01

    This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...

  8. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    OpenAIRE

    Hai An; Ling Zhou; Hui Sun

    2016-01-01

    Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new...

  9. Detection methods for irradiated food

    International Nuclear Information System (INIS)

    Stevenson, M.H.

    1993-01-01

    The plenary lecture gives a brief historical review of the development of methods for the detection of food irradiation and defines the demands on such methods. The methods described in detail are as follows: 1) Physical methods: As examples of luminescence methods, thermoluminescence and chermoluminescence are mentioned; ESR spectroscopy is discussed in detail by means of individual examples (crustaceans, frutis and vegetables, spieces and herbs, nuts). 2) Chemical methods: Examples given for these are methods that make use of alterations in lipids through radiation (formation of long-chain hydrocarbons, formation of 2-alkyl butanones), respectively radiation-induced alterations in the DNA. 3) Microbiological methods. An extensive bibliography is appended. (VHE) [de

  10. Sequential optimization and reliability assessment method for metal forming processes

    International Nuclear Information System (INIS)

    Sahai, Atul; Schramm, Uwe; Buranathiti, Thaweepat; Chen Wei; Cao Jian; Xia, Cedric Z.

    2004-01-01

    Uncertainty is inevitable in any design process. The uncertainty could be due to the variations in geometry of the part, material properties or due to the lack of knowledge about the phenomena being modeled itself. Deterministic design optimization does not take uncertainty into account and worst case scenario assumptions lead to vastly over conservative design. Probabilistic design, such as reliability-based design and robust design, offers tools for making robust and reliable decisions under the presence of uncertainty in the design process. Probabilistic design optimization often involves double-loop procedure for optimization and iterative probabilistic assessment. This results in high computational demand. The high computational demand can be reduced by replacing computationally intensive simulation models with less costly surrogate models and by employing Sequential Optimization and reliability assessment (SORA) method. The SORA method uses a single-loop strategy with a series of cycles of deterministic optimization and reliability assessment. The deterministic optimization and reliability assessment is decoupled in each cycle. This leads to quick improvement of design from one cycle to other and increase in computational efficiency. This paper demonstrates the effectiveness of Sequential Optimization and Reliability Assessment (SORA) method when applied to designing a sheet metal flanging process. Surrogate models are used as less costly approximations to the computationally expensive Finite Element simulations

  11. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  12. Level III Reliability methods feasible for complex structures

    NARCIS (Netherlands)

    Waarts, P.H.; Boer, A. de

    2001-01-01

    The paper describes the comparison between three types of reliability methods: code type level I used by a designer, full level I and a level III method. Two cases that are typical for civil engineering practise, a cable-stayed subjected to traffic load and the installation of a soil retaining sheet

  13. Developing a reliable signal wire attachment method for rail.

    Science.gov (United States)

    2014-11-01

    The goal of this project was to develop a better attachment method for rail signal wires to improve the reliability of signaling : systems. EWI conducted basic research into the failure mode of current attachment methods and developed and tested a ne...

  14. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  15. Reliability Estimation of the Pultrusion Process Using the First-Order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Baran, Ismet; Tutum, Cem Celal; Hattel, Jesper Henri

    2013-01-01

    In the present study the reliability estimation of the pultrusion process of a flat plate is analyzed by using the first order reliability method (FORM). The implementation of the numerical process model is validated by comparing the deterministic temperature and cure degree profiles...... with corresponding analyses in the literature. The centerline degree of cure at the exit (CDOCE) being less than a critical value and the maximum composite temperature (Tmax) during the process being greater than a critical temperature are selected as the limit state functions (LSFs) for the FORM. The cumulative...

  16. HUMAN RELIABILITY ANALYSIS DENGAN PENDEKATAN COGNITIVE RELIABILITY AND ERROR ANALYSIS METHOD (CREAM

    Directory of Open Access Journals (Sweden)

    Zahirah Alifia Maulida

    2015-01-01

    Full Text Available Kecelakaan kerja pada bidang grinding dan welding menempati urutan tertinggi selama lima tahun terakhir di PT. X. Kecelakaan ini disebabkan oleh human error. Human error terjadi karena pengaruh lingkungan kerja fisik dan non fisik.Penelitian kali menggunakan skenario untuk memprediksi serta mengurangi kemungkinan terjadinya error pada manusia dengan pendekatan CREAM (Cognitive Reliability and Error Analysis Method. CREAM adalah salah satu metode human reliability analysis yang berfungsi untuk mendapatkan nilai Cognitive Failure Probability (CFP yang dapat dilakukan dengan dua cara yaitu basic method dan extended method. Pada basic method hanya akan didapatkan nilai failure probabailty secara umum, sedangkan untuk extended method akan didapatkan CFP untuk setiap task. Hasil penelitian menunjukkan faktor- faktor yang mempengaruhi timbulnya error pada pekerjaan grinding dan welding adalah kecukupan organisasi, kecukupan dari Man Machine Interface (MMI & dukungan operasional, ketersediaan prosedur/ perencanaan, serta kecukupan pelatihan dan pengalaman. Aspek kognitif pada pekerjaan grinding yang memiliki nilai error paling tinggi adalah planning dengan nilai CFP 0.3 dan pada pekerjaan welding yaitu aspek kognitif execution dengan nilai CFP 0.18. Sebagai upaya untuk mengurangi nilai error kognitif pada pekerjaan grinding dan welding rekomendasi yang diberikan adalah memberikan training secara rutin, work instrucstion yang lebih rinci dan memberikan sosialisasi alat. Kata kunci: CREAM (cognitive reliability and error analysis method, HRA (human reliability analysis, cognitive error Abstract The accidents in grinding and welding sectors were the highest cases over the last five years in PT. X and it caused by human error. Human error occurs due to the influence of working environment both physically and non-physically. This study will implement an approaching scenario called CREAM (Cognitive Reliability and Error Analysis Method. CREAM is one of human

  17. Detecting binary black holes with efficient and reliable templates

    International Nuclear Information System (INIS)

    Damour, T.; Iyer, B.R.; Sathyaprakash, B.S.

    2001-01-01

    Detecting binary black holes in interferometer data requires an accurate knowledge of the orbital phase evolution of the system. From the point of view of data analysis one also needs fast algorithms to compute the templates that will be employed in searching for black hole binaries. Recently, there has been progress on both these fronts: On one hand, re-summation techniques have made it possible to accelerate the convergence of poorly convergent asymptotic post-Newtonian series and derive waveforms beyond the conventional adiabatic approximation. We now have a waveform model that extends beyond the inspiral regime into the plunge phase followed by the quasi-normal mode ringing. On the other hand, explicit Fourier domain waveforms have been derived that make the generation of waveforms fast enough so as not to be a burden on the computational resources required in filtering the detector data. These new developments should make it possible to efficiently and reliably search for black hole binaries in data from first interferometers. (author)

  18. Data processing of qualitative results from an interlaboratory comparison for the detection of "Flavescence dorée" phytoplasma: How the use of statistics can improve the reliability of the method validation process in plant pathology.

    Science.gov (United States)

    Chabirand, Aude; Loiseau, Marianne; Renaudin, Isabelle; Poliakoff, Françoise

    2017-01-01

    A working group established in the framework of the EUPHRESCO European collaborative project aimed to compare and validate diagnostic protocols for the detection of "Flavescence dorée" (FD) phytoplasma in grapevines. Seven molecular protocols were compared in an interlaboratory test performance study where each laboratory had to analyze the same panel of samples consisting of DNA extracts prepared by the organizing laboratory. The tested molecular methods consisted of universal and group-specific real-time and end-point nested PCR tests. Different statistical approaches were applied to this collaborative study. Firstly, there was the standard statistical approach consisting in analyzing samples which are known to be positive and samples which are known to be negative and reporting the proportion of false-positive and false-negative results to respectively calculate diagnostic specificity and sensitivity. This approach was supplemented by the calculation of repeatability and reproducibility for qualitative methods based on the notions of accordance and concordance. Other new approaches were also implemented, based, on the one hand, on the probability of detection model, and, on the other hand, on Bayes' theorem. These various statistical approaches are complementary and give consistent results. Their combination, and in particular, the introduction of new statistical approaches give overall information on the performance and limitations of the different methods, and are particularly useful for selecting the most appropriate detection scheme with regards to the prevalence of the pathogen. Three real-time PCR protocols (methods M4, M5 and M6 respectively developed by Hren (2007), Pelletier (2009) and under patent oligonucleotides) achieved the highest levels of performance for FD phytoplasma detection. This paper also addresses the issue of indeterminate results and the identification of outlier results. The statistical tools presented in this paper and their

  19. Data processing of qualitative results from an interlaboratory comparison for the detection of "Flavescence dorée" phytoplasma: How the use of statistics can improve the reliability of the method validation process in plant pathology.

    Directory of Open Access Journals (Sweden)

    Aude Chabirand

    Full Text Available A working group established in the framework of the EUPHRESCO European collaborative project aimed to compare and validate diagnostic protocols for the detection of "Flavescence dorée" (FD phytoplasma in grapevines. Seven molecular protocols were compared in an interlaboratory test performance study where each laboratory had to analyze the same panel of samples consisting of DNA extracts prepared by the organizing laboratory. The tested molecular methods consisted of universal and group-specific real-time and end-point nested PCR tests. Different statistical approaches were applied to this collaborative study. Firstly, there was the standard statistical approach consisting in analyzing samples which are known to be positive and samples which are known to be negative and reporting the proportion of false-positive and false-negative results to respectively calculate diagnostic specificity and sensitivity. This approach was supplemented by the calculation of repeatability and reproducibility for qualitative methods based on the notions of accordance and concordance. Other new approaches were also implemented, based, on the one hand, on the probability of detection model, and, on the other hand, on Bayes' theorem. These various statistical approaches are complementary and give consistent results. Their combination, and in particular, the introduction of new statistical approaches give overall information on the performance and limitations of the different methods, and are particularly useful for selecting the most appropriate detection scheme with regards to the prevalence of the pathogen. Three real-time PCR protocols (methods M4, M5 and M6 respectively developed by Hren (2007, Pelletier (2009 and under patent oligonucleotides achieved the highest levels of performance for FD phytoplasma detection. This paper also addresses the issue of indeterminate results and the identification of outlier results. The statistical tools presented in this paper

  20. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  1. An overview of reliability methods in mechanical and structural design

    Science.gov (United States)

    Wirsching, P. H.; Ortiz, K.; Lee, S. J.

    1987-01-01

    An evaluation is made of modern methods of fast probability integration and Monte Carlo treatment for the assessment of structural systems' and components' reliability. Fast probability integration methods are noted to be more efficient than Monte Carlo ones. This is judged to be an important consideration when several point probability estimates must be made in order to construct a distribution function. An example illustrating the relative efficiency of the various methods is included.

  2. Reliability assessment for thickness measurements of pipe wall using probability of detection

    International Nuclear Information System (INIS)

    Nakamoto, Hiroyuki; Kojima, Fumio; Kato, Sho

    2013-01-01

    This paper proposes a reliability assessment method for thickness measurements of pipe wall using probability of detection (POD). Thicknesses of pipes are measured by qualified inspectors with ultrasonic thickness gauges. The inspection results are affected by human factors of the inspectors and include some errors, because the inspectors have different experiences and frequency of inspections. In order to ensure reliability for inspection results, first, POD evaluates experimental results of pipe-wall thickness inspection. We verify that the results have differences depending on inspectors including qualified inspectors. Second, two human factors that affect POD are indicated. Finally, it is confirmed that POD can identify the human factors and ensure reliability for pipe-wall thickness inspections. (author)

  3. Investigation of MLE in nonparametric estimation methods of reliability function

    International Nuclear Information System (INIS)

    Ahn, Kwang Won; Kim, Yoon Ik; Chung, Chang Hyun; Kim, Kil Yoo

    2001-01-01

    There have been lots of trials to estimate a reliability function. In the ESReDA 20 th seminar, a new method in nonparametric way was proposed. The major point of that paper is how to use censored data efficiently. Generally there are three kinds of approach to estimate a reliability function in nonparametric way, i.e., Reduced Sample Method, Actuarial Method and Product-Limit (PL) Method. The above three methods have some limits. So we suggest an advanced method that reflects censored information more efficiently. In many instances there will be a unique maximum likelihood estimator (MLE) of an unknown parameter, and often it may be obtained by the process of differentiation. It is well known that the three methods generally used to estimate a reliability function in nonparametric way have maximum likelihood estimators that are uniquely exist. So, MLE of the new method is derived in this study. The procedure to calculate a MLE is similar just like that of PL-estimator. The difference of the two is that in the new method, the mass (or weight) of each has an influence of the others but the mass in PL-estimator not

  4. Reliability methods in nuclear power plant ageing management

    International Nuclear Information System (INIS)

    Simola, K.

    1999-01-01

    The aim of nuclear power plant ageing management is to maintain an adequate safety level throughout the lifetime of the plant. In ageing studies, the reliability of components, systems and structures is evaluated taking into account the possible time-dependent degradation. The phases of ageing analyses are generally the identification of critical components, identification and evaluation of ageing effects, and development of mitigation methods. This thesis focuses on the use of reliability methods and analyses of plant- specific operating experience in nuclear power plant ageing studies. The presented applications and method development have been related to nuclear power plants, but many of the approaches can also be applied outside the nuclear industry. The thesis consists of a summary and seven publications. The summary provides an overview of ageing management and discusses the role of reliability methods in ageing analyses. In the publications, practical applications and method development are described in more detail. The application areas at component and system level are motor-operated valves and protection automation systems, for which experience-based ageing analyses have been demonstrated. Furthermore, Bayesian ageing models for repairable components have been developed, and the management of ageing by improving maintenance practices is discussed. Recommendations for improvement of plant information management in order to facilitate ageing analyses are also given. The evaluation and mitigation of ageing effects on structural components is addressed by promoting the use of probabilistic modelling of crack growth, and developing models for evaluation of the reliability of inspection results. (orig.)

  5. Reliability methods in nuclear power plant ageing management

    Energy Technology Data Exchange (ETDEWEB)

    Simola, K. [VTT Automation, Espoo (Finland). Industrial Automation

    1999-07-01

    The aim of nuclear power plant ageing management is to maintain an adequate safety level throughout the lifetime of the plant. In ageing studies, the reliability of components, systems and structures is evaluated taking into account the possible time-dependent degradation. The phases of ageing analyses are generally the identification of critical components, identification and evaluation of ageing effects, and development of mitigation methods. This thesis focuses on the use of reliability methods and analyses of plant- specific operating experience in nuclear power plant ageing studies. The presented applications and method development have been related to nuclear power plants, but many of the approaches can also be applied outside the nuclear industry. The thesis consists of a summary and seven publications. The summary provides an overview of ageing management and discusses the role of reliability methods in ageing analyses. In the publications, practical applications and method development are described in more detail. The application areas at component and system level are motor-operated valves and protection automation systems, for which experience-based ageing analyses have been demonstrated. Furthermore, Bayesian ageing models for repairable components have been developed, and the management of ageing by improving maintenance practices is discussed. Recommendations for improvement of plant information management in order to facilitate ageing analyses are also given. The evaluation and mitigation of ageing effects on structural components is addressed by promoting the use of probabilistic modelling of crack growth, and developing models for evaluation of the reliability of inspection results. (orig.)

  6. Selected Methods For Increases Reliability The Of Electronic Systems Security

    Directory of Open Access Journals (Sweden)

    Paś Jacek

    2015-11-01

    Full Text Available The article presents the issues related to the different methods to increase the reliability of electronic security systems (ESS for example, a fire alarm system (SSP. Reliability of the SSP in the descriptive sense is a property preservation capacity to implement the preset function (e.g. protection: fire airport, the port, logistics base, etc., at a certain time and under certain conditions, e.g. Environmental, despite the possible non-compliance by a specific subset of elements this system. Analyzing the available literature on the ESS-SSP is not available studies on methods to increase the reliability (several works similar topics but moving with respect to the burglary and robbery (Intrusion. Based on the analysis of the set of all paths in the system suitability of the SSP for the scenario mentioned elements fire events (device critical because of security.

  7. A method of predicting the reliability of CDM coil insulation

    International Nuclear Information System (INIS)

    Kytasty, A.; Ogle, C.; Arrendale, H.

    1992-01-01

    This paper presents a method of predicting the reliability of the Collider Dipole Magnet (CDM) coil insulation design. The method proposes a probabilistic treatment of electrical test data, stress analysis, material properties variability and loading uncertainties to give the reliability estimate. The approach taken to predict reliability of design related failure modes of the CDM is to form analytical models of the various possible failure modes and their related mechanisms or causes, and then statistically assess the contributions of the various contributing variables. The probability of the failure mode occurring is interpreted as the number of times one would expect certain extreme situations to combine and randomly occur. One of the more complex failure modes of the CDM will be used to illustrate this methodology

  8. Human reliability analysis methods for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-11-01

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  9. Methods to compute reliabilities for genomic predictions of feed intake

    Science.gov (United States)

    For new traits without historical reference data, cross-validation is often the preferred method to validate reliability (REL). Time truncation is less useful because few animals gain substantial REL after the truncation point. Accurate cross-validation requires separating genomic gain from pedigree...

  10. Planning of operation & maintenance using risk and reliability based methods

    DEFF Research Database (Denmark)

    Florian, Mihai; Sørensen, John Dalsgaard

    2015-01-01

    Operation and maintenance (OM) of offshore wind turbines contributes with a substantial part of the total levelized cost of energy (LCOE). The objective of this paper is to present an application of risk- and reliability-based methods for planning of OM. The theoretical basis is presented...

  11. Assessment of reliability of Greulich and Pyle (gp) method for ...

    African Journals Online (AJOL)

    Background: Greulich and Pyle standards are the most widely used age estimation standards all over the world. The applicability of the Greulich and Pyle standards to populations which differ from their reference population is often questioned. This study aimed to assess the reliability of Greulich and Pyle (GP) method for ...

  12. Statistical Bayesian method for reliability evaluation based on ADT data

    Science.gov (United States)

    Lu, Dawei; Wang, Lizhi; Sun, Yusheng; Wang, Xiaohong

    2018-05-01

    Accelerated degradation testing (ADT) is frequently conducted in the laboratory to predict the products’ reliability under normal operating conditions. Two kinds of methods, degradation path models and stochastic process models, are utilized to analyze degradation data and the latter one is the most popular method. However, some limitations like imprecise solution process and estimation result of degradation ratio still exist, which may affect the accuracy of the acceleration model and the extrapolation value. Moreover, the conducted solution of this problem, Bayesian method, lose key information when unifying the degradation data. In this paper, a new data processing and parameter inference method based on Bayesian method is proposed to handle degradation data and solve the problems above. First, Wiener process and acceleration model is chosen; Second, the initial values of degradation model and parameters of prior and posterior distribution under each level is calculated with updating and iteration of estimation values; Third, the lifetime and reliability values are estimated on the basis of the estimation parameters; Finally, a case study is provided to demonstrate the validity of the proposed method. The results illustrate that the proposed method is quite effective and accuracy in estimating the lifetime and reliability of a product.

  13. Detection methods for irradiated foods

    International Nuclear Information System (INIS)

    Dyakova, A.; Tsvetkova, E.; Nikolova, R.

    2005-01-01

    In connection with the ongoing world application of irradiation as a technology in Food industry for increasing food safety, it became a need for methods of identification of irradiation. It was required to control international trade of irradiated foods, because of the certain that legally imposed food laws are not violated; supervise correct labeling; avoid multiple irradiation. Physical, chemical and biological methods for detection of irradiated foods as well principle that are based, are introducing in this summary

  14. Adjunct methods for caries detection

    DEFF Research Database (Denmark)

    Twetman, Svante; Axelsson, Susanna Bihari; Dahlén, Gunnar

    2012-01-01

    Abstract Objective. To assess the diagnostic accuracy of adjunct methods used to detect and quantify dental caries. Study design. A systematic literature search for relevant papers was conducted with pre-determined inclusion and exclusion criteria. Abstracts and full text articles were assessed...

  15. Issues in benchmarking human reliability analysis methods: A literature review

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Hendrickson, Stacey M.L.; Forester, John A.; Tran, Tuan Q.; Lois, Erasmia

    2010-01-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies is presented in order to aid in the design of future HRA benchmarking endeavors.

  16. Issues in benchmarking human reliability analysis methods : a literature review.

    Energy Technology Data Exchange (ETDEWEB)

    Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  17. Survey of industry methods for producing highly reliable software

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Persons, W.L.

    1994-11-01

    The Nuclear Reactor Regulation Office of the US Nuclear Regulatory Commission is charged with assessing the safety of new instrument and control designs for nuclear power plants which may use computer-based reactor protection systems. Lawrence Livermore National Laboratory has evaluated the latest techniques in software reliability for measurement, estimation, error detection, and prediction that can be used during the software life cycle as a means of risk assessment for reactor protection systems. One aspect of this task has been a survey of the software industry to collect information to help identify the design factors used to improve the reliability and safety of software. The intent was to discover what practices really work in industry and what design factors are used by industry to achieve highly reliable software. The results of the survey are documented in this report. Three companies participated in the survey: Computer Sciences Corporation, International Business Machines (Federal Systems Company), and TRW. Discussions were also held with NASA Software Engineering Lab/University of Maryland/CSC, and the AIAA Software Reliability Project

  18. Survey of methods used to asses human reliability in the human factors reliability benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1988-01-01

    The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim to assess the state-of-the-art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participate in the HF-RBE, which is organised around two study cases: (1) analysis of routine functional test and maintenance procedures, with the aim to assess the probability of test-induced failures, the probability of failures to remain unrevealed, and the potential to initiate transients because of errors performed in the test; and (2) analysis of human actions during an operational transient, with the aim to assess the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. The paper briefly reports how the HF-RBE was structured and gives an overview of the methods that have been used for predicting human reliability in both study cases. The experience in applying these methods is discussed and the results obtained are compared. (author)

  19. NDE reliability and probability of detection (POD) evolution and paradigm shift

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Surendra [NDE Engineering, Materials and Process Engineering, Honeywell Aerospace, Phoenix, AZ 85034 (United States)

    2014-02-18

    The subject of NDE Reliability and POD has gone through multiple phases since its humble beginning in the late 1960s. This was followed by several programs including the important one nicknamed “Have Cracks – Will Travel” or in short “Have Cracks” by Lockheed Georgia Company for US Air Force during 1974–1978. This and other studies ultimately led to a series of developments in the field of reliability and POD starting from the introduction of fracture mechanics and Damaged Tolerant Design (DTD) to statistical framework by Bernes and Hovey in 1981 for POD estimation to MIL-STD HDBK 1823 (1999) and 1823A (2009). During the last decade, various groups and researchers have further studied the reliability and POD using Model Assisted POD (MAPOD), Simulation Assisted POD (SAPOD), and applying Bayesian Statistics. All and each of these developments had one objective, i.e., improving accuracy of life prediction in components that to a large extent depends on the reliability and capability of NDE methods. Therefore, it is essential to have a reliable detection and sizing of large flaws in components. Currently, POD is used for studying reliability and capability of NDE methods, though POD data offers no absolute truth regarding NDE reliability, i.e., system capability, effects of flaw morphology, and quantifying the human factors. Furthermore, reliability and POD have been reported alike in meaning but POD is not NDE reliability. POD is a subset of the reliability that consists of six phases: 1) samples selection using DOE, 2) NDE equipment setup and calibration, 3) System Measurement Evaluation (SME) including Gage Repeatability and Reproducibility (Gage R and R) and Analysis Of Variance (ANOVA), 4) NDE system capability and electronic and physical saturation, 5) acquiring and fitting data to a model, and data analysis, and 6) POD estimation. This paper provides an overview of all major POD milestones for the last several decades and discuss rationale for using

  20. A generic method for estimating system reliability using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Ramirez-Marquez, Jose Emmanuel

    2009-01-01

    This study presents a holistic method for constructing a Bayesian network (BN) model for estimating system reliability. BN is a probabilistic approach that is used to model and predict the behavior of a system based on observed stochastic events. The BN model is a directed acyclic graph (DAG) where the nodes represent system components and arcs represent relationships among them. Although recent studies on using BN for estimating system reliability have been proposed, they are based on the assumption that a pre-built BN has been designed to represent the system. In these studies, the task of building the BN is typically left to a group of specialists who are BN and domain experts. The BN experts should learn about the domain before building the BN, which is generally very time consuming and may lead to incorrect deductions. As there are no existing studies to eliminate the need for a human expert in the process of system reliability estimation, this paper introduces a method that uses historical data about the system to be modeled as a BN and provides efficient techniques for automated construction of the BN model, and hence estimation of the system reliability. In this respect K2, a data mining algorithm, is used for finding associations between system components, and thus building the BN model. This algorithm uses a heuristic to provide efficient and accurate results while searching for associations. Moreover, no human intervention is necessary during the process of BN construction and reliability estimation. The paper provides a step-by-step illustration of the method and evaluation of the approach with literature case examples

  1. A generic method for estimating system reliability using Bayesian networks

    Energy Technology Data Exchange (ETDEWEB)

    Doguc, Ozge [Stevens Institute of Technology, Hoboken, NJ 07030 (United States); Ramirez-Marquez, Jose Emmanuel [Stevens Institute of Technology, Hoboken, NJ 07030 (United States)], E-mail: jmarquez@stevens.edu

    2009-02-15

    This study presents a holistic method for constructing a Bayesian network (BN) model for estimating system reliability. BN is a probabilistic approach that is used to model and predict the behavior of a system based on observed stochastic events. The BN model is a directed acyclic graph (DAG) where the nodes represent system components and arcs represent relationships among them. Although recent studies on using BN for estimating system reliability have been proposed, they are based on the assumption that a pre-built BN has been designed to represent the system. In these studies, the task of building the BN is typically left to a group of specialists who are BN and domain experts. The BN experts should learn about the domain before building the BN, which is generally very time consuming and may lead to incorrect deductions. As there are no existing studies to eliminate the need for a human expert in the process of system reliability estimation, this paper introduces a method that uses historical data about the system to be modeled as a BN and provides efficient techniques for automated construction of the BN model, and hence estimation of the system reliability. In this respect K2, a data mining algorithm, is used for finding associations between system components, and thus building the BN model. This algorithm uses a heuristic to provide efficient and accurate results while searching for associations. Moreover, no human intervention is necessary during the process of BN construction and reliability estimation. The paper provides a step-by-step illustration of the method and evaluation of the approach with literature case examples.

  2. Towards Reliable Evaluation of Anomaly-Based Intrusion Detection Performance

    Science.gov (United States)

    Viswanathan, Arun

    2012-01-01

    This report describes the results of research into the effects of environment-induced noise on the evaluation process for anomaly detectors in the cyber security domain. This research was conducted during a 10-week summer internship program from the 19th of August, 2012 to the 23rd of August, 2012 at the Jet Propulsion Laboratory in Pasadena, California. The research performed lies within the larger context of the Los Angeles Department of Water and Power (LADWP) Smart Grid cyber security project, a Department of Energy (DoE) funded effort involving the Jet Propulsion Laboratory, California Institute of Technology and the University of Southern California/ Information Sciences Institute. The results of the present effort constitute an important contribution towards building more rigorous evaluation paradigms for anomaly-based intrusion detectors in complex cyber physical systems such as the Smart Grid. Anomaly detection is a key strategy for cyber intrusion detection and operates by identifying deviations from profiles of nominal behavior and are thus conceptually appealing for detecting "novel" attacks. Evaluating the performance of such a detector requires assessing: (a) how well it captures the model of nominal behavior, and (b) how well it detects attacks (deviations from normality). Current evaluation methods produce results that give insufficient insight into the operation of a detector, inevitably resulting in a significantly poor characterization of a detectors performance. In this work, we first describe a preliminary taxonomy of key evaluation constructs that are necessary for establishing rigor in the evaluation regime of an anomaly detector. We then focus on clarifying the impact of the operational environment on the manifestation of attacks in monitored data. We show how dynamic and evolving environments can introduce high variability into the data stream perturbing detector performance. Prior research has focused on understanding the impact of this

  3. A novel approach for reliable detection of cathepsin S activities in mouse antigen presenting cells.

    Science.gov (United States)

    Steimle, Alex; Kalbacher, Hubert; Maurer, Andreas; Beifuss, Brigitte; Bender, Annika; Schäfer, Andrea; Müller, Ricarda; Autenrieth, Ingo B; Frick, Julia-Stefanie

    2016-05-01

    Cathepsin S (CTSS) is a eukaryotic protease mostly expressed in professional antigen presenting cells (APCs). Since CTSS activity regulation plays a role in the pathogenesis of various autoimmune diseases like multiple sclerosis, atherosclerosis, Sjögren's syndrome and psoriasis as well as in cancer progression, there is an ongoing interest in the reliable detection of cathepsin S activity. Various applications have been invented for specific detection of this enzyme. However, most of them have only been shown to be suitable for human samples, do not deliver quantitative results or the experimental procedure requires technical equipment that is not commonly available in a standard laboratory. We have tested a fluorogen substrate, Mca-GRWPPMGLPWE-Lys(Dnp)-DArg-NH2, that has been described to specifically detect CTSS activities in human APCs for its potential use for mouse samples. We have modified the protocol and thereby offer a cheap, easy, reproducible and quick activity assay to detect CTSS activities in mouse APCs. Since most of basic research on CTSS is performed in mice, this method closes a gap and offers a possibility for reliable and quantitative CTSS activity detection that can be performed in almost every laboratory. Copyright © 2016. Published by Elsevier B.V.

  4. Method of detecting failed fuels

    International Nuclear Information System (INIS)

    Ishizaki, Hideaki; Suzumura, Takeshi.

    1982-01-01

    Purpose: To enable the settlement of the temperature of an adequate filling high temperature pure water by detecting the outlet temperature of a high temperature pure water filling tube to a fuel assembly to control the heating of the pure water and detecting the failed fuel due to the sampling of the pure water. Method: A temperature sensor is provided at a water tube connected to a sipping cap for filling high temperature pure water to detect the temperature of the high temperature pure water at the outlet of the tube, and the temperature is confirmed by a temperature indicator. A heater is controlled on the basis of this confirmation, an adequate high temperature pure water is filled in the fuel assembly, and the pure water is replaced with coolant. Then, it is sampled to settle the adequate temperature of the high temperature coolant used for detecting the failure of the fuel assembly. As a result, the sipping effect does not decrease, and the failed fuel can be precisely detected. (Yoshihara, H.)

  5. Limitations in simulator time-based human reliability analysis methods

    International Nuclear Information System (INIS)

    Wreathall, J.

    1989-01-01

    Developments in human reliability analysis (HRA) methods have evolved slowly. Current methods are little changed from those of almost a decade ago, particularly in the use of time-reliability relationships. While these methods were suitable as an interim step, the time (and the need) has come to specify the next evolution of HRA methods. As with any performance-oriented data source, power plant simulator data have no direct connection to HRA models. Errors reported in data are normal deficiencies observed in human performance; failures are events modeled in probabilistic risk assessments (PRAs). Not all errors cause failures; not all failures are caused by errors. Second, the times at which actions are taken provide no measure of the likelihood of failures to act correctly within an accident scenario. Inferences can be made about human reliability, but they must be made with great care. Specific limitations are discussed. Simulator performance data are useful in providing qualitative evidence of the variety of error types and their potential influences on operating systems. More work is required to combine recent developments in the psychology of error with the qualitative data collected at stimulators. Until data become openly available, however, such an advance will not be practical

  6. Molecular methods for the detection of mutations.

    Science.gov (United States)

    Monteiro, C; Marcelino, L A; Conde, A R; Saraiva, C; Giphart-Gassler, M; De Nooij-van Dalen, A G; Van Buuren-van Seggelen, V; Van der Keur, M; May, C A; Cole, J; Lehmann, A R; Steinsgrimsdottir, H; Beare, D; Capulas, E; Armour, J A

    2000-01-01

    We report the results of a collaborative study aimed at developing reliable, direct assays for mutation in human cells. The project used common lymphoblastoid cell lines, both with and without mutagen treatment, as a shared resource to validate the development of new molecular methods for the detection of low-level mutations in the presence of a large excess of normal alleles. As the "gold standard, " hprt mutation frequencies were also measured on the same samples. The methods under development included i) the restriction site mutation (RSM) assay, in which mutations lead to the destruction of a restriction site; ii) minisatellite length-change mutation, in which mutations lead to alleles containing new numbers of tandem repeat units; iii) loss of heterozygosity for HLA epitopes, in which antibodies can be used to direct selection for mutant cells; iv) multiple fluorescence-based long linker arm nucleotides assay (mf-LLA) technology, for the detection of substitutional mutations; v) detection of alterations in the TP53 locus using a (CA) array as the target for the screening; and vi) PCR analysis of lymphocytes for the presence of the BCL2 t(14:18) translocation. The relative merits of these molecular methods are discussed, and a comparison made with more "traditional" methods.

  7. Detection methods of irradiated foodstuffs

    Energy Technology Data Exchange (ETDEWEB)

    Ponta, C C; Cutrubinis, M; Georgescu, R [IRASM Center, Horia Hulubei National Institute for Physics and Nuclear Engineering, PO Box MG-6, RO-077125 Magurele-Bucharest (Romania); Mihai, R [Life and Environmental Physics Department, Horia Hulubei National Institute for Physics and Nuclear Engineering, PO Box MG-6, RO-077125 Magurele-Bucharest (Romania); Secu, M [National Institute of Materials Physics, Bucharest (Romania)

    2005-07-01

    food is marketed as irradiated or if irradiated goods are sold without the appropriate labeling, then detection tests should be able to prove the authenticity of the product. For the moment in Romania there is not any food control laboratory able to detect irradiated foodstuffs. The Technological Irradiation Department coordinates and co finances a research project aimed to establish the first Laboratory of Irradiated Foodstuffs Detection. The detection methods studied in this project are the ESR methods (for cellulose EN 1787/2000, bone EN 1786/1996 and crystalline sugar EN 13708/2003), the TL method (EN 1788/2001), the PSL method (EN 13751/2002) and the DNA Comet Assay method (EN 13784/2001). The above detection methods will be applied on various foodstuffs such: garlic, onion, potatoes, rice, beans, wheat, maize, pistachio, sunflower seeds, raisins, figs, strawberries, chicken, beef, fish, pepper, paprika, thyme, laurel and mushrooms. As an example of the application of a detection method there are presented the ESR spectra of irradiated and nonirradiated paprika acquired according to ESR detection method for irradiated foodstuffs containing cellulose. First of all it can be noticed that the intensity of the signal of cellulose is much higher for the irradiated sample than that for the nonirradiated one and second that appear two radiation specific signals symmetrical to the cellulose signal. These two radiation specific signals prove the irradiation treatment of paprika. (author)

  8. An exact method for solving logical loops in reliability analysis

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2009-01-01

    This paper presents an exact method for solving logical loops in reliability analysis. The systems that include logical loops are usually described by simultaneous Boolean equations. First, present a basic rule of solving simultaneous Boolean equations. Next, show the analysis procedures for three-component system with external supports. Third, more detailed discussions are given for the establishment of logical loop relation. Finally, take up two typical structures which include more than one logical loop. Their analysis results and corresponding GO-FLOW charts are given. The proposed analytical method is applicable to loop structures that can be described by simultaneous Boolean equations, and it is very useful in evaluating the reliability of complex engineering systems.

  9. Bedside ultrasound reliability in locating catheter and detecting complications

    Directory of Open Access Journals (Sweden)

    Payman Moharamzadeh

    2016-10-01

    Full Text Available Introduction: Central venous catheterization is one of the most common medical procedures and is associated with such complications as misplacement and pneumothorax. Chest X-ray is among good ways for evaluation of these complications. However, due to patient’s excessive exposure to radiation, time consumption and low diagnostic value in detecting pneumothorax in the supine patient, the present study intends to examine bedside ultrasound diagnostic value in locating tip of the catheter and pneumothorax. Materials and methods: In the present cross-sectional study, all referred patients requiring central venous catheterization were examined. Central venous catheterization was performed by a trained emergency medicine specialist, and the location of catheter and the presence of pneumothorax were examined and compared using two modalities of ultrasound and x-ray (as the reference standard. Sensitivity, specificity, and positive and negative predicting values were reported. Results: A total of 200 non-trauma patients were included in the study (58% men. Cohen’s Kappa consistency coefficients for catheterization and diagnosis of pneumothorax were found as 0.49 (95% CI: 0.43-0.55, 0.89 (P<0.001, (95% CI: 97.8-100, respectively. Also, ultrasound sensitivity and specificity in diagnosing pneumothorax were 75% (95% CI: 35.6-95.5, and 100% (95% CI: 97.6-100, respectively. Conclusion: The present study results showed low diagnostic value of ultrasound in determining catheter location and in detecting pneumothorax. With knowledge of previous studies, the search still on this field.   Keywords: Central venous catheterization; complications; bedside ultrasound; radiography;

  10. Method of detecting irradiated pepper

    International Nuclear Information System (INIS)

    Doumaru, Takaaki; Furuta, Masakazu; Katayama, Tadashi; Toratani, Hirokazu; Takeda, Atsuhiko

    1989-01-01

    Spices represented by pepper are generally contaminated by microorganisms, and for using them as foodstuffs, some sterilizing treatment is indispensable. However, heating is not suitable to spices, accordingly ethylene oxide gas sterilization has been inevitably carried out, but its carcinogenic property is a problem. Food irradiation is the technology for killing microorganisms and noxious insects which cause the rotting and spoiling of foods and preventing the germination, which is an energy-conserving method without the fear of residual chemicals, therefore, it is most suitable to the sterilization of spices. In the irradiation of lower than 10 kGy, the toxicity test is not required for any food, and the irradiation of spices is permitted in 20 countries. However, in order to establish the international distribution organization for irradiated foods, the PR to consumers and the development of the means of detecting irradiation are the important subjects. The authors used pepper, and examined whether the hydrogen generated by irradiation remains in seeds and it can be detected or not. The experimental method and the results are reported. From the samples without irradiation, hydrogen was scarcely detected. The quantity of hydrogen generated was proportional to dose. The measuring instrument is only a gas chromatograph. (K.I.)

  11. COMPOSITE METHOD OF RELIABILITY RESEARCH FOR HIERARCHICAL MULTILAYER ROUTING SYSTEMS

    Directory of Open Access Journals (Sweden)

    R. B. Tregubov

    2016-09-01

    Full Text Available The paper deals with the idea of a research method for hierarchical multilayer routing systems. The method represents a composition of methods of graph theories, reliability, probabilities, etc. These methods are applied to the solution of different private analysis and optimization tasks and are systemically connected and coordinated with each other through uniform set-theoretic representation of the object of research. The hierarchical multilayer routing systems are considered as infrastructure facilities (gas and oil pipelines, automobile and railway networks, systems of power supply and communication with distribution of material resources, energy or information with the use of hierarchically nested functions of routing. For descriptive reasons theoretical constructions are considered on the example of task solution of probability determination for up state of specific infocommunication system. The author showed the possibility of constructive combination of graph representation of structure of the object of research and a logic probable analysis method of its reliability indices through uniform set-theoretic representation of its elements and processes proceeding in them.

  12. Rapid and reliable detection and identification of GM events using multiplex PCR coupled with oligonucleotide microarray.

    Science.gov (United States)

    Xu, Xiaodan; Li, Yingcong; Zhao, Heng; Wen, Si-yuan; Wang, Sheng-qi; Huang, Jian; Huang, Kun-lun; Luo, Yun-bo

    2005-05-18

    To devise a rapid and reliable method for the detection and identification of genetically modified (GM) events, we developed a multiplex polymerase chain reaction (PCR) coupled with a DNA microarray system simultaneously aiming at many targets in a single reaction. The system included probes for screening gene, species reference gene, specific gene, construct-specific gene, event-specific gene, and internal and negative control genes. 18S rRNA was combined with species reference genes as internal controls to assess the efficiency of all reactions and to eliminate false negatives. Two sets of the multiplex PCR system were used to amplify four and five targets, respectively. Eight different structure genes could be detected and identified simultaneously for Roundup Ready soybean in a single microarray. The microarray specificity was validated by its ability to discriminate two GM maizes Bt176 and Bt11. The advantages of this method are its high specificity and greatly reduced false-positives and -negatives. The multiplex PCR coupled with microarray technology presented here is a rapid and reliable tool for the simultaneous detection of GM organism ingredients.

  13. Structural Reliability Using Probability Density Estimation Methods Within NESSUS

    Science.gov (United States)

    Chamis, Chrisos C. (Technical Monitor); Godines, Cody Ric

    2003-01-01

    A reliability analysis studies a mathematical model of a physical system taking into account uncertainties of design variables and common results are estimations of a response density, which also implies estimations of its parameters. Some common density parameters include the mean value, the standard deviation, and specific percentile(s) of the response, which are measures of central tendency, variation, and probability regions, respectively. Reliability analyses are important since the results can lead to different designs by calculating the probability of observing safe responses in each of the proposed designs. All of this is done at the expense of added computational time as compared to a single deterministic analysis which will result in one value of the response out of many that make up the density of the response. Sampling methods, such as monte carlo (MC) and latin hypercube sampling (LHS), can be used to perform reliability analyses and can compute nonlinear response density parameters even if the response is dependent on many random variables. Hence, both methods are very robust; however, they are computationally expensive to use in the estimation of the response density parameters. Both methods are 2 of 13 stochastic methods that are contained within the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) program. NESSUS is a probabilistic finite element analysis (FEA) program that was developed through funding from NASA Glenn Research Center (GRC). It has the additional capability of being linked to other analysis programs; therefore, probabilistic fluid dynamics, fracture mechanics, and heat transfer are only a few of what is possible with this software. The LHS method is the newest addition to the stochastic methods within NESSUS. Part of this work was to enhance NESSUS with the LHS method. The new LHS module is complete, has been successfully integrated with NESSUS, and been used to study four different test cases that have been

  14. Soybean allergen detection methods--a comparison study

    DEFF Research Database (Denmark)

    Pedersen, M. Højgaard; Holzhauser, T.; Bisson, C.

    2008-01-01

    Soybean containing products are widely consumed, thus reliable methods for detection of soy in foods are needed in order to make appropriate risk assessment studies to adequately protect soy allergic patients. Six methods were compared using eight food products with a declared content of soy...

  15. PV Systems Reliability Final Technical Report: Ground Fault Detection

    Energy Technology Data Exchange (ETDEWEB)

    Lavrova, Olga [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flicker, Jack David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Johnson, Jay [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    We have examined ground faults in PhotoVoltaic (PV) arrays and the efficacy of fuse, current detection (RCD), current sense monitoring/relays (CSM), isolation/insulation (Riso) monitoring, and Ground Fault Detection and Isolation (GFID) using simulations based on a Simulation Program with Integrated Circuit Emphasis SPICE ground fault circuit model, experimental ground faults installed on real arrays, and theoretical equations.

  16. Applicability of simplified human reliability analysis methods for severe accidents

    Energy Technology Data Exchange (ETDEWEB)

    Boring, R.; St Germain, S. [Idaho National Lab., Idaho Falls, Idaho (United States); Banaseanu, G.; Chatri, H.; Akl, Y. [Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2016-03-15

    Most contemporary human reliability analysis (HRA) methods were created to analyse design-basis accidents at nuclear power plants. As part of a comprehensive expansion of risk assessments at many plants internationally, HRAs will begin considering severe accident scenarios. Severe accidents, while extremely rare, constitute high consequence events that significantly challenge successful operations and recovery. Challenges during severe accidents include degraded and hazardous operating conditions at the plant, the shift in control from the main control room to the technical support center, the unavailability of plant instrumentation, and the need to use different types of operating procedures. Such shifts in operations may also test key assumptions in existing HRA methods. This paper discusses key differences between design basis and severe accidents, reviews efforts to date to create customized HRA methods suitable for severe accidents, and recommends practices for adapting existing HRA methods that are already being used for HRAs at the plants. (author)

  17. A simple reliability block diagram method for safety integrity verification

    International Nuclear Information System (INIS)

    Guo Haitao; Yang Xianhui

    2007-01-01

    IEC 61508 requires safety integrity verification for safety related systems to be a necessary procedure in safety life cycle. PFD avg must be calculated to verify the safety integrity level (SIL). Since IEC 61508-6 does not give detailed explanations of the definitions and PFD avg calculations for its examples, it is difficult for common reliability or safety engineers to understand when they use the standard as guidance in practice. A method using reliability block diagram is investigated in this study in order to provide a clear and feasible way of PFD avg calculation and help those who take IEC 61508-6 as their guidance. The method finds mean down times (MDTs) of both channel and voted group first and then PFD avg . The calculated results of various voted groups are compared with those in IEC61508 part 6 and Ref. [Zhang T, Long W, Sato Y. Availability of systems with self-diagnostic components-applying Markov model to IEC 61508-6. Reliab Eng System Saf 2003;80(2):133-41]. An interesting outcome can be realized from the comparison. Furthermore, although differences in MDT of voted groups exist between IEC 61508-6 and this paper, PFD avg of voted groups are comparatively close. With detailed description, the method of RBD presented can be applied to the quantitative SIL verification, showing a similarity of the method in IEC 61508-6

  18. Application of system reliability analytical method, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Mitomo, Nobuo; Miyazaki, Keiko; Matsukura, Hiroshi; Kobayashi, Michiyuki

    1999-01-01

    The Ship Research Institute proceed a developmental study on GO-FLOW method with various advancing functionalities for the system reliability analysis method occupying main parts of PSA (Probabilistic Safety Assessment). Here was attempted to intend to upgrade functionality of the GO-FLOW method, to develop an analytical function integrated with dynamic behavior analytical function, physical behavior and probable subject transfer, and to prepare a main accident sequence picking-out function. In 1997 fiscal year, in dynamic event-tree analytical system, an analytical function was developed by adding dependency between headings. In simulation analytical function of the accident sequence, main accident sequence of MRX for improved ship propulsion reactor became possible to be covered perfectly. And, input data for analysis was prepared with a function capable easily to set by an analysis operator. (G.K.)

  19. Improvement of human reliability analysis method for PRA

    International Nuclear Information System (INIS)

    Tanji, Junichi; Fujimoto, Haruo

    2013-09-01

    It is required to refine human reliability analysis (HRA) method by, for example, incorporating consideration for the cognitive process of operator into the evaluation of diagnosis errors and decision-making errors, as a part of the development and improvement of methods used in probabilistic risk assessments (PRAs). JNES has been developed a HRA method based on ATHENA which is suitable to handle the structured relationship among diagnosis errors, decision-making errors and operator cognition process. This report summarizes outcomes obtained from the improvement of HRA method, in which enhancement to evaluate how the plant degraded condition affects operator cognitive process and to evaluate human error probabilities (HEPs) which correspond to the contents of operator tasks is made. In addition, this report describes the results of case studies on the representative accident sequences to investigate the applicability of HRA method developed. HEPs of the same accident sequences are also estimated using THERP method, which is most popularly used HRA method, and comparisons of the results obtained using these two methods are made to depict the differences of these methods and issues to be solved. Important conclusions obtained are as follows: (1) Improvement of HRA method using operator cognitive action model. Clarification of factors to be considered in the evaluation of human errors, incorporation of degraded plant safety condition into HRA and investigation of HEPs which are affected by the contents of operator tasks were made to improve the HRA method which can integrate operator cognitive action model into ATHENA method. In addition, the detail of procedures of the improved method was delineated in the form of flowchart. (2) Case studies and comparison with the results evaluated by THERP method. Four operator actions modeled in the PRAs of representative BWR5 and 4-loop PWR plants were selected and evaluated as case studies. These cases were also evaluated using

  20. Method to detect biological particles

    International Nuclear Information System (INIS)

    Giaever, I.

    1976-01-01

    A medical-diagnostic method to detect immunological as well as other specific reactions is described. According to the invention, first reactive particles (e.g. antibodies) are adsorbed on the surface of a solid, non-reactive substrate. The coated substrate is subjected to a solution which one assumes to contain the second biological particles (e.g. antigens) which are specific to the first and form complexes with these. A preferential radioactive labelling (e.g. with iodine 125) of the second biological particle is then directly or indirectly carried out. Clearage follows labelling in order to separate the second biological particles from the first ones. A specific splitting agent can selectively break the bond of both types of particle. The splitting agent solution is finally separated off and its content is investigated for the presence of labelling. (VJ) [de

  1. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    Directory of Open Access Journals (Sweden)

    Hai An

    2016-08-01

    Full Text Available Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new hybrid reliability index definition is presented based on the random–fuzzy–interval model. Furthermore, the calculation flowchart of the hybrid reliability index is presented and it is solved using the modified limit-step length iterative algorithm, which ensures convergence. And the validity of convergent algorithm for the hybrid reliability model is verified through the calculation examples in literature. In the end, a numerical example is demonstrated to show that the hybrid reliability index is applicable for the wear reliability assessment of mechanisms, where truncated random variables, fuzzy random variables, and interval variables coexist. The demonstration also shows the good convergence of the iterative algorithm proposed in this article.

  2. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  3. System principles, mathematical models and methods to ensure high reliability of safety systems

    Science.gov (United States)

    Zaslavskyi, V.

    2017-04-01

    Modern safety and security systems are composed of a large number of various components designed for detection, localization, tracking, collecting, and processing of information from the systems of monitoring, telemetry, control, etc. They are required to be highly reliable in a view to correctly perform data aggregation, processing and analysis for subsequent decision making support. On design and construction phases of the manufacturing of such systems a various types of components (elements, devices, and subsystems) are considered and used to ensure high reliability of signals detection, noise isolation, and erroneous commands reduction. When generating design solutions for highly reliable systems a number of restrictions and conditions such as types of components and various constrains on resources should be considered. Various types of components perform identical functions; however, they are implemented using diverse principles, approaches and have distinct technical and economic indicators such as cost or power consumption. The systematic use of different component types increases the probability of tasks performing and eliminates the common cause failure. We consider type-variety principle as an engineering principle of system analysis, mathematical models based on this principle, and algorithms for solving optimization problems of highly reliable safety and security systems design. Mathematical models are formalized in a class of two-level discrete optimization problems of large dimension. The proposed approach, mathematical models, algorithms can be used for problem solving of optimal redundancy on the basis of a variety of methods and control devices for fault and defects detection in technical systems, telecommunication networks, and energy systems.

  4. Structural reliability calculation method based on the dual neural network and direct integration method.

    Science.gov (United States)

    Li, Haibin; He, Yun; Nie, Xiaobo

    2018-01-01

    Structural reliability analysis under uncertainty is paid wide attention by engineers and scholars due to reflecting the structural characteristics and the bearing actual situation. The direct integration method, started from the definition of reliability theory, is easy to be understood, but there are still mathematics difficulties in the calculation of multiple integrals. Therefore, a dual neural network method is proposed for calculating multiple integrals in this paper. Dual neural network consists of two neural networks. The neural network A is used to learn the integrand function, and the neural network B is used to simulate the original function. According to the derivative relationships between the network output and the network input, the neural network B is derived from the neural network A. On this basis, the performance function of normalization is employed in the proposed method to overcome the difficulty of multiple integrations and to improve the accuracy for reliability calculations. The comparisons between the proposed method and Monte Carlo simulation method, Hasofer-Lind method, the mean value first-order second moment method have demonstrated that the proposed method is an efficient and accurate reliability method for structural reliability problems.

  5. Collection of methods for reliability and safety engineering

    International Nuclear Information System (INIS)

    Fussell, J.B.; Rasmuson, D.M.; Wilson, J.R.; Burdick, G.R.; Zipperer, J.C.

    1976-04-01

    The document presented contains five reports each describing a method of reliability and safety engineering. Report I provides a conceptual framework for the study of component malfunctions during system evaluations. Report II provides methods for locating groups of critical component failures such that all the component failures in a given group can be caused to occur by the occurrence of a single separate event. These groups of component failures are called common cause candidates. Report III provides a method for acquiring and storing system-independent component failure logic information. The information stored is influenced by the concepts presented in Report I and also includes information useful in locating common cause candidates. Report IV puts forth methods for analyzing situations that involve systems which change character in a predetermined time sequence. These phased missions techniques are applicable to the hypothetical ''accident chains'' frequently analyzed for nuclear power plants. Report V presents a unified approach to cause-consequence analysis, a method of analysis useful during risk assessments. This approach, as developed by the Danish Atomic Energy Commission, is modified to reflect the format and symbology conventionally used for other types of analysis of nuclear reactor systems

  6. Research Note The reliability of a field test kit for the detection and ...

    African Journals Online (AJOL)

    Research Note The reliability of a field test kit for the detection and the persistence of ... Open Access DOWNLOAD FULL TEXT ... The objectives were to test a field kit for practicality and reliability, to assess the spread of the bacteria among ...

  7. A Reliability-Oriented Design Method for Power Electronic Converters

    DEFF Research Database (Denmark)

    Wang, Huai; Zhou, Dao; Blaabjerg, Frede

    2013-01-01

    Reliability is a crucial performance indicator of power electronic systems in terms of availability, mission accomplishment and life cycle cost. A paradigm shift in the research on reliability of power electronics is going on from simple handbook based calculations (e.g. models in MIL-HDBK-217F h...... and reliability prediction models are provided. A case study on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical component IGBT modules....

  8. Reliability and applications of statistical methods based on oligonucleotide frequencies in bacterial and archaeal genomes

    DEFF Research Database (Denmark)

    Bohlin, J; Skjerve, E; Ussery, David

    2008-01-01

    with here are mainly used to examine similarities between archaeal and bacterial DNA from different genomes. These methods compare observed genomic frequencies of fixed-sized oligonucleotides with expected values, which can be determined by genomic nucleotide content, smaller oligonucleotide frequencies......, or be based on specific statistical distributions. Advantages with these statistical methods include measurements of phylogenetic relationship with relatively small pieces of DNA sampled from almost anywhere within genomes, detection of foreign/conserved DNA, and homology searches. Our aim was to explore...... the reliability and best suited applications for some popular methods, which include relative oligonucleotide frequencies (ROF), di- to hexanucleotide zero'th order Markov methods (ZOM) and 2.order Markov chain Method (MCM). Tests were performed on distant homology searches with large DNA sequences, detection...

  9. Precision profiles and analytic reliability of radioimmunologic methods

    International Nuclear Information System (INIS)

    Yaneva, Z.; Popova, Yu.

    1991-01-01

    The aim of the present study is to investigate and compare some methods for creation of 'precision profiles' (PP) and to clarify their possibilities for determining the analytical reliability of RIA. Only methods without complicated mathematical calculations has been used. The reproducibility in serums with a concentration of the determinable hormone in the whole range of the calibration curve has been studied. The radioimmunoassay has been performed with TSH-RIA set (ex East Germany), and comparative evaluations - with commercial sets of HOECHST (Germany) and AMERSHAM (GB). Three methods for obtaining the relationship concentration (IU/l) -reproducibility (C.V.,%) are used and a comparison is made of their corresponding profiles: preliminary rough profile, Rodbard-PP and Ekins-PP. It is concluded that the creation of a precision profile is obligatory and the method of its construction does not influence the relationship's course. PP allows to determine concentration range giving stable results which improves the efficiency of the analitical work. 16 refs., 4 figs

  10. Monte Carlo methods for the reliability analysis of Markov systems

    International Nuclear Information System (INIS)

    Buslik, A.J.

    1985-01-01

    This paper presents Monte Carlo methods for the reliability analysis of Markov systems. Markov models are useful in treating dependencies between components. The present paper shows how the adjoint Monte Carlo method for the continuous time Markov process can be derived from the method for the discrete-time Markov process by a limiting process. The straightforward extensions to the treatment of mean unavailability (over a time interval) are given. System unavailabilities can also be estimated; this is done by making the system failed states absorbing, and not permitting repair from them. A forward Monte Carlo method is presented in which the weighting functions are related to the adjoint function. In particular, if the exact adjoint function is known then weighting factors can be constructed such that the exact answer can be obtained with a single Monte Carlo trial. Of course, if the exact adjoint function is known, there is no need to perform the Monte Carlo calculation. However, the formulation is useful since it gives insight into choices of the weight factors which will reduce the variance of the estimator

  11. Reliability considerations of electronics components for the deep underwater muon and neutrino detection system

    International Nuclear Information System (INIS)

    Leskovar, B.

    1980-02-01

    The reliability of some electronics components for the Deep Underwater Muon and Neutrino Detection (DUMAND) System is discussed. An introductory overview of engineering concepts and technique for reliability assessment is given. Component reliability is discussed in the contest of major factors causing failures, particularly with respect to physical and chemical causes, process technology and testing, and screening procedures. Failure rates are presented for discrete devices and for integrated circuits as well as for basic electronics components. Furthermore, the military reliability specifications and standards for semiconductor devices are reviewed

  12. Development of reliability centered maintenance methods and tools

    International Nuclear Information System (INIS)

    Jacquot, J.P.; Dubreuil-Chambardel, A.; Lannoy, A.; Monnier, B.

    1992-12-01

    This paper recalls the development of the RCM (Reliability Centered Maintenance) approach in the nuclear industry and describes the trial study implemented by EDF in the context of the OMF (RCM) Project. The approach developed is currently being applied to about thirty systems (Industrial Project). On a parallel, R and D efforts are being maintained to improve the selectivity of the analysis methods. These methods use Probabilistic Safety Study models, thereby guaranteeing better selectivity in the identification of safety critical elements and enhancing consistency between Maintenance and Safety studies. They also offer more detailed analysis of operation feedback, invoking for example Bayes' methods combining expert judgement and feedback data. Finally, they propose a functional and material representation of the plant. This dual representation describes both the functions assured by maintenance provisions and the material elements required for their implementation. In the final chapter, the targets of the future OMF workstation are summarized and the latter's insertion in the EDF information system is briefly described. (authors). 5 figs., 2 tabs., 7 refs

  13. Particle detection systems and methods

    Science.gov (United States)

    Morris, Christopher L.; Makela, Mark F.

    2010-05-11

    Techniques, apparatus and systems for detecting particles such as muons and neutrons. In one implementation, a particle detection system employs a plurality of drift cells, which can be for example sealed gas-filled drift tubes, arranged on sides of a volume to be scanned to track incoming and outgoing charged particles, such as cosmic ray-produced muons. The drift cells can include a neutron sensitive medium to enable concurrent counting of neutrons. The system can selectively detect devices or materials, such as iron, lead, gold, uranium, plutonium, and/or tungsten, occupying the volume from multiple scattering of the charged particles passing through the volume and can concurrently detect any unshielded neutron sources occupying the volume from neutrons emitted therefrom. If necessary, the drift cells can be used to also detect gamma rays. The system can be employed to inspect occupied vehicles at border crossings for nuclear threat objects.

  14. Results of a Demonstration Assessment of Passive System Reliability Utilizing the Reliability Method for Passive Systems (RMPS)

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia; Grelle, Austin

    2015-04-26

    Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), a systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.

  15. The psychophysiological assessment method for pilot's professional reliability.

    Science.gov (United States)

    Zhang, L M; Yu, L S; Wang, K N; Jing, B S; Fang, C

    1997-05-01

    Previous research has shown that a pilot's professional reliability depends on two relative factors: the pilot's functional state and the demands of task workload. The Psychophysiological Reserve Capacity (PRC) is defined as a pilot's ability to accomplish additive tasks without reducing the performance of the primary task (flight task). We hypothesized that the PRC was a mirror of the pilot's functional state. The purpose of this study was to probe the psychophysiological method for evaluating a pilot's professional reliability on a simulator. The PRC Comprehensive Evaluating System (PRCCES) which was used in the experiment included four subsystems: a) quantitative evaluation system for pilot's performance on simulator; b) secondary task display and quantitative estimating system; c) multiphysiological data monitoring and statistical system; and d) comprehensive evaluation system for pilot PRC. Two studies were performed. In study one, 63 healthy and 13 hospitalized pilots participated. Each pilot performed a double 180 degrees circuit flight program with and without secondary task (three digit operation). The operator performance, score of secondary task and cost of physiological effort were measured and compared by PRCCES in the two conditions. Then, each pilot's flight skill in training was subjectively scored by instructor pilot ratings. In study two, 7 healthy pilots volunteered to take part in the experiment on the effects of sleep deprivation on pilot's PRC. Each participant had PRC tested pre- and post-8 h sleep deprivation. The results show that the PRC values of a healthy pilot was positively correlated with abilities of flexibility, operating and correcting deviation, attention distribution, and accuracy of instrument flight in the air (r = 0.27-0.40, p < 0.05), and negatively correlated with emotional anxiety in flight (r = -0.40, p < 0.05). The values of PRC in healthy pilots (0.61 +/- 0.17) were significantly higher than that of hospitalized pilots

  16. Thermoluminescence method for detection of irradiated food

    International Nuclear Information System (INIS)

    Pinnioja, S.

    1998-01-01

    intensity than feldspars from cold regions, evidently because a more altered mineral structure is typical in warm water regions. A new autoradiographic method to determine luminescence of irradiated rock surfaces was developed for the study. The method of thermoluminescence analysis has been used for the official control analysis of irradiated food in Finland since 1990. In the course of the study, about 500 analyses were carried out for the Finnish Customs Laboratory. Eighty lots of irradiated herbs or spices and 10 lots of irradiated seafood were found. During the last two years, irradiated green tea in spice mixtures and irradiated frog legs have been detected. No irradiated berry or mushroom products have been found. Screening with a photostimulated luminescence (PSL) instrument, followed by TL analysis to confirm the positive and ambiguous samples, provides a reliable tool for the identification of irradiated food containing adhering or contaminating minerals. The reliability of the TL method was proved in European trials. Standardisation of the method has been undertaken by the European Committee for Standardization (CEN). A TL method based on the determination of TL silicate minerals in dry herbs and spices has recently been accepted as an official CEN standard. (orig.)

  17. Thermoluminescence method for detection of irradiated food

    Energy Technology Data Exchange (ETDEWEB)

    Pinnioja, S

    1998-12-31

    intensity than feldspars from cold regions, evidently because a more altered mineral structure is typical in warm water regions. A new autoradiographic method to determine luminescence of irradiated rock surfaces was developed for the study. The method of thermoluminescence analysis has been used for the official control analysis of irradiated food in Finland since 1990. In the course of the study, about 500 analyses were carried out for the Finnish Customs Laboratory. Eighty lots of irradiated herbs or spices and 10 lots of irradiated seafood were found. During the last two years, irradiated green tea in spice mixtures and irradiated frog legs have been detected. No irradiated berry or mushroom products have been found. Screening with a photostimulated luminescence (PSL) instrument, followed by TL analysis to confirm the positive and ambiguous samples, provides a reliable tool for the identification of irradiated food containing adhering or contaminating minerals. The reliability of the TL method was proved in European trials. Standardisation of the method has been undertaken by the European Committee for Standardization (CEN). A TL method based on the determination of TL silicate minerals in dry herbs and spices has recently been accepted as an official CEN standard. (orig.) 55 refs.

  18. Comparison of Methods for Oscillation Detection

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Trangbæk, Klaus

    2006-01-01

    This paper compares a selection of methods for detecting oscillations in control loops. The methods are tested on measurement data from a coal-fired power plant, where some oscillations are occurring. Emphasis is put on being able to detect oscillations without having a system model and without...... using process knowledge. The tested methods show potential for detecting the oscillations, however, transient components in the signals cause false detections as well, motivating usage of models in order to remove the expected signals behavior....

  19. The Reliability and Effectiveness of a Radar-Based Animal Detection System

    Science.gov (United States)

    2017-09-22

    This document contains data on the reliability and effectiveness of an animal detection system along U.S. Hwy 95 near Bonners Ferry, Idaho. The system uses a Doppler radar to detect large mammals (e.g., deer and elk) when they approach the highway. T...

  20. The Reliability and Effectiveness of a Radar-Based Animal Detection System

    Science.gov (United States)

    2017-09-01

    This document contains data on the reliability and effectiveness of an animal detection system along U.S. Hwy 95 near Bonners Ferry, Idaho. The system uses a Doppler radar to detect large mammals (e.g., deer and elk) when they approach the highway. T...

  1. Suitability of the thermoluminescence method for detection of irradiated foods

    International Nuclear Information System (INIS)

    Pinnioja, S.

    1993-01-01

    Irradiated foods can be detected by thermoluminescence (TL) of contaminating minerals. Altogether about 300 lots of herbs, spices, berries, mushrooms and seafood were studied by the TL method. Irradiated herbs and spices were easily differentiated from unirradiated ones two years after irradiation of a 10 kGy dose. The mineral composition of seafood was variable; and while calcite was suitable for the TL analysis, aragonite and smectite gave unreliable results. Control analyses during two years confirmed the reliability of TL method. (author)

  2. Development on methods for evaluating structure reliability of piping components

    International Nuclear Information System (INIS)

    Schimpfke, T.; Grebner, H.; Peschke, J.; Sievers, J.

    2003-01-01

    In the frame of the German reactor safety research program of the Federal Ministry of Economics and Labour, GRS has started to develop an analysis code named PROST (PRObabilistic STructure analysis) for estimating the leak and break probabilities of piping systems in nuclear power plants. The development is based on the experience achieved with applications of the public available US code PRAISE 3.10 (Piping Reliability Analysis Including Seismic Events), which was supplemented by additional features regarding the statistical evaluation and the crack orientation. PROST is designed to be more flexible to changes and supplementations. Up to now it can be used for calculating fatigue problems. The paper mentions the main capabilities and theoretical background of the present PROST development and presents a parametric study on the influence by changing the method of stress intensity factor and limit load calculation and the statistical evaluation options on the leak probability of an exemplary pipe with postulated axial crack distribution. Furthermore the resulting leak probability of an exemplary pipe with postulated circumferential crack distribution is compared with the results of the modified PRAISE computer program. The intention of this investigation is to show trends. Therefore the resulting absolute values for probabilities should not be considered as realistic evaluations. (author)

  3. Methods for qualification of highly reliable software - international procedure

    International Nuclear Information System (INIS)

    Kersken, M.

    1997-01-01

    Despite the advantages of computer-assisted safety technology, there still is some uneasyness to be observed with respect to the novel processes, resulting from absence of a body of generally accepted and uncontentious qualification guides (regulatory provisions, standards) for safety evaluation of the computer codes applied. Warranty of adequate protection of the population, operators or plant components is an essential aspect in this context, too - as it is in general with reliability and risk assessment of novel technology - so that, due to appropriate legislation still missing, there currently is a licensing risk involved in the introduction of digital safety systems. Nevertheless, there is some extent of agreement within the international community and utility operators about what standards and measures should be applied for qualification of software of relevance to plant safety. The standard IEC 880/IEC 86/ in particular, in its original version, or national documents based on this standard, are applied in all countries using or planning to install those systems. A novel supplement to this standard, document /IEC 96/, is in the process of finalization and defines the requirements to be met by modern methods of software engineering. (orig./DG) [de

  4. Cancer Detection and Diagnosis Methods - Annual Plan

    Science.gov (United States)

    Early cancer detection is a proven life-saving strategy. Learn about the research opportunities NCI supports, including liquid biopsies and other less-invasive methods, for detecting early cancers and precancerous growths.

  5. Visual acuity measures do not reliably detect childhood refractive error--an epidemiological study.

    Directory of Open Access Journals (Sweden)

    Lisa O'Donoghue

    Full Text Available PURPOSE: To investigate the utility of uncorrected visual acuity measures in screening for refractive error in white school children aged 6-7-years and 12-13-years. METHODS: The Northern Ireland Childhood Errors of Refraction (NICER study used a stratified random cluster design to recruit children from schools in Northern Ireland. Detailed eye examinations included assessment of logMAR visual acuity and cycloplegic autorefraction. Spherical equivalent refractive data from the right eye were used to classify significant refractive error as myopia of at least 1DS, hyperopia as greater than +3.50DS and astigmatism as greater than 1.50DC, whether it occurred in isolation or in association with myopia or hyperopia. RESULTS: Results are presented from 661 white 12-13-year-old and 392 white 6-7-year-old school-children. Using a cut-off of uncorrected visual acuity poorer than 0.20 logMAR to detect significant refractive error gave a sensitivity of 50% and specificity of 92% in 6-7-year-olds and 73% and 93% respectively in 12-13-year-olds. In 12-13-year-old children a cut-off of poorer than 0.20 logMAR had a sensitivity of 92% and a specificity of 91% in detecting myopia and a sensitivity of 41% and a specificity of 84% in detecting hyperopia. CONCLUSIONS: Vision screening using logMAR acuity can reliably detect myopia, but not hyperopia or astigmatism in school-age children. Providers of vision screening programs should be cognisant that where detection of uncorrected hyperopic and/or astigmatic refractive error is an aspiration, current UK protocols will not effectively deliver.

  6. Safety and reliability analysis based on nonprobabilistic methods

    International Nuclear Information System (INIS)

    Kozin, I.O.; Petersen, K.E.

    1996-01-01

    Imprecise probabilities, being developed during the last two decades, offer a considerably more general theory having many advantages which make it very promising for reliability and safety analysis. The objective of the paper is to argue that imprecise probabilities are more appropriate tool for reliability and safety analysis, that they allow to model the behavior of nuclear industry objects more comprehensively and give a possibility to solve some problems unsolved in the framework of conventional approach. Furthermore, some specific examples are given from which we can see the usefulness of the tool for solving some reliability tasks

  7. Review of methods for the integration of reliability and design engineering

    International Nuclear Information System (INIS)

    Reilly, J.T.

    1978-03-01

    A review of methods for the integration of reliability and design engineering was carried out to establish a reliability program philosophy, an initial set of methods, and procedures to be used by both the designer and reliability analyst. The report outlines a set of procedures which implements a philosophy that requires increased involvement by the designer in reliability analysis. Discussions of each method reviewed include examples of its application

  8. Recent and innovative methods for detection of bacteremia and fungemia

    International Nuclear Information System (INIS)

    Reller, L.B.

    1983-01-01

    Advances continue to be made in methods for more reliable or more rapid means of detecting bacteremia and fungemia. The importance of blood sample volume and broth dilution has been established in controlled studies. New technology includes the use of resins that remove antimicrobials from blood samples, detection of radioactivity from organisms given radiolabeled substrate, use of dyes that stain microbial DNA and RNA, use of slides coated with growth media, and lysis-centrifugation for trapping microorganisms. Technology now being considered includes counterimmunoelectrophoresis, head-space gas chromatography, electrical impedance, microcalorimetry, and the use of lasers to detect pH changes and turbidity

  9. Method of core thermodynamic reliability determination in pressurized water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ackermann, G.; Horche, W. (Ingenieurhochschule Zittau (German Democratic Republic). Sektion Kraftwerksanlagenbau und Energieumwandlung)

    1983-01-01

    A statistical model appropriate to determine the thermodynamic reliability and the power-limiting parameter of PWR cores is described for cases of accidental transients. The model is compared with the hot channel model hitherto applied.

  10. Method of core thermodynamic reliability determination in pressurized water reactors

    International Nuclear Information System (INIS)

    Ackermann, G.; Horche, W.

    1983-01-01

    A statistical model appropriate to determine the thermodynamic reliability and the power-limiting parameter of PWR cores is described for cases of accidental transients. The model is compared with the hot channel model hitherto applied. (author)

  11. Rapid methods for detection of bacteria

    DEFF Research Database (Denmark)

    Corfitzen, Charlotte B.; Andersen, B.Ø.; Miller, M.

    2006-01-01

    Traditional methods for detection of bacteria in drinking water e.g. Heterotrophic Plate Counts (HPC) or Most Probable Number (MNP) take 48-72 hours to give the result. New rapid methods for detection of bacteria are needed to protect the consumers against contaminations. Two rapid methods...

  12. Structural Reliability Methods for Wind Power Converter System Component Reliability Assessment

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    Wind power converter systems are essential subsystems in both off-shore and on-shore wind turbines. It is the main interface between generator and grid connection. This system is affected by numerous stresses where the main contributors might be defined as vibration and temperature loadings....... The temperature variations induce time-varying stresses and thereby fatigue loads. A probabilistic model is used to model fatigue failure for an electrical component in the power converter system. This model is based on a linear damage accumulation and physics of failure approaches, where a failure criterion...... is defined by the threshold model. The attention is focused on crack propagation in solder joints of electrical components due to the temperature loadings. Structural Reliability approaches are used to incorporate model, physical and statistical uncertainties. Reliability estimation by means of structural...

  13. Reliable detection of fluence anomalies in EPID-based IMRT pretreatment quality assurance using pixel intensity deviations

    International Nuclear Information System (INIS)

    Gordon, J. J.; Gardner, J. K.; Wang, S.; Siebers, J. V.

    2012-01-01

    Purpose: This work uses repeat images of intensity modulated radiation therapy (IMRT) fields to quantify fluence anomalies (i.e., delivery errors) that can be reliably detected in electronic portal images used for IMRT pretreatment quality assurance. Methods: Repeat images of 11 clinical IMRT fields are acquired on a Varian Trilogy linear accelerator at energies of 6 MV and 18 MV. Acquired images are corrected for output variations and registered to minimize the impact of linear accelerator and electronic portal imaging device (EPID) positioning deviations. Detection studies are performed in which rectangular anomalies of various sizes are inserted into the images. The performance of detection strategies based on pixel intensity deviations (PIDs) and gamma indices is evaluated using receiver operating characteristic analysis. Results: Residual differences between registered images are due to interfraction positional deviations of jaws and multileaf collimator leaves, plus imager noise. Positional deviations produce large intensity differences that degrade anomaly detection. Gradient effects are suppressed in PIDs using gradient scaling. Background noise is suppressed using median filtering. In the majority of images, PID-based detection strategies can reliably detect fluence anomalies of ≥5% in ∼1 mm 2 areas and ≥2% in ∼20 mm 2 areas. Conclusions: The ability to detect small dose differences (≤2%) depends strongly on the level of background noise. This in turn depends on the accuracy of image registration, the quality of the reference image, and field properties. The longer term aim of this work is to develop accurate and reliable methods of detecting IMRT delivery errors and variations. The ability to resolve small anomalies will allow the accuracy of advanced treatment techniques, such as image guided, adaptive, and arc therapies, to be quantified.

  14. Leak detection by vibrational diagnostic methods

    International Nuclear Information System (INIS)

    Siklossy, P.

    1983-01-01

    The possibilities and methods of leak detection due to mechanical failures in nuclear power plants are reviewed on the basis of the literature. Great importance is attributed to vibrational diagnostic methods for their adventageous characteristics which enable them to become final leak detecting methods. The problems of noise analysis, e.g. leak detection by impact sound measurements, probe characteristics, gain problems, probe selection, off-line analysis and correlation functions, types of leak noises etc. are summarized. Leak detection based on noise analysis can be installed additionally to power plants. Its maintenance and testing is simple. On the other hand, it requires special training and measuring methods. (Sz.J.)

  15. Characteristics and application study of AP1000 NPPs equipment reliability classification method

    International Nuclear Information System (INIS)

    Guan Gao

    2013-01-01

    AP1000 nuclear power plant applies an integrated approach to establish equipment reliability classification, which includes probabilistic risk assessment technique, maintenance rule administrative, power production reliability classification and functional equipment group bounding method, and eventually classify equipment reliability into 4 levels. This classification process and result are very different from classical RCM and streamlined RCM. It studied the characteristic of AP1000 equipment reliability classification approach, considered that equipment reliability classification should effectively support maintenance strategy development and work process control, recommended to use a combined RCM method to establish the future equipment reliability program of AP1000 nuclear power plants. (authors)

  16. Calculation of the reliability of large complex systems by the relevant path method

    International Nuclear Information System (INIS)

    Richter, G.

    1975-03-01

    In this paper, analytical methods are presented and tested with which the probabilistic reliability data of technical systems can be determined for given fault trees and block diagrams and known reliability data of the components. (orig./AK) [de

  17. Reliability of a semi-quantitative method for dermal exposure assessment (DREAM)

    NARCIS (Netherlands)

    Wendel de Joode, B. van; Hemmen, J.J. van; Meijster, T.; Major, V.; London, L.; Kromhout, H.

    2005-01-01

    Valid and reliable semi-quantitative dermal exposure assessment methods for epidemiological research and for occupational hygiene practice, applicable for different chemical agents, are practically nonexistent. The aim of this study was to assess the reliability of a recently developed

  18. A Type-2 fuzzy data fusion approach for building reliable weighted protein interaction networks with application in protein complex detection.

    Science.gov (United States)

    Mehranfar, Adele; Ghadiri, Nasser; Kouhsar, Morteza; Golshani, Ashkan

    2017-09-01

    Detecting the protein complexes is an important task in analyzing the protein interaction networks. Although many algorithms predict protein complexes in different ways, surveys on the interaction networks indicate that about 50% of detected interactions are false positives. Consequently, the accuracy of existing methods needs to be improved. In this paper we propose a novel algorithm to detect the protein complexes in 'noisy' protein interaction data. First, we integrate several biological data sources to determine the reliability of each interaction and determine more accurate weights for the interactions. A data fusion component is used for this step, based on the interval type-2 fuzzy voter that provides an efficient combination of the information sources. This fusion component detects the errors and diminishes their effect on the detection protein complexes. So in the first step, the reliability scores have been assigned for every interaction in the network. In the second step, we have proposed a general protein complex detection algorithm by exploiting and adopting the strong points of other algorithms and existing hypotheses regarding real complexes. Finally, the proposed method has been applied for the yeast interaction datasets for predicting the interactions. The results show that our framework has a better performance regarding precision and F-measure than the existing approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Advances in developing rapid, reliable and portable detection systems for alcohol.

    Science.gov (United States)

    Thungon, Phurpa Dema; Kakoti, Ankana; Ngashangva, Lightson; Goswami, Pranab

    2017-11-15

    Development of portable, reliable, sensitive, simple, and inexpensive detection system for alcohol has been an instinctive demand not only in traditional brewing, pharmaceutical, food and clinical industries but also in rapidly growing alcohol based fuel industries. Highly sensitive, selective, and reliable alcohol detections are currently amenable typically through the sophisticated instrument based analyses confined mostly to the state-of-art analytical laboratory facilities. With the growing demand of rapid and reliable alcohol detection systems, an all-round attempt has been made over the past decade encompassing various disciplines from basic and engineering sciences. Of late, the research for developing small-scale portable alcohol detection system has been accelerated with the advent of emerging miniaturization techniques, advanced materials and sensing platforms such as lab-on-chip, lab-on-CD, lab-on-paper etc. With these new inter-disciplinary approaches along with the support from the parallel knowledge growth on rapid detection systems being pursued for various targets, the progress on translating the proof-of-concepts to commercially viable and environment friendly portable alcohol detection systems is gaining pace. Here, we summarize the progress made over the years on the alcohol detection systems, with a focus on recent advancement towards developing portable, simple and efficient alcohol sensors. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. A Reliability Assessment Method for the VHTR Safety Systems

    International Nuclear Information System (INIS)

    Lee, Hyung Sok; Jae, Moo Sung; Kim, Yong Wan

    2011-01-01

    The Passive safety system by very high temperature reactor which has attracted worldwide attention in the last century is the reliability safety system introduced for the improvement in the safety of the next generation nuclear power plant design. The Passive system functionality does not rely on an external source of energy, but on an intelligent use of the natural phenomena, such as gravity, conduction and radiation, which are always present. Because of these features, it is difficult to evaluate the passive safety on the risk analysis methodology having considered the existing active system failure. Therefore new reliability methodology has to be considered. In this study, the preliminary evaluation and conceptualization are tried, applying the concept of the load and capacity from the reliability physics model, designing the new passive system analysis methodology, and the trial applying to paper plant.

  1. Method of detecting genetic deletions identified with chromosomal abnormalities

    Energy Technology Data Exchange (ETDEWEB)

    Gray, Joe W; Pinkel, Daniel; Tkachuk, Douglas

    2013-11-26

    Methods and compositions for staining based upon nucleic acid sequence that employ nucleic acid probes are provided. Said methods produce staining patterns that can be tailored for specific cytogenetic analyzes. Said probes are appropriate for in situ hybridization and stain both interphase and metaphase chromosomal material with reliable signals. The nucleic acids probes are typically of a complexity greater tha 50 kb, the complexity depending upon the cytogenetic application. Methods and reagents are provided for the detection of genetic rearrangements. Probes and test kits are provided for use in detecting genetic rearrangements, particlularly for use in tumor cytogenetics, in the detection of disease related loci, specifically cancer, such as chronic myelogenous leukemia (CML) and for biological dosimetry. Methods and reagents are described for cytogenetic research, for the differentiation of cytogenetically similar ut genetically different diseases, and for many prognostic and diagnostic applications.

  2. Electromagnetic Methods of Lightning Detection

    Science.gov (United States)

    Rakov, V. A.

    2013-11-01

    Both cloud-to-ground and cloud lightning discharges involve a number of processes that produce electromagnetic field signatures in different regions of the spectrum. Salient characteristics of measured wideband electric and magnetic fields generated by various lightning processes at distances ranging from tens to a few hundreds of kilometers (when at least the initial part of the signal is essentially radiation while being not influenced by ionospheric reflections) are reviewed. An overview of the various lightning locating techniques, including magnetic direction finding, time-of-arrival technique, and interferometry, is given. Lightning location on global scale, when radio-frequency electromagnetic signals are dominated by ionospheric reflections, is also considered. Lightning locating system performance characteristics, including flash and stroke detection efficiencies, percentage of misclassified events, location accuracy, and peak current estimation errors, are discussed. Both cloud and cloud-to-ground flashes are considered. Representative examples of modern lightning locating systems are reviewed. Besides general characterization of each system, the available information on its performance characteristics is given with emphasis on those based on formal ground-truth studies published in the peer-reviewed literature.

  3. Reliability-Based Shape Optimization using Stochastic Finite Element Methods

    DEFF Research Database (Denmark)

    Enevoldsen, Ib; Sørensen, John Dalsgaard; Sigurdsson, G.

    1991-01-01

    stochastic fields (e.g. loads and material parameters such as Young's modulus and the Poisson ratio). In this case stochastic finite element techniques combined with FORM analysis can be used to obtain measures of the reliability of the structural systems, see Der Kiureghian & Ke (6) and Liu & Der Kiureghian...

  4. Between-day reliability of a method for non-invasive estimation of muscle composition.

    Science.gov (United States)

    Simunič, Boštjan

    2012-08-01

    Tensiomyography is a method for valid and non-invasive estimation of skeletal muscle fibre type composition. The validity of selected temporal tensiomyographic measures has been well established recently; there is, however, no evidence regarding the method's between-day reliability. Therefore it is the aim of this paper to establish the between-day repeatability of tensiomyographic measures in three skeletal muscles. For three consecutive days, 10 healthy male volunteers (mean±SD: age 24.6 ± 3.0 years; height 177.9 ± 3.9 cm; weight 72.4 ± 5.2 kg) were examined in a supine position. Four temporal measures (delay, contraction, sustain, and half-relaxation time) and maximal amplitude were extracted from the displacement-time tensiomyogram. A reliability analysis was performed with calculations of bias, random error, coefficient of variation (CV), standard error of measurement, and intra-class correlation coefficient (ICC) with a 95% confidence interval. An analysis of ICC demonstrated excellent agreement (ICC were over 0.94 in 14 out of 15 tested parameters). However, lower CV was observed in half-relaxation time, presumably because of the specifics of the parameter definition itself. These data indicate that for the three muscles tested, tensiomyographic measurements were reproducible across consecutive test days. Furthermore, we indicated the most possible origin of the lowest reliability detected in half-relaxation time. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Reliability and minimal detectable difference in multisegment foot kinematics during shod walking and running.

    Science.gov (United States)

    Milner, Clare E; Brindle, Richard A

    2016-01-01

    There has been increased interest recently in measuring kinematics within the foot during gait. While several multisegment foot models have appeared in the literature, the Oxford foot model has been used frequently for both walking and running. Several studies have reported the reliability for the Oxford foot model, but most studies to date have reported reliability for barefoot walking. The purpose of this study was to determine between-day (intra-rater) and within-session (inter-trial) reliability of the modified Oxford foot model during shod walking and running and calculate minimum detectable difference for common variables of interest. Healthy adult male runners participated. Participants ran and walked in the gait laboratory for five trials of each. Three-dimensional gait analysis was conducted and foot and ankle joint angle time series data were calculated. Participants returned for a second gait analysis at least 5 days later. Intraclass correlation coefficients and minimum detectable difference were determined for walking and for running, to indicate both within-session and between-day reliability. Overall, relative variables were more reliable than absolute variables, and within-session reliability was greater than between-day reliability. Between-day intraclass correlation coefficients were comparable to those reported previously for adults walking barefoot. It is an extension in the use of the Oxford foot model to incorporate wearing a shoe while maintaining marker placement directly on the skin for each segment. These reliability data for walking and running will aid in the determination of meaningful differences in studies which use this model during shod gait. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. A high-throughput multiplex method adapted for GMO detection.

    Science.gov (United States)

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  7. Diffusion-weighted MR imaging in postoperative follow-up: Reliability for detection of recurrent cholesteatoma

    Energy Technology Data Exchange (ETDEWEB)

    Cimsit, Nuri Cagatay [Marmara University Hospital, Department of Radiology, Istanbul (Turkey); Engin Sitesi Peker Sokak No:1 D:13, 34330 Levent, Istanbul (Turkey)], E-mail: cagataycimsit@gmail.com; Cimsit, Canan [Goztepe Education and Research Hospital, Department of Radiology, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, Radyoloji Klinigi, Goztepe, Istanbul (Turkey)], E-mail: ccimsit@ttmail.com; Baysal, Begumhan [Goztepe Education and Research Hospital, Department of Radiology, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, Radyoloji Klinigi, Goztepe, Istanbul (Turkey)], E-mail: begumbaysal@yahoo.com; Ruhi, Ilteris Cagatay [Goztepe Education and Research Hospital, Department of ENT, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, KBB Klinigi, Goztepe, Istanbul (Turkey)], E-mail: cruhi@yahoo.com; Ozbilgen, Suha [Goztepe Education and Research Hospital, Department of ENT, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, KBB Klinigi, Goztepe, Istanbul (Turkey)], E-mail: sozbilgen@yahoo.com; Aksoy, Elif Ayanoglu [Acibadem Bakirkoy Hospital, Department of ENT, Istanbul (Turkey); Acibadem Hastanesi, KBB Boeluemue, Bakirkoey, Istanbul (Turkey)], E-mail: elifayanoglu@yahoo.com

    2010-04-15

    Introduction: Cholesteatoma is a progressively growing process that destroy the neighboring bony structures and treatment is surgical removal. Follow-up is important in the postoperative period, since further surgery is necessary if recurrence is present, but not if granulation tissue is detected. This study evaluates if diffusion-weighted MR imaging alone can be a reliable alternative to CT, without use of contrast agent for follow-up of postoperative patients in detecting recurrent cholesteatoma. Materials and methods: 26 consecutive patients with mastoidectomy reporting for routine follow-up CT after mastoidectomy were included in the study, if there was loss of middle ear aeration on CT examination. MR images were evaluated for loss of aeration and signal intensity changes on diffusion-weighted sequences. Surgical results were compared with imaging findings. Results: Interpretation of MR images were parallel with the loss of aeration detected on CT for all 26 patients. Of the 26 patients examined, 14 were not evaluated as recurrent cholesteatoma and verified with surgery (NPV: 100%). Twelve patients were diagnosed as recurrent cholesteatoma and 11 were surgically diagnosed as recurrent cholesteatoma (PPV: 91.7%). Four of these 11 patients had loss of aeration size greater than the high signal intensity area on DWI, which were surgically confirmed as granulation tissue or fibrosis accompanying recurrent cholesteatoma. Conclusion: Diffusion-weighted MR for suspected recurrent cholesteatoma is a valuable tool to cut costs and prevent unnecessary second-look surgeries. It has the potential to become the MR sequence of choice to differentiate recurrent cholesteatoma from other causes of loss of aeration in patients with mastoidectomy.

  8. Human reliability-based MC and A models for detecting insider theft

    International Nuclear Information System (INIS)

    Duran, Felicia Angelica; Wyss, Gregory Dane

    2010-01-01

    Material control and accounting (MC and A) safeguards operations that track and account for critical assets at nuclear facilities provide a key protection approach for defeating insider adversaries. These activities, however, have been difficult to characterize in ways that are compatible with the probabilistic path analysis methods that are used to systematically evaluate the effectiveness of a site's physical protection (security) system (PPS). MC and A activities have many similar characteristics to operator procedures performed in a nuclear power plant (NPP) to check for anomalous conditions. This work applies human reliability analysis (HRA) methods and models for human performance of NPP operations to develop detection probabilities for MC and A activities. This has enabled the development of an extended probabilistic path analysis methodology in which MC and A protections can be combined with traditional sensor data in the calculation of PPS effectiveness. The extended path analysis methodology provides an integrated evaluation of a safeguards and security system that addresses its effectiveness for attacks by both outside and inside adversaries.

  9. Advancing methods for reliably assessing motivational interviewing fidelity using the motivational interviewing skills code.

    Science.gov (United States)

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C

    2015-02-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Steam leak detection in advance reactors via acoustics method

    International Nuclear Information System (INIS)

    Singh, Raj Kumar; Rao, A. Rama

    2011-01-01

    Highlights: → Steam leak detection system is developed to detect any leak inside the reactor vault. → The technique uses leak noise frequency spectrum for leak detection. → Testing of system and method to locate the leak is also developed and discussed in present paper. - Abstract: Prediction of LOCA (loss of coolant activity) plays very important role in safety of nuclear reactor. Coolant is responsible for heat transfer from fuel bundles. Loss of coolant is an accidental situation which requires immediate shut down of reactor. Fall in system pressure during LOCA is the trip parameter used for initiating automatic reactor shut down. However, in primary heat transport system operating in two phase regimes, detection of small break LOCA is not simple. Due to very slow leak rates, time for the fall of pressure is significantly slow. From reactor safety point of view, it is extremely important to find reliable and effective alternative for detecting slow pressure drop in case of small break LOCA. One such technique is the acoustic signal caused by LOCA in small breaks. In boiling water reactors whose primary heat transport is to be driven by natural circulation, small break LOCA detection is important. For prompt action on post small break LOCA, steam leak detection system is developed to detect any leak inside the reactor vault. The detection technique is reliable and plays a very important role in ensuring safety of the reactor. Methodology developed for steam leak detection is discussed in present paper. The methods to locate the leak is also developed and discussed in present paper which is based on analysis of the signal.

  11. Usefulness of the Monte Carlo method in reliability calculations

    International Nuclear Information System (INIS)

    Lanore, J.M.; Kalli, H.

    1977-01-01

    Three examples of reliability Monte Carlo programs developed in the LEP (Laboratory for Radiation Shielding Studies in the Nuclear Research Center at Saclay) are presented. First, an uncertainty analysis is given for a simplified spray system; a Monte Carlo program PATREC-MC has been written to solve the problem with the system components given in the fault tree representation. The second program MONARC 2 has been written to solve the problem of complex systems reliability by the Monte Carlo simulation, here again the system (a residual heat removal system) is in the fault tree representation. Third, the Monte Carlo program MONARC was used instead of the Markov diagram to solve the simulation problem of an electric power supply including two nets and two stand-by diesels

  12. Reliability improvement methods for sapphire fiber temperature sensors

    Science.gov (United States)

    Schietinger, C.; Adams, B.

    1991-08-01

    Mechanical, optical, electrical, and software design improvements can be brought to bear in the enhancement of fiber-optic sapphire-fiber temperature measurement tool reliability in harsh environments. The optical fiber thermometry (OFT) equipment discussed is used in numerous process industries and generally involves a sapphire sensor, an optical transmission cable, and a microprocessor-based signal analyzer. OFT technology incorporating sensors for corrosive environments, hybrid sensors, and two-wavelength measurements, are discussed.

  13. Reliability of recordings of subgingival calculus detected using an ultrasonic device.

    Science.gov (United States)

    Corraini, Priscila; López, Rodrigo

    2015-04-01

    To assess the intra-examiner reliability of recordings of subgingival calculus detected using an ultrasonic device, and to investigate the influence of subject-, tooth- and site-level factors on the reliability of these subgingival calculus recordings. On two occasions, within a 1-week interval, 147 adult periodontitis patients received a full-mouth clinical periodontal examination by a single trained examiner. Duplicate subgingival calculus recordings, in six sites per tooth, were obtained using an ultrasonic device for calculus detection and removal. Agreement was observed in 65 % of the 22,584 duplicate subgingival calculus recordings, ranging 45 % to 83 % according to subject. Using hierarchical modeling, disagreements in the subgingival calculus duplicate recordings were more likely in all other sites than the mid-buccal, and in sites harboring supragingival calculus. Disagreements were less likely in sites with PD ≥  4 mm and with furcation involvement  ≥  degree 2. Bleeding on probing or suppuration did not influence the reliability of subgingival calculus. At the subject-level, disagreements were less likely in patients presenting with the highest and lowest extent categories of the covariate subgingival calculus. The reliability of subgingival calculus recordings using the ultrasound technology is reasonable. The results of the present study suggest that the reliability of subgingival calculus recordings is not influenced by the presence of inflammation. Moreover, subgingival calculus can be more reliably detected using the ultrasound device at sites with higher need for periodontal therapy, i.e., sites presenting with deep pockets and premolars and molars with furcation involvement.

  14. Test-retest reliability of myofascial trigger point detection in hip and thigh areas.

    Science.gov (United States)

    Rozenfeld, E; Finestone, A S; Moran, U; Damri, E; Kalichman, L

    2017-10-01

    Myofascial trigger points (MTrP's) are a primary source of pain in patients with musculoskeletal disorders. Nevertheless, they are frequently underdiagnosed. Reliable MTrP palpation is the necessary for their diagnosis and treatment. The few studies that have looked for intra-tester reliability of MTrPs detection in upper body, provide preliminary evidence that MTrP palpation is reliable. Reliability tests for MTrP palpation on the lower limb have not yet been performed. To evaluate inter- and intra-tester reliability of MTrP recognition in hip and thigh muscles. Reliability study. 21 patients (15 males and 6 females, mean age 21.1 years) referred to the physical therapy clinic, 10 with knee or hip pain and 11 with pain in an upper limb, low back, shin or ankle. Two experienced physical therapists performed the examinations, blinded to the subjects' identity, medical condition and results of the previous MTrP evaluation. Each subject was evaluated four times, twice by each examiner in a random order. Dichotomous findings included a palpable taut band, tenderness, referred pain, and relevance of referred pain to patient's complaint. Based on these, diagnosis of latent MTrP's or active MTrP's was established. The evaluation was performed on both legs and included a total of 16 locations in the following muscles: rectus femoris (proximal), vastus medialis (middle and distal), vastus lateralis (middle and distal) and gluteus medius (anterior, posterior and distal). Inter- and intra-tester reliability (Cohen's kappa (κ)) values for single sites ranged from -0.25 to 0.77. Median intra-tester reliability was 0.45 and 0.46 for latent and active MTrP's, and median inter-tester reliability was 0.51 and 0.64 for latent and active MTrPs, respectively. The examination of the distal vastus medialis was most reliable for latent and active MTrP's (intra-tester k = 0.27-0.77, inter-tester k = 0.77 and intra-tester k = 0.53-0.72, inter-tester k = 0.72, correspondingly

  15. Improved GLR method to instrument failure detection

    International Nuclear Information System (INIS)

    Jeong, Hak Yeoung; Chang, Soon Heung

    1985-01-01

    The generalized likehood radio(GLR) method performs statistical tests on the innovations sequence of a Kalman-Buchy filter state estimator for system failure detection and its identification. However, the major drawback of the convensional GLR is to hypothesize particular failure type in each case. In this paper, a method to solve this drawback is proposed. The improved GLR method is applied to a PWR pressurizer and gives successful results in detection and identification of any failure. Furthmore, some benefit on the processing time per each cycle of failure detection and its identification can be accompanied. (Author)

  16. Developing a reliable method for signal wire attachment : [research results].

    Science.gov (United States)

    2013-03-01

    Railroad signaling systems detect trains on the track, identify track fractures, prevent derailments, and alert signal crossing stations when trains approach. These systems are vital to safe train operation; therefore, each component of this system h...

  17. GMDD: a database of GMO detection methods.

    Science.gov (United States)

    Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans J P; Guo, Rong; Liang, Wanqi; Zhang, Dabing

    2008-06-04

    Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier.

  18. Reliability analysis of reactor systems by applying probability method; Analiza pouzdanosti reaktorskih sistema primenom metoda verovatnoce

    Energy Technology Data Exchange (ETDEWEB)

    Milivojevic, S [Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1974-12-15

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component.

  19. A Reliable Method for the Evaluation of the Anaphylactoid Reaction Caused by Injectable Drugs

    Directory of Open Access Journals (Sweden)

    Fang Wang

    2016-10-01

    Full Text Available Adverse reactions of injectable drugs usually occur at first administration and are closely associated with the dosage and speed of injection. This phenomenon is correlated with the anaphylactoid reaction. However, up to now, study methods based on antigen detection have still not gained wide acceptance and single physiological indicators cannot be utilized to differentiate anaphylactoid reactions from allergic reactions and inflammatory reactions. In this study, a reliable method for the evaluation of anaphylactoid reactions caused by injectable drugs was established by using multiple physiological indicators. We used compound 48/80, ovalbumin and endotoxin as the sensitization agents to induce anaphylactoid, allergic and inflammatory reactions. Different experimental animals (guinea pig and nude rat and different modes of administration (intramuscular, intravenous and intraperitoneal injection and different times (15 min, 30 min and 60 min were evaluated to optimize the study protocol. The results showed that the optimal way to achieve sensitization involved treating guinea pigs with the different agents by intravenous injection for 30 min. Further, seven related humoral factors including 5-HT, SC5b-9, Bb, C4d, IL-6, C3a and histamine were detected by HPLC analysis and ELISA assay to determine their expression level. The results showed that five of them, including 5-HT, SC5b-9, Bb, C4d and IL-6, displayed significant differences between anaphylactoid, allergic and inflammatory reactions, which indicated that their combination could be used to distinguish these three reactions. Then different injectable drugs were used to verify this method and the results showed that the chosen indicators exhibited good correlation with the anaphylactoid reaction which indicated that the established method was both practical and reliable. Our research provides a feasible method for the diagnosis of the serious adverse reactions caused by injectable drugs which

  20. Reliability Evaluation of Bridges Based on Nonprobabilistic Response Surface Limit Method

    Directory of Open Access Journals (Sweden)

    Xuyong Chen

    2017-01-01

    Full Text Available Due to many uncertainties in nonprobabilistic reliability assessment of bridges, the limit state function is generally unknown. The traditional nonprobabilistic response surface method is a lengthy and oscillating iteration process and leads to difficultly solving the nonprobabilistic reliability index. This article proposes a nonprobabilistic response surface limit method based on the interval model. The intention of this method is to solve the upper and lower limits of the nonprobabilistic reliability index and to narrow the range of the nonprobabilistic reliability index. If the range of the reliability index reduces to an acceptable accuracy, the solution will be considered convergent, and the nonprobabilistic reliability index will be obtained. The case study indicates that using the proposed method can avoid oscillating iteration process, make iteration process stable and convergent, reduce iteration steps significantly, and improve computational efficiency and precision significantly compared with the traditional nonprobabilistic response surface method. Finally, the nonprobabilistic reliability evaluation process of bridge will be built through evaluating the reliability of one PC continuous rigid frame bridge with three spans using the proposed method, which appears to be more simple and reliable when lack of samples and parameters in the bridge nonprobabilistic reliability evaluation is present.

  1. GC ‘Multi-Analyte’ Detection Method

    Energy Technology Data Exchange (ETDEWEB)

    Dudar, E. [Plant Protection & Soil Conservation Service of Budapest, Budapest (Hungary)

    2009-07-15

    Elaborated methodologies for GC multi-analyte detection are presented, comprising the steps of method development, chromatographic conditions and procedures including the determination of relative retention times and summary results tables. (author)

  2. OCT4 and SOX2 are reliable markers in detecting stem cells in odontogenic lesions

    Directory of Open Access Journals (Sweden)

    Abhishek Banerjee

    2016-01-01

    Full Text Available Context (Background: Stem cells are a unique subpopulation of cells in the human body with a capacity to initiate differentiation into various cell lines. Tumor stem cells (TSCs are a unique subpopulation of cells that possess the ability to initiate a neoplasm and sustain self-renewal. Epithelial stem cell (ESC markers such as octamer-binding transcription factor 4 (OCT4 and sex-determining region Y (SRY-box 2 (SOX2 are capable of identifying these stem cells expressed during the early stages of tooth development. Aims: To detect the expression of the stem cell markers OCT4 and SOX2 in the normal odontogenic tissues and the odontogenic cysts and tumors. Materials and Methods: Paraffin sections of follicular tissue, radicular cyst, dentigerous cyst, odontogenic keratocyst, ameloblastoma, adenomatoid odontogenic tumor, and ameloblastic carcinoma were obtained from the archives. The sections were subjected to immunohistochemical assay by the use of mouse monoclonal antibodies to OCT4 and SOX2. Statistical Analysis: The results were evaluated by descriptive analysis. Results: The results show the presence of stem cells in the normal and lesional tissues with these stem cell identifying markers. SOX2 was found to be more consistent and reliable in the detection of stem cells. Conclusion: The stem cell expressions are maintained in the tumor transformation of tissue and probably suggest that there is no phenotypic change of stem cells in progression from normal embryonic state to its tumor component. The quantification and localization reveals interesting trends that indicate the probable role of the cells in the pathogenesis of the lesions.

  3. Emergency First Responders' Experience with Colorimetric Detection Methods

    Energy Technology Data Exchange (ETDEWEB)

    Sandra L. Fox; Keith A. Daum; Carla J. Miller; Marnie M. Cortez

    2007-10-01

    Nationwide, first responders from state and federal support teams respond to hazardous materials incidents, industrial chemical spills, and potential weapons of mass destruction (WMD) attacks. Although first responders have sophisticated chemical, biological, radiological, and explosive detectors available for assessment of the incident scene, simple colorimetric detectors have a role in response actions. The large number of colorimetric chemical detection methods available on the market can make the selection of the proper methods difficult. Although each detector has unique aspects to provide qualitative or quantitative data about the unknown chemicals present, not all detectors provide consistent, accurate, and reliable results. Included here, in a consumer-report-style format, we provide “boots on the ground” information directly from first responders about how well colorimetric chemical detection methods meet their needs in the field and how they procure these methods.

  4. Assessment and Improving Methods of Reliability Indices in Bakhtar Regional Electricity Company

    Directory of Open Access Journals (Sweden)

    Saeed Shahrezaei

    2013-04-01

    Full Text Available Reliability of a system is the ability of a system to do prospected duties in future and the probability of desirable operation for doing predetermined duties. Power system elements failures data are the main data of reliability assessment in the network. Determining antiseptic parameters is the goal of reliability assessment by using system history data. These parameters help to recognize week points of the system. In other words, the goal of reliability assessment is operation improving and decreasing of the failures and power outages. This paper is developed to assess reliability indices of Bakhtar Regional Electricity Company up to 1393 and the improving methods and their effects on the reliability indices in this network. DIgSILENT Power Factory software is employed for simulation. Simulation results show the positive effect of improving methods in reliability indices of Bakhtar Regional Electricity Company.

  5. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine: An overview

    Science.gov (United States)

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon; Prasad, Ramjee

    2013-01-01

    Recently, a need to develop supportive new scientific evidence for contemporary Ayurveda has emerged. One of the research objectives is an assessment of the reliability of diagnoses and treatment. Reliability is a quantitative measure of consistency. It is a crucial issue in classification (such as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine diagnoses is in the formative stage. However, reliability studies in Ayurveda are in the preliminary stage. In this paper, examples are provided to illustrate relevant concepts of reliability studies of diagnostic methods and their implication in practice, education, and training. An introduction to reliability estimates and different study designs and statistical analysis is given for future studies in Ayurveda. PMID:23930037

  6. Larvas output and influence of human factor in reliability of meat inspection by the method of artificial digestion

    OpenAIRE

    Đorđević Vesna; Savić Marko; Vasilev Saša; Đorđević Milovan

    2013-01-01

    On the basis of the performed analyses of the factors that contributed the infected meat reach food chain, we have found out that the infection occurred after consuming the meat inspected by the method of collective samples artificial digestion by using a magnetic stirrer (MM). In this work there are presented assay results which show how modifications of the method, on the level of final sedimentation, influence the reliability of Trichinella larvas detect...

  7. Method for assessing reliability of a network considering probabilistic safety assessment

    International Nuclear Information System (INIS)

    Cepin, M.

    2005-01-01

    A method for assessment of reliability of the network is developed, which uses the features of the fault tree analysis. The method is developed in a way that the increase of the network under consideration does not require significant increase of the model. The method is applied to small examples of network consisting of a small number of nodes and a small number of their connections. The results give the network reliability. They identify equipment, which is to be carefully maintained in order that the network reliability is not reduced, and equipment, which is a candidate for redundancy, as this would improve network reliability significantly. (author)

  8. Comparison of Model Reliabilities from Single-Step and Bivariate Blending Methods

    DEFF Research Database (Denmark)

    Taskinen, Matti; Mäntysaari, Esa; Lidauer, Martin

    2013-01-01

    Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from...... be calculated. Model reliabilities by the single-step and the bivariate blending methods were higher than by animal model due to genomic information. Compared to the single-step method, the bivariate blending method reliability estimates were, in general, lower. Computationally bivariate blending method was......, on the other hand, lighter than the single-step method....

  9. RELIABILITY ASSESSMENT OF ENTROPY METHOD FOR SYSTEM CONSISTED OF IDENTICAL EXPONENTIAL UNITS

    Institute of Scientific and Technical Information of China (English)

    Sun Youchao; Shi Jun

    2004-01-01

    The reliability assessment of unit-system near two levels is the most important content in the reliability multi-level synthesis of complex systems. Introducing the information theory into system reliability assessment, using the addible characteristic of information quantity and the principle of equivalence of information quantity, an entropy method of data information conversion is presented for the system consisted of identical exponential units. The basic conversion formulae of entropy method of unit test data are derived based on the principle of information quantity equivalence. The general models of entropy method synthesis assessment for system reliability approximate lower limits are established according to the fundamental principle of the unit reliability assessment. The applications of the entropy method are discussed by way of practical examples. Compared with the traditional methods, the entropy method is found to be valid and practicable and the assessment results are very satisfactory.

  10. Indian program for development of technologies relevant to reliable, non-intrusive, concealed-contraband detection

    International Nuclear Information System (INIS)

    Auluck, S.K.H.

    2007-01-01

    Generating capability for reliable, non-intrusive detection of concealed-contraband, particularly, organic contraband like explosives and narcotics, has become a national priority. This capability spans a spectrum of technologies. If a technology mission addressing the needs of a highly sophisticated technology like PFNA is set up, the capabilities acquired would be adequate to meet the requirements of many other sets of technologies. This forms the background of the Indian program for development of technologies relevant to reliable, non-intrusive, concealed contraband detection. One of the central themes of the technology development programs would be modularization of the neutron source and detector technologies, so that common elements can be combined in different ways for meeting a variety of application requirements. (author)

  11. The Reliability, Impact, and Cost-Effectiveness of Value-Added Teacher Assessment Methods

    Science.gov (United States)

    Yeh, Stuart S.

    2012-01-01

    This article reviews evidence regarding the intertemporal reliability of teacher rankings based on value-added methods. Value-added methods exhibit low reliability, yet are broadly supported by prominent educational researchers and are increasingly being used to evaluate and fire teachers. The article then presents a cost-effectiveness analysis…

  12. Model correction factor method for reliability problems involving integrals of non-Gaussian random fields

    DEFF Research Database (Denmark)

    Franchin, P.; Ditlevsen, Ove Dalager; Kiureghian, Armen Der

    2002-01-01

    The model correction factor method (MCFM) is used in conjunction with the first-order reliability method (FORM) to solve structural reliability problems involving integrals of non-Gaussian random fields. The approach replaces the limit-state function with an idealized one, in which the integrals ...

  13. Fast Reliability Assessing Method for Distribution Network with Distributed Renewable Energy Generation

    Science.gov (United States)

    Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming

    2018-01-01

    This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.

  14. A comparative study on the HW reliability assessment methods for digital I and C equipment

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Hoan Sung; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Lee, G. Y. [Korea Atomic Energy Research Institute, Taejeon (Korea); Kim, M. C. [Korea Advanced Institute of Science and Technology, Taejeon (Korea); Jun, S. T. [KHNP, Taejeon (Korea)

    2002-03-01

    It is necessary to predict or to evaluate the reliability of electronic equipment for the probabilistic safety analysis of digital instrument and control equipment. But most databases for the reliability prediction have no data for the up-to-date equipment and the failure modes are not classified. The prediction results for the specific component show different values according to the methods and databases. For boards and systems each method shows different values than others also. This study is for reliability prediction of PDC system for Wolsong NPP1 as a digital I and C equipment. Various reliability prediction methods and failure databases are used in calculation of the reliability to compare the effects of sensitivity and accuracy of each model and database. Many considerations for the reliability assessment of digital systems are derived with the results of this study. 14 refs., 19 figs., 15 tabs. (Author)

  15. Method of reliability allocation based on fault tree analysis and fuzzy math in nuclear power plants

    International Nuclear Information System (INIS)

    Chen Zhaobing; Deng Jian; Cao Xuewu

    2005-01-01

    Reliability allocation is a kind of a difficult multi-objective optimization problem. It can not only be applied to determine the reliability characteristic of reactor systems, subsystem and main components but also be performed to improve the design, operation and maintenance of nuclear plants. The fuzzy math known as one of the powerful tools for fuzzy optimization and the fault analysis deemed to be one of the effective methods of reliability analysis can be applied to the reliability allocation model so as to work out the problems of fuzzy characteristic of some factors and subsystem's choice respectively in this paper. Thus we develop a failure rate allocation model on the basis of the fault tree analysis and fuzzy math. For the choice of the reliability constraint factors, we choose the six important ones according to practical need for conducting the reliability allocation. The subsystem selected by the top-level fault tree analysis is to avoid allocating reliability for all the equipment and components including the unnecessary parts. During the reliability process, some factors can be calculated or measured quantitatively while others only can be assessed qualitatively by the expert rating method. So we adopt fuzzy decision and dualistic contrast to realize the reliability allocation with the help of fault tree analysis. Finally the example of the emergency diesel generator's reliability allocation is used to illustrate reliability allocation model and improve this model simple and applicable. (authors)

  16. Testing effort dependent software reliability model for imperfect debugging process considering both detection and correction

    International Nuclear Information System (INIS)

    Peng, R.; Li, Y.F.; Zhang, W.J.; Hu, Q.P.

    2014-01-01

    This paper studies the fault detection process (FDP) and fault correction process (FCP) with the incorporation of testing effort function and imperfect debugging. In order to ensure high reliability, it is essential for software to undergo a testing phase, during which faults can be detected and corrected by debuggers. The testing resource allocation during this phase, which is usually depicted by the testing effort function, considerably influences not only the fault detection rate but also the time to correct a detected fault. In addition, testing is usually far from perfect such that new faults may be introduced. In this paper, we first show how to incorporate testing effort function and fault introduction into FDP and then develop FCP as delayed FDP with a correction effort. Various specific paired FDP and FCP models are obtained based on different assumptions of fault introduction and correction effort. An illustrative example is presented. The optimal release policy under different criteria is also discussed

  17. Reliable Grid Condition Detection and Control of Single-Phase Distributed Power Generation Systems

    DEFF Research Database (Denmark)

    Ciobotaru, Mihai

    standards addressed to the grid-connected systems will harmonize the combination of the DPGS and the classical power plants. Consequently, the major tasks of this thesis were to develop new grid condition detection techniques and intelligent control in order to allow the DPGS not only to deliver power...... to the utility grid but also to sustain it. This thesis was divided into two main parts, namely "Grid Condition Detection" and "Control of Single-Phase DPGS". In the first part, the main focus was on reliable Phase Locked Loop (PLL) techniques for monitoring the grid voltage and on grid impedance estimation...... techniques. Additionally, a new technique for detecting the islanding mode has been developed and successfully tested. In the second part, the main reported research was concentrated around adaptive current controllers based on the information provided by the grid condition detection techniques. To guarantee...

  18. Reliability of the MicroScan WalkAway PC21 panel in identifying and detecting oxacillin resistance in clinical coagulase-negative staphylococci strains.

    Science.gov (United States)

    Olendzki, A N; Barros, E M; Laport, M S; Dos Santos, K R N; Giambiagi-Demarval, M

    2014-01-01

    The purpose of this study was to determine the reliability of the MicroScan WalkAway PosCombo21 (PC21) system for the identification of coagulase-negative staphylococci (CNS) strains and the detection of oxacillin resistance. Using molecular and phenotypic methods, 196 clinical strains were evaluated. The automated system demonstrated 100 % reliability for the identification of the clinical strains Staphylococcus haemolyticus, Staphylococcus hominis and Staphylococcus cohnii; 98.03 % reliability for the identification of Staphylococcus epidermidis; 70 % reliability for the identification of Staphylococcus lugdunensis; 40 % reliability for the identification of Staphylococcus warneri; and 28.57 % reliability for the identification of Staphylococcus capitis, but no reliability for the identification of Staphylococcus auricularis, Staphylococcus simulans and Staphylococcus xylosus. We concluded that the automated system provides accurate results for the more common CNS species but often fails to accurately identify less prevalent species. For the detection of oxacillin resistance, the automated system showed 100 % specificity and 90.22 % sensitivity. Thus, the PC21 panel detects oxacillin-resistant strains, but is limited by the heteroresistance that is observed when using most phenotypic methods.

  19. Calibration Methods for Reliability-Based Design Codes

    DEFF Research Database (Denmark)

    Gayton, N.; Mohamed, A.; Sørensen, John Dalsgaard

    2004-01-01

    The calibration methods are applied to define the optimal code format according to some target safety levels. The calibration procedure can be seen as a specific optimization process where the control variables are the partial factors of the code. Different methods are available in the literature...

  20. Rapid and Reliable HPLC Method for the Determination of Vitamin ...

    African Journals Online (AJOL)

    Purpose: To develop and validate an accurate, sensitive and reproducible high performance liquid chromatographic (HPLC) method for the quantitation of vitamin C in pharmaceutical samples. Method: The drug and the standard were eluted from Superspher RP-18 (250 mm x 4.6 mm, 10ìm particle size) at 20 0C.

  1. Evaluation and reliability of bone histological age estimation methods

    African Journals Online (AJOL)

    Human age estimation at death plays a vital role in forensic anthropology and bioarchaeology. Researchers used morphological and histological methods to estimate human age from their skeletal remains. This paper discussed different histological methods that used human long bones and ribs to determine age ...

  2. Method matters: Understanding diagnostic reliability in DSM-IV and DSM-5.

    Science.gov (United States)

    Chmielewski, Michael; Clark, Lee Anna; Bagby, R Michael; Watson, David

    2015-08-01

    Diagnostic reliability is essential for the science and practice of psychology, in part because reliability is necessary for validity. Recently, the DSM-5 field trials documented lower diagnostic reliability than past field trials and the general research literature, resulting in substantial criticism of the DSM-5 diagnostic criteria. Rather than indicating specific problems with DSM-5, however, the field trials may have revealed long-standing diagnostic issues that have been hidden due to a reliance on audio/video recordings for estimating reliability. We estimated the reliability of DSM-IV diagnoses using both the standard audio-recording method and the test-retest method used in the DSM-5 field trials, in which different clinicians conduct separate interviews. Psychiatric patients (N = 339) were diagnosed using the SCID-I/P; 218 were diagnosed a second time by an independent interviewer. Diagnostic reliability using the audio-recording method (N = 49) was "good" to "excellent" (M κ = .80) and comparable to the DSM-IV field trials estimates. Reliability using the test-retest method (N = 218) was "poor" to "fair" (M κ = .47) and similar to DSM-5 field-trials' estimates. Despite low test-retest diagnostic reliability, self-reported symptoms were highly stable. Moreover, there was no association between change in self-report and change in diagnostic status. These results demonstrate the influence of method on estimates of diagnostic reliability. (c) 2015 APA, all rights reserved).

  3. Methods for reliability evaluation of trust and reputation systems

    Science.gov (United States)

    Janiszewski, Marek B.

    2016-09-01

    Trust and reputation systems are a systematic approach to build security on the basis of observations of node's behaviour. Exchange of node's opinions about other nodes is very useful to indicate nodes which act selfishly or maliciously. The idea behind trust and reputation systems gets significance because of the fact that conventional security measures (based on cryptography) are often not sufficient. Trust and reputation systems can be used in various types of networks such as WSN, MANET, P2P and also in e-commerce applications. Trust and reputation systems give not only benefits but also could be a thread itself. Many attacks aim at trust and reputation systems exist, but such attacks still have not gain enough attention of research teams. Moreover, joint effects of many of known attacks have been determined as a very interesting field of research. Lack of an acknowledged methodology of evaluation of trust and reputation systems is a serious problem. This paper aims at presenting various approaches of evaluation such systems. This work also contains a description of generalization of many trust and reputation systems which can be used to evaluate reliability of such systems in the context of preventing various attacks.

  4. Research on Control Method Based on Real-Time Operational Reliability Evaluation for Space Manipulator

    Directory of Open Access Journals (Sweden)

    Yifan Wang

    2014-05-01

    Full Text Available A control method based on real-time operational reliability evaluation for space manipulator is presented for improving the success rate of a manipulator during the execution of a task. In this paper, a method for quantitative analysis of operational reliability is given when manipulator is executing a specified task; then a control model which could control the quantitative operational reliability is built. First, the control process is described by using a state space equation. Second, process parameters are estimated in real time using Bayesian method. Third, the expression of the system's real-time operational reliability is deduced based on the state space equation and process parameters which are estimated using Bayesian method. Finally, a control variable regulation strategy which considers the cost of control is given based on the Theory of Statistical Process Control. It is shown via simulations that this method effectively improves the operational reliability of space manipulator control system.

  5. Reliability research to nuclear power plant operators based on several methods

    International Nuclear Information System (INIS)

    Fang Xiang; Li Fu; Zhao Bingquan

    2009-01-01

    The paper utilizes many kinds of international reliability research methods, and summarizes the review of reliability research of Chinese nuclear power plant operators in past over ten years based on the simulator platform of nuclear power plant. The paper shows the necessity and feasibility of the research to nuclear power plant operators from many angles including human cognition reliability, fuzzy mathematics model and psychological research model, etc. It will be good to the safe operation of nuclear power plant based on many kinds of research methods to the reliability research of nuclear power plant operators. (authors)

  6. Comparison of Methods for Dependency Determination between Human Failure Events within Human Reliability Analysis

    International Nuclear Information System (INIS)

    Cepin, M.

    2008-01-01

    The human reliability analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan human reliability analysis (IJS-HRA) and standardized plant analysis risk human reliability analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance

  7. Comparison of methods for dependency determination between human failure events within human reliability analysis

    International Nuclear Information System (INIS)

    Cepis, M.

    2007-01-01

    The Human Reliability Analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease of subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan - Human Reliability Analysis (IJS-HRA) and Standardized Plant Analysis Risk Human Reliability Analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance. (author)

  8. A novel visual saliency detection method for infrared video sequences

    Science.gov (United States)

    Wang, Xin; Zhang, Yuzhen; Ning, Chen

    2017-12-01

    Infrared video applications such as target detection and recognition, moving target tracking, and so forth can benefit a lot from visual saliency detection, which is essentially a method to automatically localize the ;important; content in videos. In this paper, a novel visual saliency detection method for infrared video sequences is proposed. Specifically, for infrared video saliency detection, both the spatial saliency and temporal saliency are considered. For spatial saliency, we adopt a mutual consistency-guided spatial cues combination-based method to capture the regions with obvious luminance contrast and contour features. For temporal saliency, a multi-frame symmetric difference approach is proposed to discriminate salient moving regions of interest from background motions. Then, the spatial saliency and temporal saliency are combined to compute the spatiotemporal saliency using an adaptive fusion strategy. Besides, to highlight the spatiotemporal salient regions uniformly, a multi-scale fusion approach is embedded into the spatiotemporal saliency model. Finally, a Gestalt theory-inspired optimization algorithm is designed to further improve the reliability of the final saliency map. Experimental results demonstrate that our method outperforms many state-of-the-art saliency detection approaches for infrared videos under various backgrounds.

  9. [A reliability growth assessment method and its application in the development of equipment in space cabin].

    Science.gov (United States)

    Chen, J D; Sun, H L

    1999-04-01

    Objective. To assess and predict reliability of an equipment dynamically by making full use of various test informations in the development of products. Method. A new reliability growth assessment method based on army material system analysis activity (AMSAA) model was developed. The method is composed of the AMSAA model and test data conversion technology. Result. The assessment and prediction results of a space-borne equipment conform to its expectations. Conclusion. It is suggested that this method should be further researched and popularized.

  10. Is sequential cranial ultrasound reliable for detection of white matter injury in very preterm infants?

    International Nuclear Information System (INIS)

    Leijser, Lara M.; Steggerda, Sylke J.; Walther, Frans J.; Wezel-Meijler, Gerda van; Bruine, Francisca T. de; Grond, Jeroen van der

    2010-01-01

    Cranial ultrasound (cUS) may not be reliable for detection of diffuse white matter (WM) injury. Our aim was to assess in very preterm infants the reliability of a classification system for WM injury on sequential cUS throughout the neonatal period, using magnetic resonance imaging (MRI) as reference standard. In 110 very preterm infants (gestational age <32 weeks), serial cUS during admission (median 8, range 4-22) and again around term equivalent age (TEA) and a single MRI around TEA were performed. cUS during admission were assessed for presence of WM changes, and contemporaneous cUS and MRI around TEA additionally for abnormality of lateral ventricles. Sequential cUS (from birth up to TEA) and MRI were classified as normal/mildly abnormal, moderately abnormal, or severely abnormal, based on a combination of findings of the WM and lateral ventricles. Predictive values of the cUS classification were calculated. Sequential cUS were classified as normal/mildly abnormal, moderately abnormal, and severely abnormal in, respectively, 22%, 65%, and 13% of infants and MRI in, respectively, 30%, 52%, and 18%. The positive predictive value of the cUS classification for the MRI classification was high for severely abnormal WM (0.79) but lower for normal/mildly abnormal (0.67) and moderately abnormal (0.64) WM. Sequential cUS during the neonatal period detects severely abnormal WM in very preterm infants but is less reliable for mildly and moderately abnormal WM. MRI around TEA seems needed to reliably detect WM injury in very preterm infants. (orig.)

  11. Reliable method for fission source convergence of Monte Carlo criticality calculation with Wielandt's method

    International Nuclear Information System (INIS)

    Yamamoto, Toshihiro; Miyoshi, Yoshinori

    2004-01-01

    A new algorithm of Monte Carlo criticality calculations for implementing Wielandt's method, which is one of acceleration techniques for deterministic source iteration methods, is developed, and the algorithm can be successfully implemented into MCNP code. In this algorithm, part of fission neutrons emitted during random walk processes are tracked within the current cycle, and thus a fission source distribution used in the next cycle spread more widely. Applying this method intensifies a neutron interaction effect even in a loosely-coupled array where conventional Monte Carlo criticality methods have difficulties, and a converged fission source distribution can be obtained with fewer cycles. Computing time spent for one cycle, however, increases because of tracking fission neutrons within the current cycle, which eventually results in an increase of total computing time up to convergence. In addition, statistical fluctuations of a fission source distribution in a cycle are worsened by applying Wielandt's method to Monte Carlo criticality calculations. However, since a fission source convergence is attained with fewer source iterations, a reliable determination of convergence can easily be made even in a system with a slow convergence. This acceleration method is expected to contribute to prevention of incorrect Monte Carlo criticality calculations. (author)

  12. A New Method of Reliability Evaluation Based on Wavelet Information Entropy for Equipment Condition Identification

    International Nuclear Information System (INIS)

    He, Z J; Zhang, X L; Chen, X F

    2012-01-01

    Aiming at reliability evaluation of condition identification of mechanical equipment, it is necessary to analyze condition monitoring information. A new method of reliability evaluation based on wavelet information entropy extracted from vibration signals of mechanical equipment is proposed. The method is quite different from traditional reliability evaluation models that are dependent on probability statistics analysis of large number sample data. The vibration signals of mechanical equipment were analyzed by means of second generation wavelet package (SGWP). We take relative energy in each frequency band of decomposed signal that equals a percentage of the whole signal energy as probability. Normalized information entropy (IE) is obtained based on the relative energy to describe uncertainty of a system instead of probability. The reliability degree is transformed by the normalized wavelet information entropy. A successful application has been achieved to evaluate the assembled quality reliability for a kind of dismountable disk-drum aero-engine. The reliability degree indicates the assembled quality satisfactorily.

  13. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  14. Assessment of modern methods of human factor reliability analysis in PSA studies

    International Nuclear Information System (INIS)

    Holy, J.

    2001-12-01

    The report is structured as follows: Classical terms and objects (Probabilistic safety assessment as a framework for human reliability assessment; Human failure within the PSA model; Basic types of operator failure modelled in a PSA study and analyzed by HRA methods; Qualitative analysis of human reliability; Quantitative analysis of human reliability used; Process of analysis of nuclear reactor operator reliability in a PSA study); New terms and objects (Analysis of dependences; Errors of omission; Errors of commission; Error forcing context); and Overview and brief assessment of human reliability analysis (Basic characteristics of the methods; Assets and drawbacks of the use of each of HRA method; History and prospects of the use of the methods). (P.A.)

  15. Study of Fuze Structure and Reliability Design Based on the Direct Search Method

    Science.gov (United States)

    Lin, Zhang; Ning, Wang

    2017-03-01

    Redundant design is one of the important methods to improve the reliability of the system, but mutual coupling of multiple factors is often involved in the design. In my study, Direct Search Method is introduced into the optimum redundancy configuration for design optimization, in which, the reliability, cost, structural weight and other factors can be taken into account simultaneously, and the redundant allocation and reliability design of aircraft critical system are computed. The results show that this method is convenient and workable, and applicable to the redundancy configurations and optimization of various designs upon appropriate modifications. And this method has a good practical value.

  16. Application of reliability analysis methods to the comparison of two safety circuits

    International Nuclear Information System (INIS)

    Signoret, J.-P.

    1975-01-01

    Two circuits of different design, intended for assuming the ''Low Pressure Safety Injection'' function in PWR reactors are analyzed using reliability methods. The reliability analysis of these circuits allows the failure trees to be established and the failure probability derived. The dependence of these results on test use and maintenance is emphasized as well as critical paths. The great number of results obtained may allow a well-informed choice taking account of the reliability wanted for the type of circuits [fr

  17. The Language Teaching Methods Scale: Reliability and Validity Studies

    Science.gov (United States)

    Okmen, Burcu; Kilic, Abdurrahman

    2016-01-01

    The aim of this research is to develop a scale to determine the language teaching methods used by English teachers. The research sample consisted of 300 English teachers who taught at Duzce University and in primary schools, secondary schools and high schools in the Provincial Management of National Education in the city of Duzce in 2013-2014…

  18. A method to determine validity and reliability of activity sensors

    NARCIS (Netherlands)

    Boerema, Simone Theresa; Hermens, Hermanus J.

    2013-01-01

    METHOD Four sensors were securely fastened to a mechanical oscillator (Vibration Exciter, type 4809, Brüel & Kjær) and moved at various frequencies (6.67Hz; 13.45Hz; 19.88Hz) within the range of human physical activity. For each of the three sensor axes, the sensors were simultaneously moved for

  19. Reliability and Validity of the Research Methods Skills Assessment

    Science.gov (United States)

    Smith, Tamarah; Smith, Samantha

    2018-01-01

    The Research Methods Skills Assessment (RMSA) was created to measure psychology majors' statistics knowledge and skills. The American Psychological Association's Guidelines for the Undergraduate Major in Psychology (APA, 2007, 2013) served as a framework for development. Results from a Rasch analysis with data from n = 330 undergraduates showed…

  20. A Bayesian method for detecting stellar flares

    Science.gov (United States)

    Pitkin, M.; Williams, D.; Fletcher, L.; Grant, S. D. T.

    2014-12-01

    We present a Bayesian-odds-ratio-based algorithm for detecting stellar flares in light-curve data. We assume flares are described by a model in which there is a rapid rise with a half-Gaussian profile, followed by an exponential decay. Our signal model also contains a polynomial background model required to fit underlying light-curve variations in the data, which could otherwise partially mimic a flare. We characterize the false alarm probability and efficiency of this method under the assumption that any unmodelled noise in the data is Gaussian, and compare it with a simpler thresholding method based on that used in Walkowicz et al. We find our method has a significant increase in detection efficiency for low signal-to-noise ratio (S/N) flares. For a conservative false alarm probability our method can detect 95 per cent of flares with S/N less than 20, as compared to S/N of 25 for the simpler method. We also test how well the assumption of Gaussian noise holds by applying the method to a selection of `quiet' Kepler stars. As an example we have applied our method to a selection of stars in Kepler Quarter 1 data. The method finds 687 flaring stars with a total of 1873 flares after vetos have been applied. For these flares we have made preliminary characterizations of their durations and and S/N.

  1. Reliability and validity of non-radiographic methods of thoracic kyphosis measurement: a systematic review.

    Science.gov (United States)

    Barrett, Eva; McCreesh, Karen; Lewis, Jeremy

    2014-02-01

    A wide array of instruments are available for non-invasive thoracic kyphosis measurement. Guidelines for selecting outcome measures for use in clinical and research practice recommend that properties such as validity and reliability are considered. This systematic review reports on the reliability and validity of non-invasive methods for measuring thoracic kyphosis. A systematic search of 11 electronic databases located studies assessing reliability and/or validity of non-invasive thoracic kyphosis measurement techniques. Two independent reviewers used a critical appraisal tool to assess the quality of retrieved studies. Data was extracted by the primary reviewer. The results were synthesized qualitatively using a level of evidence approach. 27 studies satisfied the eligibility criteria and were included in the review. The reliability, validity and both reliability and validity were investigated by sixteen, two and nine studies respectively. 17/27 studies were deemed to be of high quality. In total, 15 methods of thoracic kyphosis were evaluated in retrieved studies. All investigated methods showed high (ICC ≥ .7) to very high (ICC ≥ .9) levels of reliability. The validity of the methods ranged from low to very high. The strongest levels of evidence for reliability exists in support of the Debrunner kyphometer, Spinal Mouse and Flexicurve index, and for validity supports the arcometer and Flexicurve index. Further reliability and validity studies are required to strengthen the level of evidence for the remaining methods of measurement. This should be addressed by future research. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Radioisotope method potentialities in machine reliability and durability enhancement

    International Nuclear Information System (INIS)

    Postnikov, V.I.

    1975-01-01

    The development of a surface activation method is reviewed with regard to wear of machine parts. Examples demonstrating the highly promising aspects and practical application of the method are cited. The use of high-sensitivity instruments and variation of activation depth from 10 um to 0.5 mm allows to perform the investigations at a sensitivity of 0.05 um and to estimate the linear values of machine wear. Standard diagrams are presented for measuring the wear of different machine parts by means of surface activation. Investigations performed at several Soviet technological institutes afford a set of dependences, which characterize the distribution of radioactive isotopes in depth under different conditions of activation of diverse metals and alloys and permit to study the wear of any metal

  3. TESTING METHODS FOR MECHANICALLY IMPROVED SOILS: RELIABILITY AND VALIDITY

    Directory of Open Access Journals (Sweden)

    Ana Petkovšek

    2017-10-01

    Full Text Available A possibility of in-situ mechanical improvement for reducing the liquefaction potential of silty sands was investigated by using three different techniques: Vibratory Roller Compaction, Rapid Impact Compaction (RIC and Soil Mixing. Material properties at all test sites were investigated before and after improvement with the laboratory and the in situ tests (CPT, SDMT, DPSH B, static and dynamic load plate test, geohydraulic tests. Correlation between the results obtained by different test methods gave inconclusive answers.

  4. A two-step method for fast and reliable EUV mask metrology

    Science.gov (United States)

    Helfenstein, Patrick; Mochi, Iacopo; Rajendran, Rajeev; Yoshitake, Shusuke; Ekinci, Yasin

    2017-03-01

    One of the major obstacles towards the implementation of extreme ultraviolet lithography for upcoming technology nodes in semiconductor industry remains the realization of a fast and reliable detection methods patterned mask defects. We are developing a reflective EUV mask-scanning lensless imaging tool (RESCAN), installed at the Swiss Light Source synchrotron at the Paul Scherrer Institut. Our system is based on a two-step defect inspection method. In the first step, a low-resolution defect map is generated by die to die comparison of the diffraction patterns from areas with programmed defects, to those from areas that are known to be defect-free on our test sample. In a later stage, a die to database comparison will be implemented in which the measured diffraction patterns will be compared to those calculated directly from the mask layout. This Scattering Scanning Contrast Microscopy technique operates purely in the Fourier domain without the need to obtain the aerial image and, given a sufficient signal to noise ratio, defects are found in a fast and reliable way, albeit with a location accuracy limited by the spot size of the incident illumination. Having thus identified rough locations for the defects, a fine scan is carried out in the vicinity of these locations. Since our source delivers coherent illumination, we can use an iterative phase-retrieval method to reconstruct the aerial image of the scanned area with - in principle - diffraction-limited resolution without the need of an objective lens. Here, we will focus on the aerial image reconstruction technique and give a few examples to illustrate the capability of the method.

  5. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  6. Reliability and validity of the AutoCAD software method in lumbar lordosis measurement.

    Science.gov (United States)

    Letafatkar, Amir; Amirsasan, Ramin; Abdolvahabi, Zahra; Hadadnezhad, Malihe

    2011-12-01

    The aim of this study was to determine the reliability and validity of the AutoCAD software method in lumbar lordosis measurement. Fifty healthy volunteers with a mean age of 23 ± 1.80 years were enrolled. A lumbar lateral radiograph was taken on all participants, and the lordosis was measured according to the Cobb method. Afterward, the lumbar lordosis degree was measured via AutoCAD software and flexible ruler methods. The current study is accomplished in 2 parts: intratester and intertester evaluations of reliability as well as the validity of the flexible ruler and software methods. Based on the intraclass correlation coefficient, AutoCAD's reliability and validity in measuring lumbar lordosis were 0.984 and 0.962, respectively. AutoCAD showed to be a reliable and valid method to measure lordosis. It is suggested that this method may replace those that are costly and involve health risks, such as radiography, in evaluating lumbar lordosis.

  7. Using DOProC method in reliability assessment of steel elements exposed to fatigue

    Directory of Open Access Journals (Sweden)

    Krejsa Martin

    2017-01-01

    Full Text Available Fatigue crack damage depends on a number of stress range cycles. This is a time factor in the course of reliability for the entire designed service life. Three sizes are important for the characteristics of the propagation of fatigue cracks - initial size, detectable size and acceptable size. The theoretical model of fatigue crack progression can be based on a linear fracture mechanic. Depending on location of an initial crack, the crack may propagate in structural element e.g. from the edge or from the surface. When determining the required degree of reliability, it is possible to specify the time of the first inspection of the construction which will focus on the fatigue damage. Using a conditional probability and Bayesian approach, times for subsequent inspections can be determined. For probabilistic modelling of fatigue crack progression was used the original and new probabilistic method - the Direct Optimized Probabilistic Calculation (“DOProC”, which uses a purely numerical approach without any simulation techniques or approximation approach based on optimized numerical integration.

  8. Novel methods for detecting buried explosive devices

    Energy Technology Data Exchange (ETDEWEB)

    Kercel, S.W.; Burlage, R.S.; Patek, D.R.; Smith, C.M. [Oak Ridge National Lab., TN (United States); Hibbs, A.D.; Rayner, T.J. [Quantum Magnetics, Inc., San Diego, CA (United States)

    1997-04-01

    Oak Ridge National Laboratory (ORNL) and Quantum Magnetics, Inc. (QM) are exploring novel landmine detection technologies. Technologies considered here include bioreporter bacteria, swept acoustic resonance, nuclear quadrupole resonance (NQR), and semiotic data fusion. Bioreporter bacteria look promising for third-world humanitarian applications; they are inexpensive, and deployment does not require high-tech methods. Swept acoustic resonance may be a useful adjunct to magnetometers in humanitarian demining. For military demining, NQR is a promising method for detecting explosive substances; of 50,000 substances that have been tested, none has an NQR signature that can be mistaken for RDX or TNT. For both military and commercial demining, sensor fusion entails two daunting tasks, identifying fusible features in both present-day and emerging technologies, and devising a fusion algorithm that runs in real-time on cheap hardware. Preliminary research in these areas is encouraging. A bioreporter bacterium for TNT detection is under development. Investigation has just started in swept acoustic resonance as an approach to a cheap mine detector for humanitarian use. Real-time wavelet processing appears to be a key to extending NQR bomb detection into mine detection, including TNT-based mines. Recent discoveries in semiotics may be the breakthrough that will lead to a robust fused detection scheme.

  9. Botulinum Neurotoxin Detection Methods for Public Health Response and Surveillance

    Directory of Open Access Journals (Sweden)

    Nagarajan Thirunavukkarasu

    2018-06-01

    Full Text Available Botulism outbreak due to consumption of food contaminated with botulinum neurotoxins (BoNTs is a public health emergency. The threat of bioterrorism through deliberate distribution in food sources and/or aerosolization of BoNTs raises global public health and security concerns due to the potential for high mortality and morbidity. Rapid and reliable detection methods are necessary to support clinical diagnosis and surveillance for identifying the source of contamination, performing epidemiological analysis of the outbreak, preventing and responding to botulism outbreaks. This review considers the applicability of various BoNT detection methods and examines their fitness-for-purpose in safeguarding the public health and security goals.

  10. Reliability Evaluation of Bridges Based on Nonprobabilistic Response Surface Limit Method

    OpenAIRE

    Chen, Xuyong; Chen, Qian; Bian, Xiaoya; Fan, Jianping

    2017-01-01

    Due to many uncertainties in nonprobabilistic reliability assessment of bridges, the limit state function is generally unknown. The traditional nonprobabilistic response surface method is a lengthy and oscillating iteration process and leads to difficultly solving the nonprobabilistic reliability index. This article proposes a nonprobabilistic response surface limit method based on the interval model. The intention of this method is to solve the upper and lower limits of the nonprobabilistic ...

  11. System reliability with correlated components: Accuracy of the Equivalent Planes method

    NARCIS (Netherlands)

    Roscoe, K.; Diermanse, F.; Vrouwenvelder, A.C.W.M.

    2015-01-01

    Computing system reliability when system components are correlated presents a challenge because it usually requires solving multi-fold integrals numerically, which is generally infeasible due to the computational cost. In Dutch flood defense reliability modeling, an efficient method for computing

  12. System reliability with correlated components : Accuracy of the Equivalent Planes method

    NARCIS (Netherlands)

    Roscoe, K.; Diermanse, F.; Vrouwenvelder, T.

    2015-01-01

    Computing system reliability when system components are correlated presents a challenge because it usually requires solving multi-fold integrals numerically, which is generally infeasible due to the computational cost. In Dutch flood defense reliability modeling, an efficient method for computing

  13. Fuel rod failure detection method and system

    International Nuclear Information System (INIS)

    Assmann, H.; Janson, W.; Stehle, H.; Wahode, P.

    1975-01-01

    The inventor claims a method for the detection of a defective fuel rod cladding tube or of inleaked water in the cladding tube of a fuel rod in the fuel assembly of a pressurized-water reactor. The fuel assembly is not disassembled but examined as a whole. In the examination, the cladding tube is heated near one of its two end plugs, e.g. with an attached high-frequency inductor. The water contained in the cladding tube evaporates, and steam bubbles or a condensate are detected by the ultrasonic impulse-echo method. It is also possible to measure the delay of the temperature rise at the end plug or to determine the cooling energy required to keep the end plug temperature stable and thus to detect water ingression. (DG/AK) [de

  14. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  15. A method for detecting hydrophobic patches protein

    NARCIS (Netherlands)

    Lijnzaad, P.; Berendsen, H.J.C.; Argos, P.

    1996-01-01

    A method for the detection of hydrophobic patches on the surfaces of protein tertiary structures is presented, it delineates explicit contiguous pieces of surface of arbitrary size and shape that consist solely of carbon and sulphur atoms using a dot representation of the solvent-accessible surface,

  16. Radioimmunoassay method for detection of gonorrhea antibodies

    International Nuclear Information System (INIS)

    1975-01-01

    A novel radioimmunoassay for the detection of gonorrhea antibodies in serum is described. A radionuclide is bound to gonorrhea antigens produced by a growth culture. In the presence of gonorrhea antibodies in the serum, an antigen-antibody conjugate is formed, the concentration of which can be measured with conventional radiometric methods. The radioimmunoassay is highly specific

  17. GMDD: a database of GMO detection methods

    NARCIS (Netherlands)

    Dong, W.; Yang, L.; Shen, K.; Kim, B.; Kleter, G.A.; Marvin, H.J.P.; Guo, R.; Liang, W.; Zhang, D.

    2008-01-01

    Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been

  18. Reliability and Minimum Detectable Change of Temporal-Spatial, Kinematic, and Dynamic Stability Measures during Perturbed Gait.

    Directory of Open Access Journals (Sweden)

    Christopher A Rábago

    Full Text Available Temporal-spatial, kinematic variability, and dynamic stability measures collected during perturbation-based assessment paradigms are often used to identify dysfunction associated with gait instability. However, it remains unclear which measures are most reliable for detecting and tracking responses to perturbations. This study systematically determined the between-session reliability and minimum detectable change values of temporal-spatial, kinematic variability, and dynamic stability measures during three types of perturbed gait. Twenty young healthy adults completed two identical testing sessions two weeks apart, comprised of an unperturbed and three perturbed (cognitive, physical, and visual walking conditions in a virtual reality environment. Within each session, perturbation responses were compared to unperturbed walking using paired t-tests. Between-session reliability and minimum detectable change values were also calculated for each measure and condition. All temporal-spatial, kinematic variability and dynamic stability measures demonstrated fair to excellent between-session reliability. Minimal detectable change values, normalized to mean values ranged from 1-50%. Step width mean and variability measures demonstrated the greatest response to perturbations with excellent between-session reliability and low minimum detectable change values. Orbital stability measures demonstrated specificity to perturbation direction and sensitivity with excellent between-session reliability and low minimum detectable change values. We observed substantially greater between-session reliability and lower minimum detectable change values for local stability measures than previously described which may be the result of averaging across trials within a session and using velocity versus acceleration data for reconstruction of state spaces. Across all perturbation types, temporal-spatial, orbital and local measures were the most reliable measures with the

  19. A Reliable Method for Rhythm Analysis during Cardiopulmonary Resuscitation

    Directory of Open Access Journals (Sweden)

    U. Ayala

    2014-01-01

    Full Text Available Interruptions in cardiopulmonary resuscitation (CPR compromise defibrillation success. However, CPR must be interrupted to analyze the rhythm because although current methods for rhythm analysis during CPR have high sensitivity for shockable rhythms, the specificity for nonshockable rhythms is still too low. This paper introduces a new approach to rhythm analysis during CPR that combines two strategies: a state-of-the-art CPR artifact suppression filter and a shock advice algorithm (SAA designed to optimally classify the filtered signal. Emphasis is on designing an algorithm with high specificity. The SAA includes a detector for low electrical activity rhythms to increase the specificity, and a shock/no-shock decision algorithm based on a support vector machine classifier using slope and frequency features. For this study, 1185 shockable and 6482 nonshockable 9-s segments corrupted by CPR artifacts were obtained from 247 patients suffering out-of-hospital cardiac arrest. The segments were split into a training and a test set. For the test set, the sensitivity and specificity for rhythm analysis during CPR were 91.0% and 96.6%, respectively. This new approach shows an important increase in specificity without compromising the sensitivity when compared to previous studies.

  20. Bayesian Methods for Radiation Detection and Dosimetry

    CERN Document Server

    Groer, Peter G

    2002-01-01

    We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed comp...

  1. Metagenomic Detection Methods in Biopreparedness Outbreak Scenarios

    DEFF Research Database (Denmark)

    Karlsson, Oskar Erik; Hansen, Trine; Knutsson, Rickard

    2013-01-01

    In the field of diagnostic microbiology, rapid molecular methods are critically important for detecting pathogens. With rapid and accurate detection, preventive measures can be put in place early, thereby preventing loss of life and further spread of a disease. From a preparedness perspective...... of a clinical sample, creating a metagenome, in a single week of laboratory work. As new technologies emerge, their dissemination and capacity building must be facilitated, and criteria for use, as well as guidelines on how to report results, must be established. This article focuses on the use of metagenomics...

  2. Detecting long-term growth trends using tree rings: a critical evaluation of methods.

    Science.gov (United States)

    Peters, Richard L; Groenendijk, Peter; Vlam, Mart; Zuidema, Pieter A

    2015-05-01

    Tree-ring analysis is often used to assess long-term trends in tree growth. A variety of growth-trend detection methods (GDMs) exist to disentangle age/size trends in growth from long-term growth changes. However, these detrending methods strongly differ in approach, with possible implications for their output. Here, we critically evaluate the consistency, sensitivity, reliability and accuracy of four most widely used GDMs: conservative detrending (CD) applies mathematical functions to correct for decreasing ring widths with age; basal area correction (BAC) transforms diameter into basal area growth; regional curve standardization (RCS) detrends individual tree-ring series using average age/size trends; and size class isolation (SCI) calculates growth trends within separate size classes. First, we evaluated whether these GDMs produce consistent results applied to an empirical tree-ring data set of Melia azedarach, a tropical tree species from Thailand. Three GDMs yielded similar results - a growth decline over time - but the widely used CD method did not detect any change. Second, we assessed the sensitivity (probability of correct growth-trend detection), reliability (100% minus probability of detecting false trends) and accuracy (whether the strength of imposed trends is correctly detected) of these GDMs, by applying them to simulated growth trajectories with different imposed trends: no trend, strong trends (-6% and +6% change per decade) and weak trends (-2%, +2%). All methods except CD, showed high sensitivity, reliability and accuracy to detect strong imposed trends. However, these were considerably lower in the weak or no-trend scenarios. BAC showed good sensitivity and accuracy, but low reliability, indicating uncertainty of trend detection using this method. Our study reveals that the choice of GDM influences results of growth-trend studies. We recommend applying multiple methods when analysing trends and encourage performing sensitivity and reliability

  3. Chromogenic in situ hybridization is a reliable assay for detection of ALK rearrangements in adenocarcinomas of the lung.

    Science.gov (United States)

    Schildhaus, Hans-Ulrich; Deml, Karl-Friedrich; Schmitz, Katja; Meiboom, Maren; Binot, Elke; Hauke, Sven; Merkelbach-Bruse, Sabine; Büttner, Reinhard

    2013-11-01

    Reliable detection of anaplastic lymphoma kinase (ALK) rearrangements is a prerequisite for personalized treatment of lung cancer patients, as ALK rearrangements represent a predictive biomarker for the therapy with specific tyrosine kinase inhibitors. Currently, fluorescent in situ hybridization (FISH) is considered to be the standard method for assessing formalin-fixed and paraffin-embedded tissue for ALK inversions and translocations. However, FISH requires a specialized equipment, the signals fade rapidly and it is difficult to detect overall morphology and tumor heterogeneity. Chromogenic in situ hybridization (CISH) has been successfully introduced as an alternative test for the detection of several genetic aberrations. This study validates a newly developed ALK CISH assay by comparing FISH and CISH signal patterns in lung cancer samples with and without ALK rearrangements. One hundred adenocarcinomas of the lung were included in this study, among them 17 with known ALK rearrangement. FISH and CISH were carried out and evaluated according to the manufacturers' recommendations. For both assays, tumors were considered positive if ≥15% of tumor cells showed either isolated 3' signals or break-apart patterns or a combination of both. A subset of tumors was exemplarily examined by using a novel EML4 (echinoderm microtubule-associated protein-like 4) CISH probe. Red, green and fusion CISH signals were clearcut and different signal patterns were easily recognized. The percentage of aberrant tumor cells was statistically highly correlated (PCISH. On the basis of 86 samples that were evaluable by ALK CISH, we found a 100% sensitivity and 100% specificity of this assay. Furthermore, EML4 rearrangements could be recognized by CISH. CISH is a highly reliable, sensitive and specific method for the detection of ALK gene rearrangements in pulmonary adenocarcinomas. Our results suggest that CISH might serve as a suitable alternative to FISH, which is the current gold

  4. Quench detection method for 2G HTS wire

    International Nuclear Information System (INIS)

    Marchevsky, M; Xie, Y-Y; Selvamanickam, V

    2010-01-01

    2G HTS conductors are increasingly used in various commercial applications and their thermal and electrical stability is an important reliability factor. Detection and prevention of quenches in 2G wire-based cables and solenoids has proven to be a difficult engineering task. This is largely due to a very slow normal zone propagation in coated conductors that leads to formation of localized hotspots while the rest of the conductor remains in the superconducting state. We propose an original method of quench and hotspot detection for 2G wires and coils that is based upon local magnetic sensing and takes advantage of 2G wire planar geometry. We demonstrate our technique experimentally and show that its sensitivity is superior to the known voltage detection scheme. A unique feature of the method is its capability to remotely detect instant degradation of the wire critical current even before a normal zone is developed within the conductor. Various modifications of the method applicable to practical device configurations are discussed.

  5. Quench detection method for 2G HTS wire

    Energy Technology Data Exchange (ETDEWEB)

    Marchevsky, M; Xie, Y-Y; Selvamanickam, V, E-mail: maxmarche@gmail.co, E-mail: yxie@superpower-inc.co [SuperPower, Inc., 450 Duane Avenue, Schenectady, NY 12304 (United States)

    2010-03-15

    2G HTS conductors are increasingly used in various commercial applications and their thermal and electrical stability is an important reliability factor. Detection and prevention of quenches in 2G wire-based cables and solenoids has proven to be a difficult engineering task. This is largely due to a very slow normal zone propagation in coated conductors that leads to formation of localized hotspots while the rest of the conductor remains in the superconducting state. We propose an original method of quench and hotspot detection for 2G wires and coils that is based upon local magnetic sensing and takes advantage of 2G wire planar geometry. We demonstrate our technique experimentally and show that its sensitivity is superior to the known voltage detection scheme. A unique feature of the method is its capability to remotely detect instant degradation of the wire critical current even before a normal zone is developed within the conductor. Various modifications of the method applicable to practical device configurations are discussed.

  6. Various imaging methods in the detection of small hepatomas

    International Nuclear Information System (INIS)

    Nakatsuka, Haruki; Kaminou, Toshio; Takemoto, Kazumasa; Takashima, Sumio; Kobayashi, Nobuyuki; Nakamura, Kenji; Onoyama, Yasuto; Kurioka, Naruto

    1985-01-01

    Fifty-one patients with small hepatomas under 5 cm in diameter were studied to compare the detectability of various imaging methods. Positive finding was obtained in 50 % of the patients by scintigraphy, in 74 % by ultrasonography and in 79 % by CT during screening tests. Rate of detection in retrospective analysis, after the site of the tumor had been known, were 73 %, 93 % and 87 % respectively. Rate of detection was 92 % by celiac arteriography and 98 % by selective hepatic arteriography. In 21 patients, who had the tumor under 3 cm, the rate was 32 % for scintigraphy, 74 % for ultrasonography and 65 % for CT during screening, whereas it was 58 %, 84 % and 75 % retrospectively. By celiac arteriography, it was 85 %, and by hepatic arteriography, 95 %. Rate of detection of small hepatomas in screening tests differed remarkably from that in retrospective analysis. No single method of imaging can disclose reliably the presense of small hepatoma, therefore more than one method should be used in screening. (author)

  7. PAUT-based defect detection method for submarine pressure hulls

    Directory of Open Access Journals (Sweden)

    Min-jae Jung

    2018-03-01

    Full Text Available A submarine has a pressure hull that can withstand high hydraulic pressure and therefore, requires the use of highly advanced shipbuilding technology. When producing a pressure hull, periodic inspection, repair, and maintenance are conducted to maintain its soundness. Of the maintenance methods, Non-Destructive Testing (NDT is the most effective, because it does not damage the target but sustains its original form and function while inspecting internal and external defects. The NDT process to detect defects in the welded parts of the submarine is applied through Magnetic particle Testing (MT to detect surface defects and Ultrasonic Testing (UT and Radiography Testing (RT to detect internal defects. In comparison with RT, UT encounters difficulties in distinguishing the types of defects, can yield different results depending on the skills of the inspector, and stores no inspection record. At the same time, the use of RT gives rise to issues related to worker safety due to radiation exposure. RT is also difficult to apply from the perspectives of the manufacturing of the submarine and economic feasibility. Therefore, in this study, the Phased Array Ultrasonic Testing (PAUT method was applied to propose an inspection method that can address the above disadvantages by designing a probe to enhance the precision of detection of hull defects and the reliability of calculations of defect size. Keywords: Submarine pressure hull, Non-destructive testing, Phased array ultrasonic testing

  8. MOA-2010-BLG-311: A PLANETARY CANDIDATE BELOW THE THRESHOLD OF RELIABLE DETECTION

    International Nuclear Information System (INIS)

    Yee, J. C.; Hung, L.-W.; Gould, A.; Gaudi, B. S.; Bond, I. A.; Allen, W.; Monard, L. A. G.; Albrow, M. D.; Fouqué, P.; Dominik, M.; Tsapras, Y.; Udalski, A.; Zellem, R.; Bos, M.; Christie, G. W.; DePoy, D. L.; Dong, Subo; Drummond, J.; Gorbikov, E.; Han, C.

    2013-01-01

    We analyze MOA-2010-BLG-311, a high magnification (A max > 600) microlensing event with complete data coverage over the peak, making it very sensitive to planetary signals. We fit this event with both a point lens and a two-body lens model and find that the two-body lens model is a better fit but with only Δχ 2 ∼ 80. The preferred mass ratio between the lens star and its companion is q = 10 –3.7±0.1 , placing the candidate companion in the planetary regime. Despite the formal significance of the planet, we show that because of systematics in the data the evidence for a planetary companion to the lens is too tenuous to claim a secure detection. When combined with analyses of other high-magnification events, this event helps empirically define the threshold for reliable planet detection in high-magnification events, which remains an open question.

  9. MOA-2010-BLG-311: A PLANETARY CANDIDATE BELOW THE THRESHOLD OF RELIABLE DETECTION

    Energy Technology Data Exchange (ETDEWEB)

    Yee, J. C.; Hung, L.-W.; Gould, A.; Gaudi, B. S. [Department of Astronomy, Ohio State University, 140 West 18th Avenue, Columbus, OH 43210 (United States); Bond, I. A. [Institute for Information and Mathematical Sciences, Massey University, Private Bag 102-904, Auckland 1330 (New Zealand); Allen, W. [Vintage Lane Observatory, Blenheim (New Zealand); Monard, L. A. G. [Bronberg Observatory, Centre for Backyard Astrophysics, Pretoria (South Africa); Albrow, M. D. [Department of Physics and Astronomy, University of Canterbury, Private Bag 4800, Christchurch 8020 (New Zealand); Fouque, P. [IRAP, CNRS, Universite de Toulouse, 14 avenue Edouard Belin, F-31400 Toulouse (France); Dominik, M. [SUPA, University of St. Andrews, School of Physics and Astronomy, North Haugh, St. Andrews, KY16 9SS (United Kingdom); Tsapras, Y. [Las Cumbres Observatory Global Telescope Network, 6740B Cortona Drive, Goleta, CA 93117 (United States); Udalski, A. [Warsaw University Observatory, Al. Ujazdowskie 4, 00-478 Warszawa (Poland); Zellem, R. [Department of Planetary Sciences/LPL, University of Arizona, 1629 East University Boulevard, Tucson, AZ 85721 (United States); Bos, M. [Molehill Astronomical Observatory, North Shore City, Auckland (New Zealand); Christie, G. W. [Auckland Observatory, P.O. Box 24-180, Auckland (New Zealand); DePoy, D. L. [Department of Physics, Texas A and M University, 4242 TAMU, College Station, TX 77843-4242 (United States); Dong, Subo [Institute for Advanced Study, Einstein Drive, Princeton, NJ 08540 (United States); Drummond, J. [Possum Observatory, Patutahi (New Zealand); Gorbikov, E. [School of Physics and Astronomy, Raymond and Beverley Sackler Faculty of Exact Sciences, Tel-Aviv University, Tel Aviv 69978 (Israel); Han, C., E-mail: liweih@astro.ucla.edu, E-mail: rzellem@lpl.arizona.edu, E-mail: tim.natusch@aut.ac.nz [Department of Physics, Chungbuk National University, 410 Seongbong-Rho, Hungduk-Gu, Chongju 371-763 (Korea, Republic of); Collaboration: muFUN Collaboration; MOA Collaboration; OGLE Collaboration; PLANET Collaboration; RoboNet Collaboration; MiNDSTEp Consortium; and others

    2013-05-20

    We analyze MOA-2010-BLG-311, a high magnification (A{sub max} > 600) microlensing event with complete data coverage over the peak, making it very sensitive to planetary signals. We fit this event with both a point lens and a two-body lens model and find that the two-body lens model is a better fit but with only {Delta}{chi}{sup 2} {approx} 80. The preferred mass ratio between the lens star and its companion is q = 10{sup -3.7{+-}0.1}, placing the candidate companion in the planetary regime. Despite the formal significance of the planet, we show that because of systematics in the data the evidence for a planetary companion to the lens is too tenuous to claim a secure detection. When combined with analyses of other high-magnification events, this event helps empirically define the threshold for reliable planet detection in high-magnification events, which remains an open question.

  10. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    Science.gov (United States)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  11. Reliability Analysis Of Fire System On The Industry Facility By Use Fameca Method

    International Nuclear Information System (INIS)

    Sony T, D.T.; Situmorang, Johnny; Ismu W, Puradwi; Demon H; Mulyanto, Dwijo; Kusmono, Slamet; Santa, Sigit Asmara

    2000-01-01

    FAMECA is one of the analysis method to determine system reliability on the industry facility. Analysis is done by some procedure that is identification of component function, determination of failure mode, severity level and effect of their failure. Reliability value is determined by three combinations that is severity level, component failure value and critical component. Reliability of analysis has been done for fire system on the industry by FAMECA method. Critical component which identified is pump, air release valve, check valve, manual test valve, isolation valve, control system etc

  12. Verification of practicability of quantitative reliability evaluation method (De-BDA) in nuclear power plants

    International Nuclear Information System (INIS)

    Takahashi, Kinshiro; Yukimachi, Takeo.

    1988-01-01

    A variety of methods have been applied to study of reliability analysis in which human factors are included in order to enhance the safety and availability of nuclear power plants. De-BDA (Detailed Block Diagram Analysis) is one of such mehtods developed with the objective of creating a more comprehensive and understandable tool for quantitative analysis of reliability associated with plant operations. The practicability of this method has been verified by applying it to reliability analysis of various phases of plant operation as well as evaluation of enhanced man-machine interface in the central control room. (author)

  13. Detection method of internal leakage from valve using acoustic method

    International Nuclear Information System (INIS)

    Kumagai, Hiromichi; Kitajima, Akira; Suzuki, Akio.

    1990-01-01

    The objective of this study is to estimate the feasibility of the acoustic method for the internal leakage from the valves in power plants. From the experimental results, it was suggested that the acoustic method for the monitoring of leakage was feasible. When the background levels are higher than the acoustic signals from leakage, we can detect the leakage analyzing the spectrum of the remainders which take the background noise from the acoustic signals. (author)

  14. A study in the reliability analysis method for nuclear power plant structures (I)

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Byung Hwan; Choi, Seong Cheol; Shin, Ho Sang; Yang, In Hwan; Kim, Yi Sung; Yu, Young; Kim, Se Hun [Seoul, Nationl Univ., Seoul (Korea, Republic of)

    1999-03-15

    Nuclear power plant structures may be exposed to aggressive environmental effects that may cause their strength and stiffness to decrease over their service life. Although the physics of these damage mechanisms are reasonably well understood and quantitative evaluation of their effects on time-dependent structural behavior is possible in some instances, such evaluations are generally very difficult and remain novel. The assessment of existing steel containment in nuclear power plants for continued service must provide quantitative evidence that they are able to withstand future extreme loads during a service period with an acceptable level of reliability. Rational methodologies to perform the reliability assessment can be developed from mechanistic models of structural deterioration, using time-dependent structural reliability analysis to take loading and strength uncertainties into account. The final goal of this study is to develop the analysis method for the reliability of containment structures. The cause and mechanism of corrosion is first clarified and the reliability assessment method has been established. By introducing the equivalent normal distribution, the procedure of reliability analysis which can determine the failure probabilities has been established. The influence of design variables to reliability and the relation between the reliability and service life will be continued second year research.

  15. A novel method for detection of apoptosis

    International Nuclear Information System (INIS)

    Zagariya, Alexander M.

    2012-01-01

    There are two different Angiotensin II (ANG II) peptides in nature: Human type (ANG II) and Bovine type (ANG II*). These eight amino acid peptides differ only at position 5 where Valine is replaced by Isoleucine in the Bovine type. They are present in all species studied so far. These amino acids are different by only one atom of carbon. This difference is so small, that it will allow any of ANG II, Bovine or Human antibodies to interact with all species and create a universal method for apoptosis detection. ANG II concentrations are found at substantially higher levels in apoptotic, compared to non-apoptotic, tissues. ANG II accumulation can lead to DNA damage, mutations, carcinogenesis and cell death. We demonstrate that Bovine antiserum can be used for universal detection of apoptosis. In 2010, the worldwide market for apoptosis detection reached the $20 billion mark and significantly increases each year. Most commercially available methods are related to Annexin V and TUNNEL. Our new method based on ANG II is more widely known to physicians and scientists compared to previously used methods. Our approach offers a novel alternative for assessing apoptosis activity with enhanced sensitivity, at a lower cost and ease of use.

  16. Hough transform methods used for object detection

    International Nuclear Information System (INIS)

    Qussay A Salih; Abdul Rahman Ramli; Md Mahmud Hassan Prakash

    2001-01-01

    The Hough transform (HT) is a robust parameter estimator of multi-dimensional features in images. The HT is an established technique which evidences a shape by mapping image edge points into a parameter space. The HT is technique which is used to isolate curves of a give shape in an image. The classical HT requires that the curve be specified in some parametric from and, hence is most commonly used in the detection of regular curves. The HT has been generalized so that it is capable of detecting arbitrary curved shapes. The main advantage of this transform technique is that it is very tolerant of gaps in the actual object boundaries the classical HT for the detection of line , we will indicate how it can be applied to the detection of arbitrary shapes. Sometimes the straight line HT is efficient enough to detect features such as artificial curves. The HT is an established technique for extracting geometric shapes based on the duality definition of the points on a curve and their parameters. This technique has been developed for extracting simple geometric shapes such as lines, circles and ellipses as well as arbitrary shapes. The HT provides robustness against discontinuous or missing features, points or edges are mapped into a partitioned parameter of Hough space as individual votes where peaks denote the feature of interest represented in a non-analytically tabular form. The main drawback of the HT technique is the computational requirement which has an exponential growth of memory space and processing time as the number of parameters used to represent a primitive increases. For this reason most of the research on the HT has focused on reducing the computational burden for extracting of arbitrary shapes under more general transformations include a overview of describing the methods for the detection image processing programs are frequently required to detect and particle classification in an industrial setting, a standard algorithms for this detection lines

  17. Detection of food irradiation - two analytical methods

    International Nuclear Information System (INIS)

    1994-01-01

    This publication summarizes the activities of Nordic countries in the field of detection of irradiated food. The National Food Agency of Denmark has coordinated the project. The two analytical methods investigated were: the gas-chromatographic determination of the hydrocarbon/lipid ratio in irradiated chicken meat, and a bioassay based on microelectrophoresis of DNA from single cells. Also a method for determination of o-tyrosine in the irradiated and non-irradiated chicken meat has been tested. The first method based on radiolytical changes in fatty acids, contained in chicken meat, has been tested and compared in the four Nordic countries. Four major hydrocarbons (C16:2, C16:3, C17:1 and C17:2) have been determined and reasonable agreement was observed between the dose level and hydrocarbons concentration. Results of a bioassay, where strand breaks of DNA are demonstrated by microelectrophoresis of single cells, prove a correlation between the dose levels and the pattern of DNA fragments migration. The hydrocarbon method can be applied to detect other irradiated, fat-containing foods, while the DNA method can be used for some animal and some vegetable foods as well.Both methods allow to determine the fact of food irradiation beyond any doubt, thus making them suitable for food control analysis. The detailed determination protocols are given. (EG)

  18. Research and Design of Rootkit Detection Method

    Science.gov (United States)

    Liu, Leian; Yin, Zuanxing; Shen, Yuli; Lin, Haitao; Wang, Hongjiang

    Rootkit is one of the most important issues of network communication systems, which is related to the security and privacy of Internet users. Because of the existence of the back door of the operating system, a hacker can use rootkit to attack and invade other people's computers and thus he can capture passwords and message traffic to and from these computers easily. With the development of the rootkit technology, its applications are more and more extensive and it becomes increasingly difficult to detect it. In addition, for various reasons such as trade secrets, being difficult to be developed, and so on, the rootkit detection technology information and effective tools are still relatively scarce. In this paper, based on the in-depth analysis of the rootkit detection technology, a new kind of the rootkit detection structure is designed and a new method (software), X-Anti, is proposed. Test results show that software designed based on structure proposed is much more efficient than any other rootkit detection software.

  19. Detection method of a failed fuel

    International Nuclear Information System (INIS)

    Urata, Megumu; Uchida, Shunsuke; Utamura, Motoaki.

    1976-01-01

    Object: To divide a tank arrangement into a heating tank for the exclusive use of heating and a mixing tank for the exclusive use of mixing to thereby minimize the purifying amount of reactor water pumped from the interior of reactor and to considerably minimize the capacity of a purifier. Structure: In a detection method of a failed fuel comprising stopping a flow of coolant within fuel assemblies arranged in the coolant in a reactor container, sampling said coolant within the fuel assemblies, and detecting a radioactivity level of sampling liquid, the improvement of the method comprising the steps of heating a part of said coolant removed from the interior of said reactor container, mixing said heated coolant into the remainder of said removed coolant, pouring said mixed liquid into said fuel assemblies, and after a lapse of given time, sampling the liquid poured into said fuel assemblies. (Kawakami, Y.)

  20. Method for detecting a failed fuel

    International Nuclear Information System (INIS)

    Utamura, Motoaki; Urata, Megumu; Uchida, Shunsuke.

    1976-01-01

    Purpose: To provide a method for the detection of failed fuel by pouring hot water, in which pouring speed of liquid to be poured and temperature of the liquid are controlled to prevent the leakage of the liquid. Constitution: The method comprises blocking the top of a fuel assembly arranged in coolant to stop a flow of coolant, pouring a liquid higher in temperature than that of coolant into the fuel assembly, sampling the liquid poured, and measuring the concentration of radioactivity of coolant already subjected to sampling to detect a failed fuel. At this time, controlling is made so that the pouring speed of the poured liquid is set to about 25 l/min, and an increased portion of temperature from the temperature of liquid to the temperature of coolant is set to a level less than about 15 0 C. (Furukawa, Y.)

  1. System and method for anomaly detection

    Science.gov (United States)

    Scherrer, Chad

    2010-06-15

    A system and method for detecting one or more anomalies in a plurality of observations is provided. In one illustrative embodiment, the observations are real-time network observations collected from a stream of network traffic. The method includes performing a discrete decomposition of the observations, and introducing derived variables to increase storage and query efficiencies. A mathematical model, such as a conditional independence model, is then generated from the formatted data. The formatted data is also used to construct frequency tables which maintain an accurate count of specific variable occurrence as indicated by the model generation process. The formatted data is then applied to the mathematical model to generate scored data. The scored data is then analyzed to detect anomalies.

  2. Bearing Procurement Analysis Method by Total Cost of Ownership Analysis and Reliability Prediction

    Science.gov (United States)

    Trusaji, Wildan; Akbar, Muhammad; Sukoyo; Irianto, Dradjad

    2018-03-01

    In making bearing procurement analysis, price and its reliability must be considered as decision criteria, since price determines the direct cost as acquisition cost and reliability of bearing determine the indirect cost such as maintenance cost. Despite the indirect cost is hard to identify and measured, it has high contribution to overall cost that will be incurred. So, the indirect cost of reliability must be considered when making bearing procurement analysis. This paper tries to explain bearing evaluation method with the total cost of ownership analysis to consider price and maintenance cost as decision criteria. Furthermore, since there is a lack of failure data when bearing evaluation phase is conducted, reliability prediction method is used to predict bearing reliability from its dynamic load rating parameter. With this method, bearing with a higher price but has higher reliability is preferable for long-term planning. But for short-term planning the cheaper one but has lower reliability is preferable. This contextuality can give rise to conflict between stakeholders. Thus, the planning horizon needs to be agreed by all stakeholder before making a procurement decision.

  3. Method and device for detecting radiatons

    International Nuclear Information System (INIS)

    Borel, J.; Goascoz, V.

    1979-01-01

    The method consists in fabricating an MOS transistor comprising a drain region and a source region separated from each other by a bulk region of opposite doping type relative to the first two regions, in delivering the radiation to be detected into the carrier-collection region of the MOS transistor, in leaving the bulk region at a floating potential and in collecting the drain-source current of the transistor

  4. Assessment of Advanced Life Support competence when combining different test methods--reliability and validity

    DEFF Research Database (Denmark)

    Ringsted, C; Lippert, F; Hesselfeldt, R

    2007-01-01

    Cardiac Arrest Simulation Test (CASTest) scenarios for the assessments according to guidelines 2005. AIMS: To analyse the reliability and validity of the individual sub-tests provided by ERC and to find a combination of MCQ and CASTest that provides a reliable and valid single effect measure of ALS...... that possessed high reliability, equality of test sets, and ability to discriminate between the two groups of supposedly different ALS competence. CONCLUSIONS: ERC sub-tests of ALS competence possess sufficient reliability and validity. A combined ALS score with equal weighting of one MCQ and one CASTest can...... competence. METHODS: Two groups of participants were included in this randomised, controlled experimental study: a group of newly graduated doctors, who had not taken the ALS course (N=17) and a group of students, who had passed the ALS course 9 months before the study (N=16). Reliability in terms of inter...

  5. DIAGNOSTIC METHODS IN BREAST CANCER DETECTION

    Directory of Open Access Journals (Sweden)

    Kristijana Hertl

    2018-02-01

    Full Text Available Background. In the world as well as in Slovenia, breast cancer is the most frequent female cancer. Due to its high incidence, it appears to be a serious health and economic problem. Content. Among other, tumour size at diagnosis, is an important prognostic factors of the course of the disease. The probability of axillary lymph node involvement as well as distant metastases is greater in larger tumours. This is the reason that encouraged the development of various diagnostic methods for early detection of small, clinically non-palpable breast tumours. Mammography, however, remains the »golden standard« of early breast cancer detection. It is the basic diagnostic method applied in all symptomatic women over 35 years of age and in asymptomatic women over 40 years of age. Ultrasonography (US, additional projections, magnetic resonance imaging (MRI and ductography are regarded as complementary diagnostic breast imaging techniques in addition to mammography. The detected changes in the breast can be further confirmed by US-, MR-guided or stereotactic biopsy. If necessary, surgical biopsy and the excision of a tissue sample, after wire or isotope localisation of the nonpalpable lesion, can be performed. Conclusions. Any of the above mentioned diagnostic methods has advantages as well as drawbacks and only detailed knowledge and understanding of each of them may assure the best option.

  6. Detection method of internal leakage from valve using acoustic method

    International Nuclear Information System (INIS)

    Kumagai, Horomichi

    1990-01-01

    The purpose of this study is to estimate the availability of acoustic method for detecting the internal leakage of valves at power plants. Experiments have been carried out on the characteristics of acoustic noise caused by the leak simulated flow. From the experimental results, the mechanism of the acoustic noisegenerated from flow, the relation between acoustic intensity and leak flow velocity, and the characteristics of the acoustic frequency spectrum were clarified. The acoustic method was applied to valves at site, and the background noises were measured in abnormal plant conditions. When the background level is higher than the acoustic signal, the difference between the background noise frequency spectrum and the acoustic signal spectrum provide a very useful leak detection method. (author)

  7. Analytical detection methods for irradiated foods

    International Nuclear Information System (INIS)

    1991-03-01

    The present publication is a review of scientific literature on the analytical identification of foods treated with ionizing radiation and the quantitative determination of absorbed dose of radiation. Because of the extremely low level of chemical changes resulting from irradiation or because of the lack of specificity to irradiation of any chemical changes, a few methods of quantitative determination of absorbed dose have shown promise until now. On the other hand, the present review has identified several possible methods, which could be used, following further research and testing, for the identification of irradiated foods. An IAEA Co-ordinated Research Programme on Analytical Detection Methods for Irradiation Treatment of Food ('ADMIT'), established in 1990, is currently investigating many of the methods cited in the present document. Refs and tab

  8. Reliability analysis of idealized tunnel support system using probability-based methods with case studies

    Science.gov (United States)

    Gharouni-Nik, Morteza; Naeimi, Meysam; Ahadi, Sodayf; Alimoradi, Zahra

    2014-06-01

    In order to determine the overall safety of a tunnel support lining, a reliability-based approach is presented in this paper. Support elements in jointed rock tunnels are provided to control the ground movement caused by stress redistribution during the tunnel drive. Main support elements contribute to stability of the tunnel structure are recognized owing to identify various aspects of reliability and sustainability in the system. The selection of efficient support methods for rock tunneling is a key factor in order to reduce the number of problems during construction and maintain the project cost and time within the limited budget and planned schedule. This paper introduces a smart approach by which decision-makers will be able to find the overall reliability of tunnel support system before selecting the final scheme of the lining system. Due to this research focus, engineering reliability which is a branch of statistics and probability is being appropriately applied to the field and much effort has been made to use it in tunneling while investigating the reliability of the lining support system for the tunnel structure. Therefore, reliability analysis for evaluating the tunnel support performance is the main idea used in this research. Decomposition approaches are used for producing system block diagram and determining the failure probability of the whole system. Effectiveness of the proposed reliability model of tunnel lining together with the recommended approaches is examined using several case studies and the final value of reliability obtained for different designing scenarios. Considering the idea of linear correlation between safety factors and reliability parameters, the values of isolated reliabilities determined for different structural components of tunnel support system. In order to determine individual safety factors, finite element modeling is employed for different structural subsystems and the results of numerical analyses are obtained in

  9. Reliability-based design optimization via high order response surface method

    International Nuclear Information System (INIS)

    Li, Hong Shuang

    2013-01-01

    To reduce the computational effort of reliability-based design optimization (RBDO), the response surface method (RSM) has been widely used to evaluate reliability constraints. We propose an efficient methodology for solving RBDO problems based on an improved high order response surface method (HORSM) that takes advantage of an efficient sampling method, Hermite polynomials and uncertainty contribution concept to construct a high order response surface function with cross terms for reliability analysis. The sampling method generates supporting points from Gauss-Hermite quadrature points, which can be used to approximate response surface function without cross terms, to identify the highest order of each random variable and to determine the significant variables connected with point estimate method. The cross terms between two significant random variables are added to the response surface function to improve the approximation accuracy. Integrating the nested strategy, the improved HORSM is explored in solving RBDO problems. Additionally, a sampling based reliability sensitivity analysis method is employed to reduce the computational effort further when design variables are distributional parameters of input random variables. The proposed methodology is applied on two test problems to validate its accuracy and efficiency. The proposed methodology is more efficient than first order reliability method based RBDO and Monte Carlo simulation based RBDO, and enables the use of RBDO as a practical design tool.

  10. Waterborne Pathogens: Detection Methods and Challenges

    Directory of Open Access Journals (Sweden)

    Flor Yazmín Ramírez-Castillo

    2015-05-01

    Full Text Available Waterborne pathogens and related diseases are a major public health concern worldwide, not only by the morbidity and mortality that they cause, but by the high cost that represents their prevention and treatment. These diseases are directly related to environmental deterioration and pollution. Despite the continued efforts to maintain water safety, waterborne outbreaks are still reported globally. Proper assessment of pathogens on water and water quality monitoring are key factors for decision-making regarding water distribution systems’ infrastructure, the choice of best water treatment and prevention waterborne outbreaks. Powerful, sensitive and reproducible diagnostic tools are developed to monitor pathogen contamination in water and be able to detect not only cultivable pathogens but also to detect the occurrence of viable but non-culturable microorganisms as well as the presence of pathogens on biofilms. Quantitative microbial risk assessment (QMRA is a helpful tool to evaluate the scenarios for pathogen contamination that involve surveillance, detection methods, analysis and decision-making. This review aims to present a research outlook on waterborne outbreaks that have occurred in recent years. This review also focuses in the main molecular techniques for detection of waterborne pathogens and the use of QMRA approach to protect public health.

  11. Doppler method leak detection for LMFBR steam generators. Pt. 3. Investigation of detection sensitivity and method

    International Nuclear Information System (INIS)

    Kumagai, Hiromichi; Kinoshita, Izumi

    2001-01-01

    To prevent the expansion of tube damage and to maintain structural integrity in the steam generators (SGs) of a fast breeder reactor (FBR), it is necessary to detect precisely and immediately any leakage of water from heat transfer tubes. Therefore, the Doppler method was developed. Previous studies have revealed that, in the SG full-sector model that simulates actual SGs, the Doppler method can detect bubbles of 0.4 l/s within a few seconds. However in consideration of the dissolution rate of hydrogen generated by a sodium-water reaction even from a small water leak, it is necessary to detect smaller leakages of water from the heat transfer tubes. The detection sensitivity of the Doppler method and the influence of background noise were experimentally investigated. In-water experiments were performed using the SG model. The results show that the Doppler method can detect bubbles of 0.01 l/s (equivalent to a water leak rate of about 0.01 g/s) within a few seconds and that the background noise has little effect on water leak detection performance. The Doppler method thus has great potential for the detection of water leakage in SGs. (author)

  12. Novel Methods of Hydrogen Leak Detection

    International Nuclear Information System (INIS)

    Pushpinder S Puri

    2006-01-01

    With the advent of the fuel cell technology and a drive for clean fuel, hydrogen gas is emerging as a leading candidate for the fuel of choice. For hydrogen to become a consumer fuel for automotive and domestic power generation, safety is paramount. It is, therefore, desired to have a method and system for hydrogen leak detection using odorant which can incorporate a uniform concentration of odorant in the hydrogen gas, when odorants are mixed in the hydrogen storage or delivery means. It is also desired to develop methods where the odorant is not added to the bulk hydrogen, keeping it free of the odorization additives. When odorants are not added to the hydrogen gas in the storage or delivery means, methods must be developed to incorporate odorant in the leaking gas so that leaks can be detected by small. Further, when odorants are not added to the stored hydrogen, it may also be desirable to observe leaks by sight by discoloration of the surface of the storage or transportation vessels. A series of novel solutions are proposed which address the issues raised above. These solutions are divided into three categories as follows: 1. Methods incorporating an odorant in the path of hydrogen leak as opposed to adding it to the hydrogen gas. 2. Methods where odorants are generated in-situ by chemical reaction with the leaking hydrogen 3. Methods of dispensing and storing odorants in high pressure hydrogen gas which release odorants to the gas at a uniform and predetermined rates. Use of one or more of the methods described here in conjunction with appropriate engineering solutions will assure the ultimate safety of hydrogen use as a commercial fuel. (authors)

  13. Sensing Methods for Detecting Analog Television Signals

    Science.gov (United States)

    Rahman, Mohammad Azizur; Song, Chunyi; Harada, Hiroshi

    This paper introduces a unified method of spectrum sensing for all existing analog television (TV) signals including NTSC, PAL and SECAM. We propose a correlation based method (CBM) with a single reference signal for sensing any analog TV signals. In addition we also propose an improved energy detection method. The CBM approach has been implemented in a hardware prototype specially designed for participating in Singapore TV white space (WS) test trial conducted by Infocomm Development Authority (IDA) of the Singapore government. Analytical and simulation results of the CBM method will be presented in the paper, as well as hardware testing results for sensing various analog TV signals. Both AWGN and fading channels will be considered. It is shown that the theoretical results closely match with those from simulations. Sensing performance of the hardware prototype will also be presented in fading environment by using a fading simulator. We present performance of the proposed techniques in terms of probability of false alarm, probability of detection, sensing time etc. We also present a comparative study of the various techniques.

  14. A Novel Reliability Enhanced Handoff Method in Future Wireless Heterogeneous Networks

    Directory of Open Access Journals (Sweden)

    Wang YuPeng

    2016-01-01

    Full Text Available As the demand increases, future networks will follow the trends of network variety and service flexibility, which requires heterogeneous type of network deployment and reliable communication method. In practice, most communication failure happens due to the bad radio link quality, i.e., high-speed users suffers a lot on the problem of radio link failure, which causes the problem of communication interrupt and radio link recovery. To make the communication more reliable, especially for the high mobility users, we propose a novel communication handoff mechanism to reduce the occurrence of service interrupt. Based on computer simulation, we find that the reliability on the service is greatly improved.

  15. Reliability analysis for thermal cutting method based non-explosive separation device

    International Nuclear Information System (INIS)

    Choi, Jun Woo; Hwang, Kuk Ha; Kim, Byung Kyu

    2016-01-01

    In order to increase the reliability of a separation device for a small satellite, a new non-explosive separation device is invented. This device is activated using a thermal cutting method with a Ni-Cr wire. A reliability analysis is carried out for the proposed non-explosive separation device by applying the Fault tree analysis (FTA) method. In the FTA results for the separation device, only ten single-point failure modes are found. The reliability modeling and analysis for the device are performed considering failure of the power supply, the Ni-Cr wire burns failure and unwinds, the holder separation failure, the balls separation failure, and the pin release failure. Ultimately, the reliability of the proposed device is calculated as 0.999989 with five Ni-Cr wire coils

  16. Reliability analysis for thermal cutting method based non-explosive separation device

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jun Woo; Hwang, Kuk Ha; Kim, Byung Kyu [Korea Aerospace University, Goyang (Korea, Republic of)

    2016-12-15

    In order to increase the reliability of a separation device for a small satellite, a new non-explosive separation device is invented. This device is activated using a thermal cutting method with a Ni-Cr wire. A reliability analysis is carried out for the proposed non-explosive separation device by applying the Fault tree analysis (FTA) method. In the FTA results for the separation device, only ten single-point failure modes are found. The reliability modeling and analysis for the device are performed considering failure of the power supply, the Ni-Cr wire burns failure and unwinds, the holder separation failure, the balls separation failure, and the pin release failure. Ultimately, the reliability of the proposed device is calculated as 0.999989 with five Ni-Cr wire coils.

  17. Detection of food irradiation with luminescence methods

    International Nuclear Information System (INIS)

    Anderle, H.

    1997-06-01

    Food irradiation is applied as method for the preservation of foods, the prevention of food spoilage and the inhibition of food-borne pathogens. Doses exceeding 10 kGy (10 kJ/kg) are not recommended by the WHO. The different legislation requires methods for the detection and the closimetry of irradiated foods. Among the physical methods based on the radiation-induced changes in inorganic, nonhygroscopic crystalline solids are thermoluminescence (TL), photostimulated luminescence (PSL) and lyoluminescence (LL) measurement. The luminescence methods were tested on natural minerals. Pure quartz, feldspars, calcite, aragonite and dolomite of known origin were irradiated, read out and analyzed to determine the influence of luminescence-activators and deactivators. Carbonate minerals show an orange-red TL easily detectable by blue-sensitive photomultiplier tubes. TIL-inactive carbonate samples may be identified by a lyoluminescence method using the reaction of trapped irradiation-generated charge carriers with the solvent during crystal-lattice breakup. The fine-ground mineral is dissolved in an alkaline complexing agent/chemiluminescence sensitizer/chemiluminescence catalyst (EDTA/luminol/hemin) reagent mixture. The TL and PSL of quartz is too weak to contribute a significant part for the corresponding signals in polymineral dust. Alkali and soda feldspar show intense TL and PSL. The temperature maxima in the TL glow curves allow a clear distinction. PSL does not give this additional information, it suffers from bleaching by ambient light and requires light-protection. Grain disinfestated with low irradiation doses (500 Gy) may not identified by both TL and PSL measurement. The natural TL of feldspar particles may be overlap with the irradiation-induced TL of other minerals. As a routine method, irradiated spices are identified with TL measurement. The dust particles have to be enriched by heavy-liquid flotation and centrifugation. The PSL method allows a clear

  18. An attempt to use FMEA method for an approximate reliability assessment of machinery

    Directory of Open Access Journals (Sweden)

    Przystupa Krzysztof

    2017-01-01

    Full Text Available The paper presents a modified FMEA (Failure Mode and Effect Analysis method to assess reliability of the components that make up a wrench type 2145: MAX Impactol TM Driver Ingersoll Rand Company. This case concerns the analysis of reliability in conditions, when full service data is not known. The aim of the study is to determine the weakest element in the design of the tool.

  19. Development of DNA elution method to detect irradiated foodstuff

    International Nuclear Information System (INIS)

    Copin, M.P.; Bourgeois, C.M.

    1991-01-01

    The aim of the work is to develop a reliable method to detect whether a fresh and frozen foodstuff has been irradiated. The molecule of DNA is one of the targets of ionizing radiation. The induction of three major classes of lesion have been shown. Double strand breaks, single strand breaks and base damage. Among the different techniques used to observe and quantify the strand breaks, techniques of elution are very interesting. The method proposed consisted of a filtration of the DNA at the atmospheric pressure and in non denaturing conditions. The amount of DNA retained on the filter is measured after being suitably labelled by microfluorometry. A difference in the amount of DNA retained on a filter of 2 μm from a lysed muscular tissue sample between a frozen Norway lobster which has been irradiated and one which has not, is observed. 7 refs

  20. Evaluating the reliability of multi-body mechanisms: A method considering the uncertainties of dynamic performance

    International Nuclear Information System (INIS)

    Wu, Jianing; Yan, Shaoze; Zuo, Ming J.

    2016-01-01

    Mechanism reliability is defined as the ability of a certain mechanism to maintain output accuracy under specified conditions. Mechanism reliability is generally assessed by the classical direct probability method (DPM) derived from the first order second moment (FOSM) method. The DPM relies strongly on the analytical form of the dynamic solution so it is not applicable to multi-body mechanisms that have only numerical solutions. In this paper, an indirect probability model (IPM) is proposed for mechanism reliability evaluation of multi-body mechanisms. IPM combines the dynamic equation, degradation function and Kaplan–Meier estimator to evaluate mechanism reliability comprehensively. Furthermore, to reduce the amount of computation in practical applications, the IPM is simplified into the indirect probability step model (IPSM). A case study of a crank–slider mechanism with clearance is investigated. Results show that relative errors between the theoretical and experimental results of mechanism reliability are less than 5%, demonstrating the effectiveness of the proposed method. - Highlights: • An indirect probability model (IPM) is proposed for mechanism reliability evaluation. • The dynamic equation, degradation function and Kaplan–Meier estimator are used. • Then the simplified form of indirect probability model is proposed. • The experimental results agree well with the predicted results.

  1. Method for assessing software reliability of the document management system using the RFID technology

    Directory of Open Access Journals (Sweden)

    Kiedrowicz Maciej

    2016-01-01

    Full Text Available The deliberations presented in this study refer to the method for assessing software reliability of the docu-ment management system, using the RFID technology. A method for determining the reliability structure of the dis-cussed software, understood as the index vector for assessing reliability of its components, was proposed. The model of the analyzed software is the control transfer graph, in which the probability of activating individual components during the system's operation results from the so-called operational profile, which characterizes the actual working environment. The reliability structure is established as a result of the solution of a specific mathematical software task. The knowledge of the reliability structure of the software makes it possible to properly plan the time and finan-cial expenses necessary to build the software, which would meet the reliability requirements. The application of the presented method is illustrated by the number example, corresponding to the software reality of the RFID document management system.

  2. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    Science.gov (United States)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  3. Designing a reliable leak bio-detection system for natural gas pipelines

    International Nuclear Information System (INIS)

    Batzias, F.A.; Siontorou, C.G.; Spanidis, P.-M.P.

    2011-01-01

    Monitoring of natural gas (NG) pipelines is an important task for economical/safety operation, loss prevention and environmental protection. Timely and reliable leak detection of gas pipeline, therefore, plays a key role in the overall integrity management for the pipeline system. Owing to the various limitations of the currently available techniques and the surveillance area that needs to be covered, the research on new detector systems is still thriving. Biosensors are worldwide considered as a niche technology in the environmental market, since they afford the desired detector capabilities at low cost, provided they have been properly designed/developed and rationally placed/networked/maintained by the aid of operational research techniques. This paper addresses NG leakage surveillance through a robust cooperative/synergistic scheme between biosensors and conventional detector systems; the network is validated in situ and optimized in order to provide reliable information at the required granularity level. The proposed scheme is substantiated through a knowledge based approach and relies on Fuzzy Multicriteria Analysis (FMCA), for selecting the best biosensor design that suits both, the target analyte and the operational micro-environment. This approach is illustrated in the design of leak surveying over a pipeline network in Greece.

  4. Designing a reliable leak bio-detection system for natural gas pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Batzias, F.A., E-mail: fbatzi@unipi.gr [Univ. Piraeus, Dept. Industrial Management and Technology, Karaoli and Dimitriou 80, 18534 Piraeus (Greece); Siontorou, C.G., E-mail: csiontor@unipi.gr [Univ. Piraeus, Dept. Industrial Management and Technology, Karaoli and Dimitriou 80, 18534 Piraeus (Greece); Spanidis, P.-M.P., E-mail: pspani@asprofos.gr [Asprofos Engineering S.A, El. Venizelos 284, 17675 Kallithea (Greece)

    2011-02-15

    Monitoring of natural gas (NG) pipelines is an important task for economical/safety operation, loss prevention and environmental protection. Timely and reliable leak detection of gas pipeline, therefore, plays a key role in the overall integrity management for the pipeline system. Owing to the various limitations of the currently available techniques and the surveillance area that needs to be covered, the research on new detector systems is still thriving. Biosensors are worldwide considered as a niche technology in the environmental market, since they afford the desired detector capabilities at low cost, provided they have been properly designed/developed and rationally placed/networked/maintained by the aid of operational research techniques. This paper addresses NG leakage surveillance through a robust cooperative/synergistic scheme between biosensors and conventional detector systems; the network is validated in situ and optimized in order to provide reliable information at the required granularity level. The proposed scheme is substantiated through a knowledge based approach and relies on Fuzzy Multicriteria Analysis (FMCA), for selecting the best biosensor design that suits both, the target analyte and the operational micro-environment. This approach is illustrated in the design of leak surveying over a pipeline network in Greece.

  5. Designing a reliable leak bio-detection system for natural gas pipelines.

    Science.gov (United States)

    Batzias, F A; Siontorou, C G; Spanidis, P-M P

    2011-02-15

    Monitoring of natural gas (NG) pipelines is an important task for economical/safety operation, loss prevention and environmental protection. Timely and reliable leak detection of gas pipeline, therefore, plays a key role in the overall integrity management for the pipeline system. Owing to the various limitations of the currently available techniques and the surveillance area that needs to be covered, the research on new detector systems is still thriving. Biosensors are worldwide considered as a niche technology in the environmental market, since they afford the desired detector capabilities at low cost, provided they have been properly designed/developed and rationally placed/networked/maintained by the aid of operational research techniques. This paper addresses NG leakage surveillance through a robust cooperative/synergistic scheme between biosensors and conventional detector systems; the network is validated in situ and optimized in order to provide reliable information at the required granularity level. The proposed scheme is substantiated through a knowledge based approach and relies on Fuzzy Multicriteria Analysis (FMCA), for selecting the best biosensor design that suits both, the target analyte and the operational micro-environment. This approach is illustrated in the design of leak surveying over a pipeline network in Greece. Copyright © 2010 Elsevier B.V. All rights reserved.

  6. Delamination detection using methods of computational intelligence

    Science.gov (United States)

    Ihesiulor, Obinna K.; Shankar, Krishna; Zhang, Zhifang; Ray, Tapabrata

    2012-11-01

    Abstract Reliable delamination prediction scheme is indispensable in order to prevent potential risks of catastrophic failures in composite structures. The existence of delaminations changes the vibration characteristics of composite laminates and hence such indicators can be used to quantify the health characteristics of laminates. An approach for online health monitoring of in-service composite laminates is presented in this paper that relies on methods based on computational intelligence. Typical changes in the observed vibration characteristics (i.e. change in natural frequencies) are considered as inputs to identify the existence, location and magnitude of delaminations. The performance of the proposed approach is demonstrated using numerical models of composite laminates. Since this identification problem essentially involves the solution of an optimization problem, the use of finite element (FE) methods as the underlying tool for analysis turns out to be computationally expensive. A surrogate assisted optimization approach is hence introduced to contain the computational time within affordable limits. An artificial neural network (ANN) model with Bayesian regularization is used as the underlying approximation scheme while an improved rate of convergence is achieved using a memetic algorithm. However, building of ANN surrogate models usually requires large training datasets. K-means clustering is effectively employed to reduce the size of datasets. ANN is also used via inverse modeling to determine the position, size and location of delaminations using changes in measured natural frequencies. The results clearly highlight the efficiency and the robustness of the approach.

  7. A generic method for assignment of reliability scores applied to solvent accessibility predictions

    DEFF Research Database (Denmark)

    Petersen, Bent; Petersen, Thomas Nordahl; Andersen, Pernille

    2009-01-01

    : The performance of the neural networks was evaluated on a commonly used set of sequences known as the CB513 set. An overall Pearson's correlation coefficient of 0.72 was obtained, which is comparable to the performance of the currently best public available method, Real-SPINE. Both methods associate a reliability...... comparing the Pearson's correlation coefficient for the upper 20% of predictions sorted according to reliability. For this subset, values of 0.79 and 0.74 are obtained using our and the compared method, respectively. This tendency is true for any selected subset....

  8. Diagnostic reliability of 3.0-T MRI for detecting osseous abnormalities of the temporomandibular joint.

    Science.gov (United States)

    Sawada, Kunihiko; Amemiya, Toshihiko; Hirai, Shigenori; Hayashi, Yusuke; Suzuki, Toshihiro; Honda, Masahiko; Sisounthone, Johnny; Matsumoto, Kunihito; Honda, Kazuya

    2018-01-01

    We compared the diagnostic reliability of 3.0-T magnetic resonance imaging (MRI) for detection of osseous abnormalities of the temporomandibular joint (TMJ) with that of the gold standard, cone-beam computed tomography (CBCT). Fifty-six TMJs were imaged with CBCT and MRI, and images of condyles and fossae were independently assessed for the presence of osseous abnormalities. The accuracy, sensitivity, and specificity of 3.0-T MRI were 0.88, 1.0, and 0.73, respectively, in condyle evaluation and 0.91, 0.75, and 0.95 in fossa evaluation. The McNemar test showed no significant difference (P > 0.05) between MRI and CBCT in the evaluation of osseous abnormalities in condyles and fossae. The present results indicate that 3.0-T MRI is equal to CBCT in the diagnostic evaluation of osseous abnormalities of the mandibular condyle.

  9. Bayesian Methods for Radiation Detection and Dosimetry

    International Nuclear Information System (INIS)

    Peter G. Groer

    2002-01-01

    We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed compartmental activities. From the estimated probability densities of the model parameters we were able to derive the densities for compartmental activities for a two compartment catenary model at different times. We also calculated the average activities and their standard deviation for a simple two compartment model

  10. Apparatus and method for detecting explosives

    International Nuclear Information System (INIS)

    Griffith, B.

    1976-01-01

    An apparatus is described for use in situations such as airports to detect explosives hidden in containers (for eg. suitcases). The method involves the evaluation of the quantities of oxygen and nitrogen within the container by neutron activation analysis and the determination of whether these quantities exceed predetermined limits. The equipment includes a small sub-critical lower powered reactor for thermal (0.01 to 0.10 eV) neutron production, a radium beryllium primary source, a deuterium-tritium reactor as a high energy (> 1.06 MeV) neutron source and Geiger counter detector arrays. (UK)

  11. A method to evaluate performance reliability of individual subjects in laboratory research applied to work settings.

    Science.gov (United States)

    1978-10-01

    This report presents a method that may be used to evaluate the reliability of performance of individual subjects, particularly in applied laboratory research. The method is based on analysis of variance of a tasks-by-subjects data matrix, with all sc...

  12. Novel Methods of Hydrogen Leak Detection

    International Nuclear Information System (INIS)

    Pushpinder S Puri

    2006-01-01

    For hydrogen to become a consumer fuel for automotive and domestic power generation, safety is paramount. Today's hydrogen systems are built with inherent safety measures and multiple levels of protection. However, human senses, in particular, the sense of smell, is considered the ultimate safeguards against leaks. Since hydrogen is an odorless gas, use of odorants to detect leaks, as is done in case of natural gas, is obvious solution. The odorants required for hydrogen used in fuel cells have a unique requirement which must be met. This is because almost all of the commercial odorants used in gas leak detection contain sulfur which acts as poison for the catalysts used in hydrogen based fuel cells, most specifically for the PEM (polymer electrolyte membrane or proton exchange membrane) fuel cells. A possible solution to this problem is to use non-sulfur containing odorants. Chemical compounds based on mixtures of acrylic acid and nitrogen compounds have been adopted to achieve a sulfur-free odorization of a gas. It is, therefore, desired to have a method and system for hydrogen leak detection using odorant which can incorporate a uniform concentration of odorant in the hydrogen gas, when odorants are mixed in the hydrogen storage or delivery means. It is also desired to develop methods where the odorant is not added to the bulk hydrogen, keeping it free of the odorization additives. A series of novel solutions are proposed which address the issues raised above. These solutions are divided into three categories as follows: 1. Methods incorporating an odorant in the path of hydrogen leak as opposed to adding it to the hydrogen gas. 2. Methods where odorants are generated in-situ by chemical reaction with the leaking hydrogen 3. Methods of dispensing and storing odorants in high pressure hydrogen gas which release odorants to the gas at a uniform and predetermined rates. Use of one or more of the methods described here in conjunction with appropriate engineering

  13. Photogrammetry: an accurate and reliable tool to detect thoracic musculoskeletal abnormalities in preterm infants.

    Science.gov (United States)

    Davidson, Josy; dos Santos, Amelia Miyashiro N; Garcia, Kessey Maria B; Yi, Liu C; João, Priscila C; Miyoshi, Milton H; Goulart, Ana Lucia

    2012-09-01

    To analyse the accuracy and reproducibility of photogrammetry in detecting thoracic abnormalities in infants born prematurely. Cross-sectional study. The Premature Clinic at the Federal University of São Paolo. Fifty-eight infants born prematurely in their first year of life. Measurement of the manubrium/acromion/trapezius angle (degrees) and the deepest thoracic retraction (cm). Digitised photographs were analysed by two blinded physiotherapists using a computer program (SAPO; http://SAPO.incubadora.fapesp.br) to detect shoulder elevation and thoracic retraction. Physical examinations performed independently by two physiotherapists were used to assess the accuracy of the new tool. Thoracic alterations were detected in 39 (67%) and in 40 (69%) infants by Physiotherapists 1 and 2, respectively (kappa coefficient=0.80). Using a receiver operating characteristic curve, measurement of the manubrium/acromion/trapezius angle and the deepest thoracic retraction indicated accuracy of 0.79 and 0.91, respectively. For measurement of the manubrium/acromion/trapezius angle, the Bland and Altman limits of agreement were -6.22 to 7.22° [mean difference (d)=0.5] for repeated measures by one physiotherapist, and -5.29 to 5.79° (d=0.75) between two physiotherapists. For thoracic retraction, the intra-rater limits of agreement were -0.14 to 0.18cm (d=0.02) and the inter-rater limits of agreement were -0.20 to -0.17cm (d=0.02). SAPO provided an accurate and reliable tool for the detection of thoracic abnormalities in preterm infants. Copyright © 2011 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  14. Reliability of the exercise ECG in detecting silent ischemia in patients with prior myocardial infarction

    International Nuclear Information System (INIS)

    Yamagishi, Takashi; Matsuda, Yasuo; Satoh, Akira

    1991-01-01

    To assess the reliability of the exercise ECG in detecting silent ischemia, ECG results were compared with those of stress-redistribution thallium-201 single-photon emission computed tomography (SPECT) in 116 patients with prior myocardial infarction and in 20 normal subjects used as a control. The left ventricle (LV) was divided into 20 segmental images, which were scored blindly on a 5-point scale. The redistribution score was defined as thallium defect score of exercise subtracted by that of redistribution image and was used as a measure of amount of ischemic but viable myocardium. The upper limit of normal redistribution score (=4.32) was defined as mean+2 standard deviations derived from 20 normal subjects. Of 116 patients, 47 had the redistribution score above the normal range. Twenty-five (53%) of the 47 patients showed positive ECG response. Fourteen (20%) of the 69 patients, who had the normal redistribution score, showed positive ECG response. Thus, the ECG response had a sensitivity of 53% and a specificity of 80% in detecting transient ischemia. Furthermore, the 116 patients were subdivided into 4 groups according to the presence or absence of chest pain and ECG change during exercise. Fourteen patients showed both chest pain and ECG change and all these patients had the redistribution score above the normal range. Twenty-five patients showed ECG change without chest pain and 11 (44%) of the 25 patients had the abnormal redistribution. Three (43%) of 7 patients who showed chest pain without ECG change had the abnormal redistribution score. Of 70 patients who had neither chest pain nor ECG change, 19 (27%) had the redistribution score above the normal range. Thus, limitations exist in detecting silent ischemia by ECG in patients with a prior myocardial infarction, because the ECG response to the exercise test may have a low degree of sensitivity and a high degree of false positive and false negative results in detecting silent ischemia. (author)

  15. Prevent cervical cancer by screening with reliable human papillomavirus detection and genotyping

    International Nuclear Information System (INIS)

    Ge, Shichao; Gong, Bo; Cai, Xushan; Yang, Xiaoer; Gan, Xiaowei; Tong, Xinghai; Li, Haichuan; Zhu, Meijuan; Yang, Fengyun; Zhou, Hongrong; Hong, Guofan

    2012-01-01

    The incidence of cervical cancer is expected to rise sharply in China. A reliable routine human papillomavirus (HPV) detection and genotyping test to be supplemented by the limited Papanicolaou cytology facilities is urgently needed to help identify the patients with cervical precancer for preventive interventions. To this end, we evaluated a nested polymerase chain reaction (PCR) protocol for detection of HPV L1 gene DNA in cervicovaginal cells. The PCR amplicons were genotyped by direct DNA sequencing. In parallel, split samples were subjected to a Digene HC2 HPV test which has been widely used for “cervical cancer risk” screen. Of the 1826 specimens, 1655 contained sufficient materials for analysis and 657 were truly negative. PCR/DNA sequencing showed 674 infected by a single high-risk HPV, 188 by a single low-risk HPV, and 136 by multiple HPV genotypes with up to five HPV genotypes in one specimen. In comparison, the HC2 test classified 713 specimens as infected by high-risk HPV, and 942 as negative for HPV infections. The high-risk HC2 test correctly detected 388 (57.6%) of the 674 high-risk HPV isolates in clinical specimens, mislabeled 88 (46.8%) of the 188 low-risk HPV isolates as high-risk genotypes, and classified 180 (27.4%) of the 657 “true-negative” samples as being infected by high-risk HPV. It was found to cross-react with 20 low-risk HPV genotypes. We conclude that nested PCR detection of HPV followed by short target DNA sequencing can be used for screening and genotyping to formulate a paradigm in clinical management of HPV-related disorders in a rapidly developing economy

  16. Radiation sensitive area detection device and method

    Science.gov (United States)

    Carter, Daniel C. (Inventor); Hecht, Diana L. (Inventor); Witherow, William K. (Inventor)

    1991-01-01

    A radiation sensitive area detection device for use in conjunction with an X ray, ultraviolet or other radiation source is provided which comprises a phosphor containing film which releases a stored diffraction pattern image in response to incoming light or other electromagnetic wave. A light source such as a helium-neon laser, an optical fiber capable of directing light from the laser source onto the phosphor film and also capable of channelling the fluoresced light from the phosphor film to an integrating sphere which directs the light to a signal processing means including a light receiving means such as a photomultiplier tube. The signal processing means allows translation of the fluoresced light in order to detect the original pattern caused by the diffraction of the radiation by the original sample. The optical fiber is retained directly in front of the phosphor screen by a thin metal holder which moves up and down across the phosphor screen and which features a replaceable pinhole which allows easy adjustment of the resolution of the light projected onto the phosphor film. The device produces near real time images with high spatial resolution and without the distortion that accompanies prior art devices employing photomultiplier tubes. A method is also provided for carrying out radiation area detection using the device of the invention.

  17. Method and apparatus for a nuclear reactor for increasing reliability to scram control elements

    International Nuclear Information System (INIS)

    Bevilacqua, F.

    1976-01-01

    A description is given of a method and apparatus for increasing the reliability of linear drive devices of a nuclear reactor to scram the control elements held in a raised position thereby. Each of the plurality of linear drive devices includes a first type of holding means associated with the drive means of the linear drive device and a second type of holding means distinct and operatively dissimilar from the first type. The system of linear drive devices having both types of holding means are operated in such a manner that the control elements of a portion of the linear drive devices are only held in a raised position by the first holding means and the control elements of the remaining portion of linear drive devices are held in a raised position by only the second type of holding means. Since the two types of holding means are distinct from one another and are operatively dissimilar, the probability of failure of both systems to scram as a result of common mode failure will be minimized. Means may be provided to positively detect disengagement of the first type of holding means and engagement of the second type of holding means for those linear drive devices being operative to hold the control elements in a raised position with the second type of holding means

  18. Lagrangian based methods for coherent structure detection

    Energy Technology Data Exchange (ETDEWEB)

    Allshouse, Michael R., E-mail: mallshouse@chaos.utexas.edu [Center for Nonlinear Dynamics and Department of Physics, University of Texas at Austin, Austin, Texas 78712 (United States); Peacock, Thomas, E-mail: tomp@mit.edu [Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States)

    2015-09-15

    There has been a proliferation in the development of Lagrangian analytical methods for detecting coherent structures in fluid flow transport, yielding a variety of qualitatively different approaches. We present a review of four approaches and demonstrate the utility of these methods via their application to the same sample analytic model, the canonical double-gyre flow, highlighting the pros and cons of each approach. Two of the methods, the geometric and probabilistic approaches, are well established and require velocity field data over the time interval of interest to identify particularly important material lines and surfaces, and influential regions, respectively. The other two approaches, implementing tools from cluster and braid theory, seek coherent structures based on limited trajectory data, attempting to partition the flow transport into distinct regions. All four of these approaches share the common trait that they are objective methods, meaning that their results do not depend on the frame of reference used. For each method, we also present a number of example applications ranging from blood flow and chemical reactions to ocean and atmospheric flows.

  19. Some methods for the detection of fissionable matter; Quelques methodes de detection des corps fissiles

    Energy Technology Data Exchange (ETDEWEB)

    Guery, M [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1967-03-01

    A number of equipments or processes allowing to detect uranium or plutonium in industrial plants, and in particular to measure solution concentrations, are studied here. Each method has its own field of applications and has its own performances, which we have tried to define by calculations and by experiments. The following topics have been treated: {gamma} absorptiometer with an Am source, detection test by neutron multiplication, apparatus for the measurement of the {alpha} activity of a solution, fissionable matter detection by {gamma} emission, fissionable matter detection by neutron emission. (author) [French] On examine ici plusieurs appareils ou procedes qui permettent de detecter l'uranium ou le plutonium dans les installations industrielles, et en particulier de mesurer les concentrations de solutions. Chacune des methodes a son domaine d'application et ses performances, qu'on a tente de definir par le calcul et par des experiences. Les sujets traites sont les suivants: absorptiometre {gamma} a source d'americium, essais de detection par multiplication neutronique, appareil de mesure de l'activite {alpha} d'une solution, detection des matieres fissiles par leur emission {gamma}, detection des matieres fissiles par leur emission neutronique. (auteur)

  20. A generic method for assignment of reliability scores applied to solvent accessibility predictions

    Directory of Open Access Journals (Sweden)

    Nielsen Morten

    2009-07-01

    Full Text Available Abstract Background Estimation of the reliability of specific real value predictions is nontrivial and the efficacy of this is often questionable. It is important to know if you can trust a given prediction and therefore the best methods associate a prediction with a reliability score or index. For discrete qualitative predictions, the reliability is conventionally estimated as the difference between output scores of selected classes. Such an approach is not feasible for methods that predict a biological feature as a single real value rather than a classification. As a solution to this challenge, we have implemented a method that predicts the relative surface accessibility of an amino acid and simultaneously predicts the reliability for each prediction, in the form of a Z-score. Results An ensemble of artificial neural networks has been trained on a set of experimentally solved protein structures to predict the relative exposure of the amino acids. The method assigns a reliability score to each surface accessibility prediction as an inherent part of the training process. This is in contrast to the most commonly used procedures where reliabilities are obtained by post-processing the output. Conclusion The performance of the neural networks was evaluated on a commonly used set of sequences known as the CB513 set. An overall Pearson's correlation coefficient of 0.72 was obtained, which is comparable to the performance of the currently best public available method, Real-SPINE. Both methods associate a reliability score with the individual predictions. However, our implementation of reliability scores in the form of a Z-score is shown to be the more informative measure for discriminating good predictions from bad ones in the entire range from completely buried to fully exposed amino acids. This is evident when comparing the Pearson's correlation coefficient for the upper 20% of predictions sorted according to reliability. For this subset, values of 0

  1. The Evaluation Method of the Lightning Strike on Transmission Lines Aiming at Power Grid Reliability

    Science.gov (United States)

    Wen, Jianfeng; Wu, Jianwei; Huang, Liandong; Geng, Yinan; Yu, zhanqing

    2018-01-01

    Lightning protection of power system focuses on reducing the flashover rate, only distinguishing by the voltage level, without considering the functional differences between the transmission lines, and being lack of analysis the effect on the reliability of power grid. This will lead lightning protection design of general transmission lines is surplus but insufficient for key lines. In order to solve this problem, the analysis method of lightning striking on transmission lines for power grid reliability is given. Full wave process theory is used to analyze the lightning back striking; the leader propagation model is used to describe the process of shielding failure of transmission lines. The index of power grid reliability is introduced and the effect of transmission line fault on the reliability of power system is discussed in detail.

  2. Investigation of Reliabilities of Bolt Distances for Bolted Structural Steel Connections by Monte Carlo Simulation Method

    Directory of Open Access Journals (Sweden)

    Ertekin Öztekin Öztekin

    2015-12-01

    Full Text Available Design of the distance of bolts to each other and design of the distance of bolts to the edge of connection plates are made based on minimum and maximum boundary values proposed by structural codes. In this study, reliabilities of those distances were investigated. For this purpose, loading types, bolt types and plate thicknesses were taken as variable parameters. Monte Carlo Simulation (MCS method was used in the reliability computations performed for all combination of those parameters. At the end of study, all reliability index values for all those distances were presented in graphics and tables. Results obtained from this study compared with the values proposed by some structural codes and finally some evaluations were made about those comparisons. Finally, It was emphasized in the end of study that, it would be incorrect of the usage of the same bolt distances in the both traditional designs and the higher reliability level designs.

  3. An Intelligent Method for Structural Reliability Analysis Based on Response Surface

    Institute of Scientific and Technical Information of China (English)

    桂劲松; 刘红; 康海贵

    2004-01-01

    As water depth increases, the structural safety and reliability of a system become more and more important and challenging. Therefore, the structural reliability method must be applied in ocean engineering design such as offshore platform design. If the performance function is known in structural reliability analysis, the first-order second-moment method is often used. If the performance function could not be definitely expressed, the response surface method is always used because it has a very clear train of thought and simple programming. However, the traditional response surface method fits the response surface of quadratic polynomials where the problem of accuracy could not be solved, because the true limit state surface can be fitted well only in the area near the checking point. In this paper, an intelligent computing method based on the whole response surface is proposed, which can be used for the situation where the performance function could not be definitely expressed in structural reliability analysis. In this method, a response surface of the fuzzy neural network for the whole area should be constructed first, and then the structural reliability can be calculated by the genetic algorithm. In the proposed method, all the sample points for the training network come from the whole area, so the true limit state surface in the whole area can be fitted. Through calculational examples and comparative analysis, it can be known that the proposed method is much better than the traditional response surface method of quadratic polynomials, because, the amount of calculation of finite element analysis is largely reduced, the accuracy of calculation is improved,and the true limit state surface can be fitted very well in the whole area. So, the method proposed in this paper is suitable for engineering application.

  4. Nucleic acid detection system and method for detecting influenza

    Science.gov (United States)

    Cai, Hong; Song, Jian

    2015-03-17

    The invention provides a rapid, sensitive and specific nucleic acid detection system which utilizes isothermal nucleic acid amplification in combination with a lateral flow chromatographic device, or DNA dipstick, for DNA-hybridization detection. The system of the invention requires no complex instrumentation or electronic hardware, and provides a low cost nucleic acid detection system suitable for highly sensitive pathogen detection. Hybridization to single-stranded DNA amplification products using the system of the invention provides a sensitive and specific means by which assays can be multiplexed for the detection of multiple target sequences.

  5. A Bayesian reliability evaluation method with integrated accelerated degradation testing and field information

    International Nuclear Information System (INIS)

    Wang, Lizhi; Pan, Rong; Li, Xiaoyang; Jiang, Tongmin

    2013-01-01

    Accelerated degradation testing (ADT) is a common approach in reliability prediction, especially for products with high reliability. However, oftentimes the laboratory condition of ADT is different from the field condition; thus, to predict field failure, one need to calibrate the prediction made by using ADT data. In this paper a Bayesian evaluation method is proposed to integrate the ADT data from laboratory with the failure data from field. Calibration factors are introduced to calibrate the difference between the lab and the field conditions so as to predict a product's actual field reliability more accurately. The information fusion and statistical inference procedure are carried out through a Bayesian approach and Markov chain Monte Carlo methods. The proposed method is demonstrated by two examples and the sensitivity analysis to prior distribution assumption

  6. The Monte Carlo Simulation Method for System Reliability and Risk Analysis

    CERN Document Server

    Zio, Enrico

    2013-01-01

    Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling.   Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques.   This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...

  7. Study on reliability analysis based on multilevel flow models and fault tree method

    International Nuclear Information System (INIS)

    Chen Qiang; Yang Ming

    2014-01-01

    Multilevel flow models (MFM) and fault tree method describe the system knowledge in different forms, so the two methods express an equivalent logic of the system reliability under the same boundary conditions and assumptions. Based on this and combined with the characteristics of MFM, a method mapping MFM to fault tree was put forward, thus providing a way to establish fault tree rapidly and realizing qualitative reliability analysis based on MFM. Taking the safety injection system of pressurized water reactor nuclear power plant as an example, its MFM was established and its reliability was analyzed qualitatively. The analysis result shows that the logic of mapping MFM to fault tree is correct. The MFM is easily understood, created and modified. Compared with the traditional fault tree analysis, the workload is greatly reduced and the modeling time is saved. (authors)

  8. Current perspectives on genetically modified crops and detection methods.

    Science.gov (United States)

    Kamle, Madhu; Kumar, Pradeep; Patra, Jayanta Kumar; Bajpai, Vivek K

    2017-07-01

    Genetically modified (GM) crops are the fastest adopted commodities in the agribiotech industry. This market penetration should provide a sustainable basis for ensuring food supply for growing global populations. The successful completion of two decades of commercial GM crop production (1996-2015) is underscored by the increasing rate of adoption of genetic engineering technology by farmers worldwide. With the advent of introduction of multiple traits stacked together in GM crops for combined herbicide tolerance, insect resistance, drought tolerance or disease resistance, the requirement of reliable and sensitive detection methods for tracing and labeling genetically modified organisms in the food/feed chain has become increasingly important. In addition, several countries have established threshold levels for GM content which trigger legally binding labeling schemes. The labeling of GM crops is mandatory in many countries (such as China, EU, Russia, Australia, New Zealand, Brazil, Israel, Saudi Arabia, Korea, Chile, Philippines, Indonesia, Thailand), whereas in Canada, Hong Kong, USA, South Africa, and Argentina voluntary labeling schemes operate. The rapid adoption of GM crops has increased controversies, and mitigating these issues pertaining to the implementation of effective regulatory measures for the detection of GM crops is essential. DNA-based detection methods have been successfully employed, while the whole genome sequencing using next-generation sequencing (NGS) technologies provides an advanced means for detecting genetically modified organisms and foods/feeds in GM crops. This review article describes the current status of GM crop commercialization and discusses the benefits and shortcomings of common and advanced detection systems for GMs in foods and animal feeds.

  9. Structural system reliability calculation using a probabilistic fault tree analysis method

    Science.gov (United States)

    Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.

    1992-01-01

    The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.

  10. Supersonic wave detection method and supersonic detection device

    International Nuclear Information System (INIS)

    Machida, Koichi; Seto, Takehiro; Ishizaki, Hideaki; Asano, Rin-ichi.

    1996-01-01

    The present invention provides a method of and device for a detection suitable to a channel box which is used while covering a fuel assembly of a BWR type reactor. Namely, a probe for transmitting/receiving supersonic waves scans on the surface of the channel box. A data processing device determines an index showing a selective orientation degree of crystal direction of the channel box based on the signals received by the probe. A judging device compares the determined index with a previously determined allowable range to judge whether the channel box is satisfactory or not based on the result of the comparison. The judgement are on the basis that (1) the bending of the channel box is caused by the difference of elongation of opposed surfaces, (2) the elongation due to irradiation is caused by the selective orientation of crystal direction, and (3) the bending of the channel box can be suppressed within a predetermined range by suppressing the index determined by the measurement of supersonic waves having a correlation with the selective orientation of the crystal direction. As a result, the performance of the channel box capable of enduring high burnup region can be confirmed in a nondestructive manner. (I.S.)

  11. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software.

    Science.gov (United States)

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2015-05-01

    Several sophisticated methods of footprint analysis currently exist. However, it is sometimes useful to apply standard measurement methods of recognized evidence with an easy and quick application. We sought to assess the reliability and validity of a new method of footprint assessment in a healthy population using Photoshop CS5 software (Adobe Systems Inc, San Jose, California). Forty-two footprints, corresponding to 21 healthy individuals (11 men with a mean ± SD age of 20.45 ± 2.16 years and 10 women with a mean ± SD age of 20.00 ± 1.70 years) were analyzed. Footprints were recorded in static bipedal standing position using optical podography and digital photography. Three trials for each participant were performed. The Hernández-Corvo, Chippaux-Smirak, and Staheli indices and the Clarke angle were calculated by manual method and by computerized method using Photoshop CS5 software. Test-retest was used to determine reliability. Validity was obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed high values (ICC, 0.98-0.99). Moreover, the validity test clearly showed no difference between techniques (ICC, 0.99-1). The reliability and validity of a method to measure, assess, and record the podometric indices using Photoshop CS5 software has been demonstrated. This provides a quick and accurate tool useful for the digital recording of morphostatic foot study parameters and their control.

  12. A human reliability based usability evaluation method for safety-critical software

    International Nuclear Information System (INIS)

    Boring, R. L.; Tran, T. Q.; Gertman, D. I.; Ragsdale, A.

    2006-01-01

    Boring and Gertman (2005) introduced a novel method that augments heuristic usability evaluation methods with that of the human reliability analysis method of SPAR-H. By assigning probabilistic modifiers to individual heuristics, it is possible to arrive at the usability error probability (UEP). Although this UEP is not a literal probability of error, it nonetheless provides a quantitative basis to heuristic evaluation. This method allows one to seamlessly prioritize and identify usability issues (i.e., a higher UEP requires more immediate fixes). However, the original version of this method required the usability evaluator to assign priority weights to the final UEP, thus allowing the priority of a usability issue to differ among usability evaluators. The purpose of this paper is to explore an alternative approach to standardize the priority weighting of the UEP in an effort to improve the method's reliability. (authors)

  13. An automated method for estimating reliability of grid systems using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Emmanuel Ramirez-Marquez, Jose

    2012-01-01

    Grid computing has become relevant due to its applications to large-scale resource sharing, wide-area information transfer, and multi-institutional collaborating. In general, in grid computing a service requests the use of a set of resources, available in a grid, to complete certain tasks. Although analysis tools and techniques for these types of systems have been studied, grid reliability analysis is generally computation-intensive to obtain due to the complexity of the system. Moreover, conventional reliability models have some common assumptions that cannot be applied to the grid systems. Therefore, new analytical methods are needed for effective and accurate assessment of grid reliability. This study presents a new method for estimating grid service reliability, which does not require prior knowledge about the grid system structure unlike the previous studies. Moreover, the proposed method does not rely on any assumptions about the link and node failure rates. This approach is based on a data-mining algorithm, the K2, to discover the grid system structure from raw historical system data, that allows to find minimum resource spanning trees (MRST) within the grid then, uses Bayesian networks (BN) to model the MRST and estimate grid service reliability.

  14. Inter- and intra- observer reliability of risk assessment of repetitive work without an explicit method.

    Science.gov (United States)

    Eliasson, Kristina; Palm, Peter; Nyman, Teresia; Forsman, Mikael

    2017-07-01

    A common way to conduct practical risk assessments is to observe a job and report the observed long term risks for musculoskeletal disorders. The aim of this study was to evaluate the inter- and intra-observer reliability of ergonomists' risk assessments without the support of an explicit risk assessment method. Twenty-one experienced ergonomists assessed the risk level (low, moderate, high risk) of eight upper body regions, as well as the global risk of 10 video recorded work tasks. Intra-observer reliability was assessed by having nine of the ergonomists repeat the procedure at least three weeks after the first assessment. The ergonomists made their risk assessment based on his/her experience and knowledge. The statistical parameters of reliability included agreement in %, kappa, linearly weighted kappa, intraclass correlation and Kendall's coefficient of concordance. The average inter-observer agreement of the global risk was 53% and the corresponding weighted kappa (K w ) was 0.32, indicating fair reliability. The intra-observer agreement was 61% and 0.41 (K w ). This study indicates that risk assessments of the upper body, without the use of an explicit observational method, have non-acceptable reliability. It is therefore recommended to use systematic risk assessment methods to a higher degree. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Damage detection in composite materials using Lamb wave methods

    Science.gov (United States)

    Kessler, Seth S.; Spearing, S. Mark; Soutis, Constantinos

    2002-04-01

    Cost-effective and reliable damage detection is critical for the utilization of composite materials. This paper presents part of an experimental and analytical survey of candidate methods for in situ damage detection of composite materials. Experimental results are presented for the application of Lamb wave techniques to quasi-isotropic graphite/epoxy test specimens containing representative damage modes, including delamination, transverse ply cracks and through-holes. Linear wave scans were performed on narrow laminated specimens and sandwich beams with various cores by monitoring the transmitted waves with piezoceramic sensors. Optimal actuator and sensor configurations were devised through experimentation, and various types of driving signal were explored. These experiments provided a procedure capable of easily and accurately determining the time of flight of a Lamb wave pulse between an actuator and sensor. Lamb wave techniques provide more information about damage presence and severity than previously tested methods (frequency response techniques), and provide the possibility of determining damage location due to their local response nature. These methods may prove suitable for structural health monitoring applications since they travel long distances and can be applied with conformable piezoelectric actuators and sensors that require little power.

  16. A Fast Optimization Method for Reliability and Performance of Cloud Services Composition Application

    Directory of Open Access Journals (Sweden)

    Zhao Wu

    2013-01-01

    Full Text Available At present the cloud computing is one of the newest trends of distributed computation, which is propelling another important revolution of software industry. The cloud services composition is one of the key techniques in software development. The optimization for reliability and performance of cloud services composition application, which is a typical stochastic optimization problem, is confronted with severe challenges due to its randomness and long transaction, as well as the characteristics of the cloud computing resources such as openness and dynamic. The traditional reliability and performance optimization techniques, for example, Markov model and state space analysis and so forth, have some defects such as being too time consuming and easy to cause state space explosion and unsatisfied the assumptions of component execution independence. To overcome these defects, we propose a fast optimization method for reliability and performance of cloud services composition application based on universal generating function and genetic algorithm in this paper. At first, a reliability and performance model for cloud service composition application based on the multiple state system theory is presented. Then the reliability and performance definition based on universal generating function is proposed. Based on this, a fast reliability and performance optimization algorithm is presented. In the end, the illustrative examples are given.

  17. Reliability-Based Topology Optimization Using Stochastic Response Surface Method with Sparse Grid Design

    Directory of Open Access Journals (Sweden)

    Qinghai Zhao

    2015-01-01

    Full Text Available A mathematical framework is developed which integrates the reliability concept into topology optimization to solve reliability-based topology optimization (RBTO problems under uncertainty. Two typical methodologies have been presented and implemented, including the performance measure approach (PMA and the sequential optimization and reliability assessment (SORA. To enhance the computational efficiency of reliability analysis, stochastic response surface method (SRSM is applied to approximate the true limit state function with respect to the normalized random variables, combined with the reasonable design of experiments generated by sparse grid design, which was proven to be an effective and special discretization technique. The uncertainties such as material property and external loads are considered on three numerical examples: a cantilever beam, a loaded knee structure, and a heat conduction problem. Monte-Carlo simulations are also performed to verify the accuracy of the failure probabilities computed by the proposed approach. Based on the results, it is demonstrated that application of SRSM with SGD can produce an efficient reliability analysis in RBTO which enables a more reliable design than that obtained by DTO. It is also found that, under identical accuracy, SORA is superior to PMA in view of computational efficiency.

  18. Odour detection methods: olfactometry and chemical sensors.

    Science.gov (United States)

    Brattoli, Magda; de Gennaro, Gianluigi; de Pinto, Valentina; Loiotile, Annamaria Demarinis; Lovascio, Sara; Penza, Michele

    2011-01-01

    The complexity of the odours issue arises from the sensory nature of smell. From the evolutionary point of view olfaction is one of the oldest senses, allowing for seeking food, recognizing danger or communication: human olfaction is a protective sense as it allows the detection of potential illnesses or infections by taking into account the odour pleasantness/unpleasantness. Odours are mixtures of light and small molecules that, coming in contact with various human sensory systems, also at very low concentrations in the inhaled air, are able to stimulate an anatomical response: the experienced perception is the odour. Odour assessment is a key point in some industrial production processes (i.e., food, beverages, etc.) and it is acquiring steady importance in unusual technological fields (i.e., indoor air quality); this issue mainly concerns the environmental impact of various industrial activities (i.e., tanneries, refineries, slaughterhouses, distilleries, civil and industrial wastewater treatment plants, landfills and composting plants) as sources of olfactory nuisances, the top air pollution complaint. Although the human olfactory system is still regarded as the most important and effective "analytical instrument" for odour evaluation, the demand for more objective analytical methods, along with the discovery of materials with chemo-electronic properties, has boosted the development of sensor-based machine olfaction potentially imitating the biological system. This review examines the state of the art of both human and instrumental sensing currently used for the detection of odours. The olfactometric techniques employing a panel of trained experts are discussed and the strong and weak points of odour assessment through human detection are highlighted. The main features and the working principles of modern electronic noses (E-Noses) are then described, focusing on their better performances for environmental analysis. Odour emission monitoring carried out through

  19. Odour Detection Methods: Olfactometry and Chemical Sensors

    Directory of Open Access Journals (Sweden)

    Sara Lovascio

    2011-05-01

    Full Text Available The complexity of the odours issue arises from the sensory nature of smell. From the evolutionary point of view olfaction is one of the oldest senses, allowing for seeking food, recognizing danger or communication: human olfaction is a protective sense as it allows the detection of potential illnesses or infections by taking into account the odour pleasantness/unpleasantness. Odours are mixtures of light and small molecules that, coming in contact with various human sensory systems, also at very low concentrations in the inhaled air, are able to stimulate an anatomical response: the experienced perception is the odour. Odour assessment is a key point in some industrial production processes (i.e., food, beverages, etc. and it is acquiring steady importance in unusual technological fields (i.e., indoor air quality; this issue mainly concerns the environmental impact of various industrial activities (i.e., tanneries, refineries, slaughterhouses, distilleries, civil and industrial wastewater treatment plants, landfills and composting plants as sources of olfactory nuisances, the top air pollution complaint. Although the human olfactory system is still regarded as the most important and effective “analytical instrument” for odour evaluation, the demand for more objective analytical methods, along with the discovery of materials with chemo-electronic properties, has boosted the development of sensor-based machine olfaction potentially imitating the biological system. This review examines the state of the art of both human and instrumental sensing currently used for the detection of odours. The olfactometric techniques employing a panel of trained experts are discussed and the strong and weak points of odour assessment through human detection are highlighted. The main features and the working principles of modern electronic noses (E-Noses are then described, focusing on their better performances for environmental analysis. Odour emission monitoring

  20. A Method to Increase Drivers' Trust in Collision Warning Systems Based on Reliability Information of Sensor

    Science.gov (United States)

    Tsutsumi, Shigeyoshi; Wada, Takahiro; Akita, Tokihiko; Doi, Shun'ichi

    Driver's workload tends to be increased during driving under complicated traffic environments like a lane change. In such cases, rear collision warning is effective for reduction of cognitive workload. On the other hand, it is pointed out that false alarm or missing alarm caused by sensor errors leads to decrease of driver' s trust in the warning system and it can result in low efficiency of the system. Suppose that reliability information of the sensor is provided in real-time. In this paper, we propose a new warning method to increase driver' s trust in the system even with low sensor reliability utilizing the sensor reliability information. The effectiveness of the warning methods is shown by driving simulator experiments.

  1. An Investment Level Decision Method to Secure Long-term Reliability

    Science.gov (United States)

    Bamba, Satoshi; Yabe, Kuniaki; Seki, Tomomichi; Shibaya, Tetsuji

    The slowdown in power demand increase and facility replacement causes the aging and lower reliability in power facility. And the aging is followed by the rapid increase of repair and replacement when many facilities reach their lifetime in future. This paper describes a method to estimate the repair and replacement costs in future by applying the life-cycle cost model and renewal theory to the historical data. This paper also describes a method to decide the optimum investment plan, which replaces facilities in the order of cost-effectiveness by setting replacement priority formula, and the minimum investment level to keep the reliability. Estimation examples applied to substation facilities show that the reasonable and leveled future cash-out can keep the reliability by lowering the percentage of replacements caused by fatal failures.

  2. A Comparison of Three Methods for the Analysis of Skin Flap Viability: Reliability and Validity.

    Science.gov (United States)

    Tim, Carla Roberta; Martignago, Cintia Cristina Santi; da Silva, Viviane Ribeiro; Dos Santos, Estefany Camila Bonfim; Vieira, Fabiana Nascimento; Parizotto, Nivaldo Antonio; Liebano, Richard Eloin

    2018-05-01

    Objective: Technological advances have provided new alternatives to the analysis of skin flap viability in animal models; however, the interrater validity and reliability of these techniques have yet to be analyzed. The present study aimed to evaluate the interrater validity and reliability of three different methods: weight of paper template (WPT), paper template area (PTA), and photographic analysis. Approach: Sixteen male Wistar rats had their cranially based dorsal skin flap elevated. On the seventh postoperative day, the viable tissue area and the necrotic area of the skin flap were recorded using the paper template method and photo image. The evaluation of the percentage of viable tissue was performed using three methods, simultaneously and independently by two raters. The analysis of interrater reliability and viability was performed using the intraclass correlation coefficient and Bland Altman Plot Analysis was used to visualize the presence or absence of systematic bias in the evaluations of data validity. Results: The results showed that interrater reliability for WPT, measurement of PTA, and photographic analysis were 0.995, 0.990, and 0.982, respectively. For data validity, a correlation >0.90 was observed for all comparisons made between the three methods. In addition, Bland Altman Plot Analysis showed agreement between the comparisons of the methods and the presence of systematic bias was not observed. Innovation: Digital methods are an excellent choice for assessing skin flap viability; moreover, they make data use and storage easier. Conclusion: Independently from the method used, the interrater reliability and validity proved to be excellent for the analysis of skin flaps' viability.

  3. SCREENING METHODS FOR THE DETECTION OF CARTELS

    Directory of Open Access Journals (Sweden)

    Mihail BUŞU

    2014-06-01

    Full Text Available During their everyday activities, the economic operators conclude a multitude of agreements in tacit or written form, such as: contracts or conventions. Some of these arrangements are absolutely necessary for the development of their current activities. These are agreements which, by respecting the rules of competition, are able to bring benefits to consumers and to the entire economy, as a whole. On the other hand, the economic operators often conclude agreements which are harmful to the economy as well as to the consumers, violating the competition rules. Some examples in this respect are: operators’ agreements on price fixing, on market or customers sharing. Before investigating the violation of competition rules, the relevant authorities should identify the possibility of the existence of such illegalities. The theoretical models for detecting the cartels do represent a proactive tool concerning the antitrust activity of competition authorities. The present paper furnishes a review of the methods for detecting cartels as well as a part of their practical application.

  4. A Method to Detect AAC Audio Forgery

    Directory of Open Access Journals (Sweden)

    Qingzhong Liu

    2015-08-01

    Full Text Available Advanced Audio Coding (AAC, a standardized lossy compression scheme for digital audio, which was designed to be the successor of the MP3 format, generally achieves better sound quality than MP3 at similar bit rates. While AAC is also the default or standard audio format for many devices and AAC audio files may be presented as important digital evidences, the authentication of the audio files is highly needed but relatively missing. In this paper, we propose a scheme to expose tampered AAC audio streams that are encoded at the same encoding bit-rate. Specifically, we design a shift-recompression based method to retrieve the differential features between the re-encoded audio stream at each shifting and original audio stream, learning classifier is employed to recognize different patterns of differential features of the doctored forgery files and original (untouched audio files. Experimental results show that our approach is very promising and effective to detect the forgery of the same encoding bit-rate on AAC audio streams. Our study also shows that shift recompression-based differential analysis is very effective for detection of the MP3 forgery at the same bit rate.

  5. Proceeding of 35th domestic symposium on applications of structural reliability and risk assessment methods to nuclear power plants

    International Nuclear Information System (INIS)

    2005-06-01

    As the 35th domestic symposium of Atomic Energy Research Committee, the Japan Welding Engineering Society, the symposium was held titled as Applications of structural reliability/risk assessment methods to nuclear energy'. Six speakers gave lectures titled as 'Structural reliability and risk assessment methods', 'Risk-informed regulation of US nuclear energy and role of probabilistic risk assessment', 'Reliability and risk assessment methods in chemical plants', 'Practical structural design methods based on reliability in architectural and civil areas', 'Maintenance activities based on reliability in thermal power plants' and 'LWR maintenance strategies based on Probabilistic Fracture Mechanics'. (T. Tanaka)

  6. Detection methods for irradiated mites and insects

    International Nuclear Information System (INIS)

    Ignatowicz, S.

    1999-01-01

    Results of the study on the following tests for separation of irradiated pests from untreated ones are reported: (a) test for identification of irradiated mites (Acaridae) based on lack of fecundity of treated females; (b) test for identification of irradiated beetles based on their locomotor activity; (c) test for identification of irradiated pests based on electron spin resonance (ESR) signal derived from treated insects; (d) test for identification of irradiated pests based on changes in the midgut induced by gamma radiation; and (e) test for identification of irradiated pests based on the alterations in total proteins of treated adults. Of these detection methods, only the test based on the pathological changes induced by irradiation in the insect midgut may identify consistently either irradiated larvae or adults. This test is simple and convenient when a rapid processing technique for dehydrating and embedding the midgut is used. (author)

  7. Method of detecting a fuel element failure

    International Nuclear Information System (INIS)

    Cohen, P.

    1975-01-01

    A method is described for detecting a fuel element failure in a liquid-sodium-cooled fast breeder reactor consisting of equilibrating a sample of the coolant with a molten salt consisting of a mixture of barium iodide and strontium iodide (or other iodides) whereby a large fraction of any radioactive iodine present in the liquid sodium coolant exchanges with the iodine present in the salt; separating the molten salt and sodium; if necessary, equilibrating the molten salt with nonradioactive sodium and separating the molten salt and sodium; and monitoring the molten salt for the presence of iodine, the presence of iodine indicating that the cladding of a fuel element has failed. (U.S.)

  8. Liquid chromatography detection unit, system, and method

    Science.gov (United States)

    Derenzo, Stephen E.; Moses, William W.

    2015-10-27

    An embodiment of a liquid chromatography detection unit includes a fluid channel and a radiation detector. The radiation detector is operable to image a distribution of a radiolabeled compound as the distribution travels along the fluid channel. An embodiment of a liquid chromatography system includes an injector, a separation column, and a radiation detector. The injector is operable to inject a sample that includes a radiolabeled compound into a solvent stream. The position sensitive radiation detector is operable to image a distribution of the radiolabeled compound as the distribution travels along a fluid channel. An embodiment of a method of liquid chromatography includes injecting a sample that comprises radiolabeled compounds into a solvent. The radiolabeled compounds are then separated. A position sensitive radiation detector is employed to image distributions of the radiolabeled compounds as the radiolabeled compounds travel along a fluid channel.

  9. The effect of DLC-coating deposition method on the reliability and mechanical properties of abutment's screws.

    Science.gov (United States)

    Bordin, Dimorvan; Coelho, Paulo G; Bergamo, Edmara T P; Bonfante, Estevam A; Witek, Lukasz; Del Bel Cury, Altair A

    2018-04-10

    To characterize the mechanical properties of different coating methods of DLC (diamond-like carbon) onto dental implant abutment screws, and their effect on the probability of survival (reliability). Seventy-five abutment screws were allocated into three groups according to the coating method: control (no coating); UMS - DLC applied through unbalanced magnetron sputtering; RFPA-DLC applied through radio frequency plasma-activated (n=25/group). Twelve screws (n=4) were used to determine the hardness and Young's modulus (YM). A 3D finite element model composed of titanium substrate, DLC-layer and a counterpart were constructed. The deformation (μm) and shear stress (MPa) were calculated. The remaining screws of each group were torqued into external hexagon abutments and subjected to step-stress accelerated life-testing (SSALT) (n=21/group). The probability Weibull curves and reliability (probability survival) were calculated considering the mission of 100, 150 and 200N at 50,000 and 100,000 cycles. DLC-coated experimental groups evidenced higher hardness than control (p1 indicating that fatigue contributed to failure. High reliability was depicted at a mission of 100N. At 200N a significant decrease in reliability was detected for all groups (ranging from 39% to 66%). No significant difference was observed among groups regardless of mission. Screw fracture was the chief failure mode. DLC-coating have been used to improve titanium's mechanical properties and increase the reliability of dental implant-supported restorations. Copyright © 2018 The Academy of Dental Materials. Published by Elsevier Inc. All rights reserved.

  10. Extended block diagram method for a multi-state system reliability assessment

    International Nuclear Information System (INIS)

    Lisnianski, Anatoly

    2007-01-01

    The presented method extends the classical reliability block diagram method to a repairable multi-state system. It is very suitable for engineering applications since the procedure is well formalized and based on the natural decomposition of the entire multi-state system (the system is represented as a collection of its elements). Until now, the classical block diagram method did not provide the reliability assessment for the repairable multi-state system. The straightforward stochastic process methods are very difficult for engineering application in such cases due to the 'dimension damnation'-huge number of system states. The suggested method is based on the combined random processes and the universal generating function technique and drastically reduces the number of states in the multi-state model

  11. FISHing for bacteria in food--a promising tool for the reliable detection of pathogenic bacteria?

    Science.gov (United States)

    Rohde, Alexander; Hammerl, Jens Andre; Appel, Bernd; Dieckmann, Ralf; Al Dahouk, Sascha

    2015-04-01

    Foodborne pathogens cause millions of infections every year and are responsible for considerable economic losses worldwide. The current gold standard for the detection of bacterial pathogens in food is still the conventional cultivation following standardized and generally accepted protocols. However, these methods are time-consuming and do not provide fast information about food contaminations and thus are limited in their ability to protect consumers in time from potential microbial hazards. Fluorescence in situ hybridization (FISH) represents a rapid and highly specific technique for whole-cell detection. This review aims to summarize the current data on FISH-testing for the detection of pathogenic bacteria in different food matrices and to evaluate its suitability for the implementation in routine testing. In this context, the use of FISH in different matrices and their pretreatment will be presented, the sensitivity and specificity of FISH tests will be considered and the need for automation shall be discussed as well as the use of technological improvements to overcome current hurdles for a broad application in monitoring food safety. In addition, the overall economical feasibility will be assessed in a rough calculation of costs, and strengths and weaknesses of FISH are considered in comparison with traditional and well-established detection methods. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. [Knowledge of university students in Szeged, Hungary about reliable contraceptive methods and sexually transmitted diseases].

    Science.gov (United States)

    Devosa, Iván; Kozinszky, Zoltán; Vanya, Melinda; Szili, Károly; Fáyné Dombi, Alice; Barabás, Katalin

    2016-04-03

    Promiscuity and lack of use of reliable contraceptive methods increase the probability of sexually transmitted diseases and the risk of unwanted pregnancies, which are quite common among university students. The aim of the study was to assess the knowledge of university students about reliable contraceptive methods and sexually transmitted diseases, and to assess the effectiveness of the sexual health education in secondary schools, with specific focus on the education held by peers. An anonymous, self-administered questionnaire survey was carried out in a randomized sample of students at the University of Szeged (n = 472, 298 women and 174 men, average age 21 years) between 2009 and 2011. 62.1% of the respondents declared that reproductive health education lessons in high schools held by peers were reliable and authentic source of information, 12.3% considered as a less reliable source, and 25.6% defined the school health education as irrelevant source. Among those, who considered the health education held by peers as a reliable source, there were significantly more females (69.3% vs. 46.6%, p = 0.001), significantly fewer lived in cities (83.6% vs. 94.8%, p = 0.025), and significantly more responders knew that Candida infection can be transmitted through sexual intercourse (79.5% versus 63.9%, p = 0.02) as compared to those who did not consider health education held by peers as a reliable source. The majority of respondents obtained knowledge about sexual issues from the mass media. Young people who considered health educating programs reliable were significantly better informed about Candida disease.

  13. Reliability of Semiautomated Computational Methods for Estimating Tibiofemoral Contact Stress in the Multicenter Osteoarthritis Study

    Directory of Open Access Journals (Sweden)

    Donald D. Anderson

    2012-01-01

    Full Text Available Recent findings suggest that contact stress is a potent predictor of subsequent symptomatic osteoarthritis development in the knee. However, much larger numbers of knees (likely on the order of hundreds, if not thousands need to be reliably analyzed to achieve the statistical power necessary to clarify this relationship. This study assessed the reliability of new semiautomated computational methods for estimating contact stress in knees from large population-based cohorts. Ten knees of subjects from the Multicenter Osteoarthritis Study were included. Bone surfaces were manually segmented from sequential 1.0 Tesla magnetic resonance imaging slices by three individuals on two nonconsecutive days. Four individuals then registered the resulting bone surfaces to corresponding bone edges on weight-bearing radiographs, using a semi-automated algorithm. Discrete element analysis methods were used to estimate contact stress distributions for each knee. Segmentation and registration reliabilities (day-to-day and interrater for peak and mean medial and lateral tibiofemoral contact stress were assessed with Shrout-Fleiss intraclass correlation coefficients (ICCs. The segmentation and registration steps of the modeling approach were found to have excellent day-to-day (ICC 0.93–0.99 and good inter-rater reliability (0.84–0.97. This approach for estimating compartment-specific tibiofemoral contact stress appears to be sufficiently reliable for use in large population-based cohorts.

  14. Risk-based methods for reliability investments in electric power distribution systems

    Energy Technology Data Exchange (ETDEWEB)

    Alvehag, Karin

    2011-07-01

    Society relies more and more on a continuous supply of electricity. However, while under investments in reliability lead to an unacceptable number of power interruptions, over investments result in too high costs for society. To give incentives for a socio economically optimal level of reliability, quality regulations have been adopted in many European countries. These quality regulations imply new financial risks for the distribution system operator (DSO) since poor reliability can reduce the allowed revenue for the DSO and compensation may have to be paid to affected customers. This thesis develops a method for evaluating the incentives for reliability investments implied by different quality regulation designs. The method can be used to investigate whether socio economically beneficial projects are also beneficial for a profit-maximizing DSO subject to a particular quality regulation design. To investigate which reinvestment projects are preferable for society and a DSO, risk-based methods are developed. With these methods, the probability of power interruptions and the consequences of these can be simulated. The consequences of interruptions for the DSO will to a large extent depend on the quality regulation. The consequences for the customers, and hence also society, will depend on factors such as the interruption duration and time of occurrence. The proposed risk-based methods consider extreme outage events in the risk assessments by incorporating the impact of severe weather, estimating the full probability distribution of the total reliability cost, and formulating a risk-averse strategy. Results from case studies performed show that quality regulation design has a significant impact on reinvestment project profitability for a DSO. In order to adequately capture the financial risk that the DSO is exposed to, detailed riskbased methods, such as the ones developed in this thesis, are needed. Furthermore, when making investment decisions, a risk

  15. Transition from Partial Factors Method to Simulation-Based Reliability Assessment in Structural Design

    Czech Academy of Sciences Publication Activity Database

    Marek, Pavel; Guštar, M.; Permaul, K.

    1999-01-01

    Roč. 14, č. 1 (1999), s. 105-118 ISSN 0266-8920 R&D Projects: GA ČR GA103/94/0562; GA ČR GV103/96/K034 Keywords : reliability * safety * failure * durability * Monte Carlo method Subject RIV: JM - Building Engineering Impact factor: 0.522, year: 1999

  16. AK-SYS: An adaptation of the AK-MCS method for system reliability

    International Nuclear Information System (INIS)

    Fauriat, W.; Gayton, N.

    2014-01-01

    A lot of research work has been proposed over the last two decades to evaluate the probability of failure of a structure involving a very time-consuming mechanical model. Surrogate model approaches based on Kriging, such as the Efficient Global Reliability Analysis (EGRA) or the Active learning and Kriging-based Monte-Carlo Simulation (AK-MCS) methods, are very efficient and each has advantages of its own. EGRA is well suited to evaluating small probabilities, as the surrogate can be used to classify any population. AK-MCS is built in relation to a given population and requires no optimization program for the active learning procedure to be performed. It is therefore easier to implement and more likely to spend computational effort on areas with a significant probability content. When assessing system reliability, analytical approaches and first-order approximation are widely used in the literature. However, in the present paper we rather focus on sampling techniques and, considering the recent adaptation of the EGRA method for systems, a strategy is presented to adapt the AK-MCS method for system reliability. The AK-SYS method, “Active learning and Kriging-based SYStem reliability method”, is presented. Its high efficiency and accuracy are illustrated via various examples

  17. Evaluation of the reliability of Levine method of wound swab for ...

    African Journals Online (AJOL)

    The aim of this paper is to evaluate the reliability of Levine swab in accurate identification of microorganisms present in a wound and identify the necessity for further studies in this regard. Methods: A semi structured questionnaire was administered and physical examination was performed on patients with chronic wounds ...

  18. Two different hematocrit detection methods: Different methods, different results?

    Directory of Open Access Journals (Sweden)

    Schuepbach Reto A

    2010-03-01

    Full Text Available Abstract Background Less is known about the influence of hematocrit detection methodology on transfusion triggers. Therefore, the aim of the present study was to compare two different hematocrit-assessing methods. In a total of 50 critically ill patients hematocrit was analyzed using (1 blood gas analyzer (ABLflex 800 and (2 the central laboratory method (ADVIA® 2120 and compared. Findings Bland-Altman analysis for repeated measurements showed a good correlation with a bias of +1.39% and 2 SD of ± 3.12%. The 24%-hematocrit-group showed a correlation of r2 = 0.87. With a kappa of 0.56, 22.7% of the cases would have been transfused differently. In the-28%-hematocrit group with a similar correlation (r2 = 0.8 and a kappa of 0.58, 21% of the cases would have been transfused differently. Conclusions Despite a good agreement between the two methods used to determine hematocrit in clinical routine, the calculated difference of 1.4% might substantially influence transfusion triggers depending on the employed method.

  19. Reliability analysis based on a novel density estimation method for structures with correlations

    Directory of Open Access Journals (Sweden)

    Baoyu LI

    2017-06-01

    Full Text Available Estimating the Probability Density Function (PDF of the performance function is a direct way for structural reliability analysis, and the failure probability can be easily obtained by integration in the failure domain. However, efficiently estimating the PDF is still an urgent problem to be solved. The existing fractional moment based maximum entropy has provided a very advanced method for the PDF estimation, whereas the main shortcoming is that it limits the application of the reliability analysis method only to structures with independent inputs. While in fact, structures with correlated inputs always exist in engineering, thus this paper improves the maximum entropy method, and applies the Unscented Transformation (UT technique to compute the fractional moments of the performance function for structures with correlations, which is a very efficient moment estimation method for models with any inputs. The proposed method can precisely estimate the probability distributions of performance functions for structures with correlations. Besides, the number of function evaluations of the proposed method in reliability analysis, which is determined by UT, is really small. Several examples are employed to illustrate the accuracy and advantages of the proposed method.

  20. Recent developments in optical detection methods for microchip separations

    NARCIS (Netherlands)

    Götz, S.; Karst, U.

    2007-01-01

    This paper summarizes the features and performances of optical detection systems currently applied in order to monitor separations on microchip devices. Fluorescence detection, which delivers very high sensitivity and selectivity, is still the most widely applied method of detection. Instruments

  1. Numerical simulation for cracks detection using the finite elements method

    Directory of Open Access Journals (Sweden)

    S Bennoud

    2016-09-01

    Full Text Available The means of detection must ensure controls either during initial construction, or at the time of exploitation of all parts. The Non destructive testing (NDT gathers the most widespread methods for detecting defects of a part or review the integrity of a structure. In the areas of advanced industry (aeronautics, aerospace, nuclear …, assessing the damage of materials is a key point to control durability and reliability of parts and materials in service. In this context, it is necessary to quantify the damage and identify the different mechanisms responsible for the progress of this damage. It is therefore essential to characterize materials and identify the most sensitive indicators attached to damage to prevent their destruction and use them optimally. In this work, simulation by finite elements method is realized with aim to calculate the electromagnetic energy of interaction: probe and piece (with/without defect. From calculated energy, we deduce the real and imaginary components of the impedance which enables to determine the characteristic parameters of a crack in various metallic parts.

  2. Reliability of magnetic resonance imaging for the detection of hypopituitarism in children with optic nerve hypoplasia.

    Science.gov (United States)

    Ramakrishnaiah, Raghu H; Shelton, Julie B; Glasier, Charles M; Phillips, Paul H

    2014-01-01

    It is essential to identify hypopituitarism in children with optic nerve hypoplasia (ONH) because they are at risk for developmental delay, seizures, or death. The purpose of this study is to determine the reliability of neurohypophyseal abnormalities on magnetic resonance imaging (MRI) for the detection of hypopituitarism in children with ONH. Cross-sectional study. One hundred one children with clinical ONH who underwent MRI of the brain and orbits and a detailed pediatric endocrinologic evaluation. Magnetic resonance imaging studies were performed on 1.5-Tesla scanners. The imaging protocol included sagittal T1-weighted images, axial fast fluid-attenuated inversion-recovery/T2-weighted images, and diffusion-weighted images of the brain. Orbital imaging included fat-saturated axial and coronal images and high-resolution axial T2-weighted images. The MRI studies were reviewed by 2 pediatric neuroradiologists for optic nerve hypoplasia, absent or ectopic posterior pituitary, absent pituitary infundibulum, absent septum pellucidum, migration anomalies, and hemispheric injury. Medical records were reviewed for clinical examination findings and endocrinologic status. All patients underwent a clinical evaluation by a pediatric endocrinologist and a standardized panel of serologic testing that included serum insulin-like growth factor-1, insulin-like growth factor binding protein-3, prolactin, cortisol, adrenocorticotropic hormone, thyroid-stimulating hormone, and free thyroxine levels. Radiologists were masked to patients' endocrinologic status and funduscopic findings. Sensitivity and specificity of MRI findings for the detection of hypopituitarism. Neurohypophyseal abnormalities, including absent pituitary infundibulum, ectopic posterior pituitary bright spot, and absent posterior pituitary bright spot, occurred in 33 children. Magnetic resonance imaging disclosed neurohypophyseal abnormalities in 27 of the 28 children with hypopituitarism (sensitivity, 96%). A

  3. Development of advanced methods and related software for human reliability evaluation within probabilistic safety analyses

    International Nuclear Information System (INIS)

    Kosmowski, K.T.; Mertens, J.; Degen, G.; Reer, B.

    1994-06-01

    Human Reliability Analysis (HRA) is an important part of Probabilistic Safety Analysis (PSA). The first part of this report consists of an overview of types of human behaviour and human error including the effect of significant performance shaping factors on human reliability. Particularly with regard to safety assessments for nuclear power plants a lot of HRA methods have been developed. The most important of these methods are presented and discussed in the report, together with techniques for incorporating HRA into PSA and with models of operator cognitive behaviour. Based on existing HRA methods the concept of a software system is described. For the development of this system the utilization of modern programming tools is proposed; the essential goal is the effective application of HRA methods. A possible integration of computeraided HRA within PSA is discussed. The features of Expert System Technology and examples of applications (PSA, HRA) are presented in four appendices. (orig.) [de

  4. A reliable method for the counting and control of single ions for single-dopant controlled devices

    International Nuclear Information System (INIS)

    Shinada, T; Kurosawa, T; Nakayama, H; Zhu, Y; Hori, M; Ohdomari, I

    2008-01-01

    By 2016, transistor device size will be just 10 nm. However, a transistor that is doped at a typical concentration of 10 18 atoms cm -3 has only one dopant atom in the active channel region. Therefore, it can be predicted that conventional doping methods such as ion implantation and thermal diffusion will not be available ten years from now. We have been developing a single-ion implantation (SII) method that enables us to implant dopant ions one-by-one into semiconductors until the desired number is reached. Here we report a simple but reliable method to control the number of single-dopant atoms by detecting the change in drain current induced by single-ion implantation. The drain current decreases in a stepwise fashion as a result of the clusters of displaced Si atoms created by every single-ion incidence. This result indicates that the single-ion detection method we have developed is capable of detecting single-ion incidence with 100% efficiency. Our method potentially could pave the way to future single-atom devices, including a solid-state quantum computer

  5. Reliability Assessment of Active Distribution System Using Monte Carlo Simulation Method

    Directory of Open Access Journals (Sweden)

    Shaoyun Ge

    2014-01-01

    Full Text Available In this paper we have treated the reliability assessment problem of low and high DG penetration level of active distribution system using the Monte Carlo simulation method. The problem is formulated as a two-case program, the program of low penetration simulation and the program of high penetration simulation. The load shedding strategy and the simulation process were introduced in detail during each FMEA process. Results indicate that the integration of DG can improve the reliability of the system if the system was operated actively.

  6. Assessment of Electronic Circuits Reliability Using Boolean Truth Table Modeling Method

    International Nuclear Information System (INIS)

    EI-Shanshoury, A.I.

    2011-01-01

    This paper explores the use of Boolean Truth Table modeling Method (BTTM) in the analysis of qualitative data. It is widely used in certain fields especially in the fields of electrical and electronic engineering. Our work focuses on the evaluation of power supply circuit reliability using (BTTM) which involves systematic attempts to falsify and identify hypotheses on the basis of truth tables constructed from qualitative data. Reliability parameters such as the system's failure rates for the power supply case study are estimated. All possible state combinations (operating and failed states) of the major components in the circuit were listed and their effects on overall system were studied

  7. Methods for estimating the reliability of the RBMK fuel assemblies and elements

    International Nuclear Information System (INIS)

    Klemin, A.I.; Sitkarev, A.G.

    1985-01-01

    Applied non-parametric methods for calculation of point and interval estimations for the basic nomenclature of reliability factors for the RBMK fuel assemblies and elements are described. As the fuel assembly and element reliability factors, the average lifetime is considered at a preset operating time up to unloading due to fuel burnout as well as the average lifetime at the reactor transient operation and at the steady-state fuel reloading mode of reactor operation. The formulae obtained are included into the special standardized engineering documentation

  8. Screening, sensitivity, and uncertainty for the CREAM method of Human Reliability Analysis

    International Nuclear Information System (INIS)

    Bedford, Tim; Bayley, Clare; Revie, Matthew

    2013-01-01

    This paper reports a sensitivity analysis of the Cognitive Reliability and Error Analysis Method for Human Reliability Analysis. We consider three different aspects: the difference between the outputs of the Basic and Extended methods, on the same HRA scenario; the variability in outputs through the choices made for common performance conditions (CPCs); and the variability in outputs through the assignment of choices for cognitive function failures (CFFs). We discuss the problem of interpreting categories when applying the method, compare its quantitative structure to that of first generation methods and discuss also how dependence is modelled with the approach. We show that the control mode intervals used in the Basic method are too narrow to be consistent with the Extended method. This motivates a new screening method that gives improved accuracy with respect to the Basic method, in the sense that (on average) halves the uncertainty associated with the Basic method. We make some observations on the design of a screening method that are generally applicable in Risk Analysis. Finally, we propose a new method of combining CPC weights with nominal probabilities so that the calculated probabilities are always in range (i.e. between 0 and 1), while satisfying sensible properties that are consistent with the overall CREAM method

  9. How often should we monitor for reliable detection of atrial fibrillation recurrence? Efficiency considerations and implications for study design.

    Directory of Open Access Journals (Sweden)

    Efstratios I Charitos

    Full Text Available Although atrial fibrillation (AF recurrence is unpredictable in terms of onset and duration, current intermittent rhythm monitoring (IRM diagnostic modalities are short-termed and discontinuous. The aim of the present study was to investigate the necessary IRM frequency required to reliably detect recurrence of various AF recurrence patterns.The rhythm histories of 647 patients (mean AF burden: 12 ± 22% of monitored time; 687 patient-years with implantable continuous monitoring devices were reconstructed and analyzed. With the use of computationally intensive simulation, we evaluated the necessary IRM frequency to reliably detect AF recurrence of various AF phenotypes using IRM of various durations.The IRM frequency required for reliable AF detection depends on the amount and temporal aggregation of the AF recurrence (p95% sensitivity of AF recurrence required higher IRM frequencies (>12 24-hour; >6 7-day; >4 14-day; >3 30-day IRM per year; p<0.0001 than currently recommended. Lower IRM frequencies will under-detect AF recurrence and introduce significant bias in the evaluation of therapeutic interventions. More frequent but of shorter duration, IRMs (24-hour are significantly more time effective (sensitivity per monitored time than a fewer number of longer IRM durations (p<0.0001.Reliable AF recurrence detection requires higher IRM frequencies than currently recommended. Current IRM frequency recommendations will fail to diagnose a significant proportion of patients. Shorter duration but more frequent IRM strategies are significantly more efficient than longer IRM durations.Unique identifier: NCT00806689.

  10. A method of bias correction for maximal reliability with dichotomous measures.

    Science.gov (United States)

    Penev, Spiridon; Raykov, Tenko

    2010-02-01

    This paper is concerned with the reliability of weighted combinations of a given set of dichotomous measures. Maximal reliability for such measures has been discussed in the past, but the pertinent estimator exhibits a considerable bias and mean squared error for moderate sample sizes. We examine this bias, propose a procedure for bias correction, and develop a more accurate asymptotic confidence interval for the resulting estimator. In most empirically relevant cases, the bias correction and mean squared error correction can be performed simultaneously. We propose an approximate (asymptotic) confidence interval for the maximal reliability coefficient, discuss the implementation of this estimator, and investigate the mean squared error of the associated asymptotic approximation. We illustrate the proposed methods using a numerical example.

  11. A Simple and Reliable Method of Design for Standalone Photovoltaic Systems

    Science.gov (United States)

    Srinivasarao, Mantri; Sudha, K. Rama; Bhanu, C. V. K.

    2017-06-01

    Standalone photovoltaic (SAPV) systems are seen as a promoting method of electrifying areas of developing world that lack power grid infrastructure. Proliferations of these systems require a design procedure that is simple, reliable and exhibit good performance over its life time. The proposed methodology uses simple empirical formulae and easily available parameters to design SAPV systems, that is, array size with energy storage. After arriving at the different array size (area), performance curves are obtained for optimal design of SAPV system with high amount of reliability in terms of autonomy at a specified value of loss of load probability (LOLP). Based on the array to load ratio (ALR) and levelized energy cost (LEC) through life cycle cost (LCC) analysis, it is shown that the proposed methodology gives better performance, requires simple data and is more reliable when compared with conventional design using monthly average daily load and insolation.

  12. Sensitivity Weaknesses in Application of some Statistical Distribution in First Order Reliability Methods

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Enevoldsen, I.

    1993-01-01

    It has been observed and shown that in some examples a sensitivity analysis of the first order reliability index results in increasing reliability index, when the standard deviation for a stochastic variable is increased while the expected value is fixed. This unfortunate behaviour can occur when...... a stochastic variable is modelled by an asymmetrical density function. For lognormally, Gumbel and Weibull distributed stochastic variables it is shown for which combinations of the/3-point, the expected value and standard deviation the weakness can occur. In relation to practical application the behaviour...... is probably rather infrequent. A simple example is shown as illustration and to exemplify that for second order reliability methods and for exact calculations of the probability of failure this behaviour is much more infrequent....

  13. A fast approximation method for reliability analysis of cold-standby systems

    International Nuclear Information System (INIS)

    Wang, Chaonan; Xing, Liudong; Amari, Suprasad V.

    2012-01-01

    Analyzing reliability of large cold-standby systems has been a complicated and time-consuming task, especially for systems with components having non-exponential time-to-failure distributions. In this paper, an approximation model, which is based on the central limit theorem, is presented for the reliability analysis of binary cold-standby systems. The proposed model can estimate the reliability of large cold-standby systems with binary-state components having arbitrary time-to-failure distributions in an efficient and easy way. The accuracy and efficiency of the proposed method are illustrated using several different types of distributions for both 1-out-of-n and k-out-of-n cold-standby systems.

  14. A dynamic discretization method for reliability inference in Dynamic Bayesian Networks

    International Nuclear Information System (INIS)

    Zhu, Jiandao; Collette, Matthew

    2015-01-01

    The material and modeling parameters that drive structural reliability analysis for marine structures are subject to a significant uncertainty. This is especially true when time-dependent degradation mechanisms such as structural fatigue cracking are considered. Through inspection and monitoring, information such as crack location and size can be obtained to improve these parameters and the corresponding reliability estimates. Dynamic Bayesian Networks (DBNs) are a powerful and flexible tool to model dynamic system behavior and update reliability and uncertainty analysis with life cycle data for problems such as fatigue cracking. However, a central challenge in using DBNs is the need to discretize certain types of continuous random variables to perform network inference while still accurately tracking low-probability failure events. Most existing discretization methods focus on getting the overall shape of the distribution correct, with less emphasis on the tail region. Therefore, a novel scheme is presented specifically to estimate the likelihood of low-probability failure events. The scheme is an iterative algorithm which dynamically partitions the discretization intervals at each iteration. Through applications to two stochastic crack-growth example problems, the algorithm is shown to be robust and accurate. Comparisons are presented between the proposed approach and existing methods for the discretization problem. - Highlights: • A dynamic discretization method is developed for low-probability events in DBNs. • The method is compared to existing approaches on two crack growth problems. • The method is shown to improve on existing methods for low-probability events

  15. A Reliable Method to Measure Lip Height Using Photogrammetry in Unilateral Cleft Lip Patients.

    Science.gov (United States)

    van der Zeeuw, Frederique; Murabit, Amera; Volcano, Johnny; Torensma, Bart; Patel, Brijesh; Hay, Norman; Thorburn, Guy; Morris, Paul; Sommerlad, Brian; Gnarra, Maria; van der Horst, Chantal; Kangesu, Loshan

    2015-09-01

    There is still no reliable tool to determine the outcome of the repaired unilateral cleft lip (UCL). The aim of this study was therefore to develop an accurate, reliable tool to measure vertical lip height from photographs. The authors measured the vertical height of the cutaneous and vermilion parts of the lip in 72 anterior-posterior view photographs of 17 patients with repairs to a UCL. Points on the lip's white roll and vermillion were marked on both the cleft and the noncleft sides on each image. Two new concepts were tested. First, photographs were standardized using the horizontal (medial to lateral) eye fissure width (EFW) for calibration. Second, the authors tested the interpupillary line (IPL) and the alar base line (ABL) for their reliability as horizontal lines of reference. Measurements were taken by 2 independent researchers, at 2 different time points each. Overall 2304 data points were obtained and analyzed. Results showed that the method was very effective in measuring the height of the lip on the cleft side with the noncleft side. When using the IPL, inter- and intra-rater reliability was 0.99 to 1.0, with the ABL it varied from 0.91 to 0.99 with one exception at 0.84. The IPL was easier to define because in some subjects the overhanging nasal tip obscured the alar base and gave more consistent measurements possibly because the reconstructed alar base was sometimes indistinct. However, measurements from the IPL can only give the percentage difference between the left and right sides of the lip, whereas those from the ABL can also give exact measurements. Patient examples were given that show how the measurements correlate with clinical assessment. The authors propose this method of photogrammetry with the innovative use of the IPL as a reliable horizontal plane and use of the EFW for calibration as a useful and reliable tool to assess the outcome of UCL repair.

  16. Identification of reliable gridded reference data for statistical downscaling methods in Alberta

    Science.gov (United States)

    Eum, H. I.; Gupta, A.

    2017-12-01

    Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system

  17. Reliability and validity of the KIPPPI: an early detection tool for psychosocial problems in toddlers.

    Directory of Open Access Journals (Sweden)

    Ingrid Kruizinga

    Full Text Available BACKGROUND: The KIPPPI (Brief Instrument Psychological and Pedagogical Problem Inventory is a Dutch questionnaire that measures psychosocial and pedagogical problems in 2-year olds and consists of a KIPPPI Total score, Wellbeing scale, Competence scale, and Autonomy scale. This study examined the reliability, validity, screening accuracy and clinical application of the KIPPPI. METHODS: Parents of 5959 2-year-old children in the Rotterdam area, the Netherlands, were invited to participate in the study. Parents of 3164 children (53.1% of all invited parents completed the questionnaire. The internal consistency was evaluated and in subsamples the test-retest reliability and concurrent validity with regard to the Child Behavioral Checklist (CBCL. Discriminative validity was evaluated by comparing scores of parents who worried about their child's upbringing and parent's that did not. Screening accuracy of the KIPPPI was evaluated against the CBCL by calculating the Receiver Operating Characteristic (ROC curves. The clinical application was evaluated by the relation between KIPPPI scores and the clinical decision made by the child health professionals. RESULTS: Psychometric properties of the KIPPPI Total score, Wellbeing scale, Competence scale and Autonomy scale were respectively: Cronbach's alphas: 0.88, 0.86, 0.83, 0.58. Test-retest correlations: 0.80, 0.76, 0.73, 0.60. Concurrent validity was as hypothesised. The KIPPPI was able to discriminate between parents that worried about their child and parents that did not. Screening accuracy was high (>0.90 for the KIPPPI Total score and for the Wellbeing scale. The KIPPPI scale scores and clinical decision of the child health professional were related (p<0.05, indicating a good clinical application. CONCLUSION: The results in this large-scale study of a diverse general population sample support the reliability, validity and clinical application of the KIPPPI Total score, Wellbeing scale and Competence

  18. Reliable methods for computer simulation error control and a posteriori estimates

    CERN Document Server

    Neittaanmäki, P

    2004-01-01

    Recent decades have seen a very rapid success in developing numerical methods based on explicit control over approximation errors. It may be said that nowadays a new direction is forming in numerical analysis, the main goal of which is to develop methods ofreliable computations. In general, a reliable numerical method must solve two basic problems: (a) generate a sequence of approximations that converges to a solution and (b) verify the accuracy of these approximations. A computer code for such a method must consist of two respective blocks: solver and checker.In this book, we are chie

  19. Reliability of Using Retinal Vascular Fractal Dimension as a Biomarker in the Diabetic Retinopathy Detection.

    Science.gov (United States)

    Huang, Fan; Dashtbozorg, Behdad; Zhang, Jiong; Bekkers, Erik; Abbasi-Sureshjani, Samaneh; Berendschot, Tos T J M; Ter Haar Romeny, Bart M

    2016-01-01

    The retinal fractal dimension (FD) is a measure of vasculature branching pattern complexity. FD has been considered as a potential biomarker for the detection of several diseases like diabetes and hypertension. However, conflicting findings were found in the reported literature regarding the association between this biomarker and diseases. In this paper, we examine the stability of the FD measurement with respect to (1) different vessel annotations obtained from human observers, (2) automatic segmentation methods, (3) various regions of interest, (4) accuracy of vessel segmentation methods, and (5) different imaging modalities. Our results demonstrate that the relative errors for the measurement of FD are significant and FD varies considerably according to the image quality, modality, and the technique used for measuring it. Automated and semiautomated methods for the measurement of FD are not stable enough, which makes FD a deceptive biomarker in quantitative clinical applications.

  20. Image Processing Methods Usable for Object Detection on the Chessboard

    Directory of Open Access Journals (Sweden)

    Beran Ladislav

    2016-01-01

    Full Text Available Image segmentation and object detection is challenging problem in many research. Although many algorithms for image segmentation have been invented, there is no simple algorithm for image segmentation and object detection. Our research is based on combination of several methods for object detection. The first method suitable for image segmentation and object detection is colour detection. This method is very simply, but there is problem with different colours. For this method it is necessary to have precisely determined colour of segmented object before all calculations. In many cases it is necessary to determine this colour manually. Alternative simply method is method based on background removal. This method is based on difference between reference image and detected image. In this paper several methods suitable for object detection are described. Thisresearch is focused on coloured object detection on chessboard. The results from this research with fusion of neural networks for user-computer game checkers will be applied.

  1. A rapid reliability estimation method for directed acyclic lifeline networks with statistically dependent components

    International Nuclear Information System (INIS)

    Kang, Won-Hee; Kliese, Alyce

    2014-01-01

    Lifeline networks, such as transportation, water supply, sewers, telecommunications, and electrical and gas networks, are essential elements for the economic and societal functions of urban areas, but their components are highly susceptible to natural or man-made hazards. In this context, it is essential to provide effective pre-disaster hazard mitigation strategies and prompt post-disaster risk management efforts based on rapid system reliability assessment. This paper proposes a rapid reliability estimation method for node-pair connectivity analysis of lifeline networks especially when the network components are statistically correlated. Recursive procedures are proposed to compound all network nodes until they become a single super node representing the connectivity between the origin and destination nodes. The proposed method is applied to numerical network examples and benchmark interconnected power and water networks in Memphis, Shelby County. The connectivity analysis results show the proposed method's reasonable accuracy and remarkable efficiency as compared to the Monte Carlo simulations

  2. A Sequential Kriging reliability analysis method with characteristics of adaptive sampling regions and parallelizability

    International Nuclear Information System (INIS)

    Wen, Zhixun; Pei, Haiqing; Liu, Hai; Yue, Zhufeng

    2016-01-01

    The sequential Kriging reliability analysis (SKRA) method has been developed in recent years for nonlinear implicit response functions which are expensive to evaluate. This type of method includes EGRA: the efficient reliability analysis method, and AK-MCS: the active learning reliability method combining Kriging model and Monte Carlo simulation. The purpose of this paper is to improve SKRA by adaptive sampling regions and parallelizability. The adaptive sampling regions strategy is proposed to avoid selecting samples in regions where the probability density is so low that the accuracy of these regions has negligible effects on the results. The size of the sampling regions is adapted according to the failure probability calculated by last iteration. Two parallel strategies are introduced and compared, aimed at selecting multiple sample points at a time. The improvement is verified through several troublesome examples. - Highlights: • The ISKRA method improves the efficiency of SKRA. • Adaptive sampling regions strategy reduces the number of needed samples. • The two parallel strategies reduce the number of needed iterations. • The accuracy of the optimal value impacts the number of samples significantly.

  3. Reliability of an experimental method to analyse the impact point on a golf ball during putting.

    Science.gov (United States)

    Richardson, Ashley K; Mitchell, Andrew C S; Hughes, Gerwyn

    2015-06-01

    This study aimed to examine the reliability of an experimental method identifying the location of the impact point on a golf ball during putting. Forty trials were completed using a mechanical putting robot set to reproduce a putt of 3.2 m, with four different putter-ball combinations. After locating the centre of the dimple pattern (centroid) the following variables were tested; distance of the impact point from the centroid, angle of the impact point from the centroid and distance of the impact point from the centroid derived from the X, Y coordinates. Good to excellent reliability was demonstrated in all impact variables reflected in very strong relative (ICC = 0.98-1.00) and absolute reliability (SEM% = 0.9-4.3%). The highest SEM% observed was 7% for the angle of the impact point from the centroid. In conclusion, the experimental method was shown to be reliable at locating the centroid location of a golf ball, therefore allowing for the identification of the point of impact with the putter head and is suitable for use in subsequent studies.

  4. Exploratory factor analysis and reliability analysis with missing data: A simple method for SPSS users

    Directory of Open Access Journals (Sweden)

    Bruce Weaver

    2014-09-01

    Full Text Available Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009 argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM covariances(or correlations as input for the analysis. SPSS users who have the Missing Values Analysis add-on module can obtain vectors ofEM means and standard deviations plus EM correlation and covariance matrices via the MVA procedure. But unfortunately, MVA has no /MATRIX subcommand, and therefore cannot write the EM correlations directly to a matrix dataset of the type needed as input to the FACTOR and RELIABILITY procedures. We describe two macros that (in conjunction with an intervening MVA command carry out the data management steps needed to create two matrix datasets, one containing EM correlations and the other EM covariances. Either of those matrix datasets can then be used asinput to the FACTOR procedure, and the EM correlations can also be used as input to RELIABILITY. We provide an example that illustrates the use of the two macros to generate the matrix datasets and how to use those datasets as input to the FACTOR and RELIABILITY procedures. We hope that this simple method for handling missing data will prove useful to both students andresearchers who are conducting EFA or reliability analysis.

  5. Development of detection methods for irradiated foods

    International Nuclear Information System (INIS)

    Yang, Jae Seung; Nam, Hye Seon; Oh, Kyong Nam; Woo, Si Ho; Kim, Kyeung Eun; Yi, Sang Duk; Park, Jun Young; Kim, Kyong Su; Hwang, Keum Taek

    2000-04-01

    In 1999, we have been studied (1) on the detection of irradiated foods by ESR spectroscopy, by thermoluminescence, and by viscometry for physical measurements, (2) on the detection of hydrocarbons and 2-alkylcyclobutanones derived from fatty foods by GC/MS for chemical measurements, (3) on the screening and detection of irradiated foods by Comet assay and immunochemical (ELISA) technique for biological or biochemical measurements

  6. Development of detection methods for irradiated foods

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jae Seung; Nam, Hye Seon; Oh, Kyong Nam; Woo, Si Ho; Kim, Kyeung Eun; Yi, Sang Duk; Park, Jun Young; Kim, Kyong Su; Hwang, Keum Taek

    2000-04-01

    In 1999, we have been studied (1) on the detection of irradiated foods by ESR spectroscopy, by thermoluminescence, and by viscometry for physical measurements, (2) on the detection of hydrocarbons and 2-alkylcyclobutanones derived from fatty foods by GC/MS for chemical measurements, (3) on the screening and detection of irradiated foods by Comet assay and immunochemical (ELISA) technique for biological or biochemical measurements.

  7. Towards achieving a reliable leakage detection and localization algorithm for application in water piping networks: an overview

    CSIR Research Space (South Africa)

    Adedeji, KB

    2017-09-01

    Full Text Available Leakage detection and localization in pipelines has become an important aspect of water management systems. Since monitoring leakage in large-scale water distribution networks (WDNs) is a challenging task, the need to develop a reliable and robust...

  8. Reliability of fitness tests using methods and time periods common in sport and occupational management.

    Science.gov (United States)

    Burnstein, Bryan D; Steele, Russell J; Shrier, Ian

    2011-01-01

    Fitness testing is used frequently in many areas of physical activity, but the reliability of these measurements under real-world, practical conditions is unknown. To evaluate the reliability of specific fitness tests using the methods and time periods used in the context of real-world sport and occupational management. Cohort study. Eighteen different Cirque du Soleil shows. Cirque du Soleil physical performers who completed 4 consecutive tests (6-month intervals) and were free of injury or illness at each session (n = 238 of 701 physical performers). Performers completed 6 fitness tests on each assessment date: dynamic balance, Harvard step test, handgrip, vertical jump, pull-ups, and 60-second jump test. We calculated the intraclass coefficient (ICC) and limits of agreement between baseline and each time point and the ICC over all 4 time points combined. Reliability was acceptable (ICC > 0.6) over an 18-month time period for all pairwise comparisons and all time points together for the handgrip, vertical jump, and pull-up assessments. The Harvard step test and 60-second jump test had poor reliability (ICC < 0.6) between baseline and other time points. When we excluded the baseline data and calculated the ICC for 6-month, 12-month, and 18-month time points, both the Harvard step test and 60-second jump test demonstrated acceptable reliability. Dynamic balance was unreliable in all contexts. Limit-of-agreement analysis demonstrated considerable intraindividual variability for some tests and a learning effect by administrators on others. Five of the 6 tests in this battery had acceptable reliability over an 18-month time frame, but the values for certain individuals may vary considerably from time to time for some tests. Specific tests may require a learning period for administrators.

  9. The Development of DNA Based Methods for the Reliable and Efficient Identification of Nicotiana tabacum in Tobacco and Its Derived Products

    Directory of Open Access Journals (Sweden)

    Sukumar Biswas

    2016-01-01

    Full Text Available Reliable methods are needed to detect the presence of tobacco components in tobacco products to effectively control smuggling and classify tariff and excise in tobacco industry to control illegal tobacco trade. In this study, two sensitive and specific DNA based methods, one quantitative real-time PCR (qPCR assay and the other loop-mediated isothermal amplification (LAMP assay, were developed for the reliable and efficient detection of the presence of tobacco (Nicotiana tabacum in various tobacco samples and commodities. Both assays targeted the same sequence of the uridine 5′-monophosphate synthase (UMPS, and their specificities and sensitivities were determined with various plant materials. Both qPCR and LAMP methods were reliable and accurate in the rapid detection of tobacco components in various practical samples, including customs samples, reconstituted tobacco samples, and locally purchased cigarettes, showing high potential for their application in tobacco identification, particularly in the special cases where the morphology or chemical compositions of tobacco have been disrupted. Therefore, combining both methods would facilitate not only the detection of tobacco smuggling control, but also the detection of tariff classification and of excise.

  10. Network reliability analysis of complex systems using a non-simulation-based method

    International Nuclear Information System (INIS)

    Kim, Youngsuk; Kang, Won-Hee

    2013-01-01

    Civil infrastructures such as transportation, water supply, sewers, telecommunications, and electrical and gas networks often establish highly complex networks, due to their multiple source and distribution nodes, complex topology, and functional interdependence between network components. To understand the reliability of such complex network system under catastrophic events such as earthquakes and to provide proper emergency management actions under such situation, efficient and accurate reliability analysis methods are necessary. In this paper, a non-simulation-based network reliability analysis method is developed based on the Recursive Decomposition Algorithm (RDA) for risk assessment of generic networks whose operation is defined by the connections of multiple initial and terminal node pairs. The proposed method has two separate decomposition processes for two logical functions, intersection and union, and combinations of these processes are used for the decomposition of any general system event with multiple node pairs. The proposed method is illustrated through numerical network examples with a variety of system definitions, and is applied to a benchmark gas transmission pipe network in Memphis TN to estimate the seismic performance and functional degradation of the network under a set of earthquake scenarios.

  11. Coupling finite elements and reliability methods - application to safety evaluation of pressurized water reactor vessels

    International Nuclear Information System (INIS)

    Pitner, P.; Venturini, V.

    1995-02-01

    When reliability studies are extended form deterministic calculations in mechanics, it is necessary to take into account input parameters variabilities which are linked to the different sources of uncertainty. Integrals must then be calculated to evaluate the failure risk. This can be performed either by simulation methods, or by approximations ones (FORM/SORM). Model in mechanics often require to perform calculation codes. These ones must then be coupled with the reliability calculations. Theses codes can involve large calculation times when they are invoked numerous times during simulations sequences or in complex iterative procedures. Response surface method gives an approximation of the real response from a reduced number of points for which the finite element code is run. Thus, when it is combined with FORM/SORM methods, a coupling can be carried out which gives results in a reasonable calculation time. An application of response surface method to mechanics reliability coupling for a mechanical model which calls for a finite element code is presented. It corresponds to a probabilistic fracture mechanics study of a pressurized water reactor vessel. (authors). 5 refs., 3 figs

  12. A fracture mechanics and reliability based method to assess non-destructive testings for pressure vessels

    International Nuclear Information System (INIS)

    Kitagawa, Hideo; Hisada, Toshiaki

    1979-01-01

    Quantitative evaluation has not been made on the effects of carrying out preservice and in-service nondestructive tests for securing the soundness, safety and maintainability of pressure vessels, spending large expenses and labor. Especially the problems concerning the time and interval of in-service inspections lack the reasonable, quantitative evaluation method. In this paper, the problems of pressure vessels are treated by having developed the analysis method based on reliability technology and probability theory. The growth of surface cracks in pressure vessels was estimated, using the results of previous studies. The effects of nondestructive inspection on the defects in pressure vessels were evaluated, and the influences of many factors, such as plate thickness, stress, the accuracy of inspection and so on, on the effects of inspection, and the method of evaluating the inspections at unequal intervals were investigated. The analysis of reliability taking in-service inspection into consideration, the evaluation of in-service inspection and other affecting factors through the typical examples of analysis, and the review concerning the time of inspection are described. The method of analyzing the reliability of pressure vessels, considering the growth of defects and preservice and in-service nondestructive tests, was able to be systematized so as to be practically usable. (Kako, I.)

  13. Acoustic feedwater heater leak detection: Industry application of low ampersand high frequency detection increases response and reliability

    International Nuclear Information System (INIS)

    Woyshner, W.S.; Bryson, T.; Robertson, M.O.

    1993-01-01

    The Electric Power Research Institute has sponsored research associated with acoustic Feedwater Heater Leak Detection since the early 1980s. Results indicate that this technology is economically beneficial and dependable. Recent research work has employed acoustic sensors and signal conditioning with wider frequency range response and background noise elimination techniques to provide increased accuracy and dependability. Dual frequency sensors have been applied at a few facilities to provide information on this application of dual frequency response. Sensor mounting methods and attenuation due to various mounting configurations are more conclusively understood. These are depicted and discussed in detail. The significance of trending certain plant parameters such as heat cycle flows, heater vent and drain valve position, proper relief valve operation, etc. is also addressed. Test data were collected at various facilities to monitor the effect of varying several related operational parameters. A group of FWHLD Users have been involved from the inception of the project and reports on their latest successes and failures, along with various data depicting early detection of FWHLD tube leaks, will be included. 3 refs., 12 figs., 1 tab

  14. European-American workshop: Determination of reliability and validation methods on NDE. Proceedings

    International Nuclear Information System (INIS)

    1997-01-01

    The invited papers focused on the following issues: 1. The different technical and scientific approaches to the problem of how to guarantees or demonstrate the reliability of NDE: a. Application of established prescriptive standards, b. Probabilities of Detection (PDO) and False Alarm (PFA) from blind trials, c. POD and PFA from signal statistics, d. Modeling, e. ''Technical Justification''; 2. The dissimilar validation/qualification concepts used in different industries in Europe and North America: a. Nuclear Power Generation, b. Aerospace Industry, c. Offcshore Industry and d. Service Companies

  15. PROOF OF CONCEPT FOR A HUMAN RELIABILITY ANALYSIS METHOD FOR HEURISTIC USABILITY EVALUATION OF SOFTWARE

    International Nuclear Information System (INIS)

    Ronald L. Boring; David I. Gertman; Jeffrey C. Joe; Julie L. Marble

    2005-01-01

    An ongoing issue within human-computer interaction (HCI) is the need for simplified or ''discount'' methods. The current economic slowdown has necessitated innovative methods that are results driven and cost effective. The myriad methods of design and usability are currently being cost-justified, and new techniques are actively being explored that meet current budgets and needs. Recent efforts in human reliability analysis (HRA) are highlighted by the ten-year development of the Standardized Plant Analysis Risk HRA (SPAR-H) method. The SPAR-H method has been used primarily for determining human centered risk at nuclear power plants. The SPAR-H method, however, shares task analysis underpinnings with HCI. Despite this methodological overlap, there is currently no HRA approach deployed in heuristic usability evaluation. This paper presents an extension of the existing SPAR-H method to be used as part of heuristic usability evaluation in HCI

  16. Comparison of sample preparation methods for reliable plutonium and neptunium urinalysis using automatic extraction chromatography

    DEFF Research Database (Denmark)

    Qiao, Jixin; Xu, Yihong; Hou, Xiaolin

    2014-01-01

    This paper describes improvement and comparison of analytical methods for simultaneous determination of trace-level plutonium and neptunium in urine samples by inductively coupled plasma mass spectrometry (ICP-MS). Four sample pre-concentration techniques, including calcium phosphate, iron......), it endows urinalysis methods with better reliability and repeatability compared with co-precipitation techniques. In view of the applicability of different pre-concentration techniques proposed previously in the literature, the main challenge behind relevant method development is pointed to be the release...

  17. Reliability of different methods used for forming of working samples in the laboratory for seed testing

    Directory of Open Access Journals (Sweden)

    Opra Branislava

    2000-01-01

    Full Text Available The testing of seed quality starts from the moment a sample is formed in a warehouse during processing or packaging of the seed. The seed sampling as the process of obtaining the working sample also assumes each step undertaken during its testing in the laboratory. With the aim of appropriate forming of a seed sample in the laboratory, the usage of seed divider is prescribed for large seeded species (such as seed the size of wheat or larger (ISTA Rules, 1999. The aim of this paper was the comparison of different methods used for obtaining the working samples of maize and wheat seeds using conical, soil and centrifugal dividers. The number of seed of added admixtures confirmed the reliability of working samples formation. To each maize sample (1000 g 10 seeds of the following admixtures were added: Zea mays L. (red pericarp, Hordeum vulgäre L., Triticum aestivum L., and Glycine max (L. Merr. Two methods were used for formation of maze seed working sample. To wheat samples (1000 g 10 seeds of each of the following species were added: Avena saliva (hulled seeds, Hordeum vulgäre L., Galium tricorne Stokes, and Polygonum lapatifolmm L. For formation of wheat seed working samples four methods were used. Optimum of 9, but not less than 7 seeds of admixture were due to be determined in the maize seed working sample, while for wheat, at least one seed of admixture was expected to be found in the working sample. The obtained results confirmed that the formation of the maize seed working samples was the most reliable when centrifugal divider, the first method was used (average of admixture - 9.37. From the observed admixtures the seed of Triticum aestivum L. was the most uniformly distributed, the first method also being used (6.93. The second method gains high average values satisfying the given criterion, but it should be used with previous homogenization of the sample being tested. The forming of wheat seed working samples is the most reliable if the

  18. Development of an analysis rule of diagnosis error for standard method of human reliability analysis

    International Nuclear Information System (INIS)

    Jeong, W. D.; Kang, D. I.; Jeong, K. S.

    2003-01-01

    This paper presents the status of development of Korea standard method for Human Reliability Analysis (HRA), and proposed a standard procedure and rules for the evaluation of diagnosis error probability. The quality of KSNP HRA was evaluated using the requirement of ASME PRA standard guideline, and the design requirement for the standard HRA method was defined. Analysis procedure and rules, developed so far, to analyze diagnosis error probability was suggested as a part of the standard method. And also a study of comprehensive application was performed to evaluate the suitability of the proposed rules

  19. Review on Laryngeal Palpation Methods in Muscle Tension Dysphonia: Validity and Reliability Issues.

    Science.gov (United States)

    Khoddami, Seyyedeh Maryam; Ansari, Noureddin Nakhostin; Jalaie, Shohreh

    2015-07-01

    Laryngeal palpation is a common clinical method for the assessment of neck and laryngeal muscles in muscle tension dysphonia (MTD). To review the available laryngeal palpation methods used in patients with MTD for the assessment, diagnosis, or document of treatment outcomes. A systematic review of the literature concerning palpatory methods in MTD was conducted using the databases MEDLINE (PubMed), ScienceDirect, Scopus, Web of science, Web of knowledge and Cochrane Library between July and October 2013. Relevant studies were identified by one reviewer based on screened titles/abstracts and full texts. Manual searching was also used to track the source literature. There were five main as well as miscellaneous palpation methods that were different according to target anatomical structures, judgment or grading system, and using tasks. There were only a few scales available, and the majority of the palpatory methods were qualitative. Most of the palpatory methods evaluate the tension at both static and dynamic tasks. There was little information about the validity and reliability of the available methods. The literature on the scientific evidence of muscle tension indicators perceived by laryngeal palpation in MTD is scarce. Future studies should be conducted to investigate the validity and reliability of palpation methods. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  20. DEPEND-HRA-A method for consideration of dependency in human reliability analysis

    International Nuclear Information System (INIS)

    Cepin, Marko

    2008-01-01

    A consideration of dependencies between human actions is an important issue within the human reliability analysis. A method was developed, which integrates the features of existing methods and the experience from a full scope plant simulator. The method is used on real plant-specific human reliability analysis as a part of the probabilistic safety assessment of a nuclear power plant. The method distinguishes dependency for pre-initiator events from dependency for initiator and post-initiator events. The method identifies dependencies based on scenarios, where consecutive human actions are modeled, and based on a list of minimal cut sets, which is obtained by running the minimal cut set analysis considering high values of human error probabilities in the evaluation. A large example study, which consisted of a large number of human failure events, demonstrated the applicability of the method. Comparative analyses that were performed show that both selection of dependency method and selection of dependency levels within the method largely impact the results of probabilistic safety assessment. If the core damage frequency is not impacted much, the listings of important basic events in terms of risk increase and risk decrease factors may change considerably. More efforts are needed on the subject, which will prepare the background for more detailed guidelines, which will remove the subjectivity from the evaluations as much as it is possible

  1. Method of administration of PROMIS scales did not significantly impact score level, reliability, or validity

    DEFF Research Database (Denmark)

    Bjorner, Jakob B; Rose, Matthias; Gandek, Barbara

    2014-01-01

    OBJECTIVES: To test the impact of the method of administration (MOA) on score level, reliability, and validity of scales developed in the Patient Reported Outcomes Measurement Information System (PROMIS). STUDY DESIGN AND SETTING: Two nonoverlapping parallel forms each containing eight items from......, no significant mode differences were found and all confidence intervals were within the prespecified minimal important difference of 0.2 standard deviation. Parallel-forms reliabilities were very high (ICC = 0.85-0.93). Only one across-mode ICC was significantly lower than the same-mode ICC. Tests of validity...... questionnaire (PQ), personal digital assistant (PDA), or personal computer (PC) and a second form by PC, in the same administration. Method equivalence was evaluated through analyses of difference scores, intraclass correlations (ICCs), and convergent/discriminant validity. RESULTS: In difference score analyses...

  2. Establishing survey validity and reliability for American Indians through "think aloud" and test-retest methods.

    Science.gov (United States)

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L; Burgess, Katherine M; Puumala, Susan E; Wilton, Georgiana; Hanson, Jessica D

    2015-06-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a "think aloud" methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test-retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test-retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. © The Author(s) 2015.

  3. Reliability and Sensitivity Analysis for Laminated Composite Plate Using Response Surface Method

    International Nuclear Information System (INIS)

    Lee, Seokje; Kim, Ingul; Jang, Moonho; Kim, Jaeki; Moon, Jungwon

    2013-01-01

    Advanced fiber-reinforced laminated composites are widely used in various fields of engineering to reduce weight. The material property of each ply is well known; specifically, it is known that ply is less reliable than metallic materials and very sensitive to the loading direction. Therefore, it is important to consider this uncertainty in the design of laminated composites. In this study, reliability analysis is conducted using Callosum and Meatball interactions for a laminated composite plate for the case in which the tip deflection is the design requirement and the material property is a random variable. Furthermore, the efficiency and accuracy of the approximation method is identified, and a probabilistic sensitivity analysis is conducted. As a result, we can prove the applicability of the advanced design method for the stabilizer of an underwater vehicle

  4. Reliability and Sensitivity Analysis for Laminated Composite Plate Using Response Surface Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seokje; Kim, Ingul [Chungnam National Univ., Daejeon (Korea, Republic of); Jang, Moonho; Kim, Jaeki; Moon, Jungwon [LIG Nex1, Yongin (Korea, Republic of)

    2013-04-15

    Advanced fiber-reinforced laminated composites are widely used in various fields of engineering to reduce weight. The material property of each ply is well known; specifically, it is known that ply is less reliable than metallic materials and very sensitive to the loading direction. Therefore, it is important to consider this uncertainty in the design of laminated composites. In this study, reliability analysis is conducted using Callosum and Meatball interactions for a laminated composite plate for the case in which the tip deflection is the design requirement and the material property is a random variable. Furthermore, the efficiency and accuracy of the approximation method is identified, and a probabilistic sensitivity analysis is conducted. As a result, we can prove the applicability of the advanced design method for the stabilizer of an underwater vehicle.

  5. INNOVATIVE METHODS TO EVALUATE THE RELIABILITY OF INFORMATION CONSOLIDATED FINANCIAL STATEMENTS

    Directory of Open Access Journals (Sweden)

    Irina P. Kurochkina

    2014-01-01

    Full Text Available The article explores the possibility of using foreign innovative methods to assess the reliabilityof information consolidated fi nancial statements of Russian companies. Recommendations aremade under their adaptation and applicationinto commercial organizations. Banish methodindicators are implemented in one of the world’s largest vertically integrated steel and miningcompanies. Audit firms are proposed to usemethods of assessing the reliability of information in the practical application of ISA.

  6. Use of simulation methods in the evaluation of reliability and availability of complex system

    International Nuclear Information System (INIS)

    Maigret, N.; Duchemin, B.; Robert, T.; Villeneuve, J.J. de; Lanore, J.M.

    1982-04-01

    After a short review of the available standard methods in the reliability field like Boolean algebra for fault tree and the semi-regeneration theory for Markov, this paper shows how the BIAF code based on state description of a system and simulation techique can solve many problems. It also shows how the use of importance sampling and biasing techniques allows us to deal with the rare event problem

  7. A new method for evaluating the availability, reliability, and maintainability whatever may be the probability law

    International Nuclear Information System (INIS)

    Doyon, L.R.; CEA Centre d'Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette

    1975-01-01

    A simple method is presented for computer solving every system model (availability, reliability, and maintenance) with intervals between failures, and time duration for repairs distributed according to any probability law, and for any maintainance policy. A matrix equation is obtained using Markov diagrams. An example is given with the solution by the APAFS program (Algorithme Pour l'Analyse de la Fiabilite des Systemes) [fr

  8. Reliability design of a critical facility: An application of PRA methods

    International Nuclear Information System (INIS)

    Souza Vieira Neto, A.; Souza Borges, W. de

    1987-01-01

    Although a general agreement concerning the enforcement of reliability (probabilistic) design criteria for nuclear utilities is yet to be achieved. PRA methodology can still be used successfully as a project design and review tool, aimed at improving system's prospective performance or minimizing expected accident consequences. In this paper, the potential of such an application of PRA methods is examined in the special case of a critical design project currently being developed in Brazil. (orig.)

  9. A new fault detection method for computer networks

    International Nuclear Information System (INIS)

    Lu, Lu; Xu, Zhengguo; Wang, Wenhai; Sun, Youxian

    2013-01-01

    Over the past few years, fault detection for computer networks has attracted extensive attentions for its importance in network management. Most existing fault detection methods are based on active probing techniques which can detect the occurrence of faults fast and precisely. But these methods suffer from the limitation of traffic overhead, especially in large scale networks. To relieve traffic overhead induced by active probing based methods, a new fault detection method, whose key is to divide the detection process into multiple stages, is proposed in this paper. During each stage, only a small region of the network is detected by using a small set of probes. Meanwhile, it also ensures that the entire network can be covered after multiple detection stages. This method can guarantee that the traffic used by probes during each detection stage is small sufficiently so that the network can operate without severe disturbance from probes. Several simulation results verify the effectiveness of the proposed method

  10. Comparative study on 4 quantitative detection methods of apoptosis induced by radiation

    International Nuclear Information System (INIS)

    Yang Yepeng; Chen Guanying; Zhou Mei; Shen Qinjian; Shen Lei; Zhu Yingbao

    2004-01-01

    Objective: To reveal the capability of 4 apoptosis-detecting methods to discriminate between apoptosis and necrosis and show their respective advantages and shortcomings through comparison of detected results and analysis of detection mechanism. Methods: Four methods, PI staining-flow cytometric detection (P-F method), TUNEL labeling-flow cytometric detection (T-F method), annexing V-FITC/PI vital staining-flow cytometric detection (A-F method) and Hoechst/PI vital staining-fluorescence microscopic observation (H-O method), were used to determine apoptosis and necrosis in human breast cancer MCF-7 cell line induced by γ-rays. Hydroxycamptothecine and sodium azide were used to induce positive controls of apoptosis and necrosis respectively. Results: All 4 methods showed good time-dependent and dose dependent respondence to apoptosis induced by γ-rays and hydroxycamptothecine. Apoptotic cell ratios and curve slopes obtained from P-F method were minimum and, on the contrary, those from T-F method were maximum among these 4 methods. With A-F method and H-O method, two sets of data, apoptosis and necrosis, could be gained respectively and the data gained from these two methods were close to equal. A-F method and H-O method could distinguish necrosis induced by sodium azide from apoptosis while P-F method and T-F method presented false increase of apoptosis. Conclusions: P-F method and T-F method can not discriminate between apoptosis and necrosis. P-F method is less sensitive but more simple, convenient and economical than T-F method. A-F method and H-O method can distinguish necrosis from apoptosis. A-F method is more costly but more quick and reliable than H-O method. H-O method is economical, practical and morphological changes of cells and nucleus can be observed simultaneously with it. (authors)

  11. Data collection on the unit control room simulator as a method of operator reliability analysis

    International Nuclear Information System (INIS)

    Holy, J.

    1998-01-01

    The report consists of the following chapters: (1) Probabilistic assessment of nuclear power plant operation safety and human factor reliability analysis; (2) Simulators and simulations as human reliability analysis tools; (3) DOE project for using the collection and analysis of data from the unit control room simulator in human factor reliability analysis at the Paks nuclear power plant; (4) General requirements for the organization of the simulator data collection project; (5) Full-scale simulator at the Nuclear Power Plants Research Institute in Trnava, Slovakia, used as a training means for operators of the Dukovany NPP; (6) Assessment of the feasibility of quantification of important human actions modelled within a PSA study by employing simulator data analysis; (7) Assessment of the feasibility of using the various exercise topics for the quantification of the PSA model; (8) Assessment of the feasibility of employing the simulator in the analysis of the individual factors affecting the operator's activity; and (9) Examples of application of statistical methods in the analysis of the human reliability factor. (P.A.)

  12. Decreasing inventory of a cement factory roller mill parts using reliability centered maintenance method

    Science.gov (United States)

    Witantyo; Rindiyah, Anita

    2018-03-01

    According to data from maintenance planning and control, it was obtained that highest inventory value is non-routine components. Maintenance components are components which procured based on maintenance activities. The problem happens because there is no synchronization between maintenance activities and the components required. Reliability Centered Maintenance method is used to overcome the problem by reevaluating maintenance activities required components. The case chosen is roller mill system because it has the highest unscheduled downtime record. Components required for each maintenance activities will be determined by its failure distribution, so the number of components needed could be predicted. Moreover, those components will be reclassified from routine component to be non-routine component, so the procurement could be carried out regularly. Based on the conducted analysis, failure happens in almost every maintenance task are classified to become scheduled on condition task, scheduled discard task, schedule restoration task and no schedule maintenance. From 87 used components for maintenance activities are evaluated and there 19 components that experience reclassification from non-routine components to routine components. Then the reliability and need of those components were calculated for one-year operation period. Based on this invention, it is suggested to change all of the components in overhaul activity to increase the reliability of roller mill system. Besides, the inventory system should follow maintenance schedule and the number of required components in maintenance activity so the value of procurement will be decreased and the reliability system will increase.

  13. Reliability of Lyapunov characteristic exponents computed by the two-particle method

    Science.gov (United States)

    Mei, Lijie; Huang, Li

    2018-03-01

    For highly complex problems, such as the post-Newtonian formulation of compact binaries, the two-particle method may be a better, or even the only, choice to compute the Lyapunov characteristic exponent (LCE). This method avoids the complex calculations of variational equations compared with the variational method. However, the two-particle method sometimes provides spurious estimates to LCEs. In this paper, we first analyze the equivalence in the definition of LCE between the variational and two-particle methods for Hamiltonian systems. Then, we develop a criterion to determine the reliability of LCEs computed by the two-particle method by considering the magnitude of the initial tangent (or separation) vector ξ0 (or δ0), renormalization time interval τ, machine precision ε, and global truncation error ɛT. The reliable Lyapunov characteristic indicators estimated by the two-particle method form a V-shaped region, which is restricted by d0, ε, and ɛT. Finally, the numerical experiments with the Hénon-Heiles system, the spinning compact binaries, and the post-Newtonian circular restricted three-body problem strongly support the theoretical results.

  14. Reliable Detection and Smart Deletion of Malassez Counting Chamber Grid in Microscopic White Light Images for Microbiological Applications.

    Science.gov (United States)

    Denimal, Emmanuel; Marin, Ambroise; Guyot, Stéphane; Journaux, Ludovic; Molin, Paul

    2015-08-01

    In biology, hemocytometers such as Malassez slides are widely used and are effective tools for counting cells manually. In a previous work, a robust algorithm was developed for grid extraction in Malassez slide images. This algorithm was evaluated on a set of 135 images and grids were accurately detected in most cases, but there remained failures for the most difficult images. In this work, we present an optimization of this algorithm that allows for 100% grid detection and a 25% improvement in grid positioning accuracy. These improvements make the algorithm fully reliable for grid detection. This optimization also allows complete erasing of the grid without altering the cells, which eases their segmentation.

  15. A dynamic particle filter-support vector regression method for reliability prediction

    International Nuclear Information System (INIS)

    Wei, Zhao; Tao, Tao; ZhuoShu, Ding; Zio, Enrico

    2013-01-01

    Support vector regression (SVR) has been applied to time series prediction and some works have demonstrated the feasibility of its use to forecast system reliability. For accuracy of reliability forecasting, the selection of SVR's parameters is important. The existing research works on SVR's parameters selection divide the example dataset into training and test subsets, and tune the parameters on the training data. However, these fixed parameters can lead to poor prediction capabilities if the data of the test subset differ significantly from those of training. Differently, the novel method proposed in this paper uses particle filtering to estimate the SVR model parameters according to the whole measurement sequence up to the last observation instance. By treating the SVR training model as the observation equation of a particle filter, our method allows updating the SVR model parameters dynamically when a new observation comes. Because of the adaptability of the parameters to dynamic data pattern, the new PF–SVR method has superior prediction performance over that of standard SVR. Four application results show that PF–SVR is more robust than SVR to the decrease of the number of training data and the change of initial SVR parameter values. Also, even if there are trends in the test data different from those in the training data, the method can capture the changes, correct the SVR parameters and obtain good predictions. -- Highlights: •A dynamic PF–SVR method is proposed to predict the system reliability. •The method can adjust the SVR parameters according to the change of data. •The method is robust to the size of training data and initial parameter values. •Some cases based on both artificial and real data are studied. •PF–SVR shows superior prediction performance over standard SVR

  16. Reliable allele detection using SNP-based PCR primers containing Locked Nucleic Acid: application in genetic mapping

    Directory of Open Access Journals (Sweden)

    Trognitz Friederike

    2007-02-01

    Full Text Available Abstract Background The diploid, Solanum caripense, a wild relative of potato and tomato, possesses valuable resistance to potato late blight and we are interested in the genetic base of this resistance. Due to extremely low levels of genetic variation within the S. caripense genome it proved impossible to generate a dense genetic map and to assign individual Solanum chromosomes through the use of conventional chromosome-specific SSR, RFLP, AFLP, as well as gene- or locus-specific markers. The ease of detection of DNA polymorphisms depends on both frequency and form of sequence variation. The narrow genetic background of close relatives and inbreds complicates the detection of persisting, reduced polymorphism and is a challenge to the development of reliable molecular markers. Nonetheless, monomorphic DNA fragments representing not directly usable conventional markers can contain considerable variation at the level of single nucleotide polymorphisms (SNPs. This can be used for the design of allele-specific molecular markers. The reproducible detection of allele-specific markers based on SNPs has been a technical challenge. Results We present a fast and cost-effective protocol for the detection of allele-specific SNPs by applying Sequence Polymorphism-Derived (SPD markers. These markers proved highly efficient for fingerprinting of individuals possessing a homogeneous genetic background. SPD markers are obtained from within non-informative, conventional molecular marker fragments that are screened for SNPs to design allele-specific PCR primers. The method makes use of primers containing a single, 3'-terminal Locked Nucleic Acid (LNA base. We demonstrate the applicability of the technique by successful genetic mapping of allele-specific SNP markers derived from monomorphic Conserved Ortholog Set II (COSII markers mapped to Solanum chromosomes, in S. caripense. By using SPD markers it was possible for the first time to map the S. caripense alleles

  17. The reliability, accuracy and minimal detectable difference of a multi-segment kinematic model of the foot-shoe complex.

    Science.gov (United States)

    Bishop, Chris; Paul, Gunther; Thewlis, Dominic

    2013-04-01

    Kinematic models are commonly used to quantify foot and ankle kinematics, yet no marker sets or models have been proven reliable or accurate when wearing shoes. Further, the minimal detectable difference of a developed model is often not reported. We present a kinematic model that is reliable, accurate and sensitive to describe the kinematics of the foot-shoe complex and lower leg during walking gait. In order to achieve this, a new marker set was established, consisting of 25 markers applied on the shoe and skin surface, which informed a four segment kinematic model of the foot-shoe complex and lower leg. Three independent experiments were conducted to determine the reliability, accuracy and minimal detectable difference of the marker set and model. Inter-rater reliability of marker placement on the shoe was proven to be good to excellent (ICC=0.75-0.98) indicating that markers could be applied reliably between raters. Intra-rater reliability was better for the experienced rater (ICC=0.68-0.99) than the inexperienced rater (ICC=0.38-0.97). The accuracy of marker placement along each axis was <6.7 mm for all markers studied. Minimal detectable difference (MDD90) thresholds were defined for each joint; tibiocalcaneal joint--MDD90=2.17-9.36°, tarsometatarsal joint--MDD90=1.03-9.29° and the metatarsophalangeal joint--MDD90=1.75-9.12°. These thresholds proposed are specific for the description of shod motion, and can be used in future research designed at comparing between different footwear. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Reliability, validity and minimal detectable change of the Mini-BESTest in Greek participants with chronic stroke.

    Science.gov (United States)

    Lampropoulou, Sofia I; Billis, Evdokia; Gedikoglou, Ingrid A; Michailidou, Christina; Nowicky, Alexander V; Skrinou, Dimitra; Michailidi, Fotini; Chandrinou, Danae; Meligkoni, Margarita

    2018-02-23

    This study aimed to investigate the psychometric characteristics of reliability, validity and ability to detect change of a newly developed balance assessment tool, the Mini-BESTest, in Greek patients with stroke. A prospective, observational design study with test-retest measures was conducted. A convenience sample of 21 Greek patients with chronic stroke (14 male, 7 female; age of 63 ± 16 years) was recruited. Two independent examiners administered the scale, for the inter-rater reliability, twice within 10 days for the test-retest reliability. Bland Altman Analysis for repeated measures assessed the absolute reliability and the Standard Error of Measurement (SEM) and the Minimum Detectable Change at 95% confidence interval (MDC 95% ) were established. The Greek Mini-BESTest (Mini-BESTest GR ) was correlated with the Greek Berg Balance Scale (BBS GR ) for assessing the concurrent validity and with the Timed Up and Go (TUG), the Functional Reach Test (FRT) and the Greek Falls Efficacy Scale-International (FES-I GR ) for the convergent validity. The Mini-BESTestGR demonstrated excellent inter-rater reliability (ICC (95%CI) = 0.997 (0.995-0.999, SEM = 0.46) with the scores of two raters within the limits of agreement (mean dif  = -0.143 ± 0.727, p > 0.05) and test-retest reliability (ICC (95%CI) = 0.966 (0.926-0.988), SEM = 1.53). Additionally, the Mini-BESTest GR yielded very strong to moderate correlations with BBS GR (r = 0.924, p reliability and the equally good validity of the Mini-BESTest GR , strongly support its utility in Greek people with chronic stroke. Its ability to identify clinically meaningful changes and falls risk need further investigation.

  19. Data-driven fault detection for industrial processes canonical correlation analysis and projection based methods

    CERN Document Server

    Chen, Zhiwen

    2017-01-01

    Zhiwen Chen aims to develop advanced fault detection (FD) methods for the monitoring of industrial processes. With the ever increasing demands on reliability and safety in industrial processes, fault detection has become an important issue. Although the model-based fault detection theory has been well studied in the past decades, its applications are limited to large-scale industrial processes because it is difficult to build accurate models. Furthermore, motivated by the limitations of existing data-driven FD methods, novel canonical correlation analysis (CCA) and projection-based methods are proposed from the perspectives of process input and output data, less engineering effort and wide application scope. For performance evaluation of FD methods, a new index is also developed. Contents A New Index for Performance Evaluation of FD Methods CCA-based FD Method for the Monitoring of Stationary Processes Projection-based FD Method for the Monitoring of Dynamic Processes Benchmark Study and Real-Time Implementat...

  20. Reliability and validity of a brief method to assess nociceptive flexion reflex (NFR) threshold.

    Science.gov (United States)

    Rhudy, Jamie L; France, Christopher R

    2011-07-01

    The nociceptive flexion reflex (NFR) is a physiological tool to study spinal nociception. However, NFR assessment can take several minutes and expose participants to repeated suprathreshold stimulations. The 4 studies reported here assessed the reliability and validity of a brief method to assess NFR threshold that uses a single ascending series of stimulations (Peak 1 NFR), by comparing it to a well-validated method that uses 3 ascending/descending staircases of stimulations (Staircase NFR). Correlations between the NFR definitions were high, were on par with test-retest correlations of Staircase NFR, and were not affected by participant sex or chronic pain status. Results also indicated the test-retest reliabilities for the 2 definitions were similar. Using larger stimulus increments (4 mAs) to assess Peak 1 NFR tended to result in higher NFR threshold estimates than using the Staircase NFR definition, whereas smaller stimulus increments (2 mAs) tended to result in lower NFR threshold estimates than the Staircase NFR definition. Neither NFR definition was correlated with anxiety, pain catastrophizing, or anxiety sensitivity. In sum, a single ascending series of electrical stimulations results in a reliable and valid estimate of NFR threshold. However, caution may be warranted when comparing NFR thresholds across studies that differ in the ascending stimulus increments. This brief method to assess NFR threshold is reliable and valid; therefore, it should be useful to clinical pain researchers interested in quickly assessing inter- and intra-individual differences in spinal nociceptive processes. Copyright © 2011 American Pain Society. Published by Elsevier Inc. All rights reserved.

  1. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  2. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  3. Matrix-based system reliability method and applications to bridge networks

    International Nuclear Information System (INIS)

    Kang, W.-H.; Song Junho; Gardoni, Paolo

    2008-01-01

    Using a matrix-based system reliability (MSR) method, one can estimate the probabilities of complex system events by simple matrix calculations. Unlike existing system reliability methods whose complexity depends highly on that of the system event, the MSR method describes any general system event in a simple matrix form and therefore provides a more convenient way of handling the system event and estimating its probability. Even in the case where one has incomplete information on the component probabilities and/or the statistical dependence thereof, the matrix-based framework enables us to estimate the narrowest bounds on the system failure probability by linear programming. This paper presents the MSR method and applies it to a transportation network consisting of bridge structures. The seismic failure probabilities of bridges are estimated by use of the predictive fragility curves developed by a Bayesian methodology based on experimental data and existing deterministic models of the seismic capacity and demand. Using the MSR method, the probability of disconnection between each city/county and a critical facility is estimated. The probability mass function of the number of failed bridges is computed as well. In order to quantify the relative importance of bridges, the MSR method is used to compute the conditional probabilities of bridge failures given that there is at least one city disconnected from the critical facility. The bounds on the probability of disconnection are also obtained for cases with incomplete information

  4. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  5. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  6. Radiation detection device and a radiation detection method

    International Nuclear Information System (INIS)

    Blum, A.

    1975-01-01

    A radiation detection device is described including at least one scintillator in the path of radiation emissions from a distributed radiation source; a plurality of photodetectors for viewing each scintillator; a signal processing means, a storage means, and a data processing means that are interconnected with one another and connected to said photodetectors; and display means connected to the data processing means to locate a plurality of radiation sources in said distributed radiation source and to provide an image of the distributed radiation sources. The storage means includes radiation emission response data and location data from a plurality of known locations for use by the data processing means to derive a more accurate image by comparison of radiation responses from known locations with radiation responses from unknown locations. (auth)

  7. Onset of nuclear boiling in forced convection (Method of detection)

    International Nuclear Information System (INIS)

    Rachedi, M.

    1986-01-01

    Local onset of boiling in any pressure water cooling systems, as a PWR for instance, can mean a possible dangerous mismatch between the produced heat and the cooling capabilities. Its consequences can lead to serious accidental conditions and a reliable technique to detect such a phenomenon is therefore of particular need. Most techniques used up to now rely basically on local measurements and assume therefore usually the previous knowledge of the actual hot or boiling spot. The method proposed here based on externally located accelerometers appears to be sensitive to the global behaviour of the mechanical structure and is therefore not particularly bound to any exact localization of the sensors. The vibrations produced in the mechanical structure of the heated assembly are measured by accelerometers placed on the external surfaces that are easily accessible. The onset of the boiling, the growth and condensation of the bubbles on the heated wall, induces a resonance in the structure and an excitation at its particular eigen frequencies. Distinctive peaks are clearly observed in the spectral density function calculated from the accelerometer signal as soon as bubbles are produced. The technique is shown to be very sensitive even at the earliest phase of boiling and quite independent on sensor position. A complete hydrodynamic analysis of the experimental channels have been performed in order to assess the validity of the method both in steady conditions and during rapid power transients

  8. ImageJ: A Free, Easy, and Reliable Method to Measure Leg Ulcers Using Digital Pictures.

    Science.gov (United States)

    Aragón-Sánchez, Javier; Quintana-Marrero, Yurena; Aragón-Hernández, Cristina; Hernández-Herero, María José

    2017-12-01

    Wound measurement to document the healing course of chronic leg ulcers has an important role in the management of these patients. Digital cameras in smartphones are readily available and easy to use, and taking pictures of wounds is becoming a routine in specialized departments. Analyzing digital pictures with appropriate software provides clinicians a quick, clean, and easy-to-use tool for measuring wound area. A set of 25 digital pictures of plain foot and leg ulcers was the basis of this study. Photographs were taken placing a ruler next to the wound in parallel with the healthy skin with the iPhone 6S (Apple Inc, Cupertino, CA), which has a camera of 12 megapixels using the flash. The digital photographs were visualized with ImageJ 1.45s freeware (National Institutes of Health, Rockville, MD; http://imagej.net/ImageJ ). Wound area measurement was carried out by 4 raters: head of the department, wound care nurse, physician, and medical student. We assessed intra- and interrater reliability using the interclass correlation coefficient. To determine intraobserver reliability, 2 of the raters repeated the measurement of the set 1 week after the first reading. The interrater model displayed an interclass correlation coefficient of 0.99 with 95% confidence interval of 0.999 to 1.000, showing excellent reliability. The intrarater model of both examiners showed excellent reliability. In conclusion, analyzing digital images of leg ulcers with ImageJ estimates wound area with excellent reliability. This method provides a free, rapid, and accurate way to measure wounds and could routinely be used to document wound healing in daily clinical practice.

  9. Reliability and Validity of 3 Methods of Assessing Orthopedic Resident Skill in Shoulder Surgery.

    Science.gov (United States)

    Bernard, Johnathan A; Dattilo, Jonathan R; Srikumaran, Uma; Zikria, Bashir A; Jain, Amit; LaPorte, Dawn M

    Traditional measures for evaluating resident surgical technical skills (e.g., case logs) assess operative volume but not level of surgical proficiency. Our goal was to compare the reliability and validity of 3 tools for measuring surgical skill among orthopedic residents when performing 3 open surgical approaches to the shoulder. A total of 23 residents at different stages of their surgical training were tested for technical skill pertaining to 3 shoulder surgical approaches using the following measures: Objective Structured Assessment of Technical Skills (OSATS) checklists, the Global Rating Scale (GRS), and a final pass/fail assessment determined by 3 upper extremity surgeons. Adverse events were recorded. The Cronbach α coefficient was used to assess reliability of the OSATS checklists and GRS scores. Interrater reliability was calculated with intraclass correlation coefficients. Correlations among OSATS checklist scores, GRS scores, and pass/fail assessment were calculated with Spearman ρ. Validity of OSATS checklists was determined using analysis of variance with postgraduate year (PGY) as a between-subjects factor. Significance was set at p shoulder approaches. Checklist scores showed superior interrater reliability compared with GRS and subjective pass/fail measurements. GRS scores were positively correlated across training years. The incidence of adverse events was significantly higher among PGY-1 and PGY-2 residents compared with more experienced residents. OSATS checklists are a valid and reliable assessment of technical skills across 3 surgical shoulder approaches. However, checklist scores do not measure quality of technique. Documenting adverse events is necessary to assess quality of technique and ultimate pass/fail status. Multiple methods of assessing surgical skill should be considered when evaluating orthopedic resident surgical performance. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights

  10. A fast and reliable readout method for quantitative analysis of surface-enhanced Raman scattering nanoprobes on chip surface

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Hyejin; Jeong, Sinyoung; Ko, Eunbyeol; Jeong, Dae Hong, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Department of Chemistry Education, Seoul National University, Seoul 151-742 (Korea, Republic of); Kang, Homan [Interdisciplinary Program in Nano-Science and Technology, Seoul National University, Seoul 151-742 (Korea, Republic of); Lee, Yoon-Sik, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Interdisciplinary Program in Nano-Science and Technology, Seoul National University, Seoul 151-742 (Korea, Republic of); School of Chemical and Biological Engineering, Seoul National University, Seoul 151-742 (Korea, Republic of); Lee, Ho-Young, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Department of Nuclear Medicine, Seoul National University Bundang Hospital, Seongnam 463-707 (Korea, Republic of)

    2015-05-15

    Surface-enhanced Raman scattering techniques have been widely used for bioanalysis due to its high sensitivity and multiplex capacity. However, the point-scanning method using a micro-Raman system, which is the most common method in the literature, has a disadvantage of extremely long measurement time for on-chip immunoassay adopting a large chip area of approximately 1-mm scale and confocal beam point of ca. 1-μm size. Alternative methods such as sampled spot scan with high confocality and large-area scan method with enlarged field of view and low confocality have been utilized in order to minimize the measurement time practically. In this study, we analyzed the two methods in respect of signal-to-noise ratio and sampling-led signal fluctuations to obtain insights into a fast and reliable readout strategy. On this basis, we proposed a methodology for fast and reliable quantitative measurement of the whole chip area. The proposed method adopted a raster scan covering a full area of 100 μm × 100 μm region as a proof-of-concept experiment while accumulating signals in the CCD detector for single spectrum per frame. One single scan with 10 s over 100 μm × 100 μm area yielded much higher sensitivity compared to sampled spot scanning measurements and no signal fluctuations attributed to sampled spot scan. This readout method is able to serve as one of key technologies that will bring quantitative multiplexed detection and analysis into practice.

  11. Simple, reliable, and nondestructive method for the measurement of vacuum pressure without specialized equipment.

    Science.gov (United States)

    Yuan, Jin-Peng; Ji, Zhong-Hua; Zhao, Yan-Ting; Chang, Xue-Fang; Xiao, Lian-Tuan; Jia, Suo-Tang

    2013-09-01

    We present a simple, reliable, and nondestructive method for the measurement of vacuum pressure in a magneto-optical trap. The vacuum pressure is verified to be proportional to the collision rate constant between cold atoms and the background gas with a coefficient k, which can be calculated by means of the simple ideal gas law. The rate constant for loss due to collisions with all background gases can be derived from the total collision loss rate by a series of loading curves of cold atoms under different trapping laser intensities. The presented method is also applicable for other cold atomic systems and meets the miniaturization requirement of commercial applications.

  12. Features of an advanced human reliability analysis method, AGAPE-ET

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Kyun

    2005-01-01

    This paper presents the main features of an advanced human reliability analysis (HRA) method, AGAPE-ET. It has the capabilities to deal with the diagnosis failures and the errors of commission (EOC), which have not been normally treated in the conventional HRAs. For the analysis of the potential for diagnosis failures, an analysis framework, which is called the misdiagnosis tree analysis (MDTA), and a taxonomy of the misdiagnosis causes with appropriate quantification schemes are provided. For the identification of the EOC events from the misdiagnosis, some procedural guidance is given. An example of the application of the method is also provided

  13. Features of an advanced human reliability analysis method, AGAPE-ET

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Kyun [Korea Atomic Energy Research Institute, Taejeon (Korea, Republic of)

    2005-11-15

    This paper presents the main features of an advanced human reliability analysis (HRA) method, AGAPE-ET. It has the capabilities to deal with the diagnosis failures and the errors of commission (EOC), which have not been normally treated in the conventional HRAs. For the analysis of the potential for diagnosis failures, an analysis framework, which is called the misdiagnosis tree analysis (MDTA), and a taxonomy of the misdiagnosis causes with appropriate quantification schemes are provided. For the identification of the EOC events from the misdiagnosis, some procedural guidance is given. An example of the application of the method is also provided.

  14. Peak Detection Method Evaluation for Ion Mobility Spectrometry by Using Machine Learning Approaches

    DEFF Research Database (Denmark)

    Hauschild, Anne-Christin; Kopczynski, Dominik; D'Addario, Marianna

    2013-01-01

    machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region......-merging with VisualNow, and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods...

  15. An open source cryostage and software analysis method for detection of antifreeze activity

    DEFF Research Database (Denmark)

    Lørup Buch, Johannes; Ramløv, H

    2016-01-01

    AFP could reliably be told apart from controls after only two minutes of recrystallisation. The goal of providing a fast, cheap and easy method for detecting antifreeze proteins in solution was met, and further development of the system can be followed at https://github.com/pechano/cryostage.......The aim of this study is to provide the reader with a simple setup that can detect antifreeze proteins (AFP) by inhibition of ice recrystallisation in very small sample sizes. This includes an open source cryostage, a method for preparing and loading samples as well as a software analysis method...

  16. A rapid and reliable determination of doxycycline hyclate by HPLC with UV detection in pharmaceutical samples

    Directory of Open Access Journals (Sweden)

    SNEZANA S. MITIC

    2008-06-01

    Full Text Available An accurate, sensitive and reproducible high performance liquid chromatographic (HPLC method for the quantification of doxycycline hyclate in pharmaceutical samples has been developed and validated. The drug and the standard were eluted from a Lichrosorb RP-8 (250 mm´4.6 mm, 10 mm particle size at 20 °C with a mobile phase consisting of methanol, acetonitrile and 0.010 M aqueous solution of oxalic acid (2:3:5, v/v/v. The flow rate was 1.25 ml min-1. A UV detector set at 350 nm was used to monitor the effluent. Each analysis required no longer than 4 min. The limits of detection and quantification were 1.15 and 3.84 μg ml-1, respectively. Recoveries for different concentrations ranged from 99.58 to 101.93 %.

  17. Cervical vertebral maturation method and mandibular growth peak: a longitudinal study of diagnostic reliability.

    Science.gov (United States)

    Perinetti, Giuseppe; Primozic, Jasmina; Sharma, Bhavna; Cioffi, Iacopo; Contardo, Luca

    2018-03-28

    The capability of the cervical vertebral maturation (CVM) method in the identification of the mandibular growth peak on an individual basis remains undetermined. The diagnostic reliability of the six-stage CVM method in the identification of the mandibular growth peak was thus investigated. From the files of the Oregon and Burlington Growth Studies (data obtained between early 1950s and middle 1970s), 50 subjects (26 females, 24 males) with at least seven annual lateral cephalograms taken from 9 to 16 years were identified. Cervical vertebral maturation was assessed according to the CVM code staging system, and mandibular growth was defined as annual increments in Co-Gn distance. A diagnostic reliability analysis was carried out to establish the capability of the circumpubertal CVM stages 2, 3, and 4 in the identification of the imminent mandibular growth peak. Variable durations of each of the CVM stages 2, 3, and 4 were seen. The overall diagnostic accuracy values for the CVM stages 2, 3, and 4 were 0.70, 0.76, and 0.77, respectively. These low values appeared to be due to false positive cases. Secular trends in conjunction with the use of a discrete staging system. In most of the Burlington Growth Study sample, the lateral head film at age 15 was missing. None of the CVM stages 2, 3, and 4 reached a satisfactorily diagnostic reliability in the identification of imminent mandibular growth peak.

  18. Larvas output and influence of human factor in reliability of meat inspection by the method of artificial digestion

    Directory of Open Access Journals (Sweden)

    Đorđević Vesna

    2013-01-01

    Full Text Available On the basis of the performed analyses of the factors that contributed the infected meat reach food chain, we have found out that the infection occurred after consuming the meat inspected by the method of collective samples artificial digestion by using a magnetic stirrer (MM. In this work there are presented assay results which show how modifications of the method, on the level of final sedimentation, influence the reliability of Trichinella larvas detection in the infected meat samples. It has been shown that use of inadequate laboratory containers for larva collecting in final sedimentation and change of volume of digestive liquid that outflow during colouring preparations, can significantly influence inspection results. Larva detection errors ranged from 4 to 80% in presented the experimental groups in regard to the control group of samples inspected by using MM method, which had been carried out completely according to Europe Commission procedure No 2075/2005, where no errors in larva number per sample was found. We consider that the results of this work will contribute to the improvement of control of the method performance and especially of the critical point during inspection of meat samples to Trichinella larvas in Serbia.

  19. Investigation of reliability of EC method for inspection of VVER steam generator tubes

    International Nuclear Information System (INIS)

    Corak, Z.

    2004-01-01

    Complete and accurate non-destructive examinations (NDE) data provides the basis for performing mitigating actions and corrective repairs. It is important that detection and characterization of flaws are done properly at an early stage. EPRI Document PWR Steam Generator Examination Guidelines recommends an approach that is intended to provide the following: Ensure accurate assessment of steam generator tube integrity; Extend the reliable, cost effective, operating life of the steam generators, and Maximize the availability of the unit. Steam Generator Eddy Current Data Analysis Performance Demonstration represents the culmination of the intense two-year industry effort in the development of a performance demonstration program for eddy current testing (ECT) of steam generator tubing. It is referred to as the Industry Database (IDB) and provides a capability for individual organizations to implement SG ECT performance demonstration programs in accordance with the requirements specified in Appendices G and H of the ISI Guidelines. The Appendix G of EPRI Document PWR Steam Generator Examination Guidelines specifies personnel training and qualification requirements for NDE personnel who analyze NDE data for PWR steam generator tubing. Its purpose is to insure a continuing uniform knowledge base and skill level for data analysis. The European methodology document is intended to provide a general framework for development of qualifications for the inspection of specific components to ensure they are developed in a consistent way throughout Europe while still allowing qualification to be tailored in detail to meet different nation requirements. In the European methodology document one will not find a detailed description of how the inspection of a specific component should be qualified. A recommended practice is a document produced by ENIQ to support the production of detailed qualification procedures by individual countries. VVER SG tubes are inspected by EC method but a

  20. Method of detecting a failed fuel

    International Nuclear Information System (INIS)

    Utamura, Motoaki; Urata, Megumi; Uchida, Shunsuke.

    1976-01-01

    Object: To improve detection accuracy of a failed fuel by eliminating a coolant temperature distribution in a fuel assembly. Structure: A failed fuel is detected from contents of nuclear fission products in a coolant by shutting off an upper portion of a fuel assembly provided in the coolant and by sampling the coolant in the fuel assembly. Temperature distribution in the fuel assembly is eliminated, by injecting the higher temperature coolant than that of the coolant inside and outside the fuel assembly when sampling, and thereby replacing the existing coolant in the fuel assembly for the higher temperature coolant. The failed fuel is detected from contents of the fission products existing in the coolant, by sampling the higher temperature coolant of the fuel assembly after a temperature passed. (Moriyama, K.)

  1. Machine Maintenance Scheduling with Reliability Engineering Method and Maintenance Value Stream Mapping

    Science.gov (United States)

    Sembiring, N.; Nasution, A. H.

    2018-02-01

    Corrective maintenance i.e replacing or repairing the machine component after machine break down always done in a manufacturing company. It causes the production process must be stopped. Production time will decrease due to the maintenance team must replace or repair the damage machine component. This paper proposes a preventive maintenance’s schedule for a critical component of a critical machine of an crude palm oil and kernel company due to increase maintenance efficiency. The Reliability Engineering & Maintenance Value Stream Mapping is used as a method and a tool to analize the reliability of the component and reduce the wastage in any process by segregating value added and non value added activities.

  2. Pharyngeal pH alone is not reliable for the detection of pharyngeal reflux events: A study with oesophageal and pharyngeal pH-impedance monitoring

    Science.gov (United States)

    Desjardin, Marie; Roman, Sabine; des Varannes, Stanislas Bruley; Gourcerol, Guillaume; Coffin, Benoit; Ropert, Alain; Mion, François

    2013-01-01

    Background Pharyngeal pH probes and pH-impedance catheters have been developed for the diagnosis of laryngo-pharyngeal reflux. Objective To determine the reliability of pharyngeal pH alone for the detection of pharyngeal reflux events. Methods 24-h pH-impedance recordings performed in 45 healthy subjects with a bifurcated probe for detection of pharyngeal and oesophageal reflux events were reviewed. Pharyngeal pH drops to below 4 and 5 were analysed for the simultaneous occurrence of pharyngeal reflux, gastro-oesophageal reflux, and swallows, according to impedance patterns. Results Only 7.0% of pharyngeal pH drops to below 5 identified with impedance corresponded to pharyngeal reflux, while 92.6% were related to swallows and 10.2 and 13.3% were associated with proximal and distal gastro-oesophageal reflux events, respectively. Of pharyngeal pH drops to below 4, 13.2% were related to pharyngeal reflux, 87.5% were related to swallows, and 18.1 and 21.5% were associated with proximal and distal gastro-oesophageal reflux events, respectively. Conclusions This study demonstrates that pharyngeal pH alone is not reliable for the detection of pharyngeal reflux and that adding distal oesophageal pH analysis is not helpful. The only reliable analysis should take into account impedance patterns demonstrating the presence of pharyngeal reflux event preceded by a distal and proximal reflux event within the oesophagus. PMID:24917995

  3. A review of the evolution of human reliability analysis methods at nuclear industry

    International Nuclear Information System (INIS)

    Oliveira, Lécio N. de; Santos, Isaac José A. Luquetti dos; Carvalho, Paulo V.R.

    2017-01-01

    This paper reviews the status of researches on the application of human reliability analysis methods at nuclear industry and its evolution along the years. Human reliability analysis (HRA) is one of the elements used in Probabilistic Safety Analysis (PSA) and is performed as part of PSAs to quantify the likelihood that people will fail to take action, such as errors of omission and errors of commission. Although HRA may be used at lots of areas, the focus of this paper is to review the applicability of HRA methods along the years at nuclear industry, especially in Nuclear Power Plants (NPP). An electronic search on CAPES Portal of Journals (A bibliographic database) was performed. This literature review covers original papers published since the first generation of HRA methods until the ones published on March 2017. A total of 94 papers were retrieved by the initial search and 13 were selected to be fully reviewed and for data extraction after the application of inclusion and exclusion criteria, quality and suitability evaluation according to applicability at nuclear industry. Results point out that the methods from first generation are more used in practice than methods from second generation. This occurs because it is more concentrated towards quantification, in terms of success or failure of human action what make them useful for quantitative risk assessment to PSA. Although the second generation considers context and error of commission in human error prediction, they are not wider used in practice at nuclear industry to PSA. (author)

  4. Study on Performance Shaping Factors (PSFs) Quantification Method in Human Reliability Analysis (HRA)

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Jang, Inseok Jang; Seong, Poong Hyun; Park, Jinkyun; Kim, Jong Hyun

    2015-01-01

    The purpose of HRA implementation is 1) to achieve the human factor engineering (HFE) design goal of providing operator interfaces that will minimize personnel errors and 2) to conduct an integrated activity to support probabilistic risk assessment (PRA). For these purposes, various HRA methods have been developed such as technique for human error rate prediction (THERP), simplified plant analysis risk human reliability assessment (SPAR-H), cognitive reliability and error analysis method (CREAM) and so on. In performing HRA, such conditions that influence human performances have been represented via several context factors called performance shaping factors (PSFs). PSFs are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. Most HRA methods evaluate the weightings of PSFs by expert judgment and explicit guidance for evaluating the weighting is not provided. It has been widely known that the performance of the human operator is one of the critical factors to determine the safe operation of NPPs. HRA methods have been developed to identify the possibility and mechanism of human errors. In performing HRA methods, the effect of PSFs which may increase or decrease human error should be investigated. However, the effect of PSFs were estimated by expert judgment so far. Accordingly, in order to estimate the effect of PSFs objectively, the quantitative framework to estimate PSFs by using PSF profiles is introduced in this paper

  5. A review of the evolution of human reliability analysis methods at nuclear industry

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Lécio N. de; Santos, Isaac José A. Luquetti dos; Carvalho, Paulo V.R., E-mail: lecionoliveira@gmail.com, E-mail: luquetti@ien.gov.br, E-mail: paulov@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-11-01

    This paper reviews the status of researches on the application of human reliability analysis methods at nuclear industry and its evolution along the years. Human reliability analysis (HRA) is one of the elements used in Probabilistic Safety Analysis (PSA) and is performed as part of PSAs to quantify the likelihood that people will fail to take action, such as errors of omission and errors of commission. Although HRA may be used at lots of areas, the focus of this paper is to review the applicability of HRA methods along the years at nuclear industry, especially in Nuclear Power Plants (NPP). An electronic search on CAPES Portal of Journals (A bibliographic database) was performed. This literature review covers original papers published since the first generation of HRA methods until the ones published on March 2017. A total of 94 papers were retrieved by the initial search and 13 were selected to be fully reviewed and for data extraction after the application of inclusion and exclusion criteria, quality and suitability evaluation according to applicability at nuclear industry. Results point out that the methods from first generation are more used in practice than methods from second generation. This occurs because it is more concentrated towards quantification, in terms of success or failure of human action what make them useful for quantitative risk assessment to PSA. Although the second generation considers context and error of commission in human error prediction, they are not wider used in practice at nuclear industry to PSA. (author)

  6. Method to detect steam generator tube leakage

    International Nuclear Information System (INIS)

    Watabe, Kiyomi

    1994-01-01

    It is important for plant operation to detect minor leakages from the steam generator tube at an early stage, thus, leakage detection has been performed using a condenser air ejector gas monitor and a steam generator blow down monitor, etc. In this study highly-sensitive main steam line monitors have been developed in order to identify leakages in the steam generator more quickly and accurately. The performance of the monitors was verified and the demonstration test at the actual plant was conducted for their intended application to the plants. (author)

  7. Reliability considerations of NDT by probability of detection (POD). Determination using ultrasound phased array. Results from a project in frame of the German nuclear safety research program

    International Nuclear Information System (INIS)

    Kurz, Jochen H.; Dugan, Sandra; Juengert, Anne

    2013-01-01

    Reliable assessment procedures are an important aspect of maintenance concepts. Non-destructive testing (NDT) methods are an essential part of a variety of maintenance plans. Fracture mechanical assessments require knowledge of flaw dimensions, loads and material parameters. NDT methods are able to acquire information on all of these areas. However, it has to be considered that the level of detail information depends on the case investigated and therefore on the applicable methods. Reliability aspects of NDT methods are of importance if quantitative information is required. Different design concepts e.g. the damage tolerance approach in aerospace already include reliability criteria of NDT methods applied in maintenance plans. NDT is also an essential part during construction and maintenance of nuclear power plants. In Germany, type and extent of inspection are specified in Safety Standards of the Nuclear Safety Standards Commission (KTA). Only certified inspections are allowed in the nuclear industry. The qualification of NDT is carried out in form of performance demonstrations of the inspection teams and the equipment, witnessed by an authorized inspector. The results of these tests are mainly statements regarding the detection capabilities of certain artificial flaws. In other countries, e.g. the U.S., additional blind tests on test blocks with hidden and unknown flaws may be required, in which a certain percentage of these flaws has to be detected. The knowledge of the probability of detection (POD) curves of specific flaws in specific testing conditions is often not present. This paper shows the results of a research project designed for POD determination of ultrasound phased array inspections of real and artificial cracks. The continuative objective of this project was to generate quantitative POD results. The distribution of the crack sizes of the specimens and the inspection planning is discussed, and results of the ultrasound inspections are presented. In

  8. METHODS OF IMPROVING THE RELIABILITY OF THE CONTROL SYSTEM TRACTION POWER SUPPLY OF ELECTRIC TRANSPORT BASED ON AN EXPERT INFORMATION

    Directory of Open Access Journals (Sweden)

    O. O. Matusevych

    2009-03-01

    Full Text Available The author proposed the numerous methods of solving the multi-criterion task – increasing of reliability of control system on the basis of expert information. The information, which allows choosing thoughtfully the method of reliability increasing for a control system of electric transport, is considered.

  9. Reliability Analysis of a Composite Wind Turbine Blade Section Using the Model Correction Factor Method: Numerical Study and Validation

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov; Friis-Hansen, Peter; Berggreen, Christian

    2013-01-01

    by the composite failure criteria. Each failure mode has been considered in a separate component reliability analysis, followed by a system analysis which gives the total probability of failure of the structure. The Model Correction Factor method used in connection with FORM (First-Order Reliability Method) proved...

  10. A comparison of moving object detection methods for real-time moving object detection

    Science.gov (United States)

    Roshan, Aditya; Zhang, Yun

    2014-06-01

    Moving object detection has a wide variety of applications from traffic monitoring, site monitoring, automatic theft identification, face detection to military surveillance. Many methods have been developed across the globe for moving object detection, but it is very difficult to find one which can work globally in all situations and with different types of videos. The purpose of this paper is to evaluate existing moving object detection methods which can be implemented in software on a desktop or laptop, for real time object detection. There are several moving object detection methods noted in the literature, but few of them are suitable for real time moving object detection. Most of the methods which provide for real time movement are further limited by the number of objects and the scene complexity. This paper evaluates the four most commonly used moving object detection methods as background subtraction technique, Gaussian mixture model, wavelet based and optical flow based methods. The work is based on evaluation of these four moving object detection methods using two (2) different sets of cameras and two (2) different scenes. The moving object detection methods have been implemented using MatLab and results are compared based on completeness of detected objects, noise, light change sensitivity, processing time etc. After comparison, it is observed that optical flow based method took least processing time and successfully detected boundary of moving objects which also implies that it can be implemented for real-time moving object detection.

  11. An Entropy-Based Network Anomaly Detection Method

    Directory of Open Access Journals (Sweden)

    Przemysław Bereziński

    2015-04-01

    Full Text Available Data mining is an interdisciplinary subfield of computer science involving methods at the intersection of artificial intelligence, machine learning and statistics. One of the data mining tasks is anomaly detection which is the analysis of large quantities of data to identify items, events or observations which do not conform to an expected pattern. Anomaly detection is applicable in a variety of domains, e.g., fraud detection, fault detection, system health monitoring but this article focuses on application of anomaly detection in the field of network intrusion detection.The main goal of the article is to prove that an entropy-based approach is suitable to detect modern botnet-like malware based on anomalous patterns in network. This aim is achieved by realization of the following points: (i preparation of a concept of original entropy-based network anomaly detection method, (ii implementation of the method, (iii preparation of original dataset, (iv evaluation of the method.

  12. Mammographic casting-type calcification associated with small screen-detected invasive breast cancers: is this a reliable prognostic indicator?

    International Nuclear Information System (INIS)

    Peacock, C.; Given-Wilson, R.M.; Duffy, S.W.

    2004-01-01

    AIM: The aim of the present study was to establish whether mammographic casting-type calcification associated with small screen-detected invasive breast cancers is a reliable prognostic indicator. METHODS AND MATERIALS: We retrospectively identified 50 consecutive women diagnosed with an invasive cancer less than 15 mm who showed associated casting calcification on their screening mammograms. Controls were identified that showed no microcalcification and were matched for tumour size, histological type and lymph node status. A minimum of 5 years follow-up was obtained, noting recurrence and outcome. Conditional and unconditional logistic regression, depending on the outcome variable, were used to analyse the data, taking the matched design into account in both cases. Where small numbers prohibited the use of logistic regression, Fisher's exact test was used. RESULTS: Five deaths from breast cancer occurred out of the 50 cases, of which three were lymph node positive, two were lymph node negative and none were grade 3. None of the 78 control cases died from breast cancer. The difference in breast cancer death rates was significant by Fisher's exact test (p=0.02). Risk of recurrence was also significantly increased in the casting cases (OR=3.55, 95% CI 1.02-12.33, p=0.046). CONCLUSION: Although the overall outcome for small screen-detected breast cancers is good, our study suggests that casting calcification is a poorer prognostic factor. The advantage of a mammographic feature as an independent prognostic indicator lies in early identification of high-risk patients, allowing optimization of management

  13. Detection methods for centrifugal microfluidic platforms

    DEFF Research Database (Denmark)

    Burger, Robert; Amato, Letizia; Boisen, Anja

    2016-01-01

    Centrifugal microfluidics has attracted much interest from academia as well as industry, since it potentially offers solutions for affordable, user-friendly and portable biosensing. A wide range of so-called fluidic unit operations, e.g. mixing, metering, liquid routing, and particle separation...... for the centrifugal microfluidics platform and cover optical as well as mechanical and electrical detection principles....

  14. Molecular Methods for Detection of Antimicrobial Resistance

    DEFF Research Database (Denmark)

    Anjum, Muna F.; Zankari, Ea; Hasman, Henrik

    2017-01-01

    The increase in bacteria harboring antimicrobial resistance (AMR) is a global problem because there is a paucity of antibiotics available to treat multidrug-resistant bacterial infections in humans and animals. Detection of AMR present in bacteria that may pose a threat to veterinary and public...

  15. Steam generator leak detection using acoustic method

    International Nuclear Information System (INIS)

    Goluchko, V.V.; Sokolov, B.M.; Bulanov, A.N.

    1982-05-01

    The main requirements to meet by a device for leak detection in sodium - water steam generators are determined. The potentialities of instrumentation designed based on the developed requirements have been tested using a model of a 550 kw steam generator [fr

  16. Ultrasound Imaging Methods for Breast Cancer Detection

    NARCIS (Netherlands)

    Ozmen, N.

    2014-01-01

    The main focus of this thesis is on modeling acoustic wavefield propagation and implementing imaging algorithms for breast cancer detection using ultrasound. As a starting point, we use an integral equation formulation, which can be used to solve both the forward and inverse problems. This thesis

  17. The reliability and internal consistency of one-shot and flicker change detection for measuring individual differences in visual working memory capacity.

    Science.gov (United States)

    Pailian, Hrag; Halberda, Justin

    2015-04-01

    We investigated the psychometric properties of the one-shot change detection task for estimating visual working memory (VWM) storage capacity-and also introduced and tested an alternative flicker change detection task for estimating these limits. In three experiments, we found that the one-shot whole-display task returns estimates of VWM storage capacity (K) that are unreliable across set sizes-suggesting that the whole-display task is measuring different things at different set sizes. In two additional experiments, we found that the one-shot single-probe variant shows improvements in the reliability and consistency of K estimates. In another additional experiment, we found that a one-shot whole-display-with-click task (requiring target localization) also showed improvements in reliability and consistency. The latter results suggest that the one-shot task can return reliable and consistent estimates of VWM storage capacity (K), and they highlight the possibility that the requirement to localize the changed target is what engenders this enhancement. Through a final series of four experiments, we introduced and tested an alternative flicker change detection method that also requires the observer to localize the changing target and that generates, from response times, an estimate of VWM storage capacity (K). We found that estimates of K from the flicker task correlated with estimates from the traditional one-shot task and also had high reliability and consistency. We highlight the flicker method's ability to estimate executive functions as well as VWM storage capacity, and discuss the potential for measuring multiple abilities with the one-shot and flicker tasks.

  18. Detection of HBsAg and Anti HBc on donors of a blood bank by IRMA and ELISA methods

    International Nuclear Information System (INIS)

    Freire Martinez, D.Y.

    1985-10-01

    Comparative evaluation of two methods, Immunoradiometric Assay (IRMA) and Enzyme Immunoassay (ELISA), for detecting HBsAg and Anti HBc was made for determining which is the most advantageous and reliable. The study was made on 300 donors of the Hospital San Juan de Dios Blood Bank. In comparison with the reference method (IRMA), ELISA shows 91.67% of sensitivity. The Anti HBc detection by IRMA is more reliable than the HBsAg detection by IRMA and ELISA for determining the carrier state

  19. Can the comet assay be used reliably to detect nanoparticle-induced genotoxicity?

    Science.gov (United States)

    Karlsson, Hanna L; Di Bucchianico, Sebastiano; Collins, Andrew R; Dusinska, Maria

    2015-03-01

    The comet assay is a sensitive method to detect DNA strand breaks as well as oxidatively damaged DNA at the level of single cells. Today the assay is commonly used in nano-genotoxicology. In this review we critically discuss possible interactions between nanoparticles (NPs) and the comet assay. Concerns for such interactions have arisen from the occasional observation of NPs in the "comet head", which implies that NPs may be present while the assay is being performed. This could give rise to false positive or false negative results, depending on the type of comet assay endpoint and NP. For most NPs, an interaction that substantially impacts the comet assay results is unlikely. For photocatalytically active NPs such as TiO2 , on the other hand, exposure to light containing UV can lead to increased DNA damage. Samples should therefore not be exposed to such light. By comparing studies in which both the comet assay and the micronucleus assay have been used, a good consistency between the assays was found in general (69%); consistency was even higher when excluding studies on TiO2 NPs (81%). The strong consistency between the comet and micronucleus assays for a range of different NPs-even though the two tests measure different endpoints-implies that both can be trusted in assessing the genotoxicity of NPs, and that both could be useful in a standard battery of test methods. © 2014 Wiley Periodicals, Inc.

  20. Reliability of the input admittance of bowed-string instruments measured by the hammer method.

    Science.gov (United States)

    Zhang, Ailin; Woodhouse, Jim

    2014-12-01

    The input admittance at the bridge, measured by hammer testing, is often regarded as the most useful and convenient measurement of the vibrational behavior of a bowed string instrument. However, this method has been questioned, due especially to differences between human bowing and hammer impact. The goal of the research presented here is to investigate the reliability and accuracy of this classic hammer method. Experimental studies were carried out on cellos, with three different driving conditions and three different boundary conditions. Results suggest that there is nothing fundamentally different about the hammer method, compared to other kinds of excitation. The third series of experiments offers an opportunity to explore the difference between the input admittance measuring from one bridge corner to another and that of single strings. The classic measurement is found to give a reasonable approximation to that of all four strings. Some possible differences between the hammer method and normal bowing and implications of the acoustical results are also discussed.

  1. Reliability-based design optimization using a generalized subset simulation method and posterior approximation

    Science.gov (United States)

    Ma, Yuan-Zhuo; Li, Hong-Shuang; Yao, Wei-Xing

    2018-05-01

    The evaluation of the probabilistic constraints in reliability-based design optimization (RBDO) problems has always been significant and challenging work, which strongly affects the performance of RBDO methods. This article deals with RBDO problems using a recently developed generalized subset simulation (GSS) method and a posterior approximation approach. The posterior approximation approach is used to transform all the probabilistic constraints into ordinary constraints as in deterministic optimization. The assessment of multiple failure probabilities required by the posterior approximation approach is achieved by GSS in a single run at all supporting points, which are selected by a proper experimental design scheme combining Sobol' sequences and Bucher's design. Sequentially, the transformed deterministic design optimization problem can be solved by optimization algorithms, for example, the sequential quadratic programming method. Three optimization problems are used to demonstrate the efficiency and accuracy of the proposed method.

  2. Seismic Azimuthal Anisotropy of the Lower Paleozoic Shales in Northern Poland: can we reliably detect it?

    Science.gov (United States)

    Cyz, Marta; Malinowski, Michał

    2017-04-01

    Analysis of the azimuthal anisotropy is an important aspect of characterization the Lower Paleozoic shale play in northern Poland, since it can be used to map pre-existing fracture networks or help in optimal placement of the horizontal wells. Previous studies employed Velocity versus Azimuth (VVAz) method and found that this anisotropy is weak - on the order of 1-2%, only locally - close to major fault zones - being higher (ca. 7%). This is consistent with the recent re-interpretation of the cross-dipole sonic data, which indicates average shear wave anisotropy of 1%. The problem with the VVAz method is that it requires good definition of the interval, for which the analysis is made and it should be minimum 100 ms thick. In our case, the target intervals are thin - upper reservoir (Lower Silurian Jantar formation) is 15 m thick, lower reservoir (Upper Ordovician Sasino formation) is 25 m thick. Therefore, we prefer to use the Amplitude vs Azimuth (AVAz) method, which can be applied on a single horizon (e.g. the base of the reservoir). However, the AVAz method depends critically on the quality of the seismic data and preservation of amplitudes during processing. On top of the above mentioned issues, physical properties of the Lower Paleozoic shales from Poland seem to be unfavourable for detecting azimuthal anisotropy. For example, for both target formations, parameter g=(Vs/Vp)2 is close to 0.32, which implies that the anisotropy expressed by the anisotropic gradient in the dry (i.e. gas-filled fractures) case is close to zero. In case of e.g. the Bakken Shale, g is much higher (0.38-0.4), leading to a detectable anisotropic signature even in the dry case. Modelling of the synthetic AVAz response performed using available well data suggested that anisotropic gradient in the wet (fluid-filled) case should be detectable even in case of the weak anisotropy (1-2%). This scenario is consistent with the observation, that the studied area is located in the liquid

  3. Study on Feasibility of Applying Function Approximation Moment Method to Achieve Reliability-Based Design Optimization

    International Nuclear Information System (INIS)

    Huh, Jae Sung; Kwak, Byung Man

    2011-01-01

    Robust optimization or reliability-based design optimization are some of the methodologies that are employed to take into account the uncertainties of a system at the design stage. For applying such methodologies to solve industrial problems, accurate and efficient methods for estimating statistical moments and failure probability are required, and further, the results of sensitivity analysis, which is needed for searching direction during the optimization process, should also be accurate. The aim of this study is to employ the function approximation moment method into the sensitivity analysis formulation, which is expressed as an integral form, to verify the accuracy of the sensitivity results, and to solve a typical problem of reliability-based design optimization. These results are compared with those of other moment methods, and the feasibility of the function approximation moment method is verified. The sensitivity analysis formula with integral form is the efficient formulation for evaluating sensitivity because any additional function calculation is not needed provided the failure probability or statistical moments are calculated

  4. GNSS Single Frequency, Single Epoch Reliable Attitude Determination Method with Baseline Vector Constraint

    Directory of Open Access Journals (Sweden)

    Ang Gong

    2015-12-01

    Full Text Available For Global Navigation Satellite System (GNSS single frequency, single epoch attitude determination, this paper proposes a new reliable method with baseline vector constraint. First, prior knowledge of baseline length, heading, and pitch obtained from other navigation equipment or sensors are used to reconstruct objective function rigorously. Then, searching strategy is improved. It substitutes gradually Enlarged ellipsoidal search space for non-ellipsoidal search space to ensure correct ambiguity candidates are within it and make the searching process directly be carried out by least squares ambiguity decorrelation algorithm (LAMBDA method. For all vector candidates, some ones are further eliminated by derived approximate inequality, which accelerates the searching process. Experimental results show that compared to traditional method with only baseline length constraint, this new method can utilize a priori baseline three-dimensional knowledge to fix ambiguity reliably and achieve a high success rate. Experimental tests also verify it is not very sensitive to baseline vector error and can perform robustly when angular error is not great.

  5. Prediction method of long-term reliability in improving residual stresses by means of surface finishing

    International Nuclear Information System (INIS)

    Sera, Takehiko; Hirano, Shinro; Chigusa, Naoki; Okano, Shigetaka; Saida, Kazuyoshi; Mochizuki, Masahito; Nishimoto, Kazutoshi

    2012-01-01

    Surface finishing methods, such as Water Jet Peening (WJP), have been applied to welds in some major components of nuclear power plants as a counter measure to Primary Water Stress Corrosion Cracking (PWSCC). In addition, the methods of surface finishing (buffing treatment) is being standardized, and thus the buffing treatment has been also recognized as the well-established method of improving stress. On the other hand, the long-term stability of peening techniques has been confirmed by accelerated test. However, the effectiveness of stress improvement by surface treatment is limited to thin layers and the effect of complicated residual stress distribution in the weld metal beneath the surface is not strictly taken into account for long-term stability. This paper, therefore, describes the accelerated tests, which confirmed that the long-term stability of the layer subjected to buffing treatment was equal to that subjected to WJP. The long-term reliability of very thin stress improved layer was also confirmed through a trial evaluation by thermal elastic-plastic creep analysis, even if the effect of complicated residual stress distribution in the weld metal was excessively taken into account. Considering the above findings, an approach is proposed for constructing the prediction method of the long-term reliability of stress improvement by surface finishing. (author)

  6. Validity and reliability of a method for assessment of cervical vertebral maturation.

    Science.gov (United States)

    Zhao, Xiao-Guang; Lin, Jiuxiang; Jiang, Jiu-Hui; Wang, Qingzhu; Ng, Sut Hong

    2012-03-01

    To evaluate the validity and reliability of the cervical vertebral maturation (CVM) method with a longitudinal sample. Eighty-six cephalograms from 18 subjects (5 males and 13 females) were selected from the longitudinal database. Total mandibular length was measured on each film; an increased rate served as the gold standard in examination of the validity of the CVM method. Eleven orthodontists, after receiving intensive training in the CVM method, evaluated all films twice. Kendall's W and the weighted kappa statistic were employed. Kendall's W values were higher than 0.8 at both times, indicating strong interobserver reproducibility, but interobserver agreement was documented twice at less than 50%. A wide range of intraobserver agreement was noted (40.7%-79.1%), and substantial intraobserver reproducibility was proved by kappa values (0.53-0.86). With regard to validity, moderate agreement was reported between the gold standard and observer staging at the initial time (kappa values 0.44-0.61). However, agreement seemed to be unacceptable for clinical use, especially in cervical stage 3 (26.8%). Even though the validity and reliability of the CVM method proved statistically acceptable, we suggest that many other growth indicators should be taken into consideration in evaluating adolescent skeletal maturation.

  7. Reliability of Doppler and stethoscope methods of determining systolic blood pressures: considerations for calculating an ankle-brachial index.

    Science.gov (United States)

    Chesbro, Steven B; Asongwed, Elmira T; Brown, Jamesha; John, Emmanuel B

    2011-01-01

    The purposes of this study were to: (1) identify the interrater and intrarater reliability of systolic blood pressures using a stethoscope and Doppler to determine an ankle-brachial index (ABI), and (2) to determine the correlation between the 2 methods. Peripheral arterial disease (PAD) affects approximately 8 to 12 million people in the United States, and nearly half of those with this disease are asymptomatic. Early detection and prompt treatment of PAD will improve health outcomes. It is important that clinicians perform tests that determine the presence of PAD. Two individual raters trained in ABI procedure measured the systolic blood pressures of 20 individuals' upper and lower extremities. Standard ABI measurement protocols were observed. Raters individually recorded the systolic blood pressures of each extremity using a stethoscope and a Doppler, for a total of 640 independent measures. Interrater reliability of Doppler measurements to determine SBP at the ankle was very strong (intraclass correlation coefficient [ICC], 0.93-0.99) compared to moderate to strong reliability using a stethoscope (ICC, 0.64-0.87). Agreement between the 2 devices to determine SBP was moderate to very weak (ICC, 0.13-0.61). Comparisons of the use of Doppler and stethoscope to determine ABI showed weak to very weak intrarater correlation (ICC, 0.17-0.35). Linear regression analysis of the 2 methods to determine ABI showed positive but weak to very weak correlations (r2 = .013, P = .184). A Doppler ultrasound is recommended over a stethoscope for accuracy in systolic pressure readings for ABI measurements.

  8. Anomaly-based Network Intrusion Detection Methods

    Directory of Open Access Journals (Sweden)

    Pavel Nevlud

    2013-01-01

    Full Text Available The article deals with detection of network anomalies. Network anomalies include everything that is quite different from the normal operation. For detection of anomalies were used machine learning systems. Machine learning can be considered as a support or a limited type of artificial intelligence. A machine learning system usually starts with some knowledge and a corresponding knowledge organization so that it can interpret, analyse, and test the knowledge acquired. There are several machine learning techniques available. We tested Decision tree learning and Bayesian networks. The open source data-mining framework WEKA was the tool we used for testing the classify, cluster, association algorithms and for visualization of our results. The WEKA is a collection of machine learning algorithms for data mining tasks.

  9. Standardized Methods for Detection of Poliovirus Antibodies.

    Science.gov (United States)

    Weldon, William C; Oberste, M Steven; Pallansch, Mark A

    2016-01-01

    Testing for neutralizing antibodies against polioviruses has been an established gold standard for assessing individual protection from disease, population immunity, vaccine efficacy studies, and other vaccine clinical trials. Detecting poliovirus specific IgM and IgA in sera and mucosal specimens has been proposed for evaluating the status of population mucosal immunity. More recently, there has been a renewed interest in using dried blood spot cards as a medium for sample collection to enhance surveillance of poliovirus immunity. Here, we describe the modified poliovirus microneutralization assay, poliovirus capture IgM and IgA ELISA assays, and dried blood spot polio serology procedures for the detection of antibodies against poliovirus serotypes 1, 2, and 3.

  10. Methods and systems for detection of radionuclides

    Science.gov (United States)

    Coates, Jr., John T.; DeVol, Timothy A.

    2010-05-25

    Disclosed are materials and systems useful in determining the existence of radionuclides in an aqueous sample. The materials provide the dual function of both extraction and scintillation to the systems. The systems can be both portable and simple to use, and as such can beneficially be utilized to determine presence and optionally concentration of radionuclide contamination in an aqueous sample at any desired location and according to a relatively simple process without the necessity of complicated sample handling techniques. The disclosed systems include a one-step process, providing simultaneous extraction and detection capability, and a two-step process, providing a first extraction step that can be carried out in a remote field location, followed by a second detection step that can be carried out in a different location.

  11. Developing methods for detecting radioactive scrap

    International Nuclear Information System (INIS)

    Bellian, J.G.; Johnston, J.G.

    1995-01-01

    During the last 10 years, there have been major developments in radiation detection systems used for catching shielded radioactive sources in scrap metal. The original testing required to determine the extent of the problem and the preliminary designs of the first instruments will be discussed. Present systems available today will be described listing their advantages and disadvantages. In conclusion, the newest developments and state of the art equipment will also be included describing the limits and most appropriate locations for the systems

  12. Development of detection methods for irradiated foods

    International Nuclear Information System (INIS)

    Yang, Jae Seung; Kim, Chong Ki; Lee, Hae Jung; Kim, Kyong Su

    1999-04-01

    To identify irradiated foods, studies have been carried out with electron spin resonance (ESR) spectroscopy on bone containing foods, such as chicken, pork, and beef. The intensity of the signal induced in bones increased linearly with irradiation doses in the range of 1.0 kGy to 5.0 kGy, and it was possible to distinguish between samples given low and high doses of irradiation. The signal stability for 6 weeks made them ideal for the quick and easy identification of irradiated meats. The analysis of DNA damage made on single cells by agarose gel electrophoresis (DNA 'comet assay') can be used to detect irradiated food. All the samples irradiated with over 0.3 kGy were identified to detect post-irradiation by the tail length of their comets. Irradiated samples showed comets with long tails, and the tail length of the comets increased with the dose, while unirradiated samples showed no or very short tails. As a result of the above experiment, the DNA 'comet assay' might be applied to the detection of irradiated grains as a simple, low-cost and rapid screening test. When fats are irradiated, hydrocarbons contained one or two fewer carbon atoms are formed from the parent fatty acids. The major hydrocarbons in irradiated beef, pork and chicken were 1,7-hexadecadiene and 8-heptadecene originating from leic acid. 1,7 hexadecadiene was the highest amount in irradiated beef, pork and chicken. Eight kinds of hydrocarbons were identified from irradiated chicken, among which 1,7-hexadecadiene and 8-heptadecen were detected as major compounds. The concentration of radiation-induced hydrocarbons was relatively constant during 16 weeks

  13. Development of detection methods for irradiated foods

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jae Seung; Kim, Chong Ki; Lee, Hae Jung [Korea Atomic Energy Research Insitiute, Taejon (Korea, Republic of); Kim, Kyong Su [Chosun University, Kwangju (Korea, Republic of)

    1999-04-01

    To identify irradiated foods, studies have been carried out with electron spin resonance (ESR) spectroscopy on bone containing foods, such as chicken, pork, and beef. The intensity of the signal induced in bones increased linearly with irradiation doses in the range of 1.0 kGy to 5.0 kGy, and it was possible to distinguish between samples given low and high doses of irradiation. The signal stability for 6 weeks made them ideal for the quick and easy identification of irradiated meats. The analysis of DNA damage made on single cells by agarose gel electrophoresis (DNA 'comet assay') can be used to detect irradiated food. All the samples irradiated with over 0.3 kGy were identified to detect post-irradiation by the tail length of their comets. Irradiated samples showed comets with long tails, and the tail length of the comets increased with the dose, while unirradiated samples showed no or very short tails. As a result of the above experiment, the DNA 'comet assay' might be applied to the detection of irradiated grains as a simple, low-cost and rapid screening test. When fats are irradiated, hydrocarbons contained one or two fewer carbon atoms are formed from the parent fatty acids. The major hydrocarbons in irradiated beef, pork and chicken were 1,7-hexadecadiene and 8-heptadecene originating from leic acid. 1,7 hexadecadiene was the highest amount in irradiated beef, pork and chicken. Eight kinds of hydrocarbons were identified from irradiated chicken, among which 1,7-hexadecadiene and 8-heptadecen were detected as major compounds. The concentration of radiation-induced hydrocarbons was relatively constant during 16 weeks.

  14. Blind Methods for Detecting Image Fakery

    Czech Academy of Sciences Publication Activity Database

    Mahdian, Babak; Saic, Stanislav

    2010-01-01

    Roč. 25, č. 4 (2010), s. 18-24 ISSN 0885-8985 R&D Projects: GA ČR GA102/08/0470 Institutional research plan: CEZ:AV0Z10750506 Keywords : Image forensics * Image Fakery * Forgery detection * Authentication Subject RIV: BD - Theory of Information Impact factor: 0.179, year: 2010 http://library.utia.cas.cz/separaty/2010/ZOI/saic-0343316.pdf

  15. Accounting for Model Uncertainties Using Reliability Methods - Application to Carbon Dioxide Geologic Sequestration System. Final Report

    International Nuclear Information System (INIS)

    Mok, Chin Man; Doughty, Christine; Zhang, Keni; Pruess, Karsten; Kiureghian, Armen; Zhang, Miao; Kaback, Dawn

    2010-01-01

    A new computer code, CALRELTOUGH, which uses reliability methods to incorporate parameter sensitivity and uncertainty analysis into subsurface flow and transport models, was developed by Geomatrix Consultants, Inc. in collaboration with Lawrence Berkeley National Laboratory and University of California at Berkeley. The CALREL reliability code was developed at the University of California at Berkely for geotechnical applications and the TOUGH family of codes was developed at Lawrence Berkeley National Laboratory for subsurface flow and tranport applications. The integration of the two codes provides provides a new approach to deal with uncertainties in flow and transport modeling of the subsurface, such as those uncertainties associated with hydrogeology parameters, boundary conditions, and initial conditions of subsurface flow and transport using data from site characterization and monitoring for conditioning. The new code enables computation of the reliability of a system and the components that make up the system, instead of calculating the complete probability distributions of model predictions at all locations at all times. The new CALRELTOUGH code has tremendous potential to advance subsurface understanding for a variety of applications including subsurface energy storage, nuclear waste disposal, carbon sequestration, extraction of natural resources, and environmental remediation. The new code was tested on a carbon sequestration problem as part of the Phase I project. Phase iI was not awarded.

  16. HPLC ‘Multi-Analyte’ Detection Method

    Energy Technology Data Exchange (ETDEWEB)

    Dudar, E. [Plant Protection & Soil Conservation Service of Budapest, Budapest (Hungary)

    2009-07-15

    The application of multi-analyte methods for pesticides carrying chromophoric structures by HPLC is described. Details are given on the materials and methods used. Recorded UV spectra of active substances are presented for allowing the verification of purity and the confirmation of substances eluting from the HPLC column. (author)

  17. A reliable method for reconstituting thymectomized, lethally irradiated guinea pigs with bone marrow cells

    International Nuclear Information System (INIS)

    Terata, N.; Tanio, Y.; Zbar, B.

    1984-01-01

    The authors developed a reliable method for reconstituting thymectomized, lethally irradiated guinea pigs. Injection of 2.5-10 x 10 7 syngeneic bone marrow cells into adult thymectomized, lethally irradiated guinea pigs produced survival of 46-100% of treated animals. Gentamycin sulfate (5 mg/kg of body weight) for 10 days was required for optimal results. Acidified drinking water (pH 2.5) appeared to be required for optimal results. Thymectomized, lethally irradiated, bone marrow reconstituted ('B') guinea pigs had impaired ability to develop delayed cutaneous hypersensitivity to mycobacterial antigens and cutaneous basophil hypersensitivity to keyhole limpet hemocyanin; proliferative responses to phytohemagglutinin were impaired. (Auth.)

  18. Radiologic identification of disaster victims: A simple and reliable method using CT of the paranasal sinuses

    International Nuclear Information System (INIS)

    Ruder, Thomas D.; Kraehenbuehl, Markus; Gotsmy, Walther F.; Mathier, Sandra; Ebert, Lars C.; Thali, Michael J.; Hatch, Gary M.

    2012-01-01

    Objective: To assess the reliability of radiologic identification using visual comparison of ante and post mortem paranasal sinus computed tomography (CT). Subjects and methods: The study was approved by the responsible justice department and university ethics committee. Four blinded readers with varying radiological experience separately compared 100 post mortem to 25 ante mortem head CTs with the goal to identify as many matching pairs as possible (out of 23 possible matches). Sensitivity, specificity, positive and negative predictive values were calculated for all readers. The chi-square test was applied to establish if there was significant difference in sensitivity between radiologists and non-radiologists. Results: For all readers, sensitivity was 83.7%, specificity was 100.0%, negative predictive value (NPV) was 95.4%, positive predictive value (PPV) was 100.0%, and accuracy was 96.3%. For radiologists, sensitivity was 97.8%, NPV was 99.4%, and accuracy was 99.5%. For non-radiologists, average sensitivity was 69.6%, negative predictive value (NPV) was 91.7%, and accuracy was 93.0%. Radiologists achieved a significantly higher sensitivity (p < 0.01) than non-radiologists. Conclusions: Visual comparison of ante mortem and post mortem CT of the head is a robust and reliable method for identifying unknown decedents, particularly in regard to positive matches. The sensitivity and NPV of the method depend on the reader's experience.

  19. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    Science.gov (United States)

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  20. Accuracy and Reliability of the Klales et al. (2012) Morphoscopic Pelvic Sexing Method.

    Science.gov (United States)

    Lesciotto, Kate M; Doershuk, Lily J

    2018-01-01

    Klales et al. (2012) devised an ordinal scoring system for the morphoscopic pelvic traits described by Phenice (1969) and used for sex estimation of skeletal remains. The aim of this study was to test the accuracy and reliability of the Klales method using a large sample from the Hamann-Todd collection (n = 279). Two observers were blinded to sex, ancestry, and age and used the Klales et al. method to estimate the sex of each individual. Sex was correctly estimated for females with over 95% accuracy; however, the male allocation accuracy was approximately 50%. Weighted Cohen's kappa and intraclass correlation coefficient analysis for evaluating intra- and interobserver error showed moderate to substantial agreement for all traits. Although each trait can be reliably scored using the Klales method, low accuracy rates and high sex bias indicate better trait descriptions and visual guides are necessary to more accurately reflect the range of morphological variation. © 2017 American Academy of Forensic Sciences.

  1. The Global Optimal Algorithm of Reliable Path Finding Problem Based on Backtracking Method

    Directory of Open Access Journals (Sweden)

    Liang Shen

    2017-01-01

    Full Text Available There is a growing interest in finding a global optimal path in transportation networks particularly when the network suffers from unexpected disturbance. This paper studies the problem of finding a global optimal path to guarantee a given probability of arriving on time in a network with uncertainty, in which the travel time is stochastic instead of deterministic. Traditional path finding methods based on least expected travel time cannot capture the network user’s risk-taking behaviors in path finding. To overcome such limitation, the reliable path finding algorithms have been proposed but the convergence of global optimum is seldom addressed in the literature. This paper integrates the K-shortest path algorithm into Backtracking method to propose a new path finding algorithm under uncertainty. The global optimum of the proposed method can be guaranteed. Numerical examples are conducted to demonstrate the correctness and efficiency of the proposed algorithm.

  2. Response and reliability analysis of nonlinear uncertain dynamical structures by the probability density evolution method

    DEFF Research Database (Denmark)

    Nielsen, Søren R. K.; Peng, Yongbo; Sichani, Mahdi Teimouri

    2016-01-01

    The paper deals with the response and reliability analysis of hysteretic or geometric nonlinear uncertain dynamical systems of arbitrary dimensionality driven by stochastic processes. The approach is based on the probability density evolution method proposed by Li and Chen (Stochastic dynamics...... of structures, 1st edn. Wiley, London, 2009; Probab Eng Mech 20(1):33–44, 2005), which circumvents the dimensional curse of traditional methods for the determination of non-stationary probability densities based on Markov process assumptions and the numerical solution of the related Fokker–Planck and Kolmogorov......–Feller equations. The main obstacle of the method is that a multi-dimensional convolution integral needs to be carried out over the sample space of a set of basic random variables, for which reason the number of these need to be relatively low. In order to handle this problem an approach is suggested, which...

  3. A Novel Evaluation Method for Building Construction Project Based on Integrated Information Entropy with Reliability Theory

    Directory of Open Access Journals (Sweden)

    Xiao-ping Bai

    2013-01-01

    Full Text Available Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  4. Calculation of noninformative prior of reliability parameter and initiating event frequency with Jeffreys method

    International Nuclear Information System (INIS)

    He Jie; Zhang Binbin

    2013-01-01

    In the probabilistic safety assessment (PSA) of nuclear power plants, there are few historical records on some initiating event frequencies or component failures in industry. In order to determine the noninformative priors of such reliability parameters and initiating event frequencies, the Jeffreys method in Bayesian statistics was employed. The mathematical mechanism of the Jeffreys prior and the simplified constrained noninformative distribution (SCNID) were elaborated in this paper. The Jeffreys noninformative formulas and the credible intervals of the Gamma-Poisson and Beta-Binomial models were introduced. As an example, the small break loss-of-coolant accident (SLOCA) was employed to show the application of the Jeffreys prior in determining an initiating event frequency. The result shows that the Jeffreys method is an effective method for noninformative prior calculation. (authors)

  5. A summary of methods of predicting reliability life of nuclear equipment with small samples

    International Nuclear Information System (INIS)

    Liao Weixian

    2000-03-01

    Some of nuclear equipment are manufactured in small batch, e.g., 1-3 sets. Their service life may be very difficult to determine experimentally in view of economy and technology. The method combining theoretical analysis with material tests to predict the life of equipment is put forward, based on that equipment consists of parts or elements which are made of different materials. The whole life of an equipment part consists of the crack forming life (i.e., the fatigue life or the damage accumulation life) and the crack extension life. Methods of predicting machine life has systematically summarized with the emphasis on those which use theoretical analysis to substitute large scale prototype experiments. Meanwhile, methods and steps of predicting reliability life have been described by taking into consideration of randomness of various variables and parameters in engineering. Finally, the latest advance and trends of machine life prediction are discussed

  6. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    Science.gov (United States)

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  7. Reliability Quantification Method for Safety Critical Software Based on a Finite Test Set

    International Nuclear Information System (INIS)

    Shin, Sung Min; Kim, Hee Eun; Kang, Hyun Gook; Lee, Seung Jun

    2014-01-01

    Software inside of digitalized system have very important role because it may cause irreversible consequence and affect the whole system as common cause failure. However, test-based reliability quantification method for some safety critical software has limitations caused by difficulties in developing input sets as a form of trajectory which is series of successive values of variables. To address these limitations, this study proposed another method which conduct the test using combination of single values of variables. To substitute the trajectory form of input using combination of variables, the possible range of each variable should be identified. For this purpose, assigned range of each variable, logical relations between variables, plant dynamics under certain situation, and characteristics of obtaining information of digital device are considered. A feasibility of the proposed method was confirmed through an application to the Reactor Protection System (RPS) software trip logic

  8. Numerical methods for reliability and safety assessment multiscale and multiphysics systems

    CERN Document Server

    Hami, Abdelkhalak

    2015-01-01

    This book offers unique insight on structural safety and reliability by combining computational methods that address multiphysics problems, involving multiple equations describing different physical phenomena, and multiscale problems, involving discrete sub-problems that together  describe important aspects of a system at multiple scales. The book examines a range of engineering domains and problems using dynamic analysis, nonlinear methods, error estimation, finite element analysis, and other computational techniques. This book also: ·       Introduces novel numerical methods ·       Illustrates new practical applications ·       Examines recent engineering applications ·       Presents up-to-date theoretical results ·       Offers perspective relevant to a wide audience, including teaching faculty/graduate students, researchers, and practicing engineers

  9. Approximation of the Monte Carlo Sampling Method for Reliability Analysis of Structures

    Directory of Open Access Journals (Sweden)

    Mahdi Shadab Far

    2016-01-01

    Full Text Available Structural load types, on the one hand, and structural capacity to withstand these loads, on the other hand, are of a probabilistic nature as they cannot be calculated and presented in a fully deterministic way. As such, the past few decades have witnessed the development of numerous probabilistic approaches towards the analysis and design of structures. Among the conventional methods used to assess structural reliability, the Monte Carlo sampling method has proved to be very convenient and efficient. However, it does suffer from certain disadvantages, the biggest one being the requirement of a very large number of samples to handle small probabilities, leading to a high computational cost. In this paper, a simple algorithm was proposed to estimate low failure probabilities using a small number of samples in conjunction with the Monte Carlo method. This revised approach was then presented in a step-by-step flowchart, for the purpose of easy programming and implementation.

  10. Distance Measurement Methods for Improved Insider Threat Detection

    Directory of Open Access Journals (Sweden)

    Owen Lo

    2018-01-01

    Full Text Available Insider threats are a considerable problem within cyber security and it is often difficult to detect these threats using signature detection. Increasing machine learning can provide a solution, but these methods often fail to take into account changes of behaviour of users. This work builds on a published method of detecting insider threats and applies Hidden Markov method on a CERT data set (CERT r4.2 and analyses a number of distance vector methods (Damerau–Levenshtein Distance, Cosine Distance, and Jaccard Distance in order to detect changes of behaviour, which are shown to have success in determining different insider threats.

  11. Detection of irradiated meats by hydrocarbon method

    International Nuclear Information System (INIS)

    Goto, Michiko; Miyakawa, Hiroyuki; Fujinuma, Kenji; Ozawa, Hideki

    2005-01-01

    Meats, for example, lamb, razorback, wild duck and turkey were irradiated by gamma ray, and the amounts of hydrocarbons formed from fatty acids were measured. Since C 20:0 was found from wild duck and turkey. C 1-18:1 was recommended for internal standard. Good correlation was found between the amount of hydrocarbons and the doses of gamma irradiation. This study shows that such hydrocarbons induced after radiation procedure as C 1,7-16:2 , C 8-17:1 , C 1-14:1 , and C 15:0 may make it possible to detect irradiated lamb, razorback, wild duck and turkey. (author)

  12. Using the graphs models for evaluating in-core monitoring systems reliability by the method of imiting simulaton

    International Nuclear Information System (INIS)

    Golovanov, M.N.; Zyuzin, N.N.; Levin, G.L.; Chesnokov, A.N.

    1987-01-01

    An approach for estimation of reliability factors of complex reserved systems at early stages of development using the method of imitating simulation is considered. Different types of models, their merits and lacks are given. Features of in-core monitoring systems and advosability of graph model and graph theory element application for estimating reliability of such systems are shown. The results of investigation of the reliability factors of the reactor monitoring, control and core local protection subsystem are shown

  13. A fast method for calculating reliable event supports in tree reconciliations via Pareto optimality.

    Science.gov (United States)

    To, Thu-Hien; Jacox, Edwin; Ranwez, Vincent; Scornavacca, Celine

    2015-11-14

    Given a gene and a species tree, reconciliation methods attempt to retrieve the macro-evolutionary events that best explain the discrepancies between the two tree topologies. The DTL parsimonious approach searches for a most parsimonious reconciliation between a gene tree and a (dated) species tree, considering four possible macro-evolutionary events (speciation, duplication, transfer, and loss) with specific costs. Unfortunately, many events are erroneously predicted due to errors in the input trees, inappropriate input cost values or because of the existence of several equally parsimonious scenarios. It is thus crucial to provide a measure of the reliability for predicted events. It has been recently proposed that the reliability of an event can be estimated via its frequency in the set of most parsimonious reconciliations obtained using a variety of reasonable input cost vectors. To compute such a support, a straightforward but time-consuming approach is to generate the costs slightly departing from the original ones, independently compute the set of all most parsimonious reconciliations for each vector, and combine these sets a posteriori. Another proposed approach uses Pareto-optimality to partition cost values into regions which induce reconciliations with the same number of DTL events. The support of an event is then defined as its frequency in the set of regions. However, often, the number of regions is not large enough to provide reliable supports. We present here a method to compute efficiently event supports via a polynomial-sized graph, which can represent all reconciliations for several different costs. Moreover, two methods are proposed to take into account alternative input costs: either explicitly providing an input cost range or allowing a tolerance for the over cost of a reconciliation. Our methods are faster than the region based method, substantially faster than the sampling-costs approach, and have a higher event-prediction accuracy on

  14. Validity and reliability of the Thai version of the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU

    Directory of Open Access Journals (Sweden)

    Pipanmekaporn T

    2014-05-01

    Full Text Available Tanyong Pipanmekaporn,1 Nahathai Wongpakaran,2 Sirirat Mueankwan,3 Piyawat Dendumrongkul,2 Kaweesak Chittawatanarat,3 Nantiya Khongpheng,3 Nongnut Duangsoy31Department of Anesthesiology, Faculty of Medicine, Chiang Mai University, Chiang Mai, Thailand; 2Department of Psychiatry, Faculty of Medicine, Chiang Mai University, Chiang Mai, Thailand; 3Division of Surgical Critical Care and Trauma, Department of Surgery, Chiang Mai University Hospital, Chiang Mai, ThailandPurpose: The purpose of this study was to determine the validity and reliability of the Thai version of the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU, when compared to the diagnoses made by delirium experts.Patients and methods: This was a cross-sectional study conducted in both surgical intensive care and subintensive care units in Thailand between February–June 2011. Seventy patients aged 60 years or older who had been admitted to the units were enrolled into the study within the first 48 hours of admission. Each patient was randomly assessed as to whether they had delirium by a nurse using the Thai version of the CAM-ICU algorithm (Thai CAM-ICU or by a delirium expert using the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision.Results: The prevalence of delirium was found to be 18.6% (n=13 by the delirium experts. The sensitivity of the Thai CAM-ICU’s algorithms was found to be 92.3% (95% confidence interval [CI] =64.0%-99.8%, while the specificity was 94.7% (95% CI =85.4%-98.9%. The instrument displayed good interrater reliability (Cohen’s κ=0.81; 95% CI =0.64-0.99. The time taken to complete the Thai CAM-ICU was 1 minute (interquatile range, 1-2 minutes.Conclusion: The Thai CAM-ICU demonstrated good validity, reliability, and ease of use when diagnosing delirium in a surgical intensive care unit setting. The use of this diagnostic tool should be encouraged for daily, routine use, so as to promote the early detection

  15. Anomaly Detection in Gas Turbine Fuel Systems Using a Sequential Symbolic Method

    Directory of Open Access Journals (Sweden)

    Fei Li

    2017-05-01

    Full Text Available Anomaly detection plays a significant role in helping gas turbines run reliably and economically. Considering the collective anomalous data and both sensitivity and robustness of the anomaly detection model, a sequential symbolic anomaly detection method is proposed and applied to the gas turbine fuel system. A structural Finite State Machine is used to evaluate posterior probabilities of observing symbolic sequences and the most probable state sequences they may locate. Hence an estimation-based model and a decoding-based model are used to identify anomalies in two different ways. Experimental results indicate that both models have both ideal performance overall, but the estimation-based model has a strong robustness ability, whereas the decoding-based model has a strong accuracy ability, particularly in a certain range of sequence lengths. Therefore, the proposed method can facilitate well existing symbolic dynamic analysis- based anomaly detection methods, especially in the gas turbine domain.

  16. Rootkits. Methods of detecting and removing

    International Nuclear Information System (INIS)

    Lagutina, A.M.; Bogdanovich, A.A.; Ivanov, M.A.

    2012-01-01

    The problems connected with the threat of the infection of computer systems by rootkits have been examined, and the methods for providing a guard from this type of malicious software have been analyzed [ru

  17. Knowledge representation methods for early failure detection

    International Nuclear Information System (INIS)

    Scherer, K.P.; Stiller, P.

    1990-01-01

    To supervise technical processes like nuclear power plants, it is very important to detect failure modes in an early stage. In the nuclear research center at Karlsruhe an expert system is developed, embedded in a computer network of autonomous computers, which are used for intelligent prepocessing. Events, process data and actual parameter values are stored in slots of special frames in the knowledge base of the expert system. Both rule based and fact based knowledge representations are employed to generate cause consequence chains of failure states. By on-line surveillance of the reactor process, the slots of the frames are dynamically actualized. Immediately after the evaluation, the inference engine starts in the special domain experts (triggered by metarules from a manager) and detects the correspondend failures or anomaly state. Matching the members of the chain and regarding a catalogue of instructions and messages, what is to do by the operator, future failure states can be estimated and propagation can be prohibited. That means qualitative failure prediction based on cause consequence in the static part of the knowledge base. Also, a time series of physical data can be used to predict on analytical way future process state and to continue such a theoretical propagation with matching the cause consuquence chain

  18. Marine Biotoxins: Occurrence, Toxicity, and Detection Methods

    Science.gov (United States)

    Asakawa, M.

    2017-04-01

    This review summarizes the role of marine organisms as vectors of marine biotoxins, and discusses the need for surveillance to protect public health and ensure the quality of seafood. I Paralytic shellfish poison (PSP) and PSP-bearing organisms-PSP is produced by toxic dinoflagellates species belonging to the genera Alexandrium, Gymnodinium, and Pyrodinium. Traditionally, PSP monitoring programs have only considered filter-feeding molluscs that concentrate these toxic algae, however, increasing attention is now being paid to higher-order predators that carry PSP, such as carnivorous gastropods and crustaceans. II. Tetrodotoxin (TTX) and TTX-bearing organisms - TTX is the most common natural marine toxin that causes food poisonings in Japan, and poses a serious public health risk. TTX was long believed to be present only in pufferfish. However, TTX was detected in the eggs of California newt Taricha torosa in 1964, and since then it has been detected in a wide variety of species belonging to several different phyla. In this study, the main toxic components in the highly toxic ribbon worm Cephalothrix simula and the greater blue-ringed octopus Hapalochlaena lunulata from Japan were purified and analysed.

  19. Polarization sensitive optical coherence tomography detection method

    International Nuclear Information System (INIS)

    Colston, B W; DaSilva, L B; Everett, M J; Featherstone, J D B; Fried, D; Ragadio, J N; Sathyam, U S.

    1999-01-01

    This study demonstrates the potential of polarization sensitive optical coherence tomography (PS-OCT) for non-invasive in vivo detection and characterization of early, incipient caries lesions. PS-OCT generates cross-sectional images of biological tissue while measuring the effect of the tissue on the polarization state of incident light. Clear discrimination between regions of normal and demineralized enamel is first shown in PS-OCT images of bovine enamel blocks containing well-characterized artificial lesions. High-resolution, cross-sectional images of extracted human teeth are then generated that clearly discriminate between the normal and carious regions on both the smooth and occlusal surfaces. Regions of the teeth that appeared to be demineralized in the PS-OCT images were verified using histological thin sections examined under polarized light microscopy. The PS-OCT system discriminates between normal and carious regions by measuring the polarization state of the back-scattered 1310 nm light, which is affected by the state of demineralization of the enamel. Demineralization of enamel increases the scattering coefficient, thus depolarizing the incident light. This study shows that PS-OCT has great potential for the detection, characterization, and monitoring of incipient caries lesions

  20. Thermal History Devices, Systems For Thermal History Detection, And Methods For Thermal History Detection

    KAUST Repository

    Caraveo Frescas, Jesus Alfonso; Alshareef, Husam N.

    2015-01-01

    Embodiments of the present disclosure include nanowire field-effect transistors, systems for temperature history detection, methods for thermal history detection, a matrix of field effect transistors, and the like.

  1. Thermal History Devices, Systems For Thermal History Detection, And Methods For Thermal History Detection

    KAUST Repository

    Caraveo Frescas, Jesus Alfonso

    2015-05-28

    Embodiments of the present disclosure include nanowire field-effect transistors, systems for temperature history detection, methods for thermal history detection, a matrix of field effect transistors, and the like.

  2. Development and Establishment of Detection Method of Irradiated Foods

    International Nuclear Information System (INIS)

    Byun, Myung Woo; Lee, Ju Woon; Kim, Dong Ho; Jo, Cheo Run; Kim, Jang Ho; Kim, Kyong Su

    2004-12-01

    The present project was related to the development and establishment of the detection techniques for the safety management of gamma-irradiated food and particularly conducted for the establishment of standard detection method for gamma-irradiated dried spices and raw materials, dried meat and fish powder for processed foods, bean paste powder, red pepper paste powder, soy sauce powder, and starch for flavoring ingredients described in 3, 6, 7 section of Korean Food Standard. Since the approvement of gamma-irradiated food items will be enlarged due to the international tendency for gamma-irradiated food, it was concluded that the establishment of detailed detection methods for each food group is not efficient for the enactment and enforcement of related regulations. For this reason, in order to establish the standard detection method, a detection system for gamma-irradiated food suitable for domestic operation was studied using comparative analysis of domestic and foreign research data classified by items and methods and European Standard as a reference. According to the comparative analyses of domestic and foreign research data and regulations of detection for gamma-irradiated food, it was concluded to be desirable that the optimal detection method should be decided after principal detection tests such as physical, chemical, and biological detection methods are established as standard methods and that the specific descriptions such as pre-treatment of raw materials, test methods, and the evaluation of results should be separately prescribed

  3. Development and Establishment of Detection Method of Irradiated Foods

    Energy Technology Data Exchange (ETDEWEB)

    Byun, Myung Woo; Lee, Ju Woon; Kim, Dong Ho; Jo, Cheo Run; Kim, Jang Ho; Kim, Kyong Su

    2004-12-15

    The present project was related to the development and establishment of the detection techniques for the safety management of gamma-irradiated food and particularly conducted for the establishment of standard detection method for gamma-irradiated dried spices and raw materials, dried meat and fish powder for processed foods, bean paste powder, red pepper paste powder, soy sauce powder, and starch for flavoring ingredients described in 3, 6, 7 section of Korean Food Standard. Since the approvement of gamma-irradiated food items will be enlarged due to the international tendency for gamma-irradiated food, it was concluded that the establishment of detailed detection methods for each food group is not efficient for the enactment and enforcement of related regulations. For this reason, in order to establish the standard detection method, a detection system for gamma-irradiated food suitable for domestic operation was studied using comparative analysis of domestic and foreign research data classified by items and methods and European Standard as a reference. According to the comparative analyses of domestic and foreign research data and regulations of detection for gamma-irradiated food, it was concluded to be desirable that the optimal detection method should be decided after principal detection tests such as physical, chemical, and biological detection methods are established as standard methods and that the specific descriptions such as pre-treatment of raw materials, test methods, and the evaluation of results should be separately prescribed.

  4. Are LOD and LOQ Reliable Parameters for Sensitivity Evaluation of Spectroscopic Methods?

    Science.gov (United States)

    Ershadi, Saba; Shayanfar, Ali

    2018-03-22

    The limit of detection (LOD) and the limit of quantification (LOQ) are common parameters to assess the sensitivity of analytical methods. In this study, the LOD and LOQ of previously reported terbium sensitized analysis methods were calculated by different methods, and the results were compared with sensitivity parameters [lower limit of quantification (LLOQ)] of U.S. Food and Drug Administration guidelines. The details of the calibration curve and standard deviation of blank samples of three different terbium-sensitized luminescence methods for the quantification of mycophenolic acid, enrofloxacin, and silibinin were used for the calculation of LOD and LOQ. A comparison of LOD and LOQ values calculated by various methods and LLOQ shows a considerable difference. The significant difference of the calculated LOD and LOQ with various methods and LLOQ should be considered in the sensitivity evaluation of spectroscopic methods.

  5. Implementation of the SFRA method as valuable tool for detection of power transformer active part deformation

    Directory of Open Access Journals (Sweden)

    Milić Saša D.

    2014-01-01

    Full Text Available The paper presents the SFRA (Sweep Frequency Response Analysis-SFRA method for analyzing frequency response of transformer windings in order to identify potential defects in the geometry of the core and winding. The most frequent problems (recognized by SFRA are: core shift, shorted or open winding, unwanted contact between core and mass, etc. Comparative analysis of this method with conventional methods is carried out in situ transformer in real hard industrial conditions. Benefits of SFRA method are great reliability and repeatability of the measurements. This method belongs to the non-invasive category. Due to the high reliability and repeatability of the measurements it is very suitable for detection of changes in the geometry of the coil and the core during prophylactic field testing, or after transporting the transformer.

  6. Procedures and methods that increase reliability and reproducibility of the transplanted kidney perfusion index

    International Nuclear Information System (INIS)

    Smokvina, A.

    1994-01-01

    At different times following surgery and during various complications, 119 studies were performed on 57 patients. In many patients studies were repeated several times. Twenty-three studies were performed in as many patients, in whom a normal function of the transplanted kidney was established by other diagnostic methods and retrospective analysis. Comparison was made of the perfusion index results obtained by the Hilson et al. method from 1978 and the ones obtained by my own modified method, which for calculating the index also takes into account: the time difference in appearance of the initial portions of the artery and kidney curves; the positioning of the region of interest over the distal part of the aorta; the bolus injection into the arteriovenous shunt of the forearm with high specific activity of small volumes of Tc-99m labelled agents; a fast 0.5 seconds study of data collection; and a standard for normalization of numerical data. The reliability of one or the other method tested by simulated time shift of the peak of arterial curves shows that the deviation percentage from the main index value in the unmodified method is 2-5 times greater than in the modified method. The normal value of the perfusion index applying the modified method is 91-171. (author)

  7. Development of detection methods for irradiated foods - Detection method for radiolytic products of irradiated foods

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kyong Su; Kim, Sun Min; Park, Eun Ryong; Lee, Hae Jung; Kim, Eun Ah; Jo, Jung Ok [Chosun University, Kwangju (Korea)

    1999-04-01

    Meat (beef, pork, chicken) and nut (sesame, perilla, black sesame, peanut) were irradiated with /sup 60/Co gamma-ray. A process to detect radiation-induced hydrocarbons and 2-alkylcyclobutanones includes the extraction of fat from meat and nut, separation of hydrocarbons and 2-alkylcyclobutanones with a florisil column and identification of GC/MS methods. Concentrations of the produced hydrocarbons and 2-alkylcyclobutanones tended to increase linearly with the dose levels of irradiation in beef, pork and chicken, while concentrations of radiation-induced hydrocarbons were different individually at the same dose level. In meat, hydrocarbons and 2-alkylcyclobutanones originated from oleic acid were found in a large amount. The concentrations of radiation-induced hydrocarbons were relatively constant during 16 weeks. In nut, hydrocarbons originated from oleic acid and linoleic acid were the major compounds whereas results of perilla was similar to meat. Radiation-induced hydrocarbons were increased linearly with the irradiation dose and remarkably detected at 0.5 kGy and over. 44 refs., 30 figs., 14 tabs. (Author)

  8. A method of failed fuel detection

    International Nuclear Information System (INIS)

    Uchida, Shunsuke; Utamura, Motoaki; Urata, Megumu.

    1976-01-01

    Object: To keep the coolant fed to a fuel assembly at a level below the temperature of existing coolant to detect a failed fuel with high accuracy without using a heater. Structure: When a coolant in a coolant pool disposed at the upper part of a reactor container is fed by a coolant feed system into a fuel assembly through a cap to fill therewith and exchange while forming a boundary layer between said coolant and the existing coolant, the temperature distribution of the feed coolant is heated by fuel rods so that the upper part is low whereas the lower part is high. Then, the lower coolant is upwardly moved by the agitating action and fission products leaked through a failed opening at the lower part of the fuel assembly and easily extracted by the sampling system. (Yoshino, Y.)

  9. Distributed gas detection system and method

    Science.gov (United States)

    Challener, William Albert; Palit, Sabarni; Karp, Jason Harris; Kasten, Ansas Matthias; Choudhury, Niloy

    2017-11-21

    A distributed gas detection system includes one or more hollow core fibers disposed in different locations, one or more solid core fibers optically coupled with the one or more hollow core fibers and configured to receive light of one or more wavelengths from a light source, and an interrogator device configured to receive at least some of the light propagating through the one or more solid core fibers and the one or more hollow core fibers. The interrogator device is configured to identify a location of a presence of a gas-of-interest by examining absorption of at least one of the wavelengths of the light at least one of the hollow core fibers.

  10. A survey on the human reliability analysis methods for the design of Korean next generation reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Lee, J. W.; Park, J. C.; Kwack, H. Y.; Lee, K. Y.; Park, J. K.; Kim, I. S.; Jung, K. W

    2000-03-01

    Enhanced features through applying recent domestic technologies may characterize the safety and efficiency of KNGR(Korea Next Generation Reactor). Human engineered interface and control room environment are expected to be beneficial to the human aspects of KNGR design. However, since the current method for human reliability analysis is not up to date after THERP/SHARP, it becomes hard to assess the potential of human errors due to both of the positive and negative effect of the design changes in KNGR. This is a state of the art report on the human reliability analysis methods that are potentially available for the application to the KNGR design. We surveyed every technical aspects of existing HRA methods, and compared them in order to obtain the requirements for the assessment of human error potentials within KNGR design. We categorized the more than 10 methods into the first and the second generation according to the suggestion of Dr. Hollnagel. THERP was revisited in detail. ATHEANA proposed by US NRC for an advanced design and CREAM proposed by Dr. Hollnagel were reviewed and compared. We conclude that the key requirements might include the enhancement in the early steps for human error identification and the quantification steps with considerations of more extended error shaping factors over PSFs(performance shaping factors). The utilization of the steps and approaches of ATHEANA and CREAM will be beneficial to the attainment of an appropriate HRA method for KNGR. However, the steps and data from THERP will be still maintained because of the continuity with previous PSA activities in KNGR design.

  11. A survey on the human reliability analysis methods for the design of Korean next generation reactor

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Lee, J. W.; Park, J. C.; Kwack, H. Y.; Lee, K. Y.; Park, J. K.; Kim, I. S.; Jung, K. W.

    2000-03-01

    Enhanced features through applying recent domestic technologies may characterize the safety and efficiency of KNGR(Korea Next Generation Reactor). Human engineered interface and control room environment are expected to be beneficial to the human aspects of KNGR design. However, since the current method for human reliability analysis is not up to date after THERP/SHARP, it becomes hard to assess the potential of human errors due to both of the positive and negative effect of the design changes in KNGR. This is a state of the art report on the human reliability analysis methods that are potentially available for the application to the KNGR design. We surveyed every technical aspects of existing HRA methods, and compared them in order to obtain the requirements for the assessment of human error potentials within KNGR design. We categorized the more than 10 methods into the first and the second generation according to the suggestion of Dr. Hollnagel. THERP was revisited in detail. ATHEANA proposed by US NRC for an advanced design and CREAM proposed by Dr. Hollnagel were reviewed and compared. We conclude that the key requirements might include the enhancement in the early steps for human error identification and the quantification steps with considerations of more extended error shaping factors over PSFs(performance shaping factors). The utilization of the steps and approaches of ATHEANA and CREAM will be beneficial to the attainment of an appropriate HRA method for KNGR. However, the steps and data from THERP will be still maintained because of the continuity with previous PSA activities in KNGR design

  12. Validity and reliability of the session-RPE method for quantifying training load in karate athletes.

    Science.gov (United States)

    Tabben, M; Tourny, C; Haddad, M; Chaabane, H; Chamari, K; Coquart, J B

    2015-04-24

    To test the construct validity and reliability of the session rating of perceived exertion (sRPE) method by examining the relationship between RPE and physiological parameters (heart rate: HR and blood lactate concentration: [La --] ) and the correlations between sRPE and two HR--based methods for quantifying internal training load (Banister's method and Edwards's method) during karate training camp. Eighteen elite karate athletes: ten men (age: 24.2 ± 2.3 y, body mass: 71.2 ± 9.0 kg, body fat: 8.2 ± 1.3% and height: 178 ± 7 cm) and eight women (age: 22.6 ± 1.2 y, body mass: 59.8 ± 8.4 kg, body fat: 20.2 ± 4.4%, height: 169 ± 4 cm) were included in the study. During training camp, subjects participated in eight karate--training sessions including three training modes (4 tactical--technical, 2 technical--development, and 2 randori training), during which RPE, HR, and [La -- ] were recorded. Significant correlations were found between RPE and physiological parameters (percentage of maximal HR: r = 0.75, 95% CI = 0.64--0.86; [La --] : r = 0.62, 95% CI = 0.49--0.75; P training load ( r = 0.65--0.95; P reliability of the same intensity across training sessions (Cronbach's α = 0.81, 95% CI = 0.61--0.92). This study demonstrates that the sRPE method is valid for quantifying internal training load and intensity in karate.

  13. Novel Methods to Enhance Precision and Reliability in Muscle Synergy Identification during Walking

    Science.gov (United States)

    Kim, Yushin; Bulea, Thomas C.; Damiano, Diane L.

    2016-01-01

    Muscle synergies are hypothesized to reflect modular control of muscle groups via descending commands sent through multiple neural pathways. Recently, the number of synergies has been reported as a functionally relevant indicator of motor control complexity in individuals with neurological movement disorders. Yet the number of synergies extracted during a given activity, e.g., gait, varies within and across studies, even for unimpaired individuals. With no standardized methods for precise determination, this variability remains unexplained making comparisons across studies and cohorts difficult. Here, we utilize k-means clustering and intra-class and between-level correlation coefficients to precisely discriminate reliable from unreliable synergies. Electromyography (EMG) was recorded bilaterally from eight leg muscles during treadmill walking at self-selected speed. Muscle synergies were extracted from 20 consecutive gait cycles using non-negative matrix factorization. We demonstrate that the number of synergies is highly dependent on the threshold when using the variance accounted for by reconstructed EMG. Beyond use of threshold, our method utilized a quantitative metric to reliably identify four or five synergies underpinning walking in unimpaired adults and revealed synergies having poor reproducibility that should not be considered as true synergies. We show that robust and unreliable synergies emerge similarly, emphasizing the need for careful analysis in those with pathology. PMID:27695403

  14. Identification of a practical and reliable method for the evaluation of litter moisture in turkey production.

    Science.gov (United States)

    Vinco, L J; Giacomelli, S; Campana, L; Chiari, M; Vitale, N; Lombardi, G; Veldkamp, T; Hocking, P M

    2018-02-01

    1. An experiment was conducted to compare 5 different methods for the evaluation of litter moisture. 2. For litter collection and assessment, 55 farms were selected, one shed from each farm was inspected and 9 points were identified within each shed. 3. For each device, used for the evaluation of litter moisture, mean and standard deviation of wetness measures per collection point were assessed. 4. The reliability and overall consistency between the 5 instruments used to measure wetness were high (α = 0.72). 5. Measurement of three out of the 9 collection points were sufficient to provide a reliable assessment of litter moisture throughout the shed. 6. Based on the direct correlation between litter moisture and footpad lesions, litter moisture measurement can be used as a resource based on-farm animal welfare indicator. 7. Among the 5 methods analysed, visual scoring is the most simple and practical, and therefore the best candidate to be used on-farm for animal welfare assessment.

  15. A Newly Developed Method for Computing Reliability Measures in a Water Supply Network

    Directory of Open Access Journals (Sweden)

    Jacek Malinowski

    2016-01-01

    Full Text Available A reliability model of a water supply network has beens examined. Its main features are: a topology that can be decomposed by the so-called state factorization into a (relativelysmall number of derivative networks, each having a series-parallel structure (1, binary-state components (either operative or failed with given flow capacities (2, a multi-state character of the whole network and its sub-networks - a network state is defined as the maximal flow between a source (sources and a sink (sinks (3, all capacities (component, network, and sub-network have integer values (4. As the network operates, its state changes due to component failures, repairs, and replacements. A newly developed method of computing the inter-state transition intensities has been presented. It is based on the so-called state factorization and series-parallel aggregation. The analysis of these intensities shows that the failure-repair process of the considered system is an asymptotically homogenous Markov process. It is also demonstrated how certain reliability parameters useful for the network maintenance planning can be determined on the basis of the asymptotic intensities. For better understanding of the presented method, an illustrative example is given. (original abstract

  16. Uncertainty analysis methods for estimation of reliability of passive system of VHTR

    International Nuclear Information System (INIS)

    Han, S.J.

    2012-01-01

    An estimation of reliability of passive system for the probabilistic safety assessment (PSA) of a very high temperature reactor (VHTR) is under development in Korea. The essential approach of this estimation is to measure the uncertainty of the system performance under a specific accident condition. The uncertainty propagation approach according to the simulation of phenomenological models (computer codes) is adopted as a typical method to estimate the uncertainty for this purpose. This presentation introduced the uncertainty propagation and discussed the related issues focusing on the propagation object and its surrogates. To achieve a sufficient level of depth of uncertainty results, the applicability of the propagation should be carefully reviewed. For an example study, Latin-hypercube sampling (LHS) method as a direct propagation was tested for a specific accident sequence of VHTR. The reactor cavity cooling system (RCCS) developed by KAERI was considered for this example study. This is an air-cooled type passive system that has no active components for its operation. The accident sequence is a low pressure conduction cooling (LPCC) accident that is considered as a design basis accident for the safety design of VHTR. This sequence is due to a large failure of the pressure boundary of the reactor system such as a guillotine break of coolant pipe lines. The presentation discussed the obtained insights (benefit and weakness) to apply an estimation of reliability of passive system

  17. A review on exudates detection methods for diabetic retinopathy.

    Science.gov (United States)

    Joshi, Shilpa; Karule, P T

    2018-01-01

    The presence of exudates on the retina is the most characteristic symptom of diabetic retinopathy. As exudates are among early clinical signs of DR, their detection would be an essential asset to the mass screening task and serve as an important step towards automatic grading and monitoring of the disease. Reliable identification and classification of exudates are of inherent interest in an automated diabetic retinopathy screening system. Here we review the numerous early studies that used for automatic exudates detection with the aim of providing decision support in addition to reducing the workload of an ophthalmologist. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  18. METHODS FOR DETECTING BACTERIA USING POLYMER MATERIALS

    NARCIS (Netherlands)

    Van Grinsven Bart Robert, Nicolaas; Cleij, Thomas

    2017-01-01

    A method for characterizing bacteria includes passing a liquid containing an analyte comprising a first bacteria and a second bacteria over and in contact with a polymer material on a substrate. The polymer material is formulated to bind to the first bacteria, and the first bacteria binds to the

  19. Method of Pentest Synthesis and Vulnerability Detection

    OpenAIRE

    Hahanova Irina Vitalyevna

    2012-01-01

    The structural method for penetration test generation and vulnerability simulation for infrastructure of telecommunication hardwaresoftware information cybernetic systems (CS), focused to protect against unauthorized access the services defined in the system specification by means of penetrating through legal interfaces of component interaction, which have vulnerabilities, is proposed. A protection service infrastructure is created with cybersystem and maintains it during the life cycle, serv...

  20. Electronic logic to enhance switch reliability in detecting openings and closures of redundant switches

    Science.gov (United States)

    Cooper, James A.

    1986-01-01

    A logic circuit is used to enhance redundant switch reliability. Two or more switches are monitored for logical high or low output. The output for the logic circuit produces a redundant and failsafe representation of the switch outputs. When both switch outputs are high, the output is high. Similarly, when both switch outputs are low, the logic circuit's output is low. When the output states of the two switches do not agree, the circuit resolves the conflict by memorizing the last output state which both switches were simultaneously in and produces the logical complement of this output state. Thus, the logic circuit of the present invention allows the redundant switches to be treated as if they were in parallel when the switches are open and as if they were in series when the switches are closed. A failsafe system having maximum reliability is thereby produced.