WorldWideScience

Sample records for reliable direct quantification

  1. Exact reliability quantification of highly reliable systems with maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Bris, Radim, E-mail: radim.bris@vsb.c [VSB-Technical University Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Applied Mathematics, 17. listopadu 15, 70833 Ostrava-Poruba (Czech Republic)

    2010-12-15

    When a system is composed of highly reliable elements, exact reliability quantification may be problematic, because computer accuracy is limited. Inaccuracy can be due to different aspects. For example, an error may be made when subtracting two numbers that are very close to each other, or at the process of summation of many very different numbers, etc. The basic objective of this paper is to find a procedure, which eliminates errors made by PC when calculations close to an error limit are executed. Highly reliable system is represented by the use of directed acyclic graph which is composed from terminal nodes, i.e. highly reliable input elements, internal nodes representing subsystems and edges that bind all of these nodes. Three admissible unavailability models of terminal nodes are introduced, including both corrective and preventive maintenance. The algorithm for exact unavailability calculation of terminal nodes is based on merits of a high-performance language for technical computing MATLAB. System unavailability quantification procedure applied to a graph structure, which considers both independent and dependent (i.e. repeatedly occurring) terminal nodes is based on combinatorial principle. This principle requires summation of a lot of very different non-negative numbers, which may be a source of an inaccuracy. That is why another algorithm for exact summation of such numbers is designed in the paper. The summation procedure uses benefits from a special number system with the base represented by the value 2{sup 32}. Computational efficiency of the new computing methodology is compared with advanced simulation software. Various calculations on systems from references are performed to emphasize merits of the methodology.

  2. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    Science.gov (United States)

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. Design and performance testing of a DNA extraction assay for sensitive and reliable quantification of acetic acid bacteria directly in red wine using real time PCR

    Directory of Open Access Journals (Sweden)

    Cédric eLONGIN

    2016-06-01

    Full Text Available Although strategies exist to prevent AAB contamination, the increased interest for wines with low sulfite addition leads to greater AAB spoilage. Hence there is a real need for a rapid, specific, sensitive and reliable method for detecting these spoilage bacteria. All these requirements are met by real time Polymerase Chain Reaction (or quantitative PCR; qPCR. Here, we compare existing methods of isolating DNA and their adaptation to a red wine matrix. Two different protocols for isolating DNA and three PCR mix compositions were tested to select the best method. The addition of insoluble polyvinylpolypyrrolidone (PVPP at 1% (v/v during DNA extraction using a protocol succeeded in eliminating PCR inhibitors from red wine. We developed a bacterial internal control which was efficient in avoiding false negative results due to decreases in the efficiency of DNA isolation and/or amplification. The specificity, linearity, repeatability and reproducibility of the method were evaluated. A standard curve was established for the enumeration of AAB inoculated into red wines. The limit of quantification in red wine was 3.7 log AAB/mL and about 2.8 log AAB/mL when the volume of the samples was increased from 1 mL to 10 mL. Thus the DNA extraction method developed in this paper allows sensitive and reliable AAB quantification without underestimation thanks to the presence of an internal control. Moreover, monitoring of both the AAB population and the amount of acetic acid in ethanol medium and red wine highlighted that a minimum about 6.0 log cells/mL of AAB is needed to significantly increase the production of acetic acid leading to spoilage.

  4. Review of some aspects of human reliability quantification

    International Nuclear Information System (INIS)

    Lydell, B.O.Y.; Spurgin, A.J.; Hannaman, G.W.; Lukic, Y.D.

    1986-01-01

    An area in systems reliability considered to be weak, is the characterization and quantification of the role of the operations and maintenance staff in combatting accidents. Several R and D programs are underway to improve the modeling of human interactions and some progress has been made. This paper describes a specific aspect of human reliability analysis which is referred to as modeling of cognitive processes. In particular, the basis for the so- called Human Cognitive Reliability (HCR) model is described and the focus is on its validation and on its benefits and limitations

  5. Reliability and discriminatory power of methods for dental plaque quantification

    Directory of Open Access Journals (Sweden)

    Daniela Prócida Raggio

    2010-04-01

    Full Text Available OBJECTIVE: This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI and fluorescence camera (FC to detect plaque. MATERIAL AND METHODS: Six volunteers used palatal appliances with six bovine enamel blocks presenting different stages of plaque accumulation. The presence of plaque with and without disclosing was assessed using VI. Images were obtained with FC and digital camera in both conditions. The area covered by plaque was assessed. Examinations were done by two independent examiners. Data were analyzed by Kruskal-Wallis and Kappa tests to compare different conditions of samples and to assess the inter-examiner reproducibility. RESULTS: Some methods presented adequate reproducibility. The Turesky index and the assessment of area covered by disclosed plaque in the FC images presented the highest discriminatory powers. CONCLUSION: The Turesky index and images with FC with disclosing present good reliability and discriminatory power in quantifying dental plaque.

  6. Towards the production of reliable quantitative microbiological data for risk assessment: Direct quantification of Campylobacter in naturally infected chicken fecal samples using selective culture and real-time PCR

    DEFF Research Database (Denmark)

    Garcia Clavero, Ana Belén; Vigre, Håkan; Josefsen, Mathilde Hasseldam

    2015-01-01

    of Campylobacter by real-time PCR was performed using standard curves designed for two different DNA extraction methods: Easy-DNA™ Kit from Invitrogen (Easy-DNA) and NucliSENS® MiniMAG® from bioMérieux (MiniMAG). Results indicated that the estimation of the numbers of Campylobacter present in chicken fecal samples...... and for the evaluation of control strategies implemented in poultry production. The aim of this study was to compare estimates of the numbers of Campylobacter spp. in naturally infected chicken fecal samples obtained using direct quantification by selective culture and by real-time PCR. Absolute quantification....... Although there were differences in terms of estimates of Campylobacter numbers between the methods and samples, the differences between culture and real-time PCR were not statistically significant for most of the samples used in this study....

  7. Digital PCR for direct quantification of viruses without DNA extraction.

    Science.gov (United States)

    Pavšič, Jernej; Žel, Jana; Milavec, Mojca

    2016-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration material and it has higher tolerance to inhibitors. DNA quantification without an extraction step (i.e. direct quantification) was performed here using dPCR and two different human cytomegalovirus whole-virus materials. Two dPCR platforms were used for this direct quantification of the viral DNA, and these were compared with quantification of the extracted viral DNA in terms of yield and variability. Direct quantification of both whole-virus materials present in simple matrices like cell lysate or Tris-HCl buffer provided repeatable measurements of virus concentrations that were probably in closer agreement with the actual viral load than when estimated through quantification of the extracted DNA. Direct dPCR quantification of other viruses, reference materials and clinically relevant matrices is now needed to show the full versatility of this very promising and cost-efficient development in virus quantification.

  8. An architectural model for software reliability quantification: sources of data

    International Nuclear Information System (INIS)

    Smidts, C.; Sova, D.

    1999-01-01

    Software reliability assessment models in use today treat software as a monolithic block. An aversion towards 'atomic' models seems to exist. These models appear to add complexity to the modeling, to the data collection and seem intrinsically difficult to generalize. In 1997, we introduced an architecturally based software reliability model called FASRE. The model is based on an architecture derived from the requirements which captures both functional and nonfunctional requirements and on a generic classification of functions, attributes and failure modes. The model focuses on evaluation of failure mode probabilities and uses a Bayesian quantification framework. Failure mode probabilities of functions and attributes are propagated to the system level using fault trees. It can incorporate any type of prior information such as results of developers' testing, historical information on a specific functionality and its attributes, and, is ideally suited for reusable software. By building an architecture and deriving its potential failure modes, the model forces early appraisal and understanding of the weaknesses of the software, allows reliability analysis of the structure of the system, provides assessments at a functional level as well as at a systems' level. In order to quantify the probability of failure (or the probability of success) of a specific element of our architecture, data are needed. The term element of the architecture is used here in its broadest sense to mean a single failure mode or a higher level of abstraction such as a function. The paper surveys the potential sources of software reliability data available during software development. Next the mechanisms for incorporating these sources of relevant data to the FASRE model are identified

  9. Quantification of human reliability in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Hirschberg, S.; Dankg, Vinh N.

    1996-01-01

    Human performance may substantially influence the reliability and safety of complex technical systems. For this reason, Human Reliability Analysis (HRA) constitutes an important part of Probabilistic Safety Assessment (PSAs) or Quantitative Risk Analyses (QRAs). The results of these studies as well as analyses of past accidents and incidents clearly demonstrate the importance of human interactions. The contribution of human errors to the core damage frequency (CDF), as estimated in the Swedish nuclear PSAs, are between 15 and 88%. A survey of the FRAs in the Swiss PSAs shows that also for the Swiss nuclear power plants the estimated HE contributions are substantial (49% of the CDF due to internal events in the case of Beznau and 70% in the case of Muehleberg; for the total CDF, including external events, 25% respectively 20%). Similar results can be extracted from the PSAs carried out for French, German, and US plants. In PSAs or QRAs, the adequate treatment of the human interactions with the system is a key to the understanding of accident sequences and their relative importance to overall risk. The main objectives of HRA are: first, to ensure that the key human interactions are systematically identified and incorporated into the safety analysis in a traceable manner, and second, to quantify the probabilities of their success and failure. Adopting a structured and systematic approach to the assessment of human performance makes it possible to provide greater confidence that the safety and availability of human-machine systems is not unduly jeopardized by human performance problems. Section 2 discusses the different types of human interactions analysed in PSAs. More generally, the section presents how HRA fits in the overall safety analysis, that is, how the human interactions to be quantified are identified. Section 3 addresses the methods for quantification. Section 4 concludes the paper by presenting some recommendations and pointing out the limitations of the

  10. Data reliability in complex directed networks

    Science.gov (United States)

    Sanz, Joaquín; Cozzo, Emanuele; Moreno, Yamir

    2013-12-01

    The availability of data from many different sources and fields of science has made it possible to map out an increasing number of networks of contacts and interactions. However, quantifying how reliable these data are remains an open problem. From Biology to Sociology and Economics, the identification of false and missing positives has become a problem that calls for a solution. In this work we extend one of the newest, best performing models—due to Guimerá and Sales-Pardo in 2009—to directed networks. The new methodology is able to identify missing and spurious directed interactions with more precision than previous approaches, which renders it particularly useful for analyzing data reliability in systems like trophic webs, gene regulatory networks, communication patterns and several social systems. We also show, using real-world networks, how the method can be employed to help search for new interactions in an efficient way.

  11. Data reliability in complex directed networks

    International Nuclear Information System (INIS)

    Sanz, Joaquín; Cozzo, Emanuele; Moreno, Yamir

    2013-01-01

    The availability of data from many different sources and fields of science has made it possible to map out an increasing number of networks of contacts and interactions. However, quantifying how reliable these data are remains an open problem. From Biology to Sociology and Economics, the identification of false and missing positives has become a problem that calls for a solution. In this work we extend one of the newest, best performing models—due to Guimerá and Sales-Pardo in 2009—to directed networks. The new methodology is able to identify missing and spurious directed interactions with more precision than previous approaches, which renders it particularly useful for analyzing data reliability in systems like trophic webs, gene regulatory networks, communication patterns and several social systems. We also show, using real-world networks, how the method can be employed to help search for new interactions in an efficient way. (paper)

  12. Probabilistic risk assessment course documentation. Volume 5. System reliability and analysis techniques Session D - quantification

    International Nuclear Information System (INIS)

    Lofgren, E.V.

    1985-08-01

    This course in System Reliability and Analysis Techniques focuses on the probabilistic quantification of accident sequences and the link between accident sequences and consequences. Other sessions in this series focus on the quantification of system reliability and the development of event trees and fault trees. This course takes the viewpoint that event tree sequences or combinations of system failures and success are available and that Boolean equations for system fault trees have been developed and are available. 93 figs., 11 tabs

  13. Digital PCR for direct quantification of viruses without DNA extraction

    OpenAIRE

    Pav?i?, Jernej; ?el, Jana; Milavec, Mojca

    2015-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration mat...

  14. Practical reliability and uncertainty quantification in complex systems : final report.

    Energy Technology Data Exchange (ETDEWEB)

    Grace, Matthew D.; Ringland, James T.; Marzouk, Youssef M. (Massachusetts Institute of Technology, Cambridge, MA); Boggs, Paul T.; Zurn, Rena M.; Diegert, Kathleen V. (Sandia National Laboratories, Albuquerque, NM); Pebay, Philippe Pierre; Red-Horse, John Robert (Sandia National Laboratories, Albuquerque, NM)

    2009-09-01

    The purpose of this project was to investigate the use of Bayesian methods for the estimation of the reliability of complex systems. The goals were to find methods for dealing with continuous data, rather than simple pass/fail data; to avoid assumptions of specific probability distributions, especially Gaussian, or normal, distributions; to compute not only an estimate of the reliability of the system, but also a measure of the confidence in that estimate; to develop procedures to address time-dependent or aging aspects in such systems, and to use these models and results to derive optimal testing strategies. The system is assumed to be a system of systems, i.e., a system with discrete components that are themselves systems. Furthermore, the system is 'engineered' in the sense that each node is designed to do something and that we have a mathematical description of that process. In the time-dependent case, the assumption is that we have a general, nonlinear, time-dependent function describing the process. The major results of the project are described in this report. In summary, we developed a sophisticated mathematical framework based on modern probability theory and Bayesian analysis. This framework encompasses all aspects of epistemic uncertainty and easily incorporates steady-state and time-dependent systems. Based on Markov chain, Monte Carlo methods, we devised a computational strategy for general probability density estimation in the steady-state case. This enabled us to compute a distribution of the reliability from which many questions, including confidence, could be addressed. We then extended this to the time domain and implemented procedures to estimate the reliability over time, including the use of the method to predict the reliability at a future time. Finally, we used certain aspects of Bayesian decision analysis to create a novel method for determining an optimal testing strategy, e.g., we can estimate the 'best' location to

  15. Evaluation of the reliability of maize reference assays for GMO quantification.

    Science.gov (United States)

    Papazova, Nina; Zhang, David; Gruden, Kristina; Vojvoda, Jana; Yang, Litao; Buh Gasparic, Meti; Blejec, Andrej; Fouilloux, Stephane; De Loose, Marc; Taverniers, Isabel

    2010-03-01

    A reliable PCR reference assay for relative genetically modified organism (GMO) quantification must be specific for the target taxon and amplify uniformly along the commercialised varieties within the considered taxon. Different reference assays for maize (Zea mays L.) are used in official methods for GMO quantification. In this study, we evaluated the reliability of eight existing maize reference assays, four of which are used in combination with an event-specific polymerase chain reaction (PCR) assay validated and published by the Community Reference Laboratory (CRL). We analysed the nucleotide sequence variation in the target genomic regions in a broad range of transgenic and conventional varieties and lines: MON 810 varieties cultivated in Spain and conventional varieties from various geographical origins and breeding history. In addition, the reliability of the assays was evaluated based on their PCR amplification performance. A single base pair substitution, corresponding to a single nucleotide polymorphism (SNP) reported in an earlier study, was observed in the forward primer of one of the studied alcohol dehydrogenase 1 (Adh1) (70) assays in a large number of varieties. The SNP presence is consistent with a poor PCR performance observed for this assay along the tested varieties. The obtained data show that the Adh1 (70) assay used in the official CRL NK603 assay is unreliable. Based on our results from both the nucleotide stability study and the PCR performance test, we can conclude that the Adh1 (136) reference assay (T25 and Bt11 assays) as well as the tested high mobility group protein gene assay, which also form parts of CRL methods for quantification, are highly reliable. Despite the observed uniformity in the nucleotide sequence of the invertase gene assay, the PCR performance test reveals that this target sequence might occur in more than one copy. Finally, although currently not forming a part of official quantification methods, zein and SSIIb

  16. Direct quantification of negatively charged functional groups on membrane surfaces

    KAUST Repository

    Tiraferri, Alberto; Elimelech, Menachem

    2012-01-01

    groups at the surface of dense polymeric membranes. Both techniques consist of associating the membrane surface moieties with chemical probes, followed by quantification of the bound probes. Uranyl acetate and toluidine blue O dye, which interact

  17. Direct quantification of nickel in stainless steels by spectrophotometry

    International Nuclear Information System (INIS)

    Singh, Ritu; Raut, Vaibhavi V.; Jeyakumar, S.; Ramakumar, K.L.

    2007-01-01

    A spectrophotometric method based on the Ni-DMG complex for the quantification of nickel in steel samples without employing any prior separation is reported in the present study. The interfering ions are masked by suitable complexing agents and the method was extended to real samples after validating with BCS and Euro steel standards. (author)

  18. Reliable quantification of phthalates in environmental matrices (air, water, sludge, sediment and soil): a review.

    Science.gov (United States)

    Net, Sopheak; Delmont, Anne; Sempéré, Richard; Paluselli, Andrea; Ouddane, Baghdad

    2015-05-15

    Because of their widespread application, phthalates or phthalic acid esters (PAEs) are ubiquitous in the environment. Their presence has attracted considerable attention due to their potential impacts on ecosystem functioning and on public health, so their quantification has become a necessity. Various extraction procedures as well as gas/liquid chromatography and mass spectrometry detection techniques are found as suitable for reliable detection of such compounds. However, PAEs are ubiquitous in the laboratory environment including ambient air, reagents, sampling equipment, and various analytical devices, that induces difficult analysis of real samples with a low PAE background. Therefore, accurate PAE analysis in environmental matrices is a challenging task. This paper reviews the extensive literature data on the techniques for PAE quantification in natural media. Sampling, sample extraction/pretreatment and detection for quantifying PAEs in different environmental matrices (air, water, sludge, sediment and soil) have been reviewed and compared. The concept of "green analytical chemistry" for PAE determination is also discussed. Moreover useful information about the material preparation and the procedures of quality control and quality assurance are presented to overcome the problem of sample contamination and these encountered due to matrix effects in order to avoid overestimating PAE concentrations in the environment. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Review of advances in human reliability analysis of errors of commission-Part 2: EOC quantification

    International Nuclear Information System (INIS)

    Reer, Bernhard

    2008-01-01

    In close connection with examples relevant to contemporary probabilistic safety assessment (PSA), a review of advances in human reliability analysis (HRA) of post-initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions, has been carried out. The review comprises both EOC identification (part 1) and quantification (part 2); part 2 is presented in this article. Emerging HRA methods in this field are: ATHEANA, MERMOS, the EOC HRA method developed by Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS), the MDTA method and CREAM. The essential advanced features are on the conceptual side, especially to envisage the modeling of multiple contexts for an EOC to be quantified (ATHEANA, MERMOS and MDTA), in order to explicitly address adverse conditions. There is promising progress in providing systematic guidance to better account for cognitive demands and tendencies (GRS, CREAM), and EOC recovery (MDTA). Problematic issues are associated with the implementation of multiple context modeling and the assessment of context-specific error probabilities. Approaches for task or error opportunity scaling (CREAM, GRS) and the concept of reference cases (ATHEANA outlook) provide promising orientations for achieving progress towards data-based quantification. Further development work is needed and should be carried out in close connection with large-scale applications of existing approaches

  20. Uncertainty Quantification in the Reliability and Risk Assessment of Generation IV Reactors: Final Scientific/Technical Report

    International Nuclear Information System (INIS)

    Vierow, Karen; Aldemir, Tunc

    2009-01-01

    The project entitled, 'Uncertainty Quantification in the Reliability and Risk Assessment of Generation IV Reactors', was conducted as a DOE NERI project collaboration between Texas A and M University and The Ohio State University between March 2006 and June 2009. The overall goal of the proposed project was to develop practical approaches and tools by which dynamic reliability and risk assessment techniques can be used to augment the uncertainty quantification process in probabilistic risk assessment (PRA) methods and PRA applications for Generation IV reactors. This report is the Final Scientific/Technical Report summarizing the project.

  1. Uncertainty Quantification in the Reliability and Risk Assessment of Generation IV Reactors: Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Vierow, Karen; Aldemir, Tunc

    2009-09-10

    The project entitled, “Uncertainty Quantification in the Reliability and Risk Assessment of Generation IV Reactors”, was conducted as a DOE NERI project collaboration between Texas A&M University and The Ohio State University between March 2006 and June 2009. The overall goal of the proposed project was to develop practical approaches and tools by which dynamic reliability and risk assessment techniques can be used to augment the uncertainty quantification process in probabilistic risk assessment (PRA) methods and PRA applications for Generation IV reactors. This report is the Final Scientific/Technical Report summarizing the project.

  2. Reliability Quantification Method for Safety Critical Software Based on a Finite Test Set

    International Nuclear Information System (INIS)

    Shin, Sung Min; Kim, Hee Eun; Kang, Hyun Gook; Lee, Seung Jun

    2014-01-01

    Software inside of digitalized system have very important role because it may cause irreversible consequence and affect the whole system as common cause failure. However, test-based reliability quantification method for some safety critical software has limitations caused by difficulties in developing input sets as a form of trajectory which is series of successive values of variables. To address these limitations, this study proposed another method which conduct the test using combination of single values of variables. To substitute the trajectory form of input using combination of variables, the possible range of each variable should be identified. For this purpose, assigned range of each variable, logical relations between variables, plant dynamics under certain situation, and characteristics of obtaining information of digital device are considered. A feasibility of the proposed method was confirmed through an application to the Reactor Protection System (RPS) software trip logic

  3. Approximation of reliability of direct genomic breeding values

    Science.gov (United States)

    Two methods to efficiently approximate theoretical genomic reliabilities are presented. The first method is based on the direct inverse of the left hand side (LHS) of mixed model equations. It uses the genomic relationship matrix for a small subset of individuals with the highest genomic relationshi...

  4. Direct unavailability computation of a maintained highly reliable system

    Czech Academy of Sciences Publication Activity Database

    Briš, R.; Byczanski, Petr

    2010-01-01

    Roč. 224, č. 3 (2010), s. 159-170 ISSN 1748-0078 Grant - others:GA Mšk(CZ) MSM6198910007 Institutional research plan: CEZ:AV0Z30860518 Keywords : high reliability * availability * directed acyclic graph Subject RIV: BA - General Mathematics http:// journals .pepublishing.com/content/rtp3178l17923m46/

  5. Comparison of indirect and direct quantification of esters of monochloropropanediol in vegetable oil.

    Science.gov (United States)

    Dubois, Mathieu; Tarres, Adrienne; Goldmann, Till; Empl, Anna Maria; Donaubauer, Alfred; Seefelder, Walburga

    2012-05-04

    The presence of fatty acid esters of monochloropropanediol (MEs) in food is a recent concern raised due to the carcinogenicity of their hydrolysable moieties 2- and 3-monochloropropanediol (2- and 3-MCPD). Several indirect methods for the quantification of MEs have been developed and are commonly in use until today, however significant discrepancies among analytical results obtained are challenging their reliability. The aim of the present study was therefore to test the trueness of an indirect method by comparing it to a newly developed direct method using palm oil and palm olein as examples. The indirect method was based on ester cleavage under acidic conditions, derivatization of the liberated 2- and 3-MCPD with heptafluorobutyryl imidazole and GC-MS determination. The direct method was comprised of two extraction procedures targeting 2-and 3-MCPD mono esters (co-extracting as well glycidyl esters) by the use of double solid phase extraction (SPE), and 2- and 3-MCPD di-esters by the use of silica gel column, respectively. Detection was carried out by liquid chromatography coupled to time of flight mass spectrometry (LC-ToF-MS). Accurate quantification of the intact compounds was assured by means of matrix matched standard addition on extracts. Analysis of 22 palm oil and 7 palm olein samples (2- plus 3-MCPD contamination ranged from 0.3 to 8.8 μg/g) by both methods revealed no significant bias. Both methods were therefore considered as comparable in terms of results; however the indirect method was shown to require less analytical standards, being less tedious and furthermore applicable to all type of different vegetable oils and hence recommended for routine application. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Assessment of ALWR passive safety system reliability. Phase 1: Methodology development and component failure quantification

    International Nuclear Information System (INIS)

    Hake, T.M.; Heger, A.S.

    1995-04-01

    Many advanced light water reactor (ALWR) concepts proposed for the next generation of nuclear power plants rely on passive systems to perform safety functions, rather than active systems as in current reactor designs. These passive systems depend to a great extent on physical processes such as natural circulation for their driving force, and not on active components, such as pumps. An NRC-sponsored study was begun at Sandia National Laboratories to develop and implement a methodology for evaluating ALWR passive system reliability in the context of probabilistic risk assessment (PRA). This report documents the first of three phases of this study, including methodology development, system-level qualitative analysis, and sequence-level component failure quantification. The methodology developed addresses both the component (e.g. valve) failure aspect of passive system failure, and uncertainties in system success criteria arising from uncertainties in the system's underlying physical processes. Traditional PRA methods, such as fault and event tree modeling, are applied to the component failure aspect. Thermal-hydraulic calculations are incorporated into a formal expert judgment process to address uncertainties in selected natural processes and success criteria. The first phase of the program has emphasized the component failure element of passive system reliability, rather than the natural process uncertainties. Although cursory evaluation of the natural processes has been performed as part of Phase 1, detailed assessment of these processes will take place during Phases 2 and 3 of the program

  7. Rapid quantification of free cholesterol in tears using direct insertion/electron ionization-mass spectrometry.

    Science.gov (United States)

    Wei, Xiaojia Eric; Korth, John; Brown, Simon H J; Mitchell, Todd W; Truscott, Roger J W; Blanksby, Stephen J; Willcox, Mark D P; Zhao, Zhenjun

    2013-12-09

    To establish a simple and rapid analytical method, based on direct insertion/electron ionization-mass spectrometry (DI/EI-MS), for measuring free cholesterol in tears from humans and rabbits. A stable-isotope dilution protocol employing DI/EI-MS in selected ion monitoring mode was developed and validated. It was used to quantify the free cholesterol content in human and rabbit tear extracts. Tears were collected from adult humans (n = 15) and rabbits (n = 10) and lipids extracted. Screening, full-scan (m/z 40-600) DI/EI-MS analysis of crude tear extracts showed that diagnostic ions located in the mass range m/z 350 to 400 were those derived from free cholesterol, with no contribution from cholesterol esters. DI/EI-MS data acquired using selected ion monitoring (SIM) were analyzed for the abundance ratios of diagnostic ions with their stable isotope-labeled analogues arising from the D6-cholesterol internal standard. Standard curves of good linearity were produced and an on-probe limit of detection of 3 ng (at 3:1 signal to noise) and limit of quantification of 8 ng (at 10:1 signal to noise). The concentration of free cholesterol in human tears was 15 ± 6 μg/g, which was higher than in rabbit tears (10 ± 5 μg/g). A stable-isotope dilution DI/EI-SIM method for free cholesterol quantification without prior chromatographic separation was established. Using this method demonstrated that humans have higher free cholesterol levels in their tears than rabbits. This is in agreement with previous reports. This paper provides a rapid and reliable method to measure free cholesterol in small-volume clinical samples.

  8. Modeling and Quantification of Team Performance in Human Reliability Analysis for Probabilistic Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. JOe; Ronald L. Boring

    2014-06-01

    Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understand from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.

  9. Direct quantification of negatively charged functional groups on membrane surfaces

    KAUST Repository

    Tiraferri, Alberto

    2012-02-01

    Surface charge plays an important role in membrane-based separations of particulates, macromolecules, and dissolved ionic species. In this study, we present two experimental methods to determine the concentration of negatively charged functional groups at the surface of dense polymeric membranes. Both techniques consist of associating the membrane surface moieties with chemical probes, followed by quantification of the bound probes. Uranyl acetate and toluidine blue O dye, which interact with the membrane functional groups via complexation and electrostatic interaction, respectively, were used as probes. The amount of associated probes was quantified using liquid scintillation counting for uranium atoms and visible light spectroscopy for the toluidine blue dye. The techniques were validated using self-assembled monolayers of alkanethiols with known amounts of charged moieties. The surface density of negatively charged functional groups of hand-cast thin-film composite polyamide membranes, as well as commercial cellulose triacetate and polyamide membranes, was quantified under various conditions. Using both techniques, we measured a negatively charged functional group density of 20-30nm -2 for the hand-cast thin-film composite membranes. The ionization behavior of the membrane functional groups, determined from measurements with toluidine blue at varying pH, was consistent with published data for thin-film composite polyamide membranes. Similarly, the measured charge densities on commercial membranes were in general agreement with previous investigations. The relative simplicity of the two methods makes them a useful tool for quantifying the surface charge concentration of a variety of surfaces, including separation membranes. © 2011 Elsevier B.V.

  10. Circulating levels of 3-hydroxymyristate, a direct quantification of endotoxemia in non-infected cirrhotic patients.

    Science.gov (United States)

    Weil, Delphine; Pais de Barros, Jean-Paul; Mourey, Guillaume; Laheurte, Caroline; Cypriani, Benoit; Badet, Nicolas; Delabrousse, Eric; Grandclément, Emilie; Di Martino, Vincent; Saas, Philippe; Lagrost, Laurent; Thévenot, Thierry

    2018-06-22

    The quantification of lipopolysaccharide (LPS) in biological fluids is challenging. We aimed to measure plasma LPS concentration using a new method of direct quantification of 3-hydroxymyristate (3-HM), a lipid component of LPS, and to evaluate correlations between 3-HM and markers of liver function, endothelial activation, portal hypertension and enterocyte damage. Plasma from 90 non-infected cirrhotic patients (30 Child-Pugh [CP]-A, 30 CP-B, 30 CP-C) was prospectively collected. The concentration of 3-HM was determined by High Performance Liquid Chromatography coupled with Mass Spectrometry. 3-HM levels were higher in CP-C patients (CP-A/CP-B/CP-C: 68/70/103 ng/mL, p=0.005). Patients with severe acute alcoholic hepatitis (n=16; 113 vs 74 ng/mL,p=0.012), diabetic patients (n=22; 99 vs 70 ng/mL, p=0.028) and those not receiving beta-blockers (n=44; 98 vs 72 ng/mL, p=0.034) had higher levels of 3-HM. We observed a trend towards higher baseline levels of 3-HM in patients with hepatic encephalopathy (n=7; 144 vs 76 ng/mL, p=0.45) or SIRS (n=10; 106 vs 75 ng/mL, p=0.114). In multivariate analysis, high levels of 3-HM were associated with CP (OR=4.39; 95%CI=1.79-10.76) or MELD (OR=8.24; 95%CI=3.19-21.32) scores. Patients dying from liver insufficiency (n=6) during a 12-month follow-up had higher baseline levels of 3-HM (106 vs 75 ng/mL, p=0.089). In non-infected cirrhotic patients, 3-HM arises more frequently with impairment of liver function, heavy alcohol consumption, diabetic status, non-use of beta-blockers, and a trend towards poorer outcome is also observed. The direct mass-measurement of LPS using 3-HM appears reliable to detect transient endotoxemia and promising to manage the follow-up of cirrhotic patients. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  11. Direct infusion-SIM as fast and robust method for absolute protein quantification in complex samples

    Directory of Open Access Journals (Sweden)

    Christina Looße

    2015-06-01

    Full Text Available Relative and absolute quantification of proteins in biological and clinical samples are common approaches in proteomics. Until now, targeted protein quantification is mainly performed using a combination of HPLC-based peptide separation and selected reaction monitoring on triple quadrupole mass spectrometers. Here, we show for the first time the potential of absolute quantification using a direct infusion strategy combined with single ion monitoring (SIM on a Q Exactive mass spectrometer. By using complex membrane fractions of Escherichia coli, we absolutely quantified the recombinant expressed heterologous human cytochrome P450 monooxygenase 3A4 (CYP3A4 comparing direct infusion-SIM with conventional HPLC-SIM. Direct-infusion SIM revealed only 14.7% (±4.1 (s.e.m. deviation on average, compared to HPLC-SIM and a decreased processing and analysis time of 4.5 min (that could be further decreased to 30 s for a single sample in contrast to 65 min by the LC–MS method. Summarized, our simplified workflow using direct infusion-SIM provides a fast and robust method for quantification of proteins in complex protein mixtures.

  12. Absolute and direct microRNA quantification using DNA-gold nanoparticle probes.

    Science.gov (United States)

    Degliangeli, Federica; Kshirsagar, Prakash; Brunetti, Virgilio; Pompa, Pier Paolo; Fiammengo, Roberto

    2014-02-12

    DNA-gold nanoparticle probes are implemented in a simple strategy for direct microRNA (miRNA) quantification. Fluorescently labeled DNA-probe strands are immobilized on PEGylated gold nanoparticles (AuNPs). In the presence of target miRNA, DNA-RNA heteroduplexes are formed and become substrate for the endonuclease DSN (duplex-specific nuclease). Enzymatic hydrolysis of the DNA strands yields a fluorescence signal due to diffusion of the fluorophores away from the gold surface. We show that the molecular design of our DNA-AuNP probes, with the DNA strands immobilized on top of the PEG-based passivation layer, results in nearly unaltered enzymatic activity toward immobilized heteroduplexes compared to substrates free in solution. The assay, developed in a real-time format, allows absolute quantification of as little as 0.2 fmol of miR-203. We also show the application of the assay for direct quantification of cancer-related miR-203 and miR-21 in samples of extracted total RNA from cell cultures. The possibility of direct and absolute quantification may significantly advance the use of microRNAs as biomarkers in the clinical praxis.

  13. Direct Quantification of Methane Emissions Across the Supply Chain: Identification of Mitigation Targets

    Science.gov (United States)

    Darzi, M.; Johnson, D.; Heltzel, R.; Clark, N.

    2017-12-01

    Researchers at West Virginia University's Center for Alternative Fuels, Engines, and Emissions have recently participated in a variety of studies targeted at direction quantification of methane emissions from across the natural gas supply chain. These studies included assessing methane emissions from heavy-duty vehicles and their fuel stations, active unconventional well sites - during both development and production, natural gas compression and storage facilities, natural gas engines - both large and small, two- and four-stroke, and low-throughput equipment associated with coal bed methane wells. Engine emissions were sampled using conventional instruments such as Fourier transform infrared spectrometers and heated flame ionization detection analyzers. However, to accurately quantify a wide range of other sources beyond the tailpipe (both leaks and losses), a full flow sampling system was developed, which included an integrated cavity-enhanced absorption spectrometer. Through these direct quantification efforts and analysis major sources of methane emissions were identified. Technological solutions and best practices exist or could be developed to reduce methane emissions by focusing on the "lowest-hanging fruit." For example, engine crankcases from across the supply chain should employ vent mitigation systems to reduce methane and other emissions. An overview of the direct quantification system and various campaign measurements results will be presented along with the identification of other targets for additional mitigation.

  14. FRET-based modified graphene quantum dots for direct trypsin quantification in urine

    Energy Technology Data Exchange (ETDEWEB)

    Poon, Chung-Yan; Li, Qinghua [Department of Chemistry, Hong Kong Baptist University, Kowloon Tong, Hong Kong Special Administrative Region (Hong Kong); Zhang, Jiali; Li, Zhongping [Department of Chemistry, Hong Kong Baptist University, Kowloon Tong, Hong Kong Special Administrative Region (Hong Kong); Research Center of Environmental Science and Engineering, School of Chemistry and Chemical Engineering, Shanxi University, Taiyuan 030006 (China); Dong, Chuan [Research Center of Environmental Science and Engineering, School of Chemistry and Chemical Engineering, Shanxi University, Taiyuan 030006 (China); Lee, Albert Wai-Ming; Chan, Wing-Hong [Department of Chemistry, Hong Kong Baptist University, Kowloon Tong, Hong Kong Special Administrative Region (Hong Kong); Li, Hung-Wing, E-mail: hwli@hkbu.edu.hk [Department of Chemistry, Hong Kong Baptist University, Kowloon Tong, Hong Kong Special Administrative Region (Hong Kong)

    2016-04-21

    A versatile nanoprobe was developed for trypsin quantification with fluorescence resonance energy transfer (FRET). Here, fluorescence graphene quantum dot is utilized as a donor while a well-designed coumarin derivative, CMR2, as an acceptor. Moreover, bovine serum albumin (BSA), as a protein model, is not only served as a linker for the FRET pair, but also a fluorescence enhancer of the quantum dots and CMR2. In the presence of trypsin, the FRET system would be destroyed when the BSA is digested by trypsin. Thus, the emission peak of the donor is regenerated and the ratio of emission peak of donor/emission peak of acceptor increased. By the ratiometric measurement of these two emission peaks, trypsin content could be determined. The detection limit of trypsin was found to be 0.7 μg/mL, which is 0.008-fold of the average trypsin level in acute pancreatitis patient's urine suggesting a high potential for fast and low cost clinical screening. - Highlights: • A FRET-based biosensor was developed for direct quantification of trypsin. • Fast and sensitive screening of pancreatic disease was facilitated. • The direct quantification of trypsin in urine samples was demonstrated.

  15. Direct quantification of airborne nanoparticles composition by TXRF after collection on filters

    Energy Technology Data Exchange (ETDEWEB)

    Motellier, S; Lhaute, K; Guiot, A; Golanski, L; Tardif, F [CEA Grenoble, DRT, LITEN, DTNM, Laboratory of Nanochemistry and Nanosafety, 17 Avenue des Martyrs, Cedex 9, F-38054 Grenoble (France); Geoffroy, C, E-mail: sylvie.motellier@cea.fr [Elexience, 9 rue des petits ruisseaux, BP 61, 91371 Verrieres-le-Buisson Cedex (France)

    2011-07-06

    Direct TXRF analysis of nanoparticles deposited on filters was evaluated. Standard filters spiked with known amounts of NP were produced using an atomizer which generates an aerosol from a NP containing-liquid suspension. Polycarbonate filters provided the highest fluorescence signals and black polycarbonate filters containing chromium were further selected, Cr being used as internal standard for elemental quantification of the filter contaminants. Calibration curves were established for various NP (TiO{sub 2}, ZnO, CeO{sub 2}, Al{sub 2}O{sub 3}). Good linearity was observed. Low limits of detection were in the tens to the hundreds of ngs per filter, the method being less adapted to Al{sub 2}O{sub 3} due to the poor TXRF sensitivity for light elements. The analysis of MW-CNTs was attempted by quantification of their metal (Fe) catalyst impurities. Problems like CNT dispersion in liquids, quantification of the deposited quantity and high Fe-background contamination.

  16. High level issues in reliability quantification of safety-critical software

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2012-01-01

    For the purpose of developing a consensus method for the reliability assessment of safety-critical digital instrumentation and control systems in nuclear power plants, several high level issues in reliability assessment of the safety-critical software based on Bayesian belief network modeling and statistical testing are discussed. Related to the Bayesian belief network modeling, the relation between the assessment approach and the sources of evidence, the relation between qualitative evidence and quantitative evidence, how to consider qualitative evidence, and the cause-consequence relation are discussed. Related to the statistical testing, the need of the consideration of context-specific software failure probabilities and the inability to perform a huge number of tests in the real world are discussed. The discussions in this paper are expected to provide a common basis for future discussions on the reliability assessment of safety-critical software. (author)

  17. Quantification of Wave Model Uncertainties Used for Probabilistic Reliability Assessments of Wave Energy Converters

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2015-01-01

    Wave models used for site assessments are subjected to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Four different wave models are considered, and validation...... data are collected from published scientific research. The bias and the root-mean-square error, as well as the scatter index, are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example, this paper presents how the quantified...... uncertainties can be implemented in probabilistic reliability assessments....

  18. Reverse transcriptase real-time PCR for detection and quantification of viable Campylobacter jejuni directly from poultry faecal samples

    DEFF Research Database (Denmark)

    Bui, Thanh Xuan; Wolff, Anders; Madsen, Mogens

    2012-01-01

    Campylobacter spp. is the most common cause of bacterial diarrhoea in humans worldwide. Therefore, rapid and reliable methods fordetection and quantification of this pathogen are required. In this study, we have developed a reverse transcription quantitative real-time PCR(RT-qPCR) for detection a...

  19. Identification of Reliable Reference Genes for Quantification of MicroRNAs in Serum Samples of Sulfur Mustard-Exposed Veterans.

    Science.gov (United States)

    Gharbi, Sedigheh; Shamsara, Mehdi; Khateri, Shahriar; Soroush, Mohammad Reza; Ghorbanmehr, Nassim; Tavallaei, Mahmood; Nourani, Mohammad Reza; Mowla, Seyed Javad

    2015-01-01

    In spite of accumulating information about pathological aspects of sulfur mustard (SM), the precise mechanism responsible for its effects is not well understood. Circulating microRNAs (miRNAs) are promising biomarkers for disease diagnosis and prognosis. Accurate normalization using appropriate reference genes, is a critical step in miRNA expression studies. In this study, we aimed to identify appropriate reference gene for microRNA quantification in serum samples of SM victims. In this case and control experimental study, using quantitative real-time polymerase chain reaction (qRT-PCR), we evaluated the suitability of a panel of small RNAs including SNORD38B, SNORD49A, U6, 5S rRNA, miR-423-3p, miR-191, miR-16 and miR-103 in sera of 28 SM-exposed veterans of Iran-Iraq war (1980-1988) and 15 matched control volunteers. Different statistical algorithms including geNorm, Normfinder, best-keeper and comparative delta-quantification cycle (Cq) method were employed to find the least variable reference gene. miR-423-3p was identified as the most stably expressed reference gene, and miR- 103 and miR-16 ranked after that. We demonstrate that non-miRNA reference genes have the least stabil- ity in serum samples and that some house-keeping miRNAs may be used as more reliable reference genes for miRNAs in serum. In addition, using the geometric mean of two reference genes could increase the reliability of the normalizers.

  20. Optimizing total reflection X-ray fluorescence for direct trace element quantification in proteins I: Influence of sample homogeneity and reflector type

    Science.gov (United States)

    Wellenreuther, G.; Fittschen, U. E. A.; Achard, M. E. S.; Faust, A.; Kreplin, X.; Meyer-Klaucke, W.

    2008-12-01

    Total reflection X-ray fluorescence (TXRF) is a very promising method for the direct, quick and reliable multi-elemental quantification of trace elements in protein samples. With the introduction of an internal standard consisting of two reference elements, scandium and gallium, a wide range of proteins can be analyzed, regardless of their salt content, buffer composition, additives and amino acid composition. This strategy also enables quantification of matrix effects. Two potential issues associated with drying have been considered in this study: (1) Formation of heterogeneous residues of varying thickness and/or density; and (2) separation of the internal standard and protein during drying (which has to be prevented to allow accurate quantification). These issues were investigated by microbeam X-ray fluorescence (μXRF) with special emphasis on (I) the influence of sample support and (II) the protein / buffer system used. In the first part, a model protein was studied on well established sample supports used in TXRF, PIXE and XRF (Mylar, siliconized quartz, Plexiglas and silicon). In the second part we imaged proteins of different molecular weight, oligomerization state, bound metals and solubility. A partial separation of protein and internal standard was only observed with untreated silicon, suggesting it may not be an adequate support material. Siliconized quartz proved to be the least prone to heterogeneous drying of the sample and yielded the most reliable results.

  1. Optimizing total reflection X-ray fluorescence for direct trace element quantification in proteins I: Influence of sample homogeneity and reflector type

    Energy Technology Data Exchange (ETDEWEB)

    Wellenreuther, G. [European Molecular Biology Laboratory, Notkestr. 85, 22603 Hamburg (Germany); Fittschen, U.E.A. [Department of Chemistry, University of Hamburg, Martin-Luther-King-Platz 6, 20146 Hamburg (Germany); Achard, M.E.S.; Faust, A.; Kreplin, X. [European Molecular Biology Laboratory, Notkestr. 85, 22603 Hamburg (Germany); Meyer-Klaucke, W. [European Molecular Biology Laboratory, Notkestr. 85, 22603 Hamburg (Germany)], E-mail: Wolfram@embl-hamburg.de

    2008-12-15

    Total reflection X-ray fluorescence (TXRF) is a very promising method for the direct, quick and reliable multi-elemental quantification of trace elements in protein samples. With the introduction of an internal standard consisting of two reference elements, scandium and gallium, a wide range of proteins can be analyzed, regardless of their salt content, buffer composition, additives and amino acid composition. This strategy also enables quantification of matrix effects. Two potential issues associated with drying have been considered in this study: (1) Formation of heterogeneous residues of varying thickness and/or density; and (2) separation of the internal standard and protein during drying (which has to be prevented to allow accurate quantification). These issues were investigated by microbeam X-ray fluorescence ({mu}XRF) with special emphasis on (I) the influence of sample support and (II) the protein / buffer system used. In the first part, a model protein was studied on well established sample supports used in TXRF, PIXE and XRF (Mylar, siliconized quartz, Plexiglas and silicon). In the second part we imaged proteins of different molecular weight, oligomerization state, bound metals and solubility. A partial separation of protein and internal standard was only observed with untreated silicon, suggesting it may not be an adequate support material. Siliconized quartz proved to be the least prone to heterogeneous drying of the sample and yielded the most reliable results.

  2. Reliability of smartphone-based gait measurements for quantification of physical activity/inactivity levels.

    Science.gov (United States)

    Ebara, Takeshi; Azuma, Ryohei; Shoji, Naoto; Matsukawa, Tsuyoshi; Yamada, Yasuyuki; Akiyama, Tomohiro; Kurihara, Takahiro; Yamada, Shota

    2017-11-25

    Objective measurements using built-in smartphone sensors that can measure physical activity/inactivity in daily working life have the potential to provide a new approach to assessing workers' health effects. The aim of this study was to elucidate the characteristics and reliability of built-in step counting sensors on smartphones for development of an easy-to-use objective measurement tool that can be applied in ergonomics or epidemiological research. To evaluate the reliability of step counting sensors embedded in seven major smartphone models, the 6-minute walk test was conducted and the following analyses of sensor precision and accuracy were performed: 1) relationship between actual step count and step count detected by sensors, 2) reliability between smartphones of the same model, and 3) false detection rates when sitting during office work, while riding the subway, and driving. On five of the seven models, the inter-class correlations coefficient (ICC (3,1) ) showed high reliability with a range of 0.956-0.993. The other two models, however, had ranges of 0.443-0.504 and the relative error ratios of the sensor-detected step count to the actual step count were ±48.7%-49.4%. The level of agreement between the same models was ICC (3,1) : 0.992-0.998. The false detection rates differed between the sitting conditions. These results suggest the need for appropriate regulation of step counts measured by sensors, through means such as correction or calibration with a predictive model formula, in order to obtain the highly reliable measurement results that are sought in scientific investigation.

  3. Development of a reliable extraction and quantification method for glucosinolates in Moringa oleifera.

    Science.gov (United States)

    Förster, Nadja; Ulrichs, Christian; Schreiner, Monika; Müller, Carsten T; Mewis, Inga

    2015-01-01

    Glucosinolates are the characteristic secondary metabolites of plants in the order Brassicales. To date the common DIN extraction 'desulfo glucosinolates' method remains the common procedure for determination and quantification of glucosinolates. However, the desulfation step in the extraction of glucosinolates from Moringa oleifera leaves resulted in complete conversion and degradation of the naturally occurring glucosinolates in this plant. Therefore, a method for extraction of intact Moringa glucosinolates was developed and no conversion and degradation of the different rhamnopyranosyloxy-benzyl glucosinolates was found. Buffered eluents (0.1 M ammonium acetate) were necessary to stabilize 4-α-rhamnopyranosyloxy-benzyl glucosinolate (Rhamno-Benzyl-GS) and acetyl-4-α-rhamnopyranosyloxy-benzyl glucosinolate isomers (Ac-Isomers-GS) during HPLC analysis. Due to the instability of intact Moringa glucosinolates at room temperature and during the purification process of single glucosinolates, influences of different storage (room temperature, frozen, thawing and refreezing) and buffer conditions on glucosinolate conversion were analysed. Conversion and degradations processes were especially determined for the Ac-Isomers-GS III. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Reliable Assessment and Quantification of the Fluorescence-Labeled Antisense Oligonucleotides In Vivo

    Directory of Open Access Journals (Sweden)

    Maria Chiara Munisso

    2014-01-01

    Full Text Available The availability of fluorescent dyes and the advances in the optical systems for in vivo imaging have stimulated an increasing interest in developing new methodologies to study and quantify the biodistribution of labeled agents. However, despite these great achievements, we are facing significant challenges in determining if the observed fluorescence does correspond to the quantity of the dye in the tissues. In fact, although the far-red and near-infrared lights can propagate through several centimetres of tissue, they diffuse within a few millimetres as consequence of the elastic scattering of photons. In addition, when dye-labeled oligonucleotides form stable complex with cationic carriers, a large change in the fluorescence intensity of the dye is observed. Therefore, the measured fluorescence intensity is altered by the tissue heterogeneity and by the fluctuation of dye intensity. Hence, in this study a quantification strategy for fluorescence-labeled oligonucleotides was developed to solve these disadvantageous effects. Our results proved that upon efficient homogenization and dilution with chaotropic agents, such as guanidinium thiocyanate, it is possible to achieve a complete fluorescence intensity recovery. Furthermore, we demonstrated that this method has the advantage of good sensitivity and reproducibility, as well as easy handling of the tissue samples.

  5. Quantification of colour Doppler activity in the wrist in patients with rheumatoid arthritis - the reliability of different methods for image selection and evaluation

    DEFF Research Database (Denmark)

    Ellegaard, K.; Torp-Pedersen, S.; Lund, H.

    2008-01-01

    measurements in the wrist of patients with rheumatoid arthritis (RA) using different selection and quantification methods. Materials and Methods: 14 patients with RA had their wrist scanned twice by the same investigator with an interval of 30 Minutes, The images for analysis were selected either......Purpose: The amount Of colour Doppler activity in the inflamed synovium is used to quantity inflammatory activity. The measurements may vary due to image selection, quantification method, and point in cardiac cycle. This study investigated the test-retest reliability Of ultrasound colour Doppler...... was obtained when the images were selected guided by colour Doppler and the Subsequent quantification was (done in an area defined by anatomical Structures. With this method, the intra-class coefficient ICC (2.1) was 0.95 and the within-subject SD (SW) was 0.017, indicating good reliability. In contrast, poor...

  6. Latest scientific and technological knowledge of human-reliability quantification - December 1991

    International Nuclear Information System (INIS)

    Berg, H.P.; Schott, H.

    1992-02-01

    Again an again real incidents and accidents show that human factors may seriously affect the safety of plants. This is also true for, e.g. nuclear facilities. The major methods which are used to quantify the reliability of humans are described. These methods are applied in the framework of German and international risk analyses. Since in probabilistic safety analyses data bases are of great importance of the, however, naturally very difficult quantitative evaluation of human errors, the study also discusses the present limits to the treatment of human misbehavior in safety analyses. (orig.) [de

  7. Study on Performance Shaping Factors (PSFs) Quantification Method in Human Reliability Analysis (HRA)

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Jang, Inseok Jang; Seong, Poong Hyun; Park, Jinkyun; Kim, Jong Hyun

    2015-01-01

    The purpose of HRA implementation is 1) to achieve the human factor engineering (HFE) design goal of providing operator interfaces that will minimize personnel errors and 2) to conduct an integrated activity to support probabilistic risk assessment (PRA). For these purposes, various HRA methods have been developed such as technique for human error rate prediction (THERP), simplified plant analysis risk human reliability assessment (SPAR-H), cognitive reliability and error analysis method (CREAM) and so on. In performing HRA, such conditions that influence human performances have been represented via several context factors called performance shaping factors (PSFs). PSFs are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. Most HRA methods evaluate the weightings of PSFs by expert judgment and explicit guidance for evaluating the weighting is not provided. It has been widely known that the performance of the human operator is one of the critical factors to determine the safe operation of NPPs. HRA methods have been developed to identify the possibility and mechanism of human errors. In performing HRA methods, the effect of PSFs which may increase or decrease human error should be investigated. However, the effect of PSFs were estimated by expert judgment so far. Accordingly, in order to estimate the effect of PSFs objectively, the quantitative framework to estimate PSFs by using PSF profiles is introduced in this paper

  8. A reliable and validated LC-MS/MS method for the simultaneous quantification of 4 cannabinoids in 40 consumer products.

    Directory of Open Access Journals (Sweden)

    Qingfang Meng

    Full Text Available In the past 50 years, Cannabis sativa (C. sativa has gone from a substance essentially prohibited worldwide to one that is gaining acceptance both culturally and legally in many countries for medicinal and recreational use. As additional jurisdictions legalize Cannabis products and the variety and complexity of these products surpass the classical dried plant material, appropriate methods for measuring the biologically active constituents is paramount to ensure safety and regulatory compliance. While there are numerous active compounds in C. sativa the primary cannabinoids of regulatory and safety concern are (--Δ⁹-tetrahydrocannabinol (THC, cannabidiol (CBD, and their respective acidic forms THCA-A and CBDA. Using the US Food and Drug Administration (FDA bioanalytical method validation guidelines we developed a sensitive, selective, and accurate method for the simultaneous analysis CBD, CBDA, THC, and THCA-A in oils and THC & CBD in more complex matrices. This HPLC-MS/MS method was simple and reliable using standard sample dilution and homogenization, an isocratic chromatographic separation, and a triple quadrupole mass spectrometer. The lower limit of quantification (LLOQ for analytes was 0.195 ng/mL over a 0.195-50.0 ng/mL range of quantification with a coefficient of correlation of >0.99. Average intra-day and inter-day accuracies were 94.2-112.7% and 97.2-110.9%, respectively. This method was used to quantify CBD, CBDA, THC, and THCA-A in 40 commercial hemp products representing a variety of matrices including oils, plant materials, and creams/cosmetics. All products tested met the federal regulatory restrictions on THC content in Canada (1,000 (an oil-based product. Overall, the method proved amenable to the analysis of various commercial products including oils, creams, and plant material and may be diagnostically indicative of adulteration with non-hemp C. sativa, specialized hemp cultivars, or unique manufacturing methods.

  9. Quantification of the occurrence of common-mode faults in highly reliable protective systems

    International Nuclear Information System (INIS)

    Aitken, A.

    1978-10-01

    The report first covers the investigation, definition and classification of common mode failure (CMF) based on an extensive study of the nature of CMF. A new classification of CMF is proposed, based on possible causes of failures. This is used as a basis for analysing data from reported failures of reactor safety systems and aircraft systems. Design and maintenance errors are shown to be predominant cause of CMF. The estimated CMF rates for the highly reliable nuclear power plant automatic protection system (APS) and for the emergency core cooling system (ECCS) are 2.8.10 -2 CMF/sub-system-year and 3.3.10 -2 CMF/sub-system-year respectively. For comparison, the data from the aircraft accident records have shown a CMF rate for total flight control system (FCS), 2.1.10 -5 CMF/sub-system-year. The analysis has laid the grounds for work on relating CMF modelling and defences

  10. Reliability of direct sensitivity determination of blood cultures

    International Nuclear Information System (INIS)

    Noman, F.; Ahmed, A.

    2008-01-01

    The aim of this study was to evaluate the error in interpreting antimicrobial sensitivity by direct method when compared to standard method and find out if specific antibiotic-organism combination had more discrepancies. All blood culture samples received at Microbiology Laboratory from 1st July 2006 to 31st August 2006 were ncluded in the study. All samples were inoculated in automated blood culture system BACTEC 9240 which contained enriched Soybean-Casein Digest broth with CO/sub 2/. Once positive, bottles were removed from system; gram staining of the positive broths was done. Susceptibility test was performed from positive broth, on MHA (Mueller-Hinton Agar), with antibiotics panel according to gram stain result. All positive broths were also sub-cultured on blood agar, chocolate agar and McConkey agar for only gram-negative rods. Next day, the zone sizes of all antibiotics were recorded using measuring scale and at the same time susceptibility test was repeated from isolated colonies from subcultures, with inoculums prepared of McFarland 0.5 standard 0.2 Staphylococcus aureus (ATCC 29213); E.coli (ATCC 25922) and Pseudomonas aeruginosa (ATCC 27853) were included as quality control strain. Zone sizes were interpreted as sensitive (S), resistant (R) and intermediate (I) according to CLSI recommendation. Two results were compared and recorded. Out of a total 1083 combinations, zone diameters by standard method were either equal or greater than direct zone diameter (never smaller). Most of the discrepancies were in b-lactam/b-lactamase combinations and aminoglycosides. While reporting these groups of antibiotics with direct sensitivity test, one should be cautious. These are the major antibiotic used for life-threatening infections. In case of being heavy/lighter standard inoculums or marginal zones, repeating with standard method should be preferred to minimize the chances of error. (author)

  11. Switching Diarylethenes Reliably in Both Directions with Visible Light.

    Science.gov (United States)

    Fredrich, Sebastian; Göstl, Robert; Herder, Martin; Grubert, Lutz; Hecht, Stefan

    2016-01-18

    A diarylethene photoswitch was covalently connected to two small triplet sensitizer moieties in a conjugated and nonconjugated fashion and the photochromic performance of the resulting compounds was investigated. In comparison with the parent diarylethene (without sensitizers) and one featuring saturated linkages, the conjugated photoswitch offers superior fatigue resistance upon visible-light excitation due to effective triplet energy transfer from the biacetyl termini to the diarylethene core. Our design makes it possible to switch diarylethenes with visible light in both directions in a highly efficient and robust fashion based on extending π-conjugation and by-product-free ring-closure via the triplet manifold. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Methods of direct (non-chromatographic) quantification of body metabolites utilizing chemical ionization mass spectrometry

    International Nuclear Information System (INIS)

    Mee, J.M.L.

    1978-01-01

    For quantitative determination of known metabolites from the biological sample by direct chemical ionization mass spectrometry (CI-MS), the method of internal standard using stable isotopically labelled analogs appears to be the method of choice. In the case where stable isotope ratio determinations could not be applied, and alternative quantification can be achieved using non-labelled external or internal standards and a calibration curve (sum of peak height per a given number of scans versus concentration). The technique of computer monitoring permits display and plotting of ion current profiles (TIC and SIC) or spectra per a given number of scans or a given range of mass per charge. Examples are given in areas of clinical application and the quantitative data show very good agreement with the conventional chromatographic measurements. (Auth.)

  13. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  14. Improving the reliability of POD curves in NDI methods using a Bayesian inversion approach for uncertainty quantification

    Science.gov (United States)

    Ben Abdessalem, A.; Jenson, F.; Calmon, P.

    2016-02-01

    This contribution provides an example of the possible advantages of adopting a Bayesian inversion approach to uncertainty quantification in nondestructive inspection methods. In such problem, the uncertainty associated to the random parameters is not always known and needs to be characterised from scattering signal measurements. The uncertainties may then correctly propagated in order to determine a reliable probability of detection curve. To this end, we establish a general Bayesian framework based on a non-parametric maximum likelihood function formulation and some priors from expert knowledge. However, the presented inverse problem is time-consuming and computationally intensive. To cope with this difficulty, we replace the real model by a surrogate one in order to speed-up the model evaluation and to make the problem to be computationally feasible for implementation. The least squares support vector regression is adopted as metamodelling technique due to its robustness to deal with non-linear problems. We illustrate the usefulness of this methodology through the control of tube with enclosed defect using ultrasonic inspection method.

  15. AGNES at vibrated gold microwire electrode for the direct quantification of free copper concentrations

    Energy Technology Data Exchange (ETDEWEB)

    Domingos, Rute F., E-mail: rdomingos@ipgp.fr [Centro de Química Estrutural, Instituto Superior Técnico, Universidade de Lisboa, Torre Sul Lab 11-6.3, Av. Rovisco Pais #1, 1049-001 Lisbon (Portugal); Carreira, Sara [Centro de Química Estrutural, Instituto Superior Técnico, Universidade de Lisboa, Torre Sul Lab 11-6.3, Av. Rovisco Pais #1, 1049-001 Lisbon (Portugal); Galceran, Josep [Department of Chemistry, University of Lleida and Agrotecnio, Rovira Roure 191, 25198 Lleida (Spain); Salaün, Pascal [School of Environmental Sciences, University of Liverpool, 4 Brownlow Street, Liverpool L693 GP (United Kingdom); Pinheiro, José P. [LIEC/ENSG, UMR 7360 CNRS – Université de Lorraine, 15 Avenue du Charmois, 54500 Vandoeuvre-les-Nancy (France)

    2016-05-12

    The free metal ion concentration and the dynamic features of the metal species are recognized as key to predict metal bioavailability and toxicity to aquatic organisms. Quantification of the former is, however, still challenging. In this paper, it is shown for the first time that the concentration of free copper (Cu{sup 2+}) can be quantified by applying AGNES (Absence of Gradients and Nernstian equilibrium stripping) at a solid gold electrode. It was found that: i) the amount of deposited Cu follows a Nernstian relationship with the applied deposition potential, and ii) the stripping signal is linearly related with the free metal ion concentration. The performance of AGNES at the vibrating gold microwire electrode (VGME) was assessed for two labile systems: Cu-malonic acid and Cu-iminodiacetic acid at ionic strength 0.01 M and a range of pH values from 4.0 to 6.0. The free Cu concentrations and conditional stability constants obtained by AGNES were in good agreement with stripping scanned voltammetry and thermodynamic theoretical predictions obtained by Visual MinteQ. This work highlights the suitability of gold electrodes for the quantification of free metal ion concentrations by AGNES. It also strongly suggests that other solid electrodes may be well appropriate for such task. This new application of AGNES is a first step towards a range of applications for a number of metals in speciation, toxicological and environmental studies for the direct determination of the key parameter that is the free metal ion concentration. - Highlights: • AGNES principles are valid at the vibrating gold microwire electrode (VGME). • VGME was successfully employed to quantify free Cu concentrations by using AGNES. • Stability constants of labile systems were in good agreement with predictions.

  16. Direct detection and quantification of abasic sites for in vivo studies of DNA damage and repair

    International Nuclear Information System (INIS)

    Wang Yanming; Liu Lili; Wu Chunying; Bulgar, Alina; Somoza, Eduardo; Zhu Wenxia; Gerson, Stanton L.

    2009-01-01

    Use of chemotherapeutic agents to induce cytotoxic DNA damage and programmed cell death is a key strategy in cancer treatments. However, the efficacy of DNA-targeted agents such as temozolomide is often compromised by intrinsic cellular responses such as DNA base excision repair (BER). Previous studies have shown that BER pathway resulted in formation of abasic or apurinic/apyrimidinic (AP) sites, and blockage of AP sites led to a significant enhancement of drug sensitivity due to reduction of DNA base excision repair. Since a number of chemotherapeutic agents also induce formation of AP sites, monitoring of these sites as a clinical correlate of drug effect will provide a useful tool in the development of DNA-targeted chemotherapies aimed at blocking abasic sites from repair. Here we report an imaging technique based on positron emission tomography (PET) that allows for direct quantification of AP sites in vivo. For this purpose, positron-emitting carbon-11 has been incorporated into methoxyamine ([ 11 C]MX) that binds covalently to AP sites with high specificity. The binding specificity of [ 11 C]MX for AP sites was demonstrated by in vivo blocking experiments. Using [ 11 C]MX as a radiotracer, animal PET studies have been conducted in melanoma and glioma xenografts for quantification of AP sites. Following induction of AP sites by temozolomide, both tumor models showed significant increase of [ 11 C]MX uptake in tumor regions in terms of radioactivity concentration as a function of time, which correlates well with conventional aldehyde reactive probe (ARP)-based bioassays for AP sites.

  17. Direct quantification of cell-free, circulating DNA from unpurified plasma.

    Science.gov (United States)

    Breitbach, Sarah; Tug, Suzan; Helmig, Susanne; Zahn, Daniela; Kubiak, Thomas; Michal, Matthias; Gori, Tommaso; Ehlert, Tobias; Beiter, Thomas; Simon, Perikles

    2014-01-01

    Cell-free DNA (cfDNA) in body tissues or fluids is extensively investigated in clinical medicine and other research fields. In this article we provide a direct quantitative real-time PCR (qPCR) as a sensitive tool for the measurement of cfDNA from plasma without previous DNA extraction, which is known to be accompanied by a reduction of DNA yield. The primer sets were designed to amplify a 90 and 222 bp multi-locus L1PA2 sequence. In the first module, cfDNA concentrations in unpurified plasma were compared to cfDNA concentrations in the eluate and the flow-through of the QIAamp DNA Blood Mini Kit and in the eluate of a phenol-chloroform isoamyl (PCI) based DNA extraction, to elucidate the DNA losses during extraction. The analyses revealed 2.79-fold higher cfDNA concentrations in unpurified plasma compared to the eluate of the QIAamp DNA Blood Mini Kit, while 36.7% of the total cfDNA were found in the flow-through. The PCI procedure only performed well on samples with high cfDNA concentrations, showing 87.4% of the concentrations measured in plasma. The DNA integrity strongly depended on the sample treatment. Further qualitative analyses indicated differing fractions of cfDNA fragment lengths in the eluate of both extraction methods. In the second module, cfDNA concentrations in the plasma of 74 coronary heart disease patients were compared to cfDNA concentrations of 74 healthy controls, using the direct L1PA2 qPCR for cfDNA quantification. The patient collective showed significantly higher cfDNA levels (mean (SD) 20.1 (23.8) ng/ml; range 5.1-183.0 ng/ml) compared to the healthy controls (9.7 (4.2) ng/ml; range 1.6-23.7 ng/ml). With our direct qPCR, we recommend a simple, economic and sensitive procedure for the quantification of cfDNA concentrations from plasma that might find broad applicability, if cfDNA became an established marker in the assessment of pathophysiological conditions.

  18. A reliable and validated LC-MS/MS method for the simultaneous quantification of 4 cannabinoids in 40 consumer products.

    Science.gov (United States)

    Meng, Qingfang; Buchanan, Beth; Zuccolo, Jonathan; Poulin, Mathieu-Marc; Gabriele, Joseph; Baranowski, David Charles

    2018-01-01

    In the past 50 years, Cannabis sativa (C. sativa) has gone from a substance essentially prohibited worldwide to one that is gaining acceptance both culturally and legally in many countries for medicinal and recreational use. As additional jurisdictions legalize Cannabis products and the variety and complexity of these products surpass the classical dried plant material, appropriate methods for measuring the biologically active constituents is paramount to ensure safety and regulatory compliance. While there are numerous active compounds in C. sativa the primary cannabinoids of regulatory and safety concern are (-)-Δ⁹-tetrahydrocannabinol (THC), cannabidiol (CBD), and their respective acidic forms THCA-A and CBDA. Using the US Food and Drug Administration (FDA) bioanalytical method validation guidelines we developed a sensitive, selective, and accurate method for the simultaneous analysis CBD, CBDA, THC, and THCA-A in oils and THC & CBD in more complex matrices. This HPLC-MS/MS method was simple and reliable using standard sample dilution and homogenization, an isocratic chromatographic separation, and a triple quadrupole mass spectrometer. The lower limit of quantification (LLOQ) for analytes was 0.195 ng/mL over a 0.195-50.0 ng/mL range of quantification with a coefficient of correlation of >0.99. Average intra-day and inter-day accuracies were 94.2-112.7% and 97.2-110.9%, respectively. This method was used to quantify CBD, CBDA, THC, and THCA-A in 40 commercial hemp products representing a variety of matrices including oils, plant materials, and creams/cosmetics. All products tested met the federal regulatory restrictions on THC content in Canada (CBD, the majority of analyzed products contained low CBD levels and a CBD: CBDA ratio of CBD and a CBD: CBDA ratio of >1,000 (an oil-based product). Overall, the method proved amenable to the analysis of various commercial products including oils, creams, and plant material and may be diagnostically indicative of

  19. Direct protein quantification in complex sample solutions by surface-engineered nanorod probes

    KAUST Repository

    Schrittwieser, Stefan

    2017-06-30

    Detecting biomarkers from complex sample solutions is the key objective of molecular diagnostics. Being able to do so in a simple approach that does not require laborious sample preparation, sophisticated equipment and trained staff is vital for point-of-care applications. Here, we report on the specific detection of the breast cancer biomarker sHER2 directly from serum and saliva samples by a nanorod-based homogeneous biosensing approach, which is easy to operate as it only requires mixing of the samples with the nanorod probes. By careful nanorod surface engineering and homogeneous assay design, we demonstrate that the formation of a protein corona around the nanoparticles does not limit the applicability of our detection method, but on the contrary enables us to conduct in-situ reference measurements, thus further strengthening the point-of-care applicability of our method. Making use of sandwich assays on top of the nanorods, we obtain a limit of detection of 110 pM and 470 pM in 10-fold diluted spiked saliva and serum samples, respectively. In conclusion, our results open up numerous applications in direct protein biomarker quantification, specifically in point-of-care settings where resources are limited and ease-of-use is of essence.

  20. Direct protein quantification in complex sample solutions by surface-engineered nanorod probes

    KAUST Repository

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J.; Lentijo Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Schotter, Joerg

    2017-01-01

    Detecting biomarkers from complex sample solutions is the key objective of molecular diagnostics. Being able to do so in a simple approach that does not require laborious sample preparation, sophisticated equipment and trained staff is vital for point-of-care applications. Here, we report on the specific detection of the breast cancer biomarker sHER2 directly from serum and saliva samples by a nanorod-based homogeneous biosensing approach, which is easy to operate as it only requires mixing of the samples with the nanorod probes. By careful nanorod surface engineering and homogeneous assay design, we demonstrate that the formation of a protein corona around the nanoparticles does not limit the applicability of our detection method, but on the contrary enables us to conduct in-situ reference measurements, thus further strengthening the point-of-care applicability of our method. Making use of sandwich assays on top of the nanorods, we obtain a limit of detection of 110 pM and 470 pM in 10-fold diluted spiked saliva and serum samples, respectively. In conclusion, our results open up numerous applications in direct protein biomarker quantification, specifically in point-of-care settings where resources are limited and ease-of-use is of essence.

  1. Ct shift: A novel and accurate real-time PCR quantification model for direct comparison of different nucleic acid sequences and its application for transposon quantifications.

    Science.gov (United States)

    Kolacsek, Orsolya; Pergel, Enikő; Varga, Nóra; Apáti, Ágota; Orbán, Tamás I

    2017-01-20

    There are numerous applications of quantitative PCR for both diagnostic and basic research. As in many other techniques the basis of quantification is that comparisons are made between different (unknown and known or reference) specimens of the same entity. When the aim is to compare real quantities of different species in samples, one cannot escape their separate precise absolute quantification. We have established a simple and reliable method for this purpose (Ct shift method) which combines the absolute and the relative approach. It requires a plasmid standard containing both sequences of amplicons to be compared (e.g. the target of interest and the endogenous control). It can serve as a reference sample with equal copies of templates for both targets. Using the ΔΔCt formula we can quantify the exact ratio of the two templates in each unknown sample. The Ct shift method has been successfully applied for transposon gene copy measurements, as well as for comparison of different mRNAs in cDNA samples. This study provides the proof of concept and introduces some potential applications of the method; the absolute nature of results even without the need for real reference samples can contribute to the universality of the method and comparability of different studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Zeta-potential data reliability of gold nanoparticle biomolecular conjugates and its application in sensitive quantification of surface absorbed protein.

    Science.gov (United States)

    Wang, Wenjie; Ding, Xiaofan; Xu, Qing; Wang, Jing; Wang, Lei; Lou, Xinhui

    2016-12-01

    Zeta potentials (ZP) of gold nanoparticle bioconjugates (AuNP-bios) provide important information on surface charge that is critical for many applications including drug delivery, biosensing, and cell imaging. The ZP measurements (ZPMs) are conducted under an alternative electrical field at a high frequency under laser irradiation, which may strongly affect the status of surface coating of AuNP-bios and generate unreliable data. In this study, we systemically evaluated the ZP data reliability (ZPDR) of citrate-, thiolated single stranded DNA-, and protein-coated AuNPs mainly according to the consistence of ZPs in the repeated ZPMs and the changes of the hydrodynamic size before and after the ZPMs. We found that the ZPDR was highly dependent on both buffer conditions and surface modifications. Overall, the higher ionic strength of the buffer and the lower affinity of surface bounders were related with the worse ZPDR. The ZPDR of citrate-coated AuNP was good in water, but bad in 10mM phosphate buffer (PB), showing substantially decrease of the absolute ZP values after each measurement, probably due to the electrical field facilitated adsorption of negatively charged phosphate ions on AuNPs. The significant desorption of DNAs from AuNP was observed in the PB containing medium concentration of NaCl, but not in PB. The excellent ZPDR of bovine serum albumin (BSA)-coated AuNP was observed at high salt concentrations and low surface coverage, enabling ZPM as an ultra-sensitive tool for protein quantification on the surface of AuNPs with a single molecule resolution. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Quantification of the activity of biomolecules in microarrays obtained by direct laser transfer.

    Science.gov (United States)

    Dinca, V; Ranella, A; Farsari, M; Kafetzopoulos, D; Dinescu, M; Popescu, A; Fotakis, C

    2008-10-01

    The direct-writing technique laser-induced forward transfer has been employed for the micro-array printing of liquid solutions of the enzyme horseradish peroxidase and the protein Titin on nitrocellulose solid surfaces. The effect of two UV laser pulse lengths, femtosecond and nanosecond has been studied in relation with maintaining the activity of the transferred biomolecules. The quantification of the active biomolecules after transfer has been carried out using Bradford assay, quantitative colorimetric enzymatic assay and fluorescence techniques. Spectrophotometric measurements of the HRP and the Titin activity as well as chromatogenic and fluorescence assay studies have revealed a connection between the properties of the deposited, biologically active biomolecules, the experimental conditions and the target composition. The bioassays have shown that up to 78% of the biomolecules remained active after femtosecond laser transfer, while this value reduced to 54% after nanosecond laser transfer. The addition of glycerol in a percentage up to 70% in the solution to be transferred has contributed to the stabilization of the micro-array patterns and the increase of their resolution.

  4. Study of Fuze Structure and Reliability Design Based on the Direct Search Method

    Science.gov (United States)

    Lin, Zhang; Ning, Wang

    2017-03-01

    Redundant design is one of the important methods to improve the reliability of the system, but mutual coupling of multiple factors is often involved in the design. In my study, Direct Search Method is introduced into the optimum redundancy configuration for design optimization, in which, the reliability, cost, structural weight and other factors can be taken into account simultaneously, and the redundant allocation and reliability design of aircraft critical system are computed. The results show that this method is convenient and workable, and applicable to the redundancy configurations and optimization of various designs upon appropriate modifications. And this method has a good practical value.

  5. Overall Key Performance Indicator to Optimizing Operation of High-Pressure Homogenizers for a Reliable Quantification of Intracellular Components in Pichia pastoris.

    Science.gov (United States)

    Garcia-Ortega, Xavier; Reyes, Cecilia; Montesinos, José Luis; Valero, Francisco

    2015-01-01

    The most commonly used cell disruption procedures may present lack of reproducibility, which introduces significant errors in the quantification of intracellular components. In this work, an approach consisting in the definition of an overall key performance indicator (KPI) was implemented for a lab scale high-pressure homogenizer (HPH) in order to determine the disruption settings that allow the reliable quantification of a wide sort of intracellular components. This innovative KPI was based on the combination of three independent reporting indicators: decrease of absorbance, release of total protein, and release of alkaline phosphatase activity. The yeast Pichia pastoris growing on methanol was selected as model microorganism due to it presents an important widening of the cell wall needing more severe methods and operating conditions than Escherichia coli and Saccharomyces cerevisiae. From the outcome of the reporting indicators, the cell disruption efficiency achieved using HPH was about fourfold higher than other lab standard cell disruption methodologies, such bead milling cell permeabilization. This approach was also applied to a pilot plant scale HPH validating the methodology in a scale-up of the disruption process. This innovative non-complex approach developed to evaluate the efficacy of a disruption procedure or equipment can be easily applied to optimize the most common disruption processes, in order to reach not only reliable quantification but also recovery of intracellular components from cell factories of interest.

  6. Reliability of Direct Behavior Ratings - Social Competence (DBR-SC) data: How many ratings are necessary?

    Science.gov (United States)

    Kilgus, Stephen P; Riley-Tillman, T Chris; Stichter, Janine P; Schoemann, Alexander M; Bellesheim, Katie

    2016-09-01

    The purpose of this investigation was to evaluate the reliability of Direct Behavior Ratings-Social Competence (DBR-SC) ratings. Participants included 60 students identified as possessing deficits in social competence, as well as their 23 classroom teachers. Teachers used DBR-SC to complete ratings of 5 student behaviors within the general education setting on a daily basis across approximately 5 months. During this time, each student was assigned to 1 of 2 intervention conditions, including the Social Competence Intervention-Adolescent (SCI-A) and a business-as-usual (BAU) intervention. Ratings were collected across 3 intervention phases, including pre-, mid-, and postintervention. Results suggested DBR-SC ratings were highly consistent across time within each student, with reliability coefficients predominantly falling in the .80 and .90 ranges. Findings further indicated such levels of reliability could be achieved with only a small number of ratings, with estimates varying between 2 and 10 data points. Group comparison analyses further suggested the reliability of DBR-SC ratings increased over time, such that student behavior became more consistent throughout the intervention period. Furthermore, analyses revealed that for 2 of the 5 DBR-SC behavior targets, the increase in reliability over time was moderated by intervention grouping, with students receiving SCI-A demonstrating greater increases in reliability relative to those in the BAU group. Limitations of the investigation as well as directions for future research are discussed herein. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  7. Direct liquid chromatography method for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines.

    Science.gov (United States)

    Piñeiro, Zulema; Cantos-Villar, Emma; Palma, Miguel; Puertas, Belen

    2011-11-09

    A validated HPLC method with fluorescence detection for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines is described. Detection conditions for both compounds were optimized (excitation at 279 and 278 and emission at 631 and 598 nm for hydroxytyrosol and tyrosol, respectively). The validation of the analytical method was based on selectivity, linearity, robustness, detection and quantification limits, repeatability, and recovery. The detection and quantification limits in red wines were set at 0.023 and 0.076 mg L(-1) for hydroxytyrosol and at 0.007 and 0.024 mg L(-1) for tyrosol determination, respectively. Precision values, both within-day and between-day (n = 5), remained below 3% for both compounds. In addition, a fractional factorial experimental design was developed to analyze the influence of six different conditions on analysis. The final optimized HPLC-fluorescence method allowed the analysis of 30 nonpretreated Spanish red wines to evaluate their hydroxytyrosol and tyrosol contents.

  8. Direct photothermal techniques for quantification of anthocyanins in sour cherry cultivars

    NARCIS (Netherlands)

    Doka, O.; Ficzek, G.; Bicanic, D.D.; Spruijt, R.B.; Luterotti, S.; Toth, M.; Buijnsters, J.G.; György Végvári, G.

    2011-01-01

    The analytical performance of the newly proposed laser-based photoacoustic spectroscopy (PAS) and of optothermal window (OW) method for quantification of total anthocyanin concentration (TAC) in five sour cherry varieties is compared to that of the spectrophotometry (SP). High performance liquid

  9. Direct quantification of creatinine in human urine by using isotope dilution extractive electrospray ionization tandem mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Li Xue [Institute of Environmental Pollution and Health, School of Environmental and Chemical Engineering, Shanghai University, Shanghai 200444 (China); Jiangxi Key Laboratory for Mass Spectrometry and Instrumentation, Applied Chemistry Department, East China Institute of Technology, Nanchang 330013 (China); Fang Xiaowei [Jiangxi Key Laboratory for Mass Spectrometry and Instrumentation, Applied Chemistry Department, East China Institute of Technology, Nanchang 330013 (China); Yu Zhiqiang; Sheng Guoying [Guangdong Key Laboratory of Environmental Protection and Resource Utilization, State Key Laboratory of Organic Geochemistry, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China); Wu Minghong [Shanghai Applied Radiation Institute, School of Environmental and Chemical Engineering, Shanghai University, Shanghai 200444 (China); Fu Jiamo [Institute of Environmental Pollution and Health, School of Environmental and Chemical Engineering, Shanghai University, Shanghai 200444 (China); Guangdong Key Laboratory of Environmental Protection and Resource Utilization, State Key Laboratory of Organic Geochemistry, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China); Chen Huanwen, E-mail: chw8868@gmail.com [Jiangxi Key Laboratory for Mass Spectrometry and Instrumentation, Applied Chemistry Department, East China Institute of Technology, Nanchang 330013 (China)

    2012-10-20

    Highlights: Black-Right-Pointing-Pointer High throughput analysis of urinary creatinine is achieved by using ID-EESI-MS/MS. Black-Right-Pointing-Pointer Urine sample is directly analyzed and no sample pre-treatment is required. Black-Right-Pointing-Pointer Accurate quantification is accomplished with isotope dilution technique. - Abstract: Urinary creatinine (CRE) is an important biomarker of renal function. Fast and accurate quantification of CRE in human urine is required by clinical research. By using isotope dilution extractive electrospray ionization tandem mass spectrometry (EESI-MS/MS) a high throughput method for direct and accurate quantification of urinary CRE was developed in this study. Under optimized conditions, the method detection limit was lower than 50 {mu}g L{sup -1}. Over the concentration range investigated (0.05-10 mg L{sup -1}), the calibration curve was obtained with satisfactory linearity (R{sup 2} = 0.9861), and the relative standard deviation (RSD) values for CRE and isotope-labeled CRE (CRE-d3) were 7.1-11.8% (n = 6) and 4.1-11.3% (n = 6), respectively. The isotope dilution EESI-MS/MS method was validated by analyzing six human urine samples, and the results were comparable with the conventional spectrophotometric method (based on the Jaffe reaction). Recoveries for individual urine samples were 85-111% and less than 0.3 min was taken for each measurement, indicating that the present isotope dilution EESI-MS/MS method is a promising strategy for the fast and accurate quantification of urinary CRE in clinical laboratories.

  10. Direct quantification of creatinine in human urine by using isotope dilution extractive electrospray ionization tandem mass spectrometry

    International Nuclear Information System (INIS)

    Li Xue; Fang Xiaowei; Yu Zhiqiang; Sheng Guoying; Wu Minghong; Fu Jiamo; Chen Huanwen

    2012-01-01

    Highlights: ► High throughput analysis of urinary creatinine is achieved by using ID-EESI–MS/MS. ► Urine sample is directly analyzed and no sample pre-treatment is required. ► Accurate quantification is accomplished with isotope dilution technique. - Abstract: Urinary creatinine (CRE) is an important biomarker of renal function. Fast and accurate quantification of CRE in human urine is required by clinical research. By using isotope dilution extractive electrospray ionization tandem mass spectrometry (EESI–MS/MS) a high throughput method for direct and accurate quantification of urinary CRE was developed in this study. Under optimized conditions, the method detection limit was lower than 50 μg L −1 . Over the concentration range investigated (0.05–10 mg L −1 ), the calibration curve was obtained with satisfactory linearity (R 2 = 0.9861), and the relative standard deviation (RSD) values for CRE and isotope-labeled CRE (CRE-d3) were 7.1–11.8% (n = 6) and 4.1–11.3% (n = 6), respectively. The isotope dilution EESI–MS/MS method was validated by analyzing six human urine samples, and the results were comparable with the conventional spectrophotometric method (based on the Jaffe reaction). Recoveries for individual urine samples were 85–111% and less than 0.3 min was taken for each measurement, indicating that the present isotope dilution EESI–MS/MS method is a promising strategy for the fast and accurate quantification of urinary CRE in clinical laboratories.

  11. Reliability analysis of an LCL tuned track segmented bi-directional inductive power transfer system

    DEFF Research Database (Denmark)

    Asif Iqbal, S. M.; Madawala, U. K.; Thrimawithana, D. J.

    2013-01-01

    Bi-directional Inductive Power Transfer (BDIPT) technique is suitable for renewable energy based applications such as electric vehicles (EVs), for the implementation of vehicle-to-grid (V2G) systems. Recently, more efforts have been made by researchers to improve both efficiency and reliability...... of renewable energy systems to further enhance their economical sustainability. This paper presents a comparative reliability study between a typical BDIPT system and an individually controlled segmented BDIPT system. Steady state thermal simulation results are provided for different output power levels...... for a 1.5 kW BDIPT system in a MATLAB/Simulink environment. Reliability parameters such as failure rate and mean time between failures (MTBF) are compared between the two systems. A nonlinear programming (NP) model is developed for optimizing charging schedule for a stationery EV. A case study of EV...

  12. Structural reliability calculation method based on the dual neural network and direct integration method.

    Science.gov (United States)

    Li, Haibin; He, Yun; Nie, Xiaobo

    2018-01-01

    Structural reliability analysis under uncertainty is paid wide attention by engineers and scholars due to reflecting the structural characteristics and the bearing actual situation. The direct integration method, started from the definition of reliability theory, is easy to be understood, but there are still mathematics difficulties in the calculation of multiple integrals. Therefore, a dual neural network method is proposed for calculating multiple integrals in this paper. Dual neural network consists of two neural networks. The neural network A is used to learn the integrand function, and the neural network B is used to simulate the original function. According to the derivative relationships between the network output and the network input, the neural network B is derived from the neural network A. On this basis, the performance function of normalization is employed in the proposed method to overcome the difficulty of multiple integrations and to improve the accuracy for reliability calculations. The comparisons between the proposed method and Monte Carlo simulation method, Hasofer-Lind method, the mean value first-order second moment method have demonstrated that the proposed method is an efficient and accurate reliability method for structural reliability problems.

  13. Direct quantification of fungal DNA from soil substrate using real-time PCR.

    Science.gov (United States)

    Filion, Martin; St-Arnaud, Marc; Jabaji-Hare, Suha H

    2003-04-01

    Detection and quantification of genomic DNA from two ecologically different fungi, the plant pathogen Fusarium solani f. sp. phaseoli and the arbuscular mycorrhizal fungus Glomus intraradices, was achieved from soil substrate. Specific primers targeting a 362-bp fragment from the SSU rRNA gene region of G. intraradices and a 562-bp fragment from the F. solani f. sp. phaseoli translation elongation factor 1 alpha gene were used in real-time polymerase chain reaction (PCR) assays conjugated with the fluorescent SYBR(R) Green I dye. Standard curves showed a linear relation (r(2)=0.999) between log values of fungal genomic DNA of each species and real-time PCR threshold cycles and were quantitative over 4-5 orders of magnitude. Real-time PCR assays were applied to in vitro-produced fungal structures and sterile and non-sterile soil substrate seeded with known propagule numbers of either fungi. Detection and genomic DNA quantification was obtained from the different treatments, while no amplicon was detected from non-seeded non-sterile soil samples, confirming the absence of cross-reactivity with the soil microflora DNA. A significant correlation (Pgenomic DNA of F. solani f. sp. phaseoli or G. intraradices detected and the number of fungal propagules present in seeded soil substrate. The DNA extraction protocol and real-time PCR quantification assay can be performed in less than 2 h and is adaptable to detect and quantify genomic DNA from other soilborne fungi.

  14. Reliable four-point flexion test and model for die-to-wafer direct bonding

    Energy Technology Data Exchange (ETDEWEB)

    Tabata, T., E-mail: toshiyuki.tabata@cea.fr; Sanchez, L.; Fournel, F.; Moriceau, H. [Univ. Grenoble Alpes, F-38000 Grenoble, France and CEA, LETI, MINATEC Campus, F-38054 Grenoble (France)

    2015-07-07

    For many years, wafer-to-wafer (W2W) direct bonding has been very developed particularly in terms of bonding energy measurement and bonding mechanism comprehension. Nowadays, die-to-wafer (D2W) direct bonding has gained significant attention, for instance, in photonics and microelectro-mechanics, which supposes controlled and reliable fabrication processes. So, whatever the stuck materials may be, it is not obvious whether bonded D2W structures have the same bonding strength as bonded W2W ones, because of possible edge effects of dies. For that reason, it has been strongly required to develop a bonding energy measurement technique which is suitable for D2W structures. In this paper, both D2W- and W2W-type standard SiO{sub 2}-to-SiO{sub 2} direct bonding samples are fabricated from the same full-wafer bonding. Modifications of the four-point flexion test (4PT) technique and applications for measuring D2W direct bonding energies are reported. Thus, the comparison between the modified 4PT and the double-cantilever beam techniques is drawn, also considering possible impacts of the conditions of measures such as the water stress corrosion at the debonding interface and the friction error at the loading contact points. Finally, reliability of a modified technique and a new model established for measuring D2W direct bonding energies is demonstrated.

  15. Accurate and reliable quantification of total microalgal fuel potential as fatty acid methyl esters by in situ transesterification

    Energy Technology Data Exchange (ETDEWEB)

    Laurens, Lieve M.L.; Quinn, Matthew; Wychen, Stefanie van; Templeton, David W.; Wolfrum, Edward J. [National Bioenergy Center, National Renewable Energy Laboratory, Golden, CO (United States)

    2012-04-15

    In the context of algal biofuels, lipids, or better aliphatic chains of the fatty acids, are perhaps the most important constituents of algal biomass. Accurate quantification of lipids and their respective fuel yield is crucial for comparison of algal strains and growth conditions and for process monitoring. As an alternative to traditional solvent-based lipid extraction procedures, we have developed a robust whole-biomass in situ transesterification procedure for quantification of algal lipids (as fatty acid methyl esters, FAMEs) that (a) can be carried out on a small scale (using 4-7 mg of biomass), (b) is applicable to a range of different species, (c) consists of a single-step reaction, (d) is robust over a range of different temperature and time combinations, and (e) tolerant to at least 50% water in the biomass. Unlike gravimetric lipid quantification, which can over- or underestimate the lipid content, whole biomass transesterification reflects the true potential fuel yield of algal biomass. We report here on the comparison of the yield of FAMEs by using different catalysts and catalyst combinations, with the acid catalyst HCl providing a consistently high level of conversion of fatty acids with a precision of 1.9% relative standard deviation. We investigate the influence of reaction time, temperature, and biomass water content on the measured FAME content and profile for 4 different samples of algae (replete and deplete Chlorella vulgaris, replete Phaeodactylum tricornutum, and replete Nannochloropsis sp.). We conclude by demonstrating a full mass balance closure of all fatty acids around a traditional lipid extraction process. (orig.)

  16. A direct qPCR method for residual DNA quantification in monoclonal antibody drugs produced in CHO cells.

    Science.gov (United States)

    Hussain, Musaddeq

    2015-11-10

    Chinese hamster ovary (CHO) cells are the host cell of choice for manufacturing of monoclonal antibody (mAb) drugs in the biopharmaceutical industry. Host cell DNA is an impurity of such manufacturing process and must be controlled and monitored in order to ensure drug purity and safety. A conventional method for quantification of host residual DNA in drug requires extraction of DNA from the mAb drug substance with subsequent quantification of the extracted DNA using real-time PCR (qPCR). Here we report a method where the DNA extraction step is eliminated prior to qPCR. In this method, which we have named 'direct resDNA qPCR', the mAb drug substance is digested with a protease called KAPA in a 96-well PCR plate, the protease in the digest is then denatured at high temperature, qPCR reagents are added to the resultant reaction wells in the plate along with standards and controls in other wells of the same plate, and the plate subjected to qPCR for analysis of residual host DNA in the samples. This direct resDNA qPCR method for CHO is sensitive to 5.0fg of DNA with high precision and accuracy and has a wide linear range of determination. The method has been successfully tested with four mAbs drug, two IgG1 and two IgG4. Both the purified drug substance as well as a number of process intermediate samples, e.g., bioreactor harvest, Protein A column eluate and ion-exchange column eluates were tested. This method simplifies the residual DNA quantification protocol, reduces time of analysis and leads to increased assay sensitivity and development of automated high-throughput methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Reliability of video-based quantification of the knee- and hip angle at foot strike during running

    DEFF Research Database (Denmark)

    Damsted, Camma; Nielsen, R.O.; Larsen, Lars Henrik

    2015-01-01

    INTRODUCTION: In clinical practice, joint kinematics during running are primarily quantified by two-dimensional (2D) video recordings and motion-analysis software. The applicability of this approach depends on the clinicians' ability to quantify kinematics in a reliable manner. The reliability...... motion analysis system prior to the recordings and conclusions should take measurement variations (3-8 degrees and 9-14 degrees for within and between day, respectively) into account. LEVEL OF EVIDENCE: 3....

  18. Reliability of video-based quantification of the knee- and hip angle at foot strike during running

    DEFF Research Database (Denmark)

    Damsted, Camma; Oestergaard Nielsen, Rasmus; Larsen, Lars Henrik

    2014-01-01

    INTRODUCTION: In clinical practice, joint kinematics during running are primarily quantified by two-dimensional (2D) video recordings and motion-analysis software. The applicability of this approach depends on the clinicians' ability to quantify kinematics in a reliable manner. The reliability...... motion analysis system prior to the recordings and conclusions should take measurement variations (3-8 degrees and 9-14 degrees for within and between day, respectively) into account....

  19. Experimental design for TBT quantification by isotope dilution SPE-GC-ICP-MS under the European water framework directive.

    Science.gov (United States)

    Alasonati, Enrica; Fabbri, Barbara; Fettig, Ina; Yardin, Catherine; Del Castillo Busto, Maria Estela; Richter, Janine; Philipp, Rosemarie; Fisicaro, Paola

    2015-03-01

    In Europe the maximum allowable concentration for tributyltin (TBT) compounds in surface water has been regulated by the water framework directive (WFD) and daughter directive that impose a limit of 0.2 ng L(-1) in whole water (as tributyltin cation). Despite the large number of different methodologies for the quantification of organotin species developed in the last two decades, standardised analytical methods at required concentration level do not exist. TBT quantification at picogram level requires efficient and accurate sample preparation and preconcentration, and maximum care to avoid blank contamination. To meet the WFD requirement, a method for the quantification of TBT in mineral water at environmental quality standard (EQS) level, based on solid phase extraction (SPE), was developed and optimised. The quantification was done using species-specific isotope dilution (SSID) followed by gas chromatography (GC) coupled to inductively coupled plasma mass spectrometry (ICP-MS). The analytical process was optimised using a design of experiment (DOE) based on a factorial fractionary plan. The DOE allowed to evaluate 3 qualitative factors (type of stationary phase and eluent, phase mass and eluent volume, pH and analyte ethylation procedure) for a total of 13 levels studied, and a sample volume in the range of 250-1000 mL. Four different models fitting the results were defined and evaluated with statistic tools: one of them was selected and optimised to find the best procedural conditions. C18 phase was found to be the best stationary phase for SPE experiments. The 4 solvents tested with C18, the pH and ethylation conditions, the mass of the phases, the volume of the eluents and the sample volume can all be optimal, but depending on their respective combination. For that reason, the equation of the model conceived in this work is a useful decisional tool for the planning of experiments, because it can be applied to predict the TBT mass fraction recovery when the

  20. Validity, reliability, feasibility, acceptability and educational impact of direct observation of procedural skills (DOPS).

    Science.gov (United States)

    Naeem, Naghma

    2013-01-01

    Direct observation of procedural skills (DOPS) is a new workplace assessment tool. The aim of this narrative review of literature is to summarize the available evidence about the validity, reliability, feasibility, acceptability and educational impact of DOPS. A PubMed database and Google search of the literature on DOPS published from January 2000 to January 2012 was conducted which yielded 30 articles. Thirteen articles were selected for full text reading and review. In the reviewed literature, DOPS was found to be a useful tool for assessment of procedural skills, but further research is required to prove its utility as a workplace based assessment instrument.

  1. A reliable method of quantification of trace copper in beverages with and without alcohol by spectrophotometry after cloud point extraction

    Directory of Open Access Journals (Sweden)

    Ramazan Gürkan

    2013-01-01

    Full Text Available A new cloud point extraction (CPE method was developed for the separation and preconcentration of copper (II prior to spectrophotometric analysis. For this purpose, 1-(2,4-dimethylphenyl azonapthalen-2-ol (Sudan II was used as a chelating agent and the solution pH was adjusted to 10.0 with borate buffer. Polyethylene glycol tert-octylphenyl ether (Triton X-114 was used as an extracting agent in the presence of sodium dodecylsulphate (SDS. After phase separation, based on the cloud point of the mixture, the surfactant-rich phase was diluted with acetone, and the enriched analyte was spectrophotometrically determined at 537 nm. The variables affecting CPE efficiency were optimized. The calibration curve was linear within the range 0.285-20 µg L-1 with a detection limit of 0.085 µg L-1. The method was successfully applied to the quantification of copper in different beverage samples.

  2. DNA microsatellite region for a reliable quantification of soft wheat adulteration in durum wheat-based foodstuffs by real-time PCR.

    Science.gov (United States)

    Sonnante, Gabriella; Montemurro, Cinzia; Morgese, Anita; Sabetta, Wilma; Blanco, Antonio; Pasqualone, Antonella

    2009-11-11

    Italian industrial pasta and durum wheat typical breads must be prepared using exclusively durum wheat semolina. Previously, a microsatellite sequence specific of the wheat D-genome had been chosen for traceability of soft wheat in semolina and bread samples, using qualitative and quantitative Sybr green-based real-time experiments. In this work, we describe an improved method based on the same soft wheat genomic region by means of a quantitative real-time PCR using a dual-labeled probe. Standard curves based on dilutions of 100% soft wheat flour, pasta, or bread were constructed. Durum wheat semolina, pasta, and bread samples were prepared with increasing amounts of soft wheat to verify the accuracy of the method. Results show that reliable quantifications were obtained especially for the samples containing a lower amount of soft wheat DNA, fulfilling the need to verify labeling of pasta and typical durum wheat breads.

  3. Accuracy and reliability of 3D stereophotogrammetry: A comparison to direct anthropometry and 2D photogrammetry.

    Science.gov (United States)

    Dindaroğlu, Furkan; Kutlu, Pınar; Duran, Gökhan Serhat; Görgülü, Serkan; Aslan, Erhan

    2016-05-01

    To evaluate the accuracy of three-dimensional (3D) stereophotogrammetry by comparing it with the direct anthropometry and digital photogrammetry methods. The reliability of 3D stereophotogrammetry was also examined. Six profile and four frontal parameters were directly measured on the faces of 80 participants. The same measurements were repeated using two-dimensional (2D) photogrammetry and 3D stereophotogrammetry (3dMDflex System, 3dMD, Atlanta, Ga) to obtain images of the subjects. Another observer made the same measurements for images obtained with 3D stereophotogrammetry, and interobserver reproducibility was evaluated for 3D images. Both observers remeasured the 3D images 1 month later, and intraobserver reproducibility was evaluated. Statistical analysis was conducted using the paired samples t-test, intraclass correlation coefficient, and Bland-Altman limits of agreement. The highest mean difference was 0.30 mm between direct measurement and photogrammetry, 0.21 mm between direct measurement and 3D stereophotogrammetry, and 0.5 mm between photogrammetry and 3D stereophotogrammetry. The lowest agreement value was 0.965 in the Sn-Pro parameter between the photogrammetry and 3D stereophotogrammetry methods. Agreement between the two observers varied from 0.90 (Ch-Ch) to 0.99 (Sn-Me) in linear measurements. For intraobserver agreement, the highest difference between means was 0.33 for observer 1 and 1.42 mm for observer 2. Measurements obtained using 3D stereophotogrammetry indicate that it may be an accurate and reliable imaging method for use in orthodontics.

  4. Assessment of the human factor in the quantification of technical system reliability taking into consideration cognitive-causal aspects. Partial project 2. Modeling of the human behavior for reliability considerations. Final report

    International Nuclear Information System (INIS)

    Jennerich, Marco; Imbsweiler, Jonas; Straeter, Oliver; Arenius, Marcus

    2015-03-01

    This report presents the findings of the project for the consideration of human factor in the quantification of the reliability of technical systems, taking into account cognitive-causal aspects concerning the modeling of human behavior of reliability issues (funded by the Federal Ministry of Economics and Technology; grant number 15014328). This project is part of a joint project with the University of Applied Sciences Zittau / Goerlitz for assessing the human factor in the quantification of the reliability of technical systems. The concern of the University of Applied Sciences Zittau / Goerlitz is the mathematical modeling of human reliability by means of a fuzzy set approach (grant number 1501432A). The part of the project presented here provides the necessary data basis for the evaluation of the mathematical modeling using fuzzy set approach. At the appropriate places in this report, the interfaces and data bases between the two projects are outlined accordingly. HRA-methods (Human Reliability Analysis) are an essential component to analyze the reliability of socio-technical systems. Various methods have been established and are used in different areas of application. The established HRA methods have been checked on their congruence. In particular the underlying models and their parameters such as performance-influencing factors and situational influences have been investigated. The elaborated parameters were combined into a hierarchical class structure. Cross-domain incidents were studied. The specific performance-influencing factors have been worked out and have been integrated into a cross-domain database. The dominant (critical) situational factors and their interactions within the event data were identified using the CAHR method (connectionism Assessment of Human Reliability). Task dependent cognitive load profiles have been defined. Within these profiles qualitative and quantitative data of the possibility of emergence of errors have been acquired. This

  5. Quantification of the reliability of personnel actions from the evaluation of actual German operational experience. Final report

    International Nuclear Information System (INIS)

    Preischl, W.; Fassmann, W.

    2013-07-01

    The results and their uncertainty bounds of PSA studies are considerably impacted by the assessment of human reliability. But the amount of available, generic data is not sufficient to evaluate all human actions considered in a modern PSA study adequately. Further the data are not sufficiently validated and rely as well as the proposed uncertainty bounds on expert judgement. This research project as well as the preceding project /GRS 10/ validated data recommended by the German PSA Guidelines and enlarged the amount of available data. The findings may contribute to an update of the German PSA Guidelines. In a first step of the project information about reportable events in German nuclear power plants with observed human errors (event reports, expert statements, technical documents, interviews and plant walk downs with subject matter experts from the plants) were analysed. The investigation resulted in 67 samples describing personal activities, performance conditions, the number of observed errors and the number of action performance. In a second step a new methodology was developed and applied in a pilot plant. The objective was to identify undoubtedly error free safety relevant actions, their performance conditions, and frequency as well as to prove and demonstrate that probabilistic data can be derived from that operational experience (OE). The application in the pilot plant resulted in 18 ''error free'' samples characterizing human reliability. All available samples were evaluated by use of the method of Bayes. That commonly accepted methodology was applied in order to derive probabilistic data based on samples taken from operational experience. A thorough analysis of the obtained results shows that both data sources (OE reportable events, OE with undoubtedly error free action performance) provide data with comparable quality and validity. At the end of the research project the following products are available. - Methods to select samples

  6. Ultrasensitive Direct Quantification of Nucleobase Modifications in DNA by Surface-Enhanced Raman Scattering: The Case of Cytosine.

    Science.gov (United States)

    Morla-Folch, Judit; Xie, Hai-nan; Gisbert-Quilis, Patricia; Gómez-de Pedro, Sara; Pazos-Perez, Nicolas; Alvarez-Puebla, Ramon A; Guerrini, Luca

    2015-11-09

    Recognition of chemical modifications in canonical nucleobases of nucleic acids is of key importance since such modified variants act as different genetic encoders, introducing variability in the biological information contained in DNA. Herein, we demonstrate the feasibility of direct SERS in combination with chemometrics and microfluidics for the identification and relative quantification of 4 different cytosine modifications in both single- and double-stranded DNA. The minute amount of DNA required per measurement, in the sub-nanogram regime, removes the necessity of pre-amplification or enrichment steps (which are also potential sources of artificial DNA damages). These findings show great potentials for the development of fast, low-cost and high-throughput screening analytical devices capable of detecting known and unknown modifications in nucleic acids (DNA and RNA) opening new windows of activity in several fields such as biology, medicine and forensic sciences. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Multi Directional Repeated Sprint Is a Valid and Reliable Test for Assessment of Junior Handball Players

    Directory of Open Access Journals (Sweden)

    Amin Daneshfar

    2018-04-01

    Full Text Available The aim of the present study was to examine the validity and reliability of a 10 × (6 × 5 m multi-directional repeated sprint ability test (RSM in elite young team handball (TH players. Participants were members of the Iranian national team (n = 20, age 16.4 ± 0.7 years, weight 82.5 ± 5.5 kg, height 184.8 ± 4.6 cm, body fat 15.4 ± 4.3%. The validity of RSM was tested against a 10 × (15 + 15 m repeated sprint ability test (RSA, Yo-Yo Intermittent Recovery test Level 1 (Yo-Yo IR1, squat jump (SJ and countermovement jump (CMJ. To test the reliability of RSM, the participants repeated the testing sessions of RSM and RSA 1 week later. Both RSA and RSM tests showed good to excellent reliability of the total time (TT, best time (BT, and weakest time (WT. The results of the correlation analysis showed significant inverse correlations between maximum aerobic capacity and TT in RSA (r = −0.57, p ≤ 0.05 and RSM (r = −0.76, p ≤ 0.01. There was also a significant inverse correlation between maximum aerobic capacity with fatigue index (FI in RSA test (r = −0.64, p ≤ 0.01 and in RSM test (r = −0.53, p ≤ 0.05. BT, WT, and TT of RSA was largely-to-very largely correlated with BT (r = 0.58, p ≤ 0.01, WT (r = 0.62, p ≤ 0.01, and TT (r = 0.65, p ≤ 0.01 of RSM. BT in RSM was also correlated with FI in RSM (r = 0.88, p ≤ 0.01. In conclusion, based on the findings of the current study, the recently developed RSM test is a valid and reliable test and should be utilized for assessment of repeated sprint ability in handball players.

  8. Multi Directional Repeated Sprint Is a Valid and Reliable Test for Assessment of Junior Handball Players

    Science.gov (United States)

    Daneshfar, Amin; Gahreman, Daniel E.; Koozehchian, Majid S.; Amani Shalamzari, Sadegh; Hassanzadeh Sablouei, Mozhgan; Rosemann, Thomas; Knechtle, Beat; Nikolaidis, Pantelis T.

    2018-01-01

    The aim of the present study was to examine the validity and reliability of a 10 × (6 × 5 m) multi-directional repeated sprint ability test (RSM) in elite young team handball (TH) players. Participants were members of the Iranian national team (n = 20, age 16.4 ± 0.7 years, weight 82.5 ± 5.5 kg, height 184.8 ± 4.6 cm, body fat 15.4 ± 4.3%). The validity of RSM was tested against a 10 × (15 + 15 m) repeated sprint ability test (RSA), Yo-Yo Intermittent Recovery test Level 1 (Yo-Yo IR1), squat jump (SJ) and countermovement jump (CMJ). To test the reliability of RSM, the participants repeated the testing sessions of RSM and RSA 1 week later. Both RSA and RSM tests showed good to excellent reliability of the total time (TT), best time (BT), and weakest time (WT). The results of the correlation analysis showed significant inverse correlations between maximum aerobic capacity and TT in RSA (r = −0.57, p ≤ 0.05) and RSM (r = −0.76, p ≤ 0.01). There was also a significant inverse correlation between maximum aerobic capacity with fatigue index (FI) in RSA test (r = −0.64, p ≤ 0.01) and in RSM test (r = −0.53, p ≤ 0.05). BT, WT, and TT of RSA was largely-to-very largely correlated with BT (r = 0.58, p ≤ 0.01), WT (r = 0.62, p ≤ 0.01), and TT (r = 0.65, p ≤ 0.01) of RSM. BT in RSM was also correlated with FI in RSM (r = 0.88, p ≤ 0.01). In conclusion, based on the findings of the current study, the recently developed RSM test is a valid and reliable test and should be utilized for assessment of repeated sprint ability in handball players. PMID:29670536

  9. Reliable Quantification of the Potential for Equations Based on Spot Urine Samples to Estimate Population Salt Intake

    DEFF Research Database (Denmark)

    Huang, Liping; Crino, Michelle; Wu, Jason Hy

    2016-01-01

    to a standard format. Individual participant records will be compiled and a series of analyses will be completed to: (1) compare existing equations for estimating 24-hour salt intake from spot urine samples with 24-hour urine samples, and assess the degree of bias according to key demographic and clinical......BACKGROUND: Methods based on spot urine samples (a single sample at one time-point) have been identified as a possible alternative approach to 24-hour urine samples for determining mean population salt intake. OBJECTIVE: The aim of this study is to identify a reliable method for estimating mean...... population salt intake from spot urine samples. This will be done by comparing the performance of existing equations against one other and against estimates derived from 24-hour urine samples. The effects of factors such as ethnicity, sex, age, body mass index, antihypertensive drug use, health status...

  10. First direct fluorescence polarization assay for the detection and quantification of spirolides in mussel samples

    Energy Technology Data Exchange (ETDEWEB)

    Otero, Paz; Alfonso, Amparo [Departamento de Farmacologia, Facultad de Veterinaria, Universidad de Santiago de Compostela, Campus Universitario s/n, 27002 Lugo (Spain); Alfonso, Carmen [CIFGA Laboratorio, Plaza de Santo Domingo, 1, 27001 Lugo (Spain); Araoz, Romulo; Molgo, Jordi [CNRS, Institut de Neurobiologie Alfred Fessard - FRC2118, Laboratoire de Neurobiologie et Developpement UPR3294, 1 Avenue de la Terrasse, 91198 Gif sur Yvette Cedex (France); Vieytes, Mercedes R. [Departamento de Fisiologia, Facultad de Veterinaria, Universidad de Santiago de Compostela, 27002 Lugo (Spain); Botana, Luis M., E-mail: luis.botana@usc.es [Departamento de Farmacologia, Facultad de Veterinaria, Universidad de Santiago de Compostela, Campus Universitario s/n, 27002 Lugo (Spain)

    2011-09-09

    Highlights: {yields} A direct assay based in the binding of nAChR to spirolide toxins by FP is described. {yields} A direct relationship between FP and 13-desMeC in the range of 10-500 nM is obtained. {yields} FP is dependent on the 13, 19-didesMeC in a higher concentration range than 13-desMeC. {yields} FP assay is a sensitive method to detect and quantify 13-desMeC in mussel samples. - Abstract: In 2009, we achieve the first inhibition FP assay to detect imine cyclic toxins. In the present paper we propose a new FP assay for direct quantify spirolides. This new method has resulted in significant improvement of sensitivity, rapidity and accessibility. In the method design, nicotinic acetylcholine receptor from Torpedo marmorata membranes labelled with a derivative of fluorescein was used. Spirolides, 13-desmethyl spirolide C (13-desMeC) and 13,19-didesmethyl spirolide C (13,19-didesMeC) were extracted and purified from cultures of the Alexandrium ostenfeldii dinoflagellate. Data showed the decrease of FP when toxin concentration was increased. Thus, a relationship between the FP units and the spirolides amount present in a sample was obtained. This direct assay is a reproducible, simple and very sensitive method with a detection limit about 25 nM for 13-desMeC and 150 nM for 13,19-didesMeC. The procedure was used to measure spirolides in mussel samples using an extraction and clean up protocol suitable for the FP assay. Results obtained show that this method is able to quantify 13-desMeC in the range of 50-350 {mu}g kg{sup -1} meat. Other liposoluble toxins did not interfere with the assay, proving a specific method. Moreover, the matrix do not affect in the range of toxin concentrations that involving risk of spirolides intoxication.

  11. First direct fluorescence polarization assay for the detection and quantification of spirolides in mussel samples

    International Nuclear Information System (INIS)

    Otero, Paz; Alfonso, Amparo; Alfonso, Carmen; Araoz, Romulo; Molgo, Jordi; Vieytes, Mercedes R.; Botana, Luis M.

    2011-01-01

    Highlights: → A direct assay based in the binding of nAChR to spirolide toxins by FP is described. → A direct relationship between FP and 13-desMeC in the range of 10-500 nM is obtained. → FP is dependent on the 13, 19-didesMeC in a higher concentration range than 13-desMeC. → FP assay is a sensitive method to detect and quantify 13-desMeC in mussel samples. - Abstract: In 2009, we achieve the first inhibition FP assay to detect imine cyclic toxins. In the present paper we propose a new FP assay for direct quantify spirolides. This new method has resulted in significant improvement of sensitivity, rapidity and accessibility. In the method design, nicotinic acetylcholine receptor from Torpedo marmorata membranes labelled with a derivative of fluorescein was used. Spirolides, 13-desmethyl spirolide C (13-desMeC) and 13,19-didesmethyl spirolide C (13,19-didesMeC) were extracted and purified from cultures of the Alexandrium ostenfeldii dinoflagellate. Data showed the decrease of FP when toxin concentration was increased. Thus, a relationship between the FP units and the spirolides amount present in a sample was obtained. This direct assay is a reproducible, simple and very sensitive method with a detection limit about 25 nM for 13-desMeC and 150 nM for 13,19-didesMeC. The procedure was used to measure spirolides in mussel samples using an extraction and clean up protocol suitable for the FP assay. Results obtained show that this method is able to quantify 13-desMeC in the range of 50-350 μg kg -1 meat. Other liposoluble toxins did not interfere with the assay, proving a specific method. Moreover, the matrix do not affect in the range of toxin concentrations that involving risk of spirolides intoxication.

  12. Sparse Pseudo Spectral Projection Methods with Directional Adaptation for Uncertainty Quantification

    KAUST Repository

    Winokur, J.

    2015-12-19

    We investigate two methods to build a polynomial approximation of a model output depending on some parameters. The two approaches are based on pseudo-spectral projection (PSP) methods on adaptively constructed sparse grids, and aim at providing a finer control of the resolution along two distinct subsets of model parameters. The control of the error along different subsets of parameters may be needed for instance in the case of a model depending on uncertain parameters and deterministic design variables. We first consider a nested approach where an independent adaptive sparse grid PSP is performed along the first set of directions only, and at each point a sparse grid is constructed adaptively in the second set of directions. We then consider the application of aPSP in the space of all parameters, and introduce directional refinement criteria to provide a tighter control of the projection error along individual dimensions. Specifically, we use a Sobol decomposition of the projection surpluses to tune the sparse grid adaptation. The behavior and performance of the two approaches are compared for a simple two-dimensional test problem and for a shock-tube ignition model involving 22 uncertain parameters and 3 design parameters. The numerical experiments indicate that whereas both methods provide effective means for tuning the quality of the representation along distinct subsets of parameters, PSP in the global parameter space generally requires fewer model evaluations than the nested approach to achieve similar projection error. In addition, the global approach is better suited for generalization to more than two subsets of directions.

  13. Quantification of the electrostatic forces involved in the directed assembly of colloidal nanoparticles by AFM nanoxerography.

    Science.gov (United States)

    Palleau, E; Sangeetha, N M; Ressier, L

    2011-08-12

    Directed assembly of 10 nm dodecanethiol stabilized silver nanoparticles in hexane and 14 nm citrate stabilized gold nanoparticles in ethanol was performed by AFM nanoxerography onto charge patterns of both polarities written into poly(methylmethacrylate) thin films. The quasi-neutral silver nanoparticles were grafted on both positive and negative charge patterns while the negatively charged gold nanoparticles were selectively deposited on positive charge patterns only. Numerical simulations were conducted to quantify the magnitude, direction and spatial range of the electrophoretic and dielectrophoretic forces exerted by the charge patterns on these two types of nanoparticles in suspension taken as models. The simulations indicate that the directed assembly of silver nanoparticles on both charge patterns is due to the predominant dielectrophoretic forces, while the selective assembly of gold nanoparticles only on positive charge patterns is due to the predominant electrophoretic forces. The study also suggests that the minimum surface potential of charge patterns required for obtaining effective nanoparticle assembly depends strongly on the charge and polarizability of the nanoparticles and also on the nature of the dispersing solvent. Attractive electrostatic forces of about 2 × 10( - 2) pN in magnitude just above the charged surface appear to be sufficient to trap silver nanoparticles in hexane onto charge patterns and the value is about 2 pN for gold nanoparticles in ethanol, under the present experimental conditions. The numerical simulations used in this work to quantify the electrostatic forces operating in the directed assembly of nanoparticles from suspensions onto charge patterns can easily be extended to any kind of colloid and serve as an effective tool for a better comprehension and prediction of liquid-phase nanoxerography processes.

  14. Quantification of the electrostatic forces involved in the directed assembly of colloidal nanoparticles by AFM nanoxerography

    International Nuclear Information System (INIS)

    Palleau, E; Sangeetha, N M; Ressier, L

    2011-01-01

    Directed assembly of 10 nm dodecanethiol stabilized silver nanoparticles in hexane and 14 nm citrate stabilized gold nanoparticles in ethanol was performed by AFM nanoxerography onto charge patterns of both polarities written into poly(methylmethacrylate) thin films. The quasi-neutral silver nanoparticles were grafted on both positive and negative charge patterns while the negatively charged gold nanoparticles were selectively deposited on positive charge patterns only. Numerical simulations were conducted to quantify the magnitude, direction and spatial range of the electrophoretic and dielectrophoretic forces exerted by the charge patterns on these two types of nanoparticles in suspension taken as models. The simulations indicate that the directed assembly of silver nanoparticles on both charge patterns is due to the predominant dielectrophoretic forces, while the selective assembly of gold nanoparticles only on positive charge patterns is due to the predominant electrophoretic forces. The study also suggests that the minimum surface potential of charge patterns required for obtaining effective nanoparticle assembly depends strongly on the charge and polarizability of the nanoparticles and also on the nature of the dispersing solvent. Attractive electrostatic forces of about 2 x 10 -2 pN in magnitude just above the charged surface appear to be sufficient to trap silver nanoparticles in hexane onto charge patterns and the value is about 2 pN for gold nanoparticles in ethanol, under the present experimental conditions. The numerical simulations used in this work to quantify the electrostatic forces operating in the directed assembly of nanoparticles from suspensions onto charge patterns can easily be extended to any kind of colloid and serve as an effective tool for a better comprehension and prediction of liquid-phase nanoxerography processes.

  15. Towards tributyltin quantification in natural water at the Environmental Quality Standard level required by the Water Framework Directive.

    Science.gov (United States)

    Alasonati, Enrica; Fettig, Ina; Richter, Janine; Philipp, Rosemarie; Milačič, Radmila; Sčančar, Janez; Zuliani, Tea; Tunç, Murat; Bilsel, Mine; Gören, Ahmet Ceyhan; Fisicaro, Paola

    2016-11-01

    The European Union (EU) has included tributyltin (TBT) and its compounds in the list of priority water pollutants. Quality standards demanded by the EU Water Framework Directive (WFD) require determination of TBT at so low concentration level that chemical analysis is still difficult and further research is needed to improve the sensitivity, the accuracy and the precision of existing methodologies. Within the frame of a joint research project "Traceable measurements for monitoring critical pollutants under the European Water Framework Directive" in the European Metrology Research Programme (EMRP), four metrological and designated institutes have developed a primary method to quantify TBT in natural water using liquid-liquid extraction (LLE) and species-specific isotope dilution mass spectrometry (SSIDMS). The procedure has been validated at the Environmental Quality Standard (EQS) level (0.2ngL(-1) as cation) and at the WFD-required limit of quantification (LOQ) (0.06ngL(-1) as cation). The LOQ of the methodology was 0.06ngL(-1) and the average measurement uncertainty at the LOQ was 36%, which agreed with WFD requirements. The analytical difficulties of the method, namely the presence of TBT in blanks and the sources of measurement uncertainties, as well as the interlaboratory comparison results are discussed in detail. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Automated identification of retinal vessels using a multiscale directional contrast quantification (MDCQ) strategy

    Energy Technology Data Exchange (ETDEWEB)

    Zhen, Yi; Zhang, Xinyuan; Wang, Ningli, E-mail: wningli@vip.163.com, E-mail: puj@upmc.edu [National Engineering Research Center for Ophthalmic Equipments, Beijing, 100730 (China); Gu, Suicheng; Meng, Xin [Imaging Research Center, Department of Radiology, University of Pittsburgh, Pittsburgh, Pennsylvania, 15213 (United States); Zheng, Bin [School of Electrical and Computer Engineering, University of Oklahoma, Norman, Oklahoma 73019 (United States); Pu, Jiantao, E-mail: wningli@vip.163.com, E-mail: puj@upmc.edu [Imaging Research Center, Departments of Radiology and Bioengineering, University of Pittsburgh, Pittsburgh, Pennsylvania, 15213 (United States)

    2014-09-15

    Purpose: A novel algorithm is presented to automatically identify the retinal vessels depicted in color fundus photographs. Methods: The proposed algorithm quantifies the contrast of each pixel in retinal images at multiple scales and fuses the resulting consequent contrast images in a progressive manner by leveraging their spatial difference and continuity. The multiscale strategy is to deal with the variety of retinal vessels in width, intensity, resolution, and orientation; and the progressive fusion is to combine consequent images and meanwhile avoid a sudden fusion of image noise and/or artifacts in space. To quantitatively assess the performance of the algorithm, we tested it on three publicly available databases, namely, DRIVE, STARE, and HRF. The agreement between the computer results and the manual delineation in these databases were quantified by computing their overlapping in both area and length (centerline). The measures include sensitivity, specificity, and accuracy. Results: For the DRIVE database, the sensitivities in identifying vessels in area and length were around 90% and 70%, respectively, the accuracy in pixel classification was around 99%, and the precisions in terms of both area and length were around 94%. For the STARE database, the sensitivities in identifying vessels were around 90% in area and 70% in length, and the accuracy in pixel classification was around 97%. For the HRF database, the sensitivities in identifying vessels were around 92% in area and 83% in length for the healthy subgroup, around 92% in area and 75% in length for the glaucomatous subgroup, around 91% in area and 73% in length for the diabetic retinopathy subgroup. For all three subgroups, the accuracy was around 98%. Conclusions: The experimental results demonstrate that the developed algorithm is capable of identifying retinal vessels depicted in color fundus photographs in a relatively reliable manner.

  17. Direct Quantification of Rare Earth Elements Concentrations in Urine of Workers Manufacturing Cerium, Lanthanum Oxide Ultrafine and Nanoparticles by a Developed and Validated ICP-MS

    Science.gov (United States)

    Li, Yan; Yu, Hua; Zheng, Siqian; Miao, Yang; Yin, Shi; Li, Peng; Bian, Ying

    2016-01-01

    Rare earth elements (REEs) have undergone a steady spread in several industrial, agriculture and medical applications. With the aim of exploring a sensitive and reliable indicator of estimating exposure level to REEs, a simple, accurate and specific ICP-MS method for simultaneous direct quantification of 15 REEs (89Y, 139La, 140Ce, 141Pr, 146Nd, 147Sm, 153Eu, 157Gd, 159Tb, 163Dy, 165Ho, 166Er, 169Tm, 172Yb and 175Lu) in human urine has been developed and validated. The method showed good linearity for all REEs in human urine in the concentrations ranging from 0.001–1.000 μg∙L−1 with r2 > 0.997. The limits of detection and quantification for this method were in the range of 0.009–0.010 μg∙L−1 and 0.029–0.037 μg∙L−1, the recoveries on spiked samples of the 15 REEs ranged from 93.3% to 103.0% and the relative percentage differences were less than 6.2% in duplicate samples, and the intra- and inter-day variations of the analysis were less than 1.28% and less than 0.85% for all REEs, respectively. The developed method was successfully applied to the determination of 15 REEs in 31 urine samples obtained from the control subjects and the workers engaged in work with manufacturing of ultrafine and nanoparticles containing cerium and lanthanum oxide. The results suggested that only the urinary levels of La (1.234 ± 0.626 μg∙L−1), Ce (1.492 ± 0.995 μg∙L−1), Nd (0.014 ± 0.009 μg∙L−1) and Gd (0.023 ± 0.010 μg∙L−1) among the exposed workers were significantly higher (p < 0.05) than the levels measured in the control subjects. From these, La and Ce were the primary components, and accounted for 88% of the total REEs. Lanthanum comprised 27% of the total REEs while Ce made up the majority of REE content at 61%. The remaining elements only made up 1% each, with the exception of Dy which was not detected. Comparison with the previously published data, the levels of urinary La and Ce in workers and the control subjects show a higher trend

  18. Direct In Situ Quantification of HO2 from a Flow Reactor.

    Science.gov (United States)

    Brumfield, Brian; Sun, Wenting; Ju, Yiguang; Wysocki, Gerard

    2013-03-21

    The first direct in situ measurements of hydroperoxyl radical (HO2) at atmospheric pressure from the exit of a laminar flow reactor have been carried out using mid-infrared Faraday rotation spectroscopy. HO2 was generated by oxidation of dimethyl ether, a potential renewable biofuel with a simple molecular structure but rich low-temperature oxidation chemistry. On the basis of the results of nonlinear fitting of the experimental data to a theoretical spectroscopic model, the technique offers an estimated sensitivity of reactor exit temperature range of 398-673 K. Accurate in situ measurement of this species will aid in quantitative modeling of low-temperature and high-pressure combustion kinetics.

  19. Direct quantification of rare earth element concentrations in natural waters by ICP-MS

    International Nuclear Information System (INIS)

    Lawrence, Michael G.; Greig, Alan; Collerson, Kenneth D.; Kamber, Balz S.

    2006-01-01

    A direct quadrupole ICP-MS technique has been developed for the analysis of the rare earth elements and yttrium in natural waters. The method has been validated by comparison of the results obtained for the river water reference material SLRS-4 with literature values. The detection limit of the technique was investigated by analysis of serial dilutions of SLRS-4 and revealed that single elements can be quantified at single-digit fg/g concentrations. A coherent normalised rare earth pattern was retained at concentrations two orders of magnitude below natural concentrations for SLRS-4, demonstrating the excellent inter-element accuracy and precision of the method. The technique was applied to the analysis of a diluted mid-salinity estuarine sample, which also displayed a coherent normalised rare earth element pattern, yielding the expected distinctive marine characteristics

  20. Laser-induced breakdown spectroscopy for lambda quantification in a direct-injection engine

    International Nuclear Information System (INIS)

    Buschbeck, M.; Büchler, F.; Halfmann, T.; Arndt, S.

    2012-01-01

    We apply laser-induced breakdown spectroscopy (LIBS) to determine local lambda values (i.e. the normalized air-fuel mass ratio) at the ignition location λ ip in a direct-injection single-cylinder optical research engine. The technique enables us to determine variations of λ ip for different fuel injection strategies, as well as correlations between variations in λ ip and the combustion dynamics. In particular we observe, that fluctuations in λ ip are not the major cause of cycle-to-cycle variations in the combustion process. Moreover, our experiments identify insufficient lean λ ip values as a source of misfires in lean combustions. In a combination of LIBS with laser-induced fluorescence (LIF), we obtain additionally information about the two-dimensional λ distribution. These results demonstrate the potential of LIBS to monitor λ values during mixture formation in gasoline engines. - Highlights: ► Determination of λ values by means of LIBS in an optical gasoline engine. ► Evaluation of λ fluctuations for different fuel injection strategies. ► Investigation of the effect of λ upon combustion dynamics. ► Combination of LIBS and LIF to obtain two-dimensional λ distributions.

  1. In-situ visualization and order quantification of symmetric diblock copolymer directed self-assembly

    International Nuclear Information System (INIS)

    Salaün, M.; Le Gallic, M.; Picard, E.; Zelsmann, M.

    2013-01-01

    In this work, atomic force microscopy (AFM) investigations of lamellar PS-b-PMMA block copolymer layers are performed during the self-assembly process. These in-situ experiments are made on both un-patterned planar substrates and topographical substrates (graphoepitaxy experiments) at different temperatures and for different durations. Image processing software is used to produce AFM movies of the same location on the sample and to measure polymer micro-phase domain lengths versus annealing time. We observed that micro-domain formation starts after only a few minutes of heating. On planar substrates, the micro-domain length evolution with time (t) is in accordance with the literature, following a power law ∼ t 0.29 . On the other hand, in substrate channels and in conditions used, we show that the domain length dependence follows a two-step process. Initially, the system adopts a similar kinetic dependence as that of the planar substrate, but at longer times, drastically reduced time dependence is observed due to the topographical confinement of the domains. - Highlights: ► Live atomic force microscopy of block copolymer directed self-assembly is performed. ► Values of polymer self-assembly kinetic in topographical trenches are measured. ► Opens the way to a better understanding of graphoepitaxy order nucleation and growth

  2. Direct observation and quantification of nanoscale spinodal decomposition in super duplex stainless steel weld metals.

    Science.gov (United States)

    Shariq, Ahmed; Hättestrand, Mats; Nilsson, Jan-Olof; Gregori, Andrea

    2009-06-01

    Three variants of super duplex stainless steel weld metals with the basic composition 29Cr-8Ni-2Mo (wt%) were investigated. The nitrogen content of the three materials was 0.22%, 0.33% and 0.37%, respectively. Isothermal heat treatments were performed at 450 degrees C for times up to 243 h. The hardness evolution of the three materials was found to vary with the overall concentration of the nitrogen. Atom probe field ion microscopy (APFIM) was used to directly detect and quantify the degree of spinodal decomposition in different material conditions. 3-DAP atomic reconstruction clearly illustrate nanoscale variation of iron rich (alpha) and chromium rich (alpha') phases. A longer ageing time produces a coarser microstructure with larger alpha and alpha' domains. Statistical evaluation of APFIM data showed that phase separation was significant already after 1 h of ageing that gradually became more pronounced. Although nanoscale concentration variation was evident, no significant influence of overall nitrogen content on the degree of spinodal decomposition was found.

  3. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  4. A direct ROI quantification method for inherent PVE correction: accuracy assessment in striatal SPECT measurements

    Energy Technology Data Exchange (ETDEWEB)

    Vanzi, Eleonora; De Cristofaro, Maria T.; Sotgia, Barbara; Mascalchi, Mario; Formiconi, Andreas R. [University of Florence, Clinical Pathophysiology, Florence (Italy); Ramat, Silvia [University of Florence, Neurological and Psychiatric Sciences, Florence (Italy)

    2007-09-15

    The clinical potential of striatal imaging with dopamine transporter (DAT) SPECT tracers is hampered by the limited capability to recover activity concentration ratios due to partial volume effects (PVE). We evaluated the accuracy of a least squares method that allows retrieval of activity in regions of interest directly from projections (LS-ROI). An Alderson striatal phantom was filled with striatal to background ratios of 6:1, 9:1 and 28:1; the striatal and background ROIs were drawn on a coregistered X-ray CT of the phantom. The activity ratios of these ROIs were derived both with the LS-ROI method and with conventional SPECT EM reconstruction (EM-SPECT). Moreover, the two methods were compared in seven patients with motor symptoms who were examined with N-3-fluoropropyl-2-{beta}-carboxymethoxy-3-{beta}-(4-iodophenyl) (FP-CIT) SPECT, calculating the binding potential (BP). In the phantom study, the activity ratios obtained with EM-SPECT were 3.5, 5.3 and 17.0, respectively, whereas the LS-ROI method resulted in ratios of 6.2, 9.0 and 27.3, respectively. With the LS-ROI method, the BP in the seven patients was approximately 60% higher than with EM-SPECT; a linear correlation between the LS-ROI and the EM estimates was found (r = 0.98, p = 0.03). The LS-ROI PVE correction capability is mainly due to the fact that the ill-conditioning of the LS-ROI approach is lower than that of the EM-SPECT one. The LS-ROI seems to be feasible and accurate in the examination of the dopaminergic system. This approach can be fruitful in monitoring of disease progression and in clinical trials of dopaminergic drugs. (orig.)

  5. Reliable quantification of bite-force performance requires use of appropriate biting substrate and standardization of bite out-lever.

    Science.gov (United States)

    Lappin, A Kristopher; Jones, Marc E H

    2014-12-15

    Bite-force performance is an ecologically important measure of whole-organism performance that shapes dietary breadth and feeding strategies and, in some taxa, determines reproductive success. It also is a metric that is crucial to testing and evaluating biomechanical models. We reviewed nearly 100 published studies of a range of taxa that incorporate direct in vivo measurements of bite force. Problematically, methods of data collection and processing vary considerably among studies. In particular, there is little consensus on the appropriate substrate to use on the biting surface of force transducers. In addition, the bite out-lever, defined as the distance from the fulcrum (i.e. jaw joint) to the position along the jawline at which the jaws engage the transducer, is rarely taken into account. We examined the effect of bite substrate and bite out-lever on bite-force estimates in a diverse sample of lizards. Results indicate that both variables have a significant impact on the accuracy of measurements. Maximum bite force is significantly greater using leather as the biting substrate compared with a metal substrate. Less-forceful bites on metal are likely due to inhibitory feedback from mechanoreceptors that prevent damage to the feeding apparatus. Standardization of bite out-lever affected which trial produced maximum performance for a given individual. Indeed, maximum bite force is usually underestimated without standardization because it is expected to be greatest at the minimum out-lever (i.e. back of the jaws), which in studies is rarely targeted with success. We assert that future studies should use a pliable substrate, such as leather, and use appropriate standardization for bite out-lever. © 2014. Published by The Company of Biologists Ltd.

  6. Direct quantification of thorium, uranium and rare earth element concentration in natural waters by ICP-MS

    International Nuclear Information System (INIS)

    Palmieri, Helena E.L.; Knupp, Eliana A.N.; Auler, Lucia M.L.A.; Gomes, Luiza M.F.; Windmoeller, Claudia C.

    2011-01-01

    A direct quantification of the thorium, uranium and rare earth elements in natural water samples using inductively coupled plasma mass spectrometry (ICP-MS) was evaluated with respect to selection of isotopes, detection limits, accuracy, precision, matrix effects for each isotope and spectral interferences. Accuracy of the method was evaluated by analysis of Spectra pure Standards (SPS-SW1 Batch 116-Norway) for the rare earth elements (REEs), thorium, uranium, scandium and yttrium. The measurements were carried out for each of the following analytical isotopes: 139 La, 140 Ce, 141 Pr, 143 Nd, 147 Sm, 151 Eu, 160 Gd, 159 Tb, 163 Dy, 165 Ho, 167 Er, 16 9Tm, 174 Yb, 175 Lu, 45 Sc, 89 Y, 232 Th and 238 U. Recovery percentage values found in these certified samples varied between 95 and 107%. The method was applied to the analysis of spring water samples collected in fountains spread throughout the historical towns of Ouro Preto, Mariana, Sabara and Diamantina in the state of Minas Gerais, Brazil. In the past these fountains played an essential and strategic role in supplying these towns with potable water. Until today this water is used by both the local population and tourists who believe in its quality. REE were quantified at levels comparable to those found in estuarine waters, which are characterized by low REE concentrations. In two fountains analyzed the concentration of REEs presented high levels and thus possible health risks for humans may not be excluded. (author)

  7. On the direct quantification of celecoxib in commercial solid drugs using the TT-PIXE and TT-PIGE techniques

    International Nuclear Information System (INIS)

    Nsouli, B.; Zahraman, K.; Bejjani, A.; Assi, S.; El-Yazbi, F.; Roumie, M.

    2006-01-01

    The quantification of the active ingredient (AI) in drugs is a crucial and important step in the drug quality control process. This is usually performed by using wet chemical techniques like LC-MS, UV spectrophotometry and other appropriate organic analytical methods. In the case of an active ingredient contains specific heteroatoms (F, S, Cl, etc.,), elemental IBA technique can be explored for molecular quantification. IBA techniques permit the analysis of the sample under solid form, without any laborious sample preparation. This is an advantage when the number of sample is relatively large. In this work, we demonstrate the ability of the thick target PIXE (TT-PIXE) and the TT-PIGE techniques for rapid and accurate quantification of celecoxib in commercial drugs. The experimental aspects related to the quantification validity are presented and discussed

  8. Development of an instrument for direct ozone production rate measurements: measurement reliability and current limitations

    Science.gov (United States)

    Sklaveniti, Sofia; Locoge, Nadine; Stevens, Philip S.; Wood, Ezra; Kundu, Shuvashish; Dusanter, Sébastien

    2018-02-01

    Ground-level ozone (O3) is an important pollutant that affects both global climate change and regional air quality, with the latter linked to detrimental effects on both human health and ecosystems. Ozone is not directly emitted in the atmosphere but is formed from chemical reactions involving volatile organic compounds (VOCs), nitrogen oxides (NOx = NO + NO2) and sunlight. The photochemical nature of ozone makes the implementation of reduction strategies challenging and a good understanding of its formation chemistry is fundamental in order to develop efficient strategies of ozone reduction from mitigation measures of primary VOCs and NOx emissions. An instrument for direct measurements of ozone production rates (OPRs) was developed and deployed in the field as part of the IRRONIC (Indiana Radical, Reactivity and Ozone Production Intercomparison) field campaign. The OPR instrument is based on the principle of the previously published MOPS instrument (Measurement of Ozone Production Sensor) but using a different sampling design made of quartz flow tubes and a different Ox (O3 and NO2) conversion-detection scheme composed of an O3-to-NO2 conversion unit and a cavity attenuated phase shift spectroscopy (CAPS) NO2 monitor. Tests performed in the laboratory and in the field, together with model simulations of the radical chemistry occurring inside the flow tubes, were used to assess (i) the reliability of the measurement principle and (ii) potential biases associated with OPR measurements. This publication reports the first field measurements made using this instrument to illustrate its performance. The results showed that a photo-enhanced loss of ozone inside the sampling flow tubes disturbs the measurements. This issue needs to be solved to be able to perform accurate ambient measurements of ozone production rates with the instrument described in this study. However, an attempt was made to investigate the OPR sensitivity to NOx by adding NO inside the instrument

  9. Reliability of Direct Behavior Ratings--Social Competence (DBR-SC) Data: How Many Ratings Are Necessary?

    Science.gov (United States)

    Kilgus, Stephen P.; Riley-Tillman, T. Chris; Stichter, Janine P.; Schoemann, Alexander M.; Bellesheim, Katie

    2016-01-01

    The purpose of this investigation was to evaluate the reliability of Direct Behavior Ratings--Social Competence (DBR-SC) ratings. Participants included 60 students identified as possessing deficits in social competence, as well as their 23 classroom teachers. Teachers used DBR-SC to complete ratings of 5 student behaviors within the general…

  10. Direct quantification of lipopeptide biosurfactants in biological samples via HPLC and UPLC-MS requires sample modification with an organic solvent.

    Science.gov (United States)

    Biniarz, Piotr; Łukaszewicz, Marcin

    2017-06-01

    The rapid and accurate quantification of biosurfactants in biological samples is challenging. In contrast to the orcinol method for rhamnolipids, no simple biochemical method is available for the rapid quantification of lipopeptides. Various liquid chromatography (LC) methods are promising tools for relatively fast and exact quantification of lipopeptides. Here, we report strategies for the quantification of the lipopeptides pseudofactin and surfactin in bacterial cultures using different high- (HPLC) and ultra-performance liquid chromatography (UPLC) systems. We tested three strategies for sample pretreatment prior to LC analysis. In direct analysis (DA), bacterial cultures were injected directly and analyzed via LC. As a modification, we diluted the samples with methanol and detected an increase in lipopeptide recovery in the presence of methanol. Therefore, we suggest this simple modification as a tool for increasing the accuracy of LC methods. We also tested freeze-drying followed by solvent extraction (FDSE) as an alternative for the analysis of "heavy" samples. In FDSE, the bacterial cultures were freeze-dried, and the resulting powder was extracted with different solvents. Then, the organic extracts were analyzed via LC. Here, we determined the influence of the extracting solvent on lipopeptide recovery. HPLC methods allowed us to quantify pseudofactin and surfactin with run times of 15 and 20 min per sample, respectively, whereas UPLC quantification was as fast as 4 and 5.5 min per sample, respectively. Our methods provide highly accurate measurements and high recovery levels for lipopeptides. At the same time, UPLC-MS provides the possibility to identify lipopeptides and their structural isoforms.

  11. A methodology for direct quantification of over-ranging length in helical computed tomography with real-time dosimetry.

    Science.gov (United States)

    Tien, Christopher J; Winslow, James F; Hintenlang, David E

    2011-01-31

    In helical computed tomography (CT), reconstruction information from volumes adjacent to the clinical volume of interest (VOI) is required for proper reconstruction. Previous studies have relied upon either operator console readings or indirect extrapolation of measurements in order to determine the over-ranging length of a scan. This paper presents a methodology for the direct quantification of over-ranging dose contributions using real-time dosimetry. A Siemens SOMATOM Sensation 16 multislice helical CT scanner is used with a novel real-time "point" fiber-optic dosimeter system with 10 ms temporal resolution to measure over-ranging length, which is also expressed in dose-length-product (DLP). Film was used to benchmark the exact length of over-ranging. Over-ranging length varied from 4.38 cm at pitch of 0.5 to 6.72 cm at a pitch of 1.5, which corresponds to DLP of 131 to 202 mGy-cm. The dose-extrapolation method of Van der Molen et al. yielded results within 3%, while the console reading method of Tzedakis et al. yielded consistently larger over-ranging lengths. From film measurements, it was determined that Tzedakis et al. overestimated over-ranging lengths by one-half of beam collimation width. Over-ranging length measured as a function of reconstruction slice thicknesses produced two linear regions similar to previous publications. Over-ranging is quantified with both absolute length and DLP, which contributes about 60 mGy-cm or about 10% of DLP for a routine abdominal scan. This paper presents a direct physical measurement of over-ranging length within 10% of previous methodologies. Current uncertainties are less than 1%, in comparison with 5% in other methodologies. Clinical implantation can be increased by using only one dosimeter if codependence with console readings is acceptable, with an uncertainty of 1.1% This methodology will be applied to different vendors, models, and postprocessing methods--which have been shown to produce over-ranging lengths

  12. Inter-rater reliability of direct observations of the physical and psychosocial working conditions in eldercare

    DEFF Research Database (Denmark)

    Karstad, Kristina; Rugulies, Reiner; Skotte, Jørgen

    2018-01-01

    The aim of the study was to develop and evaluate the reliability of the "Danish observational study of eldercare work and musculoskeletal disorders" (DOSES) observation instrument to assess physical and psychosocial risk factors for musculoskeletal disorders (MSD) in eldercare work. During 1.5 ye...... is appropriate for assessing physical and psychosocial risk factors for MSD among eldercare workers.......The aim of the study was to develop and evaluate the reliability of the "Danish observational study of eldercare work and musculoskeletal disorders" (DOSES) observation instrument to assess physical and psychosocial risk factors for musculoskeletal disorders (MSD) in eldercare work. During 1...

  13. Inter-rater reliability of direct observations of the physical and psychosocial working conditions in eldercare

    DEFF Research Database (Denmark)

    Karstad, Kristina; Rugulies, Reiner; Skotte, Jørgen

    2018-01-01

    The aim of the study was to develop and evaluate the reliability of the "Danish observational study of eldercare work and musculoskeletal disorders" (DOSES) observation instrument to assess physical and psychosocial risk factors for musculoskeletal disorders (MSD) in eldercare work. During 1...... is appropriate for assessing physical and psychosocial risk factors for MSD among eldercare workers....

  14. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  15. Reliability and Validity of a New Test of Change-of-Direction Speed for Field-Based Sports: the Change-of-Direction and Acceleration Test (CODAT).

    Science.gov (United States)

    Lockie, Robert G; Schultz, Adrian B; Callaghan, Samuel J; Jeffriess, Matthew D; Berry, Simon P

    2013-01-01

    Field sport coaches must use reliable and valid tests to assess change-of-direction speed in their athletes. Few tests feature linear sprinting with acute change- of-direction maneuvers. The Change-of-Direction and Acceleration Test (CODAT) was designed to assess field sport change-of-direction speed, and includes a linear 5-meter (m) sprint, 45° and 90° cuts, 3- m sprints to the left and right, and a linear 10-m sprint. This study analyzed the reliability and validity of this test, through comparisons to 20-m sprint (0-5, 0-10, 0-20 m intervals) and Illinois agility run (IAR) performance. Eighteen Australian footballers (age = 23.83 ± 7.04 yrs; height = 1.79 ± 0.06 m; mass = 85.36 ± 13.21 kg) were recruited. Following familiarization, subjects completed the 20-m sprint, CODAT, and IAR in 2 sessions, 48 hours apart. Intra-class correlation coefficients (ICC) assessed relative reliability. Absolute reliability was analyzed through paired samples t-tests (p ≤ 0.05) determining between-session differences. Typical error (TE), coefficient of variation (CV), and differences between the TE and smallest worthwhile change (SWC), also assessed absolute reliability and test usefulness. For the validity analysis, Pearson's correlations (p ≤ 0.05) analyzed between-test relationships. Results showed no between-session differences for any test (p = 0.19-0.86). CODAT time averaged ~6 s, and the ICC and CV equaled 0.84 and 3.0%, respectively. The homogeneous sample of Australian footballers meant that the CODAT's TE (0.19 s) exceeded the usual 0.2 x standard deviation (SD) SWC (0.10 s). However, the CODAT is capable of detecting moderate performance changes (SWC calculated as 0.5 x SD = 0.25 s). There was a near perfect correlation between the CODAT and IAR (r = 0.92), and very large correlations with the 20-m sprint (r = 0.75-0.76), suggesting that the CODAT was a valid change-of-direction speed test. Due to movement specificity, the CODAT has value for field sport

  16. Estimating the impact of structural directionality: How reliable are undirected connectomes?

    Directory of Open Access Journals (Sweden)

    Penelope Kale

    2018-06-01

    Full Text Available Directionality is a fundamental feature of network connections. Most structural brain networks are intrinsically directed because of the nature of chemical synapses, which comprise most neuronal connections. Because of the limitations of noninvasive imaging techniques, the directionality of connections between structurally connected regions of the human brain cannot be confirmed. Hence, connections are represented as undirected, and it is still unknown how this lack of directionality affects brain network topology. Using six directed brain networks from different species and parcellations (cat, mouse, C. elegans, and three macaque networks, we estimate the inaccuracies in network measures (degree, betweenness, clustering coefficient, path length, global efficiency, participation index, and small-worldness associated with the removal of the directionality of connections. We employ three different methods to render directed brain networks undirected: (a remove unidirectional connections, (b add reciprocal connections, and (c combine equal numbers of removed and added unidirectional connections. We quantify the extent of inaccuracy in network measures introduced through neglecting connection directionality for individual nodes and across the network. We find that the coarse division between core and peripheral nodes remains accurate for undirected networks. However, hub nodes differ considerably when directionality is neglected. Comparing the different methods to generate undirected networks from directed ones, we generally find that the addition of reciprocal connections (false positives causes larger errors in graph-theoretic measures than the removal of the same number of directed connections (false negatives. These findings suggest that directionality plays an essential role in shaping brain networks and highlight some limitations of undirected connectomes. Most brain networks are inherently directed because of the nature of chemical synapses

  17. Reliability of heart rate mobile apps in young healthy adults: exploratory study and research directions

    Directory of Open Access Journals (Sweden)

    Maria Parpinel

    2017-06-01

    Full Text Available Background: Recently, a number of smartphone apps appeared that allow for heart rate measurements basing on the photoplethysmography principle. In fact, almost every smartphone now has a camera with flash that could be used for that. Some studies appeared on the reliability of some of those apps, with heterogeneous results. Objectives: The present study aims at adding up evidence in particular during physical activity, by comparing 3 apps on two different platforms (IOs and Android, on a broad range of heart rates. As gold standard, heart rate has been measured with a traditional heart rate monitor. Results: The results suggest that heart rate apps might be used for measuring heart rate for fitness aims for many individuals, but further research is needed to i analyse influence of smartphone features; ii identify personal factors hindering measurements, and iii verify reliability on different measurement sites.

  18. A rapid reliability estimation method for directed acyclic lifeline networks with statistically dependent components

    International Nuclear Information System (INIS)

    Kang, Won-Hee; Kliese, Alyce

    2014-01-01

    Lifeline networks, such as transportation, water supply, sewers, telecommunications, and electrical and gas networks, are essential elements for the economic and societal functions of urban areas, but their components are highly susceptible to natural or man-made hazards. In this context, it is essential to provide effective pre-disaster hazard mitigation strategies and prompt post-disaster risk management efforts based on rapid system reliability assessment. This paper proposes a rapid reliability estimation method for node-pair connectivity analysis of lifeline networks especially when the network components are statistically correlated. Recursive procedures are proposed to compound all network nodes until they become a single super node representing the connectivity between the origin and destination nodes. The proposed method is applied to numerical network examples and benchmark interconnected power and water networks in Memphis, Shelby County. The connectivity analysis results show the proposed method's reasonable accuracy and remarkable efficiency as compared to the Monte Carlo simulations

  19. Quantification in emission tomography

    International Nuclear Information System (INIS)

    Buvat, Irene

    2011-11-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena 2 - Main problems impacting quantification in PET and SPECT: problems, consequences, correction methods, results (Attenuation, scattering, partial volume effect, movement, un-stationary spatial resolution in SPECT, fortuitous coincidences in PET, standardisation in PET); 3 - Synthesis: accessible efficiency, know-how, Precautions, beyond the activity measurement

  20. Direct quantification of fatty acids in wet microalgal and yeast biomass via a rapid in situ fatty acid methyl ester derivatization approach.

    Science.gov (United States)

    Dong, Tao; Yu, Liang; Gao, Difeng; Yu, Xiaochen; Miao, Chao; Zheng, Yubin; Lian, Jieni; Li, Tingting; Chen, Shulin

    2015-12-01

    Accurate determination of fatty acid contents is routinely required in microalgal and yeast biofuel studies. A method of rapid in situ fatty acid methyl ester (FAME) derivatization directly from wet fresh microalgal and yeast biomass was developed in this study. This method does not require prior solvent extraction or dehydration. FAMEs were prepared with a sequential alkaline hydrolysis (15 min at 85 °C) and acidic esterification (15 min at 85 °C) process. The resulting FAMEs were extracted into n-hexane and analyzed using gas chromatography. The effects of each processing parameter (temperature, reaction time, and water content) upon the lipids quantification in the alkaline hydrolysis step were evaluated with a full factorial design. This method could tolerate water content up to 20% (v/v) in total reaction volume, which equaled up to 1.2 mL of water in biomass slurry (with 0.05-25 mg of fatty acid). There were no significant differences in FAME quantification (p>0.05) between the standard AOAC 991.39 method and the proposed wet in situ FAME preparation method. This fatty acid quantification method is applicable to fresh wet biomass of a wide range of microalgae and yeast species.

  1. Children's Physical Activity While Gardening: Development of a Valid and Reliable Direct Observation Tool.

    Science.gov (United States)

    Myers, Beth M; Wells, Nancy M

    2015-04-01

    Gardens are a promising intervention to promote physical activity (PA) and foster health. However, because of the unique characteristics of gardening, no extant tool can capture PA, postures, and motions that take place in a garden. The Physical Activity Research and Assessment tool for Garden Observation (PARAGON) was developed to assess children's PA levels, tasks, postures, and motions, associations, and interactions while gardening. PARAGON uses momentary time sampling in which a trained observer watches a focal child for 15 seconds and then records behavior for 15 seconds. Sixty-five children (38 girls, 27 boys) at 4 elementary schools in New York State were observed over 8 days. During the observation, children simultaneously wore Actigraph GT3X+ accelerometers. The overall interrater reliability was 88% agreement, and Ebel was .97. Percent agreement values for activity level (93%), garden tasks (93%), motions (80%), associations (95%), and interactions (91%) also met acceptable criteria. Validity was established by previously validated PA codes and by expected convergent validity with accelerometry. PARAGON is a valid and reliable observation tool for assessing children's PA in the context of gardening.

  2. Improved quantification of farnesene during microbial production from Saccharomyces cerevisiae in two-liquid-phase fermentations

    DEFF Research Database (Denmark)

    Tippmann, Stefan; Nielsen, Jens; Khoomrung, Sakda

    2016-01-01

    Organic solvents are widely used in microbial fermentations to reduce gas stripping effects and capture hydrophobic or toxic compounds. Reliable quantification of biochemical products in these overlays is highly challenging and practically difficult. Here, we present a significant improvement...... carryover could be minimized. Direct quantification of farnesene in dodecane was achieved by GC-FID whereas GC-MS demonstrated to be an excellent technique for identification of known and unknown metabolites. The GC-FID is a suitable technique for direct quantification of farnesene in complex matrices...

  3. Reliable and cost effective design of intermetallic Ni2Si nanowires and direct characterization of its mechanical properties

    OpenAIRE

    Seung Zeon Han; Joonhee Kang; Sung-Dae Kim; Si-Young Choi; Hyung Giun Kim; Jehyun Lee; Kwangho Kim; Sung Hwan Lim; Byungchan Han

    2015-01-01

    We report that a single crystal Ni2Si nanowire (NW) of intermetallic compound can be reliably designed using simple three-step processes: casting a ternary Cu-Ni-Si alloy, nucleate and growth of Ni2Si NWs as embedded in the alloy matrix via designing discontinuous precipitation (DP) of Ni2Si nanoparticles and thermal aging, and finally chemical etching to decouple the Ni2Si NWs from the alloy matrix. By direct application of uniaxial tensile tests to the Ni2Si NW we characterize its mechanica...

  4. Reliability of pregnancy diagnosis in sows by direct radioimmunoassay of estrone sulfate

    International Nuclear Information System (INIS)

    Kalab, P.; Hajek, J.

    1988-01-01

    Pregnancy diagnosis in sows using direct radioimmunoassay of estrone sulfate in the blood serum without sample extraction is described. It was found that for pregnancy diagnosis the period between days 22 and 30 of pregnancy can be used since in this period the estrone sulfate concentrations in all pregnant sows markedly exceeded those of 64 non-pregnant animals. The estrone sulfate estimation cannot be used for pregnancy diagnosis before day 22 and between days 30 and 40 because the estrone sulfate concentrations in most samples collected in these periods were lower than 4 nmol.l -1 . (author). 1 fig., 6 refs

  5. Blinded evaluation of interrater reliability of an operative competency assessment tool for direct laryngoscopy and rigid bronchoscopy.

    Science.gov (United States)

    Ishman, Stacey L; Benke, James R; Johnson, Kaalan Erik; Zur, Karen B; Jacobs, Ian N; Thorne, Marc C; Brown, David J; Lin, Sandra Y; Bhatti, Nasir; Deutsch, Ellen S

    2012-10-01

    OBJECTIVES To confirm interrater reliability using blinded evaluation of a skills-assessment instrument to assess the surgical performance of resident and fellow trainees performing pediatric direct laryngoscopy and rigid bronchoscopy in simulated models. DESIGN Prospective, paired, blinded observational validation study. SUBJECTS Paired observers from multiple institutions simultaneously evaluated residents and fellows who were performing surgery in an animal laboratory or using high-fidelity manikins. The evaluators had no previous affiliation with the residents and fellows and did not know their year of training. INTERVENTIONS One- and 2-page versions of an objective structured assessment of technical skills (OSATS) assessment instrument composed of global and a task-specific surgical items were used to evaluate surgical performance. RESULTS Fifty-two evaluations were completed by 17 attending evaluators. The instrument agreement for the 2-page assessment was 71.4% when measured as a binary variable (ie, competent vs not competent) (κ = 0.38; P = .08). Evaluation as a continuous variable revealed a 42.9% percentage agreement (κ = 0.18; P = .14). The intraclass correlation was 0.53, considered substantial/good interrater reliability (69% reliable). For the 1-page instrument, agreement was 77.4% when measured as a binary variable (κ = 0.53, P = .0015). Agreement when evaluated as a continuous measure was 71.0% (κ = 0.54, P formative feedback on operational competency.

  6. Intra-Subject Consistency and Reliability of Response Following 2 mA Transcranial Direct Current Stimulation.

    Science.gov (United States)

    Dyke, Katherine; Kim, Soyoung; Jackson, Georgina M; Jackson, Stephen R

    Transcranial direct current stimulation (tDCS) is a popular non-invasive brain stimulation technique that has been shown to influence cortical excitability. While polarity specific effects have often been reported, this is not always the case, and variability in both the magnitude and direction of the effects have been observed. We aimed to explore the consistency and reliability of the effects of tDCS by investigating changes in cortical excitability across multiple testing sessions in the same individuals. A within subjects design was used to investigate the effects of anodal and cathodal tDCS applied to the motor cortex. Four experimental sessions were tested for each polarity in addition to two sham sessions. Transcranial magnetic stimulation (TMS) was used to measure cortical excitability (TMS recruitment curves). Changes in excitability were measured by comparing baseline measures and those taken immediately following 20 minutes of 2 mA stimulation or sham stimulation. Anodal tDCS significantly increased cortical excitability at a group level, whereas cathodal tDCS failed to have any significant effects. The sham condition also failed to show any significant changes. Analysis of intra-subject responses to anodal stimulation across four sessions suggest that the amount of change in excitability across sessions was only weakly associated, and was found to have poor reliability across sessions (ICC = 0.276). The effects of cathodal stimulation show even poorer reliability across sessions (ICC = 0.137). In contrast ICC analysis for the two sessions of sham stimulation reflect a moderate level of reliability (ICC = .424). Our findings indicate that although 2 mA anodal tDCS is effective at increasing cortical excitability at group level, the effects are unreliable across repeated testing sessions within individual participants. Our results suggest that 2 mA cathodal tDCS does not significantly alter cortical excitability immediately following

  7. Validity and Reliability of 10-Hz Global Positioning System to Assess In-line Movement and Change of Direction.

    Science.gov (United States)

    Nikolaidis, Pantelis T; Clemente, Filipe M; van der Linden, Cornelis M I; Rosemann, Thomas; Knechtle, Beat

    2018-01-01

    The objectives of the present study were to examine the validity and reliability of the 10 Hz Johan GPS unit in assessing in-line movement and change of direction. The validity was tested against the criterion measure of 200 m track-and-field (track-and-field athletes, n = 8) and 20 m shuttle run endurance test (female soccer players, n = 20). Intra-unit and inter-unit reliability was tested by intra-class correlation coefficient (ICC) and coefficient of variation (CV), respectively. An analysis of variance examined differences between the GPS measurement and five laps of 200 m at 15 km/h, and t -test examined differences between the GPS measurement and 20 m shuttle run endurance test. The difference between the GPS measurement and 200 m distance ranged from -0.13 ± 3.94 m (95% CI -3.42; 3.17) in the first lap to 2.13 ± 2.64 m (95% CI -0.08; 4.33) in the fifth lap. A good intra-unit reliability was observed in 200 m (ICC = 0.833, 95% CI 0.535; 0.962). Inter-unit CV ranged from 1.31% (fifth lap) to 2.20% (third lap). The difference between the GPS measurement and 20 m shuttle run endurance test ranged from 0.33 ± 4.16 m (95% CI -10.01; 10.68) in 11.5 km/h to 9.00 ± 5.30 m (95% CI 6.44; 11.56) in 8.0 km/h. A moderate intra-unit reliability was shown in the second and third stage of the 20 m shuttle run endurance test (ICC = 0.718, 95% CI 0.222;0.898) and good reliability in the fifth, sixth, seventh and eighth (ICC = 0.831, 95% CI -0.229;0.996). Inter-unit CV ranged from 2.08% (11.5 km/h) to 3.92% (8.5 km/h). Based on these findings, it was concluded that the 10 Hz Johan system offers an affordable valid and reliable tool for coaches and fitness trainers to monitor training and performance.

  8. Validity and Reliability of 10-Hz Global Positioning System to Assess In-line Movement and Change of Direction

    Directory of Open Access Journals (Sweden)

    Pantelis T. Nikolaidis

    2018-03-01

    Full Text Available The objectives of the present study were to examine the validity and reliability of the 10 Hz Johan GPS unit in assessing in-line movement and change of direction. The validity was tested against the criterion measure of 200 m track-and-field (track-and-field athletes, n = 8 and 20 m shuttle run endurance test (female soccer players, n = 20. Intra-unit and inter-unit reliability was tested by intra-class correlation coefficient (ICC and coefficient of variation (CV, respectively. An analysis of variance examined differences between the GPS measurement and five laps of 200 m at 15 km/h, and t-test examined differences between the GPS measurement and 20 m shuttle run endurance test. The difference between the GPS measurement and 200 m distance ranged from −0.13 ± 3.94 m (95% CI −3.42; 3.17 in the first lap to 2.13 ± 2.64 m (95% CI −0.08; 4.33 in the fifth lap. A good intra-unit reliability was observed in 200 m (ICC = 0.833, 95% CI 0.535; 0.962. Inter-unit CV ranged from 1.31% (fifth lap to 2.20% (third lap. The difference between the GPS measurement and 20 m shuttle run endurance test ranged from 0.33 ± 4.16 m (95% CI −10.01; 10.68 in 11.5 km/h to 9.00 ± 5.30 m (95% CI 6.44; 11.56 in 8.0 km/h. A moderate intra-unit reliability was shown in the second and third stage of the 20 m shuttle run endurance test (ICC = 0.718, 95% CI 0.222;0.898 and good reliability in the fifth, sixth, seventh and eighth (ICC = 0.831, 95% CI −0.229;0.996. Inter-unit CV ranged from 2.08% (11.5 km/h to 3.92% (8.5 km/h. Based on these findings, it was concluded that the 10 Hz Johan system offers an affordable valid and reliable tool for coaches and fitness trainers to monitor training and performance.

  9. Pore sub-features reproducibility in direct microscopic and Livescan images--their reliability in personal identification.

    Science.gov (United States)

    Gupta, Abhishek; Sutton, Raul

    2010-07-01

    Third level features have been reported to have equal discriminatory power as second level details in establishing personal identification. Pore area, as an extended set third level sub-feature, has been studied by minimizing possible factors that could affect pore size. The reproducibility of pore surface area has been studied using direct microscopic and 500 ppi Livescan images. Direct microscopic pore area measurements indicated that the day on which the pore area was measured had a significant impact on the measured pore area. Pore area measurement was shown to be difficult to estimate in 500 ppi Livescan measurements owing to lack of resolution. It is not possible to reliably use pore area as an identifying feature in fingerprint examination.

  10. Improving the accuracy and reliability of MWD/magnetic-Wellbore-Directional surveying in the barents sea

    DEFF Research Database (Denmark)

    Edvardsen, I.; Nyrnes, E.; Johnsen, M. G.

    2014-01-01

    of nonmagnetic steel in the bottomhole assembly (BHA). To maintain azimuth uncertaintyat an acceptable level in northern areas, it is crucial that wellbore-directional-surveying requirements are given high priority and considered early during well planning. During the development phase of an oil and gas field...... magnetic-reference stations. The different land and sea configuration, distant offshore oil and gas fields, higher geomagnetic latitude, and different behavior of the magnetic field require the procedures to be reassessed before being applied to the Barents Sea. To reduce drilling delays, procedures must...... be implemented to enable efficient management of magnetic disturbances.In some areas of the Barents Sea, the management requires new equipment to be developed and tested before drilling, such as seabed magnetometer stations. One simple way to reduce drillstring interference is increasing the amount...

  11. Reliability of an analysis method for measuring diaphragm excursion by means of direct visualization with videofluoroscopy.

    Science.gov (United States)

    Yi, Liu C; Nascimento, Oliver A; Jardim, José R

    2011-06-01

    The purpose of this study was to verify the reproducibility between two different observers of an analysis method for diaphragmatic displacement measurements using direct visualization with videofluoroscopy. 29 mouth breathing children aged 5 to 12 years from both genders were analyzed. The diaphragmatic displacement evaluation was divided in three parts: videofluoroscopy with VHS recording in standing, sitting, and dorsal positions; digitalization of the images; and measurement of the distance between diaphragmatic domes during a breathing cycle using Adobe Photoshop 5.5 and Adobe Premiere PRO 6.5 software. The intraclass correlation coefficients presented excellent reproducibility in all positions, with coefficients always above 0.94. Mean of the measurements of the diaphagramatic domes displacement done by the two observers were similar (Phealthcare professionals. Copyright © 2010 SEPAR. Published by Elsevier Espana. All rights reserved.

  12. Superposition Quantification

    Science.gov (United States)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  13. Accident sequence quantification with KIRAP

    International Nuclear Information System (INIS)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong.

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP's cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs

  14. Accident sequence quantification with KIRAP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP`s cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs.

  15. Direct nuclear magnetic resonance identification and quantification of geometric isomers of conjugated linoleic acid in milk lipid fraction without derivatization steps: Overcoming sensitivity and resolution barriers

    International Nuclear Information System (INIS)

    Tsiafoulis, Constantinos G.; Skarlas, Theodore; Tzamaloukas, Ouranios; Miltiadou, Despoina; Gerothanassis, Ioannis P.

    2014-01-01

    Highlights: • The first NMR quantification of four geometric 18:2 CLA isomers has been achieved. • Sensitivity and resolution NMR barriers have been overcome. • Selective suppression and reduced 13 C spectral width have been utilized. • The method is applied in the milk lipid fraction without derivatization steps. • The method is selective, sensitive with very good analytical characteristics. - Abstract: We report the first successful direct and unequivocal identification and quantification of four minor geometric (9-cis, 11-trans) 18:2, (9-trans, 11-cis) 18:2, (9-cis, 11-cis) 18:2 and (9-trans, 11-trans) 18:2 conjugated linoleic acid (CLA) isomers in lipid fractions of lyophilized milk samples with the combined use of 1D 1 H-NMR, 2D 1 H- 1 H TOCSY and 2D 1 H- 13 C HSQC NMR. The significant sensitivity barrier has been successfully overcome under selective suppression of the major resonances, with over 10 4 greater equilibrium magnetization of the -(CH 2 ) n - 1 H spins compared to that of the 1 H spins of the conjugated bonds of the CLA isomers. The resolution barrier has been significantly increased using reduced 13 C spectral width in the 2D 1 H- 13 C HSQC experiment. The assignment was confirmed with spiking experiments with CLA standard compounds and the method does not require any derivatization steps for the lipid fraction. The proposed method is selective, sensitive and compares favorably with the GS-MS method of analysis

  16. A risk-informed approach of quantification of epistemic uncertainty for the long-term radioactive waste disposal. Improving reliability of expert judgements with an advanced elicitation procedure

    International Nuclear Information System (INIS)

    Sugiyama, Daisuke; Chida, Taiji; Fujita, Tomonari; Tsukamoto, Masaki

    2011-01-01

    A quantification methodology of epistemic uncertainty by expert judgement based on the risk-informed approach is developed to assess inevitable uncertainty for the long-term safety assessment of radioactive waste disposal. The proposed method in this study employs techniques of logic tree, by which options of models and/or scenarios are identified, and Evidential Support Logic (ESL), by which possibility of each option is quantified. In this report, the effect of a feedback process of discussion between experts and input of state-of-the-art knowledge in the proposed method is discussed to estimate alteration of the distribution of expert judgements which is one of the factors causing uncertainty. In a preliminary quantification experiment of uncertainty of degradation of the engineering barrier materials in a tentative sub-surface disposal using the proposed methodology, experts themselves modified questions appropriately to facilitate sound judgements and to correlate those with scientific evidences clearly. The result suggests that the method effectively improves confidence of expert judgement. Also, the degree of consensus of expert judgement was sort of improved in some cases, since scientific knowledge and information of expert judgement in other fields became common understanding. It is suggested that the proposed method could facilitate consensus on uncertainty between interested persons. (author)

  17. Direct-Imaging-Based Quantification of Bacillus cereus ATCC 14579 Population Heterogeneity at a Low Incubation Temperature

    NARCIS (Netherlands)

    Besten, den H.M.W.; Garcia, D.; Moezelaar, R.; Zwietering, M.H.; Abee, T.

    2010-01-01

    Bacillus cereus ATCC 14579 was cultured in microcolonies on Anopore strips near its minimum growth temperature to directly image and quantify its population heterogeneity at an abusive refrigeration temperature. Eleven percent of the microcolonies failed to grow during low-temperature incubation,

  18. Removal of trichloroethylene DNAPL trapped in porous media using nanoscale zerovalent iron and bimetallic nanoparticles: Direct observation and quantification

    International Nuclear Information System (INIS)

    Wang, Qiliang; Jeong, Seung-Woo; Choi, Heechul

    2012-01-01

    Highlights: ► TCE DNAPL removal inside pores using NZVI or bimetals in a 2-D system was visualized. ► Presence of nitrate and humic substances decrease the TCE DNAPL removal efficiency. ► Presence of ethanol increases the TCE DNAPL removal efficiency. ► Metal catalysts enhance the TCE DNAPL removal using NZVI in a short term reaction. ► Metal catalysts do not increase the DNAPL removal efficiency for a long term reaction. - Abstract: Direct trichloroethylene (TCE) dense non-aqueous phase liquid (DNAPL) removal inside pore areas using nanoscale zerovalent iron (NZVI) and bimetallic nanoparticles were first investigated in a water-saturated porous glass micromodel. Effects of nitrate, aqueous ethanol co-solvent, humic substance, and elapsed time on TCE DNAPL removal using NZVI were studied by direct visualization. The removal efficiency was then quantified by directly measuring the remaining TCE DNAPL blobs area using an image analyzer. As ethanol content of co-solvent increased, TCE DNAPL removal by NZVI was also increased implying sequential TCE DNAPL removal mechanisms: as dissolved TCE was degraded by NZVI, TCE dissolution from TCE blobs would be then facilitated and the TCE blob areas would be eventually reduced. The presence of nitrate and humic substance hindered the NZVI reactivity for the TCE DNAPL removal. In contrast, the TCE DNAPL removal efficiency was enhanced using bimetallic nanoparticles in a short-term reaction by generating atomic hydrogen for catalytic hydro-dechlorination. However, all TCE DNAPL removal efficiencies reached the same level after long-term reaction using both NZVI and bimetallic nanoparticles. Direct TCE DNAPL observation clearly implied that TCE blobs existed for long time even though all TCE blobs were fully exposed to NZVI and bimetallic nanoparticles.

  19. Removal of trichloroethylene DNAPL trapped in porous media using nanoscale zerovalent iron and bimetallic nanoparticles: Direct observation and quantification

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Qiliang [School of Environmental Science and Engineering, Gwangju Institute of Science and Technology (GIST), 261 Cheomdan-gwagiro, Buk-gu, 500-712 Gwangju (Korea, Republic of); Jeong, Seung-Woo, E-mail: swjeong@kunsan.ac.kr [Department of Environmental Engineering, Kunsan National University, Kunsan 550-701 (Korea, Republic of); Choi, Heechul, E-mail: hcchoi@gist.ac.kr [School of Environmental Science and Engineering, Gwangju Institute of Science and Technology (GIST), 261 Cheomdan-gwagiro, Buk-gu, 500-712 Gwangju (Korea, Republic of)

    2012-04-30

    Highlights: Black-Right-Pointing-Pointer TCE DNAPL removal inside pores using NZVI or bimetals in a 2-D system was visualized. Black-Right-Pointing-Pointer Presence of nitrate and humic substances decrease the TCE DNAPL removal efficiency. Black-Right-Pointing-Pointer Presence of ethanol increases the TCE DNAPL removal efficiency. Black-Right-Pointing-Pointer Metal catalysts enhance the TCE DNAPL removal using NZVI in a short term reaction. Black-Right-Pointing-Pointer Metal catalysts do not increase the DNAPL removal efficiency for a long term reaction. - Abstract: Direct trichloroethylene (TCE) dense non-aqueous phase liquid (DNAPL) removal inside pore areas using nanoscale zerovalent iron (NZVI) and bimetallic nanoparticles were first investigated in a water-saturated porous glass micromodel. Effects of nitrate, aqueous ethanol co-solvent, humic substance, and elapsed time on TCE DNAPL removal using NZVI were studied by direct visualization. The removal efficiency was then quantified by directly measuring the remaining TCE DNAPL blobs area using an image analyzer. As ethanol content of co-solvent increased, TCE DNAPL removal by NZVI was also increased implying sequential TCE DNAPL removal mechanisms: as dissolved TCE was degraded by NZVI, TCE dissolution from TCE blobs would be then facilitated and the TCE blob areas would be eventually reduced. The presence of nitrate and humic substance hindered the NZVI reactivity for the TCE DNAPL removal. In contrast, the TCE DNAPL removal efficiency was enhanced using bimetallic nanoparticles in a short-term reaction by generating atomic hydrogen for catalytic hydro-dechlorination. However, all TCE DNAPL removal efficiencies reached the same level after long-term reaction using both NZVI and bimetallic nanoparticles. Direct TCE DNAPL observation clearly implied that TCE blobs existed for long time even though all TCE blobs were fully exposed to NZVI and bimetallic nanoparticles.

  20. Quantification of biofilm in microtiter plates: overview of testing conditions and practical recommendations for assessment of biofilm production by staphylococci.

    Science.gov (United States)

    Stepanović, Srdjan; Vuković, Dragana; Hola, Veronika; Di Bonaventura, Giovanni; Djukić, Slobodanka; Cirković, Ivana; Ruzicka, Filip

    2007-08-01

    The details of all steps involved in the quantification of biofilm formation in microtiter plates are described. The presented protocol incorporates information on assessment of biofilm production by staphylococci, gained both by direct experience as well as by analysis of methods for assaying biofilm production. The obtained results should simplify quantification of biofilm formation in microtiter plates, and make it more reliable and comparable among different laboratories.

  1. The test-retest reliability of anatomical co-ordinate axes definition for the quantification of lower extremity kinematics during running.

    Science.gov (United States)

    Sinclair, Jonathan; Taylor, Paul John; Greenhalgh, Andrew; Edmundson, Christopher James; Brooks, Darrell; Hobbs, Sarah Jane

    2012-12-01

    Three-dimensional (3-D) kinematic analyses are used widely in both sport and clinical examinations. However, this procedure depends on reliable palpation of anatomical landmarks and mal-positioning of markers between sessions may result in improperly defined segment co-ordinate system axes which will produce in-consistent joint rotations. This had led some to question the efficacy of this technique. The aim of the current investigation was to assess the reliability of the anatomical frame definition when quantifying 3-D kinematics of the lower extremities during running. Ten participants completed five successful running trials at 4.0 m·s(-1) ± 5%. 3-D angular joint kinematics parameters from the hip, knee and ankle were collected using an eight camera motion analysis system. Two static calibration trials were captured. The first (test) was conducted prior to the running trials following which anatomical landmarks were removed. The second was obtained following completion of the running trials where anatomical landmarks were re-positioned (retest). Paired samples t-tests were used to compare 3-D kinematic parameters quantified using the two static trials, and intraclass correlations were employed to examine the similarities between the sagittal, coronal and transverse plane waveforms. The results indicate that no significant (p>0.05) differences were found between test and retest 3-D kinematic parameters and strong (R(2)≥0.87) correlations were observed between test and retest waveforms. Based on the results obtained from this investigation, it appears that the anatomical co-ordinate axes of the lower extremities can be defined reliably thus confirming the efficacy of studies using this technique.

  2. Reliable and cost effective design of intermetallic Ni2Si nanowires and direct characterization of its mechanical properties.

    Science.gov (United States)

    Han, Seung Zeon; Kang, Joonhee; Kim, Sung-Dae; Choi, Si-Young; Kim, Hyung Giun; Lee, Jehyun; Kim, Kwangho; Lim, Sung Hwan; Han, Byungchan

    2015-10-12

    We report that a single crystal Ni2Si nanowire (NW) of intermetallic compound can be reliably designed using simple three-step processes: casting a ternary Cu-Ni-Si alloy, nucleate and growth of Ni2Si NWs as embedded in the alloy matrix via designing discontinuous precipitation (DP) of Ni2Si nanoparticles and thermal aging, and finally chemical etching to decouple the Ni2Si NWs from the alloy matrix. By direct application of uniaxial tensile tests to the Ni2Si NW we characterize its mechanical properties, which were rarely reported in previous literatures. Using integrated studies of first principles density functional theory (DFT) calculations, high-resolution transmission electron microscopy (HRTEM), and energy-dispersive X-ray spectroscopy (EDX) we accurately validate the experimental measurements. Our results indicate that our simple three-step method enables to design brittle Ni2Si NW with high tensile strength of 3.0 GPa and elastic modulus of 60.6 GPa. We propose that the systematic methodology pursued in this paper significantly contributes to opening innovative processes to design various kinds of low dimensional nanomaterials leading to advancement of frontiers in nanotechnology and related industry sectors.

  3. Local Directional Probability Optimization for Quantification of Blurred Gray/White Matter Junction in Magnetic Resonance Image

    Directory of Open Access Journals (Sweden)

    Xiaoxia Qu

    2017-09-01

    Full Text Available The blurred gray/white matter junction is an important feature of focal cortical dysplasia (FCD lesions. FCD is the main cause of epilepsy and can be detected through magnetic resonance (MR imaging. Several earlier studies have focused on computing the gradient magnitude of the MR image and used the resulting map to model the blurred gray/white matter junction. However, gradient magnitude cannot quantify the blurred gray/white matter junction. Therefore, we proposed a novel algorithm called local directional probability optimization (LDPO for detecting and quantifying the width of the gray/white matter boundary (GWB within the lesional areas. The proposed LDPO method mainly consists of the following three stages: (1 introduction of a hidden Markov random field-expectation-maximization algorithm to compute the probability images of brain tissues in order to obtain the GWB region; (2 generation of local directions from gray matter (GM to white matter (WM passing through the GWB, considering the GWB to be an electric potential field; (3 determination of the optimal local directions for any given voxel of GWB, based on iterative searching of the neighborhood. This was then used to measure the width of the GWB. The proposed LDPO method was tested on real MR images of patients with FCD lesions. The results indicated that the LDPO method could quantify the GWB width. On the GWB width map, the width of the blurred GWB in the lesional region was observed to be greater than that in the non-lesional regions. The proposed GWB width map produced higher F-scores in terms of detecting the blurred GWB within the FCD lesional region as compared to that of FCD feature maps, indicating better trade-off between precision and recall.

  4. Quantification of Endogenous Cholesterol in Human Serum on Paper Using Direct Analysis in Real Time Mass Spectrometry.

    Science.gov (United States)

    Hsieh, Hua-Yi; Li, Li-Hua; Hsu, Ren-Yu; Kao, Wei-Fong; Huang, Ying-Chen; Hsu, Cheng-Chih

    2017-06-06

    Blood testing for endogenous small metabolites to determine physiological and biochemical states is routine for laboratory analysis. Here we demonstrate that by combining the commercial direct analysis in real time (DART) ion source with an ion trap mass spectrometer, native cholesterol in its free alcohol form is readily detected from a few hundred nanoliters of human serum loaded onto chromatography paper. Deuterium-labeled cholesterol was used as the internal standard to obtain the absolute quantity of the endogenous cholesterol. The amount of the cholesterol measured by this paper-loaded DART mass spectrometry (pDART-MS) is statistically comparable with that obtained by using commercially available fluorometric-enzymatic assay and liquid chromatography/mass spectrometry. Furthermore, sera from 21 participants at three different time points in an ultramarathon were collected to obtain their cholesterol levels. The test requires only very minimal sample preparation, and the concentrations of cholesterol in each sample were acquired within a minute.

  5. Quantification of intervertebral displacement with a novel MRI-based modeling technique: Assessing measurement bias and reliability with a porcine spine model.

    Science.gov (United States)

    Mahato, Niladri K; Montuelle, Stephane; Goubeaux, Craig; Cotton, John; Williams, Susan; Thomas, James; Clark, Brian C

    2017-05-01

    The purpose of this study was to develop a novel magnetic resonance imaging (MRI)-based modeling technique for measuring intervertebral displacements. Here, we present the measurement bias and reliability of the developmental work using a porcine spine model. Porcine lumbar vertebral segments were fitted in a custom-built apparatus placed within an externally calibrated imaging volume of an open-MRI scanner. The apparatus allowed movement of the vertebrae through pre-assigned magnitudes of sagittal and coronal translation and rotation. The induced displacements were imaged with static (T 1 ) and fast dynamic (2D HYCE S) pulse sequences. These images were imported into animation software, in which these images formed a background 'scene'. Three-dimensional models of vertebrae were created using static axial scans from the specimen and then transferred into the animation environment. In the animation environment, the user manually moved the models (rotoscoping) to perform model-to-'scene' matching to fit the models to their image silhouettes and assigned anatomical joint axes to the motion-segments. The animation protocol quantified the experimental translation and rotation displacements between the vertebral models. Accuracy of the technique was calculated as 'bias' using a linear mixed effects model, average percentage error and root mean square errors. Between-session reliability was examined by computing intra-class correlation coefficients (ICC) and the coefficient of variations (CV). For translation trials, a constant bias (β 0 ) of 0.35 (±0.11) mm was detected for the 2D HYCE S sequence (p=0.01). The model did not demonstrate significant additional bias with each mm increase in experimental translation (β 1 Displacement=0.01mm; p=0.69). Using the T 1 sequence for the same assessments did not significantly change the bias (p>0.05). ICC values for the T 1 and 2D HYCE S pulse sequences were 0.98 and 0.97, respectively. For rotation trials, a constant bias (

  6. Advantageous direct quantification of viable closely related probiotics in petit-suisse cheeses under in vitro gastrointestinal conditions by Propidium Monoazide--qPCR.

    Directory of Open Access Journals (Sweden)

    Martha Lissete Morales Villarreal

    Full Text Available Species-specific Quantitative Real Time PCR (qPCR alone and combined with the use of propidium monoazide (PMA were used along with the plate count method to evaluate the survival of the probiotic strains Lactobacillus acidophilus La-5 and Bifidobacterium animalis subsp. lactis Bb-12, and the bacteriocinogenic and potentially probiotic strain Lactobacillus sakei subsp. sakei 2a in synbiotic (F1 and probiotic (F2 petit-suisse cheeses exposed throughout shelf-life to in vitro simulated gastrointestinal tract conditions. The three strains studied showed a reduction in their viability after the 6 h assay. Bb-12 displayed the highest survival capacity, above 72.6 and 74.6% of the initial populations, respectively, by plate count and PMA-qPCR, maintaining population levels in the range or above 6 log CFU/g. The prebiotic mix of inulin and FOS did not offer any additional protection for the strains against the simulated gastrointestinal environment. The microorganisms' populations were comparable among the three methods at the initial time of the assay, confirming the presence of mainly viable and culturable cells. However, with the intensification of the stress induced throughout the various stages of the in vitro test, the differences among the methods increased. The qPCR was not a reliable enumeration method for the quantification of intact bacterial populations, mixed with large numbers of injured and dead bacteria, as confirmed by the scanning electron microscopy results. Furthermore, bacteria plate counts were much lower (P<0.05 than with the PMA-qPCR method, suggesting the accumulation of stressed or dead microorganisms unable to form colonies. The use of PMA overcame the qPCR inability to differentiate between dead and alive cells. The combination of PMA and species-specific qPCR in this study allowed a quick and unequivocal way of enumeration of viable closely related species incorporated into probiotic and synbiotic petit-suisse cheeses and

  7. Direct Quantification of Ice Nucleation Active Bacteria in Aerosols and Precipitation: Their Potential Contribution as Ice Nuclei

    Science.gov (United States)

    Hill, T. C.; DeMott, P. J.; Garcia, E.; Moffett, B. F.; Prenni, A. J.; Kreidenweis, S. M.; Franc, G. D.

    2013-12-01

    Ice nucleation active (INA) bacteria are a potentially prodigious source of highly active (≥-12°C) atmospheric ice nuclei, especially from agricultural land. However, we know little about the conditions that promote their release (eg, daily or seasonal cycles, precipitation, harvesting or post-harvest decay of litter) or their typical contribution to the pool of boundary layer ice nucleating particles (INP). To initiate these investigations we developed a quantitative Polymerase Chain Reaction (qPCR) test of the ina gene, the gene that codes for the ice nucleating protein, to directly count INA bacteria in environmental samples. The qPCR test amplifies most forms of the gene and is highly sensitive, able to detect perhaps a single gene copy (ie, a single bacterium) in DNA extracted from precipitation. Direct measurement of the INA bacteria is essential because environmental populations will be a mixture of living, viable-but-not culturable, moribund and dead cells, all of which may retain ice nucleating proteins. Using the qPCR test on leaf washings of plants from three farms in Wyoming, Colorado and Nebraska we found INA bacteria to be abundant on crops, especially on cereals. Mid-summer populations on wheat and barley were ~108/g fresh weigh of foliage. Broadleaf crops, such as corn, alfalfa, sugar beet and potato supported 105-107/g. Unexpectedly, however, in the absence of a significant physical disturbance, such as harvesting, we were unable to detect the ina gene in aerosols sampled above the crops. Likewise, in fresh snow samples taken over two winters, ina genes from a range of INA bacteria were detected in about half the samples but at abundances that equated to INA bacterial numbers that accounted for only a minor proportion of INP active at -10°C. By contrast, in a hail sample from a summer thunderstorm we found 0.3 INA bacteria per INP at -10°C and ~0.5 per hail stone. Although the role of the INA bacteria as warm-temperature INP in these samples

  8. Windowed direct exponential curve resolution quantification of nuclear magnetic resonance spectroscopy with applications to amniotic fluid metabonomics

    International Nuclear Information System (INIS)

    Botros, L.L.

    2007-01-01

    This thesis presents a quantitative protocol of proton nuclear magnetic resonance ( 1 H NMR) that allows the determination of human amniotic fluid metabolite concentrations, which are then used in a metabonomic study to establish patient health during gestation. 1 H NMR free inductive decays (FIDs) of 258 human amniotic fluid samples from a 500MHz spectrometer are acquired. Quantitative analyses methods in both the frequency- and time-domain are carried out and compared. Frequency-domain analysis is accomplished by integration of the metabolite peaks before and after the inclusion of a known standard addition of alanine. Time-domain analysis is accomplished by the direct exponential curve resolution algorithm (DECRA). Both techniques are assessed by applications to calibration biological solutions and a simulated data set. The DECRA method proves to be a more accurate and precise route for quantitative analysis, and is included in the developed protocol. Well-defined peaks of various components are visible in the frequency-domain 1 H NMR spectra, including lactate, alanine, acetate, citrate, choline, glycine, and glucose. All are quantified with the proposed protocol. Statistical t-test and notched box and whisker plots are used to compare means of metabolite concentrations for diabetic and normal patients. Glucose, glycine, and choline are all found to correlate with gestational diabetes mellitus early in gestation. With further development, time-domain quantitative 1 H NMR has potential to become a robust diagnostic tool for gestational health. (author)

  9. Windowed direct exponential curve resolution quantification of nuclear magnetic resonance spectroscopy with applications to amniotic fluid metabonomics

    Energy Technology Data Exchange (ETDEWEB)

    Botros, L.L

    2007-07-01

    This thesis presents a quantitative protocol of proton nuclear magnetic resonance ({sup 1}H NMR) that allows the determination of human amniotic fluid metabolite concentrations, which are then used in a metabonomic study to establish patient health during gestation. {sup 1}H NMR free inductive decays (FIDs) of 258 human amniotic fluid samples from a 500MHz spectrometer are acquired. Quantitative analyses methods in both the frequency- and time-domain are carried out and compared. Frequency-domain analysis is accomplished by integration of the metabolite peaks before and after the inclusion of a known standard addition of alanine. Time-domain analysis is accomplished by the direct exponential curve resolution algorithm (DECRA). Both techniques are assessed by applications to calibration biological solutions and a simulated data set. The DECRA method proves to be a more accurate and precise route for quantitative analysis, and is included in the developed protocol. Well-defined peaks of various components are visible in the frequency-domain {sup 1}H NMR spectra, including lactate, alanine, acetate, citrate, choline, glycine, and glucose. All are quantified with the proposed protocol. Statistical t-test and notched box and whisker plots are used to compare means of metabolite concentrations for diabetic and normal patients. Glucose, glycine, and choline are all found to correlate with gestational diabetes mellitus early in gestation. With further development, time-domain quantitative {sup 1}H NMR has potential to become a robust diagnostic tool for gestational health. (author)

  10. Theoretical quantification of shock-timing sensitivities for direct-drive inertial confinement fusion implosions on OMEGA

    Science.gov (United States)

    Cao, D.; Boehly, T. R.; Gregor, M. C.; Polsin, D. N.; Davis, A. K.; Radha, P. B.; Regan, S. P.; Goncharov, V. N.

    2018-05-01

    Using temporally shaped laser pulses, multiple shocks can be launched in direct-drive inertial confinement fusion implosion experiments to set the shell on a desired isentrope or adiabat. The velocity of the first shock and the times at which subsequent shocks catch up to it are measured through the velocity interferometry system for any reflector diagnostic [T. R. Boehly et al., Phys. Plasmas 18, 092706 (2011)] on OMEGA [T. R. Boehly et al., Opt. Commun. 133, 495 (1997)]. Simulations reproduce these velocity and shock-merger time measurements when using laser pulses designed for setting mid-adiabat (α ˜ 3) implosions, but agreement degrades for lower-adiabat (α ˜ 1) designs. Simulation results indicate that the shock timing discrepancy is most sensitive to details of the density and temperature profiles in the coronal plasma, which influences the laser energy coupled into the target, and only marginally sensitive to the target offset and beam power imbalance. To aid in verifying the coronal profile's influence, a new technique under development to infer coronal profiles using x-ray self-emission imaging [A. K. Davis et al., Bull. Am. Phys. Soc. 61, BAPS.2016.DPP.NO8.7 (2016)] can be applied to the pulse shapes used in shock-timing experiments.

  11. Direct quantification of PM2.5 fossil and biomass carbon within the Northern Front Range Air Quality Study's domain

    International Nuclear Information System (INIS)

    Klinedinst, D.B.; Currie, L.A.

    1999-01-01

    Radiocarbon ( 14 C) analyses of PM 2.5 (particulate matter with an aerodynamic diameter of 2.5 microm or less) of both ambient and source samples from the Northern Front Range Air Quality Study (NFRAQS) in Colorado were performed. The 14 C analyses were undertaken to provide direct fossil vs modern (biomass) carbon source discrimination data for a subset of summer and winter 1996--1997 samples collected within the Denver metropolitan area. Samples were prepared for 14 C accelerator mass spectrometry measurements using techniques specially developed for small samples, i.e., lt100 μg C. For the days and sampling periods analyzed the median and interquartile range of the winter blank corrected fraction of modern carbon was 23% (16--34%) at Welby and 27% (25--37%) at Brighton. The summer samples exhibited a more mixed signature with a median and interquartile range of 47% (9--70%). Source samples yielded 14 C signatures consistent with expectation. The authors conclude fossil-derived sources contribute substantially in both seasons and at both locations; however, the biomass carbon component dominates episodically in the summer

  12. Direct and simultaneous quantification of tannin mean degree of polymerization and percentage of galloylation in grape seeds using diffuse reflectance fourier transform-infrared spectroscopy.

    Science.gov (United States)

    Pappas, Christos; Kyraleou, Maria; Voskidi, Eleni; Kotseridis, Yorgos; Taranilis, Petros A; Kallithraka, Stamatina

    2015-02-01

    The direct and simultaneous quantitative determination of the mean degree of polymerization (mDP) and the degree of galloylation (%G) in grape seeds were quantified using diffuse reflectance infrared Fourier transform spectroscopy and partial least squares (PLS). The results were compared with those obtained using the conventional analysis employing phloroglucinolysis as pretreatment followed by high performance liquid chromatography-UV and mass spectrometry detection. Infrared spectra were recorded in solid state samples after freeze drying. The 2nd derivative of the 1832 to 1416 and 918 to 739 cm(-1) spectral regions for the quantification of mDP, the 2nd derivative of the 1813 to 607 cm(-1) spectral region for the degree of %G determination and PLS regression were used. The determination coefficients (R(2) ) of mDP and %G were 0.99 and 0.98, respectively. The corresponding values of the root-mean-square error of calibration were found 0.506 and 0.692, the root-mean-square error of cross validation 0.811 and 0.921, and the root-mean-square error of prediction 0.612 and 0.801. The proposed method in comparison with the conventional method is simpler, less time consuming, more economical, and requires reduced quantities of chemical reagents and fewer sample pretreatment steps. It could be a starting point for the design of more specific models according to the requirements of the wineries. © 2015 Institute of Food Technologists®

  13. HPLC/ESI-quadrupole ion trap mass spectrometry for characterization and direct quantification of amphoteric and nonionic surfactants in aqueous samples

    Science.gov (United States)

    Levine, Lanfang H.; Garland, Jay L.; Johnson, Jodie V.

    2002-01-01

    An amphoteric (cocamidopropylbetaine, CAPB) and a nonionic (alcohol polyethoxylate, AE) surfactant were characterized by electrospray ionization quadrupole ion trap mass spectrometry (ESI-MS) as to their homologue distribution and ionization/fragmentation chemistry. Quantitative methods involving reversed-phase gradient HPLC and (+)ESI-MSn were developed to directly determine these surfactants in hydroponic plant growth medium that received simulated graywater. The predominant homologues, 12 C alkyl CAPB and 9 EO AE, were monitored to represent the total amount of the respective surfactants. The methods demonstrated dynamic linear ranges of 0.5-250 ng (r2 > 0.996) for CAPB and 8-560 ng (r2 > 0.998) for AE homologue mixture, corresponding to minimum quantification limits of 25 ppb CAPB and 0.4 ppm AE with 20-microL injections. This translated into an even lower limit for individual components due to the polydispersive nature of the surfactants. The procedure was successfully employed for the assessment of CAPB and AE biodegradation in a hydroponic plant growth system used as a graywater bioreactor.

  14. Direct injection liquid chromatography/electrospray ionization mass spectrometric horse urine analysis for the quantification and confirmation of threshold substances for doping control. II. Determination of theobromine.

    Science.gov (United States)

    Vonaparti, A; Lyris, E; Panderi, I; Koupparis, M; Georgakopoulos, C

    2009-04-01

    In equine sport, theobromine is prohibited with a threshold level of 2 microg mL(-1) in urine, hence doping control laboratories have to establish quantitative and qualitative methods for its determination. Two simple liquid chromatography/mass spectrometry (LC/MS) methods for the identification and quantification of theobromine were developed and validated using the same sample preparation procedure but different mass spectrometric systems: ion trap mass spectrometry (ITMS) and time-of-flight mass spectrometry (TOFMS). Particle-free diluted urine samples were directly injected into the LC/MS systems, avoiding the time-consuming extraction step. 3-Propylxanthine was used as the internal standard. The tested linear range was 0.75-15 microg mL(-1). Matrix effects were evaluated analyzing calibration curves in water and different fortified horse urine samples. A great variation in the signal of theobromine and the internal standard was observed in different matrices. To overcome matrix effects, a standard additions calibration method was applied. The relative standard deviations of intra- and inter-day analysis were lower than 8.6 and 7.2%, respectively, for the LC/ITMS method and lower than 5.7 and 5.8%, respectively, for the LC/TOFMS method. The bias was less than 8.7% for both methods. The methods were applied to two case samples, demonstrating simplicity, accuracy and selectivity. Copyright (c) 2009 John Wiley & Sons, Ltd.

  15. Bulk derivatization and direct injection of human cerebrospinal fluid for trace-level quantification of endogenous estrogens using trap-and-elute liquid chromatography with tandem mass spectrometry.

    Science.gov (United States)

    Fan, Hui; Papouskova, Barbora; Lemr, Karel; Wigginton, Jane G; Schug, Kevin A

    2014-08-01

    Although there are existing methods for determining estrogen in human bodily fluids including blood plasma and serum, very little information is available regarding estrogen levels in human cerebrospinal fluid (CSF), which is critical to assess in studies of neuroprotective functions and diffusion of neuroprotective estrogens across the blood-brain barrier. To address this problem, a liquid chromatography with tandem mass spectrometry method for the simultaneous quantification of four endogenous estrogens (estrone, 17α-estradiol, 17β-estradiol, and estriol) in human CSF was developed. An aliquot (300 μL) of human CSF was bulk derivatized using dansyl chloride in the sample and 10 μL was directly injected onto a restricted-access media trap column for protein removal. No off-line sample extraction or cleanup was needed. The limits of detection of estrone, 17α-estradiol, 17β-estradiol, and estriol were 17, 28, 13, and 30 pg/mL, respectively, which is in the parts-per-trillion regime. The method was then applied to human CSF collected from ischemic trauma patients. Endogenous estrogens were detected and quantified, demonstrating the effectiveness of this method. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. A construction of standardized near infrared hyper-spectral teeth database: a first step in the development of reliable diagnostic tool for quantification and early detection of caries

    Science.gov (United States)

    Bürmen, Miran; Usenik, Peter; Fidler, Aleš; Pernuš, Franjo; Likar, Boštjan

    2011-03-01

    Dental caries is a disease characterized by demineralization of enamel crystals leading to the penetration of bacteria into the dentin and pulp. If left untreated, the disease can lead to pain, infection and tooth loss. Early detection of enamel demineralization resulting in increased enamel porosity, commonly known as white spots, is a difficult diagnostic task. Several papers reported on near infrared (NIR) spectroscopy to be a potentially useful noninvasive spectroscopic technique for early detection of caries lesions. However, the conducted studies were mostly qualitative and did not include the critical assessment of the spectral variability of the sound and carious dental tissues and influence of the water content. Such assessment is essential for development and validation of reliable qualitative and especially quantitative diagnostic tools based on NIR spectroscopy. In order to characterize the described spectral variability, a standardized diffuse reflectance hyper-spectral database was constructed by imaging 12 extracted human teeth with natural lesions of various degrees in the spectral range from 900 to 1700 nm with spectral resolution of 10 nm. Additionally, all the teeth were imaged by digital color camera. The influence of water content on the acquired spectra was characterized by monitoring the teeth during the drying process. The images were assessed by an expert, thereby obtaining the gold standard. By analyzing the acquired spectra we were able to accurately model the spectral variability of the sound dental tissues and identify the advantages and limitations of NIR hyper-spectral imaging.

  17. On the direct characterization and quantification of active ingredients in commercial solid drugs using PIXE, PIGE and ToF-SIMS techniques

    Energy Technology Data Exchange (ETDEWEB)

    Nsouli, B; Zahraman, K; Bejjani, A; Roumie, M; Noun, M [Ion Beam Analysis Laboratory, Lebanese Atomic Energy Commission - CNRS, P.O.Box: 11-8281 Beirut (Lebanon); Younes, G [Faculty of Sciences, Department of Chemistry - Beirut Arab University, Beirut (Lebanon); Assi, S; El-Yazbi, F; Mahmoud, R [Faculty of Pharmacy, Department of Pharmaceutical Analytical Chemistry - Beirut Arab University, Beirut (Lebanon); Thomas, J P [Institut de Physique Nucleaire de Lyon - Universite Claude Bernard Lyon 1 (France)

    2010-01-15

    The quantification of the active ingredient (AI) in drugs is a crucial and important step in the drug quality control process. This is usually performed by using wet chemical techniques like HPLC, LC-MS/MS, UV spectrophotometry and other appropriate organic analytical methods. In the case of an active ingredient contains specific heteroatoms (F, S, Cl, Br, ...), elemental IBA technique can be explored for molecular quantification. IBA techniques permit the analysis of the sample under solid form, without any laborious sample preparation. This is an advantage when the number of sample is relatively large. In this work, we demonstrate the ability of the Thick target PIXE (TT-PIXE) and the TT-PIGE techniques for rapid and accurate quantification of celecoxib and atorvastatin in commercial solid drugs. The experimental aspects related to the quantification validity are presented and discussed. (author)

  18. On the direct characterization and quantification of active ingredients in commercial solid drugs using PIXE, PIGE and ToF-SIMS techniques

    International Nuclear Information System (INIS)

    Nsouli, B.; Zahraman, K.; Bejjani, A.; Roumie, M.; Noun, M.; Younes, G.; Assi, S.; El-Yazbi, F.; Mahmoud, R.; Thomas, J.P.

    2010-01-01

    The quantification of the active ingredient (AI) in drugs is a crucial and important step in the drug quality control process. This is usually performed by using wet chemical techniques like HPLC, LC-MS/MS, UV spectrophotometry and other appropriate organic analytical methods. In the case of an active ingredient contains specific heteroatoms (F, S, Cl, Br, ...), elemental IBA technique can be explored for molecular quantification. IBA techniques permit the analysis of the sample under solid form, without any laborious sample preparation. This is an advantage when the number of sample is relatively large. In this work, we demonstrate the ability of the Thick target PIXE (TT-PIXE) and the TT-PIGE techniques for rapid and accurate quantification of celecoxib and atorvastatin in commercial solid drugs. The experimental aspects related to the quantification validity are presented and discussed. (author)

  19. Rapid screening and quantification of residual pesticides and illegal adulterants in red wine by direct analysis in real time mass spectrometry.

    Science.gov (United States)

    Guo, Tianyang; Fang, Pingping; Jiang, Juanjuan; Zhang, Feng; Yong, Wei; Liu, Jiahui; Dong, Yiyang

    2016-11-04

    A rapid method to screen and quantify multi-class analytic targets in red wine has been developed by direct analysis in real time (DART) coupled with triple quadruple tandem mass spectrometry (QqQ-MS). A modified QuEChERS (Quick, Easy, Cheap, Effective, Rugged, and Safe) procedure was used for increasing analytical speed and reducing matrix effect, and the multiple reaction monitoring (MRM) in DART-MS/MS ensured accurate analysis. One bottle of wine containing 50 pesticides and 12 adulterants, i.e., preservatives, antioxidant, sweeteners, and azo dyes, could be totally determined less than 12min. This method exhibited proper linearity (R 2 ≥0.99) in the range of 1-1000ng/mL for pesticides and 10-5000ng/mL for adulterants. The limits of detection (LODs) were obtained in a 0.5-50ng/mL range for pesticides and 5-50ng/mL range for adulterants, and the limits of quantification (LOQs) were in a 1-100ng/mL range for pesticides and 10-250ng/mL range for adulterants. Three spiked levels for each analyte in wine were evaluated, and the recoveries were in a scope of 75-120%. The results demonstrated DART-MS/MS was a rapid and simple method, and could be applied to rapid analyze residual pesticides and illegal adulterants in a large quantities of red wine. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Comparison of the reliability of parental reporting and the direct test of the Thai Speech and Language Test.

    Science.gov (United States)

    Prathanee, Benjamas; Angsupakorn, Nipa; Pumnum, Tawitree; Seepuaham, Cholada; Jaiyong, Pechcharat

    2012-11-01

    To find reliability of parental or caregiver's report and testing of the Thai Speech and Language Test for Children Aged 0-4 Years Old. Five investigators assessed speech and language abilities from video both contexts: parental or caregivers' report and test forms of Thai Speech and Language Test for Children Aged 0-4 Years Old. Twenty-five normal and 30 children with delayed development or risk for delayed speech and language skills were assessed at age intervals of 3, 6, 9, 12, 15, 18, 24, 30, 36 and 48 months. Reliability of parental or caregivers' testing and reporting was at a moderate level (0.41-0.60). Inter-rater reliability among investigators was excellent (0.86-1.00). The parental or caregivers' report form of the Thai Speech and Language test for Children aged 0-4 years old was an indicator for success at a moderate level. Trained professionals could use both forms of this test as reliable tools at an excellent level.

  1. On the direct characterization and quantification of active ingredients in commercial solid drugs using PIXE, PIGE and TOF-SIMS techniques

    Energy Technology Data Exchange (ETDEWEB)

    Nsouli, B. [IBA laboratory, Lebanese Atomic Energy Commission (CNRS), Beirut (Lebanon)], E-mail: bnsouli@cnrs.edu.lb; Zahraman, K; Roumie, M [IBA laboratory, Lebanese Atomic Energy Commission (CNRS), Beirut (Lebanon); Yazbi, F [Faculty of Pharmacy, Department of Pharmaceutical Analytical Chemistry, Beirut Arab University, Beirut (Lebanon); Thomas, J P [Institut de Physique Nucleaire de Lyon, Universite Claude Bernard Lyon, Villeurbanne (France)

    2009-07-01

    The quantification of the active ingredient (AI) in drugs is a crucial and important step in the drug quality control process. This is usually performed by using wet chemical techniques like LC-MS, UV spectrophotometry and other appropriate organic analytical methods. In the case of an active ingredient contains specific heteroatoms (F, S, Cl, . . .), elemental IBA like PIXE and PIGE techniques, using small tandem accelerator of 1 - 2 MV, can be explored for molecular quantification. IBA techniques permit the analysis of the sample under solid form, without any laborious sample preparations. This is an advantage when the number of sample is relatively large. In this work, we demonstrate the ability of the Thick Target PIXE and PIGE technique for rapid and accurate quantification of low concentration of different fluorinated, sulfured and chlorinated active ingredients in several commercial anti-hyperlipidemic and anti-inflammatory commercial drugs. In this work we will demonstrate the ability of PIXE and PIGE techniques for rapid and accurate quantification of Celecoxib and Atorvastatin active ingredients contained in several solid commercial drugs. The experimental aspects related to the quantification validity are presented and discussed. In addition, the Time of Flight Secondary Ion Emission using multicharged Ar ions with {approx} 10 MeV energy, delivered by a 4 MV Vander Graaf single stage accelerator, was used for structural and chemical analysis for some cases of binary commercial drugs containing two different active ingredients. The aspect of sample preparation and the role of excipient will be highlighted and discussed. (author)

  2. Inter- and intra-rater reliability of patellofemoral kinematic and contact area quantification by fast spin echo MRI and correlation with cartilage health by quantitative T1ρ MRI.

    Science.gov (United States)

    Lau, Brian C; Thuillier, Daniel U; Pedoia, Valentina; Chen, Ellison Y; Zhang, Zhihong; Feeley, Brian T; Souza, Richard B

    2016-01-01

    Patellar maltracking is a leading cause of patellofemoral pain syndrome (PFPS). The aim of this study was to determine the inter- and intra-rater reliability of a semi-automated program for magnetic resonance imaging (MRI) based patellofemoral kinematics. Sixteen subjects (10 with PFPS [mean age 32.3; SD 5.2; eight females] and six controls without PFPS 19 [mean age 28.6; SD 2.8; three females]) participated in the study. One set of T2-weighted, fat-saturated fast spin-echo (FSE) MRIs were acquired from each subject in full extension and 30° of knee flexion. MRI including axial T1ρ relaxation time mapping sequences was also performed on each knee. Following image acquisitions, regions of interest for kinematic MRI, and patellar and trochlear cartilage were segmented and quantified with in-house designed spline- based MATLAB semi-automated software. Intraclass Correlations Coefficients (ICC) of calculated kinematic parameters were good to excellent, ICC > 0.8 in patellar flexion, rotation, tilt, and translation (anterior -posterior, medial -lateral, and superior -inferior), and contact area translation. Only patellar tilt in the flexed position and motion from extended to flexed state was significantly different between PFPS and control patients (p=0.002 and p=0.006, respectively). No significant correlations were identified between patellofemoral kinematics and contact area with T1ρ relaxation times. A semi-automated, spline-based kinematic MRI technique for patellofemoral kinematic and contact area quantification is highly reproducible with the potential to help better understand the role of patellofemoral maltracking in PFPS and other knee disorders. Level IV. Published by Elsevier B.V.

  3. Inter- and intra-rater reliability of patellofemoral kinematic and contact area quantification by fast spin echo MRI and correlation with cartilage health by quantitative T1ρ MRI☆

    Science.gov (United States)

    Lau, Brian C.; Thuillier, Daniel U.; Pedoia, Valentina; Chen, Ellison Y.; Zhang, Zhihong; Feeley, Brian T.; Souza, Richard B.

    2016-01-01

    Background Patellar maltracking is a leading cause of patellofemoral pain syndrome (PFPS). The aim of this study was to determine the inter- and intra-rater reliability of a semi-automated program for magnetic resonance imaging (MRI) based patellofemoral kinematics. Methods Sixteen subjects (10 with PFPS [mean age 32.3; SD 5.2; eight females] and six controls without PFPS 19 [mean age 28.6; SD 2.8; three females]) participated in the study. One set of T2-weighted, fat-saturated fast spin-echo (FSE) MRIs were acquired from each subject in full extension and 30° of knee flexion. MRI including axial T1ρ relaxation time mapping sequences was also performed on each knee. Following image acquisitions, regions of interest for kinematic MRI, and patellar and trochlear cartilage were segmented and quantified with in-house designed spline- based MATLAB semi-automated software. Results Intraclass Correlations Coefficients (ICC) of calculated kinematic parameters were good to excellent, ICC > 0.8 in patellar flexion, rotation, tilt, and translation (anterior -posterior, medial -lateral, and superior -inferior), and contact area translation. Only patellar tilt in the flexed position and motion from extended to flexed state was significantly different between PFPS and control patients (p = 0.002 and p = 0.006, respectively). No significant correlations were identified between patellofemoral kinematics and contact area with T1ρ relaxation times. Conclusions A semi-automated, spline-based kinematic MRI technique for patellofemoral kinematic and contact area quantification is highly reproducible with the potential to help better understand the role of patellofemoral maltracking in PFPS and other knee disorders. PMID:26746045

  4. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  5. Direct quantification of human cytomegalovirus immediate-early and late mRNA levels in blood of lung transplant recipients by competitive nucleic acid sequence-based amplification

    NARCIS (Netherlands)

    Greijer, AE; Verschuuren, EAM; Harmsen, MC; Dekkers, CAJ; Adriaanse, HMA; The, TH; Middeldorp, JM

    The dynamics of active human cytomegalovirus (HCMV) infection was monitored by competitive nucleic acid sequence-based amplification (NASBA) assays for quantification of IE1 (UL123) and pp67 (UL65) mRNA expression levels In the blood of patients after lung transplantation. RNA was isolated from 339

  6. Paper Spray and Extraction Spray Mass Spectrometry for the Direct and Simultaneous Quantification of Eight Drugs of Abuse in Whole Blood

    NARCIS (Netherlands)

    Espy, R.D.; Teunissen, S.F.; Manicke, N.E.; Ren, Y.; Ouyang, Z.; van Asten, A.; Cooks, R.G.

    2014-01-01

    Determination of eight drugs of abuse in blood has been performed using paper spray or extraction spray mass spectrometry in under 2 min with minimal sample preparation. A method has been optimized for quantification of amphetamine, methamphetamine, 3,4-methylenedioxyamphetamine (MDA),

  7. Predicting interatrial septum rotation: is the position of the heart or the direction of the coronary sinus reliable?: Implications for interventional electrophysiologists from CT studies.

    Science.gov (United States)

    Sun, Huan; Wang, Yanjing; Zhang, Zhenming; Liu, Lin; Yang, Ping

    2015-04-01

    Determining the location of the interatrial septum (IAS) is crucial for cardiac electrophysiology procedures. Empirical methods of predicting IAS orientation depend on anatomical landmarks, including determining it from the direction of the coronary sinus (CS) and the position of the heart (e.g., vertical or transverse). However, the reliability of these methods for predicting IAS rotation warrants further study. The purpose of this study was to assess the clinical utility of the relationship between IAS orientation, CS direction, and heart position. Data from 115 patients undergoing coronary computed tomography (CT) angiography with no evidence of cardiac structural disease were collected and analyzed. Angulations describing IAS orientation, CS direction, and heart position were measured. The relationships between IAS orientation and each of the other two parameters were subsequently analyzed. The mean angulations for IAS orientation, CS direction, and heart position were 36.8 ± 7.3° (range 19.1-53.6), 37.7 ± 6.6° (range 21.3-50.1), and 37.1 ± 8.3° (range 19.2-61.0), respectively. We found a significant correlation between IAS orientation and CS direction (r = 0.928; P IAS orientation = 2.01 + 1.03 × CS direction (r(2) = 0.86). No correlation was observed between IAS orientation and heart position (P = 0.86). In patients without structural heart disease, CS direction may be a reliable predictor of IAS orientation, and may serve as a helpful reference for clinicians during invasive electrophysiological procedures. Further study is warranted to clarify the relationship between IAS orientation and heart position. © 2015 Wiley Periodicals, Inc.

  8. Evidence that transcranial direct current stimulation (tDCS) generates little-to-no reliable neurophysiologic effect beyond MEP amplitude modulation in healthy human subjects: A systematic review.

    Science.gov (United States)

    Horvath, Jared Cooney; Forte, Jason D; Carter, Olivia

    2015-01-01

    Transcranial direct current stimulation (tDCS) is a form of neuromodulation that is increasingly being utilized to examine and modify a number of cognitive and behavioral measures. The theoretical mechanisms by which tDCS generates these changes are predicated upon a rather large neurophysiological literature. However, a robust systematic review of this neurophysiological data has not yet been undertaken. tDCS data in healthy adults (18-50) from every neurophysiological outcome measure reported by at least two different research groups in the literature was collected. When possible, data was pooled and quantitatively analyzed to assess significance. When pooling was not possible, data was qualitatively compared to assess reliability. Of the 30 neurophysiological outcome measures reported by at least two different research groups, tDCS was found to have a reliable effect on only one: MEP amplitude. Interestingly, the magnitude of this effect has been significantly decreasing over the last 14 years. Our systematic review does not support the idea that tDCS has a reliable neurophysiological effect beyond MEP amplitude modulation - though important limitations of this review (and conclusion) are discussed. This work raises questions concerning the mechanistic foundations and general efficacy of this device - the implications of which extend to the steadily increasing tDCS psychological literature. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Direct cloning from enrichment cultures, a reliable strategy for isolation of complete operons and genes from microbial consortia.

    Science.gov (United States)

    Entcheva, P; Liebl, W; Johann, A; Hartsch, T; Streit, W R

    2001-01-01

    Enrichment cultures of microbial consortia enable the diverse metabolic and catabolic activities of these populations to be studied on a molecular level and to be explored as potential sources for biotechnology processes. We have used a combined approach of enrichment culture and direct cloning to construct cosmid libraries with large (>30-kb) inserts from microbial consortia. Enrichment cultures were inoculated with samples from five environments, and high amounts of avidin were added to the cultures to favor growth of biotin-producing microbes. DNA was extracted from three of these enrichment cultures and used to construct cosmid libraries; each library consisted of between 6,000 and 35,000 clones, with an average insert size of 30 to 40 kb. The inserts contained a diverse population of genomic DNA fragments isolated from the consortia organisms. These three libraries were used to complement the Escherichia coli biotin auxotrophic strain ATCC 33767 Delta(bio-uvrB). Initial screens resulted in the isolation of seven different complementing cosmid clones, carrying biotin biosynthesis operons. Biotin biosynthesis capabilities and growth under defined conditions of four of these clones were studied. Biotin measured in the different culture supernatants ranged from 42 to 3,800 pg/ml/optical density unit. Sequencing the identified biotin synthesis genes revealed high similarities to bio operons from gram-negative bacteria. In addition, random sequencing identified other interesting open reading frames, as well as two operons, the histidine utilization operon (hut), and the cluster of genes involved in biosynthesis of molybdopterin cofactors in bacteria (moaABCDE).

  10. Redefining reliability

    International Nuclear Information System (INIS)

    Paulson, S.L.

    1995-01-01

    Want to buy some reliability? The question would have been unthinkable in some markets served by the natural gas business even a few years ago, but in the new gas marketplace, industrial, commercial and even some residential customers have the opportunity to choose from among an array of options about the kind of natural gas service they need--and are willing to pay for. The complexities of this brave new world of restructuring and competition have sent the industry scrambling to find ways to educate and inform its customers about the increased responsibility they will have in determining the level of gas reliability they choose. This article discusses the new options and the new responsibilities of customers, the needed for continuous education, and MidAmerican Energy Company's experiment in direct marketing of natural gas

  11. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  12. Single-leg lateral, horizontal, and vertical jump assessment: reliability, interrelationships, and ability to predict sprint and change-of-direction performance.

    Science.gov (United States)

    Meylan, Cesar; McMaster, Travis; Cronin, John; Mohammad, Nur Ikhwan; Rogers, Cailyn; Deklerk, Melissa

    2009-07-01

    The purposes of this study were to determine the reliability of unilateral vertical, horizontal, and lateral countermovement jump assessments, the interrelationship between these tests, and their usefulness as predictors of sprint (10 m) and change-of-direction (COD) performance for 80 men and women physical education students. Jump performance was assessed on a contact mat and sprint, and COD performances were assessed using timing lights. With regard to the reliability statistics, the largest coefficient of variation (CV) was observed for the vertical jump (CV = 6.7-7.2%) of both genders, whereas the sprint and COD assessments had smallest variability (CV = 0.8 to 2.8%). All intraclass correlation coefficients (ICC) were greater than 0.85, except for the men's COD assessment with the alternate leg. The shared variance between the single-leg vertical, horizontal, and lateral jumps for men and women was less than 50%, indicating that the jumps are relatively independent of one another and represent different leg strength/power qualities. The ability of the jumps to predict sprint and COD performance was limited (R2 < 43%). It would seem that the ability to change direction with 1 leg is relatively independent of a COD with the other leg, especially in the women (R < 30%) of this study. However, if 1 jump assessment were selected to predict sprint and COD performance in a test battery, the single-leg horizontal countermovement jump would seem the logical choice, given the results of this study. Many of the findings in this study have interesting diagnostic and training implications for the strength and conditioning coach.

  13. Test-retest reliability of prefrontal transcranial Direct Current Stimulation (tDCS) effects on functional MRI connectivity in healthy subjects.

    Science.gov (United States)

    Wörsching, Jana; Padberg, Frank; Helbich, Konstantin; Hasan, Alkomiet; Koch, Lena; Goerigk, Stephan; Stoecklein, Sophia; Ertl-Wagner, Birgit; Keeser, Daniel

    2017-07-15

    Transcranial Direct Current Stimulation (tDCS) of the prefrontal cortex (PFC) can be used for probing functional brain connectivity and meets general interest as novel therapeutic intervention in psychiatric and neurological disorders. Along with a more extensive use, it is important to understand the interplay between neural systems and stimulation protocols requiring basic methodological work. Here, we examined the test-retest (TRT) characteristics of tDCS-induced modulations in resting-state functional-connectivity MRI (RS fcMRI). Twenty healthy subjects received 20minutes of either active or sham tDCS of the dorsolateral PFC (2mA, anode over F3 and cathode over F4, international 10-20 system), preceded and ensued by a RS fcMRI (10minutes each). All subject underwent three tDCS sessions with one-week intervals in between. Effects of tDCS on RS fcMRI were determined at an individual as well as at a group level using both ROI-based and independent-component analyses (ICA). To evaluate the TRT reliability of individual active-tDCS and sham effects on RS fcMRI, voxel-wise intra-class correlation coefficients (ICC) of post-tDCS maps between testing sessions were calculated. For both approaches, results revealed low reliability of RS fcMRI after active tDCS (ICC (2,1) = -0.09 - 0.16). Reliability of RS fcMRI (baselines only) was low to moderate for ROI-derived (ICC (2,1) = 0.13 - 0.50) and low for ICA-derived connectivity (ICC (2,1) = 0.19 - 0.34). Thus, for ROI-based analyses, the distribution of voxel-wise ICC was shifted to lower TRT reliability after active, but not after sham tDCS, for which the distribution was similar to baseline. The intra-individual variation observed here resembles variability of tDCS effects in motor regions and may be one reason why in this study robust tDCS effects at a group level were missing. The data can be used for appropriately designing large scale studies investigating methodological issues such as sources of variability and

  14. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  15. Quantification in single photon emission computed tomography (SPECT)

    International Nuclear Information System (INIS)

    Buvat, Irene

    2005-01-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena; 2 - quantification in SPECT, problems and correction methods: Attenuation, scattering, un-stationary spatial resolution, partial volume effect, movement, tomographic reconstruction, calibration; 3 - Synthesis: actual quantification accuracy; 4 - Beyond the activity concentration measurement

  16. Safety and reliability assessment

    International Nuclear Information System (INIS)

    1979-01-01

    This report contains the papers delivered at the course on safety and reliability assessment held at the CSIR Conference Centre, Scientia, Pretoria. The following topics were discussed: safety standards; licensing; biological effects of radiation; what is a PWR; safety principles in the design of a nuclear reactor; radio-release analysis; quality assurance; the staffing, organisation and training for a nuclear power plant project; event trees, fault trees and probability; Automatic Protective Systems; sources of failure-rate data; interpretation of failure data; synthesis and reliability; quantification of human error in man-machine systems; dispersion of noxious substances through the atmosphere; criticality aspects of enrichment and recovery plants; and risk and hazard analysis. Extensive examples are given as well as case studies

  17. Quantification of the reliability of personnel actions from the evaluation of actual German operational experience. Final report; Quantifizierung der Zuverlaessigkeit von Personalhandlungen durch Auswertung der aktuellen deutschen Betriebserfahrung. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Preischl, W.; Fassmann, W.

    2013-07-15

    The results and their uncertainty bounds of PSA studies are considerably impacted by the assessment of human reliability. But the amount of available, generic data is not sufficient to evaluate all human actions considered in a modern PSA study adequately. Further the data are not sufficiently validated and rely as well as the proposed uncertainty bounds on expert judgement. This research project as well as the preceding project /GRS 10/ validated data recommended by the German PSA Guidelines and enlarged the amount of available data. The findings may contribute to an update of the German PSA Guidelines. In a first step of the project information about reportable events in German nuclear power plants with observed human errors (event reports, expert statements, technical documents, interviews and plant walk downs with subject matter experts from the plants) were analysed. The investigation resulted in 67 samples describing personal activities, performance conditions, the number of observed errors and the number of action performance. In a second step a new methodology was developed and applied in a pilot plant. The objective was to identify undoubtedly error free safety relevant actions, their performance conditions, and frequency as well as to prove and demonstrate that probabilistic data can be derived from that operational experience (OE). The application in the pilot plant resulted in 18 ''error free'' samples characterizing human reliability. All available samples were evaluated by use of the method of Bayes. That commonly accepted methodology was applied in order to derive probabilistic data based on samples taken from operational experience. A thorough analysis of the obtained results shows that both data sources (OE reportable events, OE with undoubtedly error free action performance) provide data with comparable quality and validity. At the end of the research project the following products are available. - Methods to select samples

  18. Direct quantification of TiO{sub 2} nanoparticles in suspension by grazing-incidence X-ray fluorescence spectrometry: Influence of substrate pre-treatment in the deposition process

    Energy Technology Data Exchange (ETDEWEB)

    Motellier, S., E-mail: Sylvie.motellier@cea.fr [Commissariat à l' Energie Atomique et aux Energies alternatives, DRT/LITEN/DTNM/LCSN, 17 rue des martyrs, F-38054 GRENOBLE CEDEX (France); Derrough, S.; Locatelli, D. [Commissariat à l’Energie Atomique et aux Energies alternatives, DRT/NanoSafety Plateform, 17 rue des martyrs, F-38054 GRENOBLE CEDEX (France); Amdaoud, M.; Lhaute, K. [Commissariat à l' Energie Atomique et aux Energies alternatives, DRT/LITEN/DTNM/LCSN, 17 rue des martyrs, F-38054 GRENOBLE CEDEX (France)

    2013-10-01

    X-ray fluorescence at grazing incidence (GIXRF) was investigated as a method for the quantification of TiO{sub 2} nanoparticles in aqueous suspensions. One of the major advantages of this technique is the possibility to analyze the particles without pre-treatment, like harsh acid digestion, as required by most other conventional methods. However, reliable quantitative measurements require a number of precautions. Particularly, the deposition process of the sample on the flat reflecting substrate must maintain homogeneity in composition and concentration over the entire surface of the deposition residue once dried. Scanning electron microscopy showed that using an adhesive coating of the substrate significantly improves the morphology and chemical homogeneity of the residue, hence leading to better performance of the method from a quantitative point of view. Linear calibration curves using internal standardization were established with ionic Ti and with two different types of TiO{sub 2} nanoparticles. Low limits of detections of 18 μg L{sup −1} and 52 μg L{sup −1} at incident angles of 0.20° and 0.75°, respectively, were obtained. It was found that correlation factors of the calibration linear fits were particle-size dependent, which was assigned to sampling problems due to possible incomplete dispersion of the particles in suspensions. The measured fluorescence of the dried deposits changed within a 4-month timespan for both types of TiO{sub 2} nanoparticles, demonstrating the very peculiar behavior of these particulate samples. - Highlights: • Suspensions of TiO{sub 2} nanoparticles were quantitatively analyzed by GIXRF. • The substrate was coated with an adhesive film prior to sample deposition. • Improved spatial homogeneity of the dry spot residue was confirmed by SEM/EDX. • Linear calibration curves were obtained with ionic Cr as internal standard. • Ti low limits of detections were in the 20–50 μg L{sup −1}.

  19. Pistachio (Pistacia vera L.) Detection and Quantification Using a Murine Monoclonal Antibody-Based Direct Sandwich Enzyme-Linked Immunosorbent Assay.

    Science.gov (United States)

    Liu, Changqi; Chhabra, Guneet S; Sathe, Shridhar K

    2015-10-21

    A commercially available direct sandwich enzyme-linked immunosorbent assay (ELISA) (BioFront Technologies, Tallahassee, FL, USA) using murine anti-pistachio monoclonal antibodies (mAbs) as capture and detection antibodies was evaluated. The assay was sensitive (limit of detection = 0.09 ± 0.02 ppm full fat pistachio, linear detection range = 0.5-36 ppm, 50% maximum signal concentration = 7.9 ± 0.7 ppm), reproducible (intra- and inter-assay variability pistachio seeds subjected to autoclaving (121 °C, 15 psi, 15, 30 min), blanching (100 °C, 5, 10 min), frying (191 °C, 1 min), microwaving (500, 1000 W, 3 min), and dry roasting (140 °C, 30 min; 168 °C, 12 min). No cross-reactivity was observed in 156 food matrices, each tested at 100,000 ppm, suggesting the ELISA to be pistachio specific. The pistachio recovery ranges for spiked (10 ppm) and incurred (10-50000 ppm) food matrices were 93.1-125.6% and 35.7-112.2%, respectively. The assay did not register any false-positive or -negative results among the tested commercial and laboratory prepared samples.

  20. The use of direct analysis in real time (DART) to assess the levels of inhibitors co-extracted with DNA and the associated impact in quantification and amplification.

    Science.gov (United States)

    Moreno, Lilliana I; McCord, Bruce R

    2016-10-01

    The measure of quality in DNA sample processing starts with an effective nucleic acid isolation procedure. Most problems with DNA sample typing can be attributed to low quantity DNA and/or to the presence of inhibitors in the sample. Therefore, establishing which isolation method is best at removing potential inhibitors may help overcome some of the problems analysts encounter by providing useful information in the determination of the optimal approach for any given sample. Direct analysis in real time (DART) mass spectrometry was used in this study to investigate the ability of different extraction methods to remove PCR inhibitors. Methods investigated included both liquid/liquid (phenol-chloroform) and solid phase based robotic procedures, (PrepFiler™ and EZ1 chemistries). Following extraction, samples were analyzed by DART in order to determine the level of remaining inhibitors and then quantified and amplified to determine the effect any remaining inhibitor had on the overall results. The data suggests that organic extraction methods result in detrimental amounts of phenol carryover while automated methods may produce carry-over of bile salts and other chemicals that preferentially bind the solid phase matrix. Both of these effects can have a negative impact in downstream sample processing and genotyping by PCR. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  2. Quantification of the Relative Biological Effectiveness for Ion Beam Radiotherapy: Direct Experimental Comparison of Proton and Carbon Ion Beams and a Novel Approach for Treatment Planning

    International Nuclear Information System (INIS)

    Elsaesser, Thilo; Weyrather, Wilma K.; Friedrich, Thomas; Durante, Marco; Iancu, Gheorghe; Kraemer, Michael; Kragl, Gabriele; Brons, Stephan; Winter, Marcus; Weber, Klaus-Josef; Scholz, Michael

    2010-01-01

    Purpose: To present the first direct experimental in vitro comparison of the biological effectiveness of range-equivalent protons and carbon ion beams for Chinese hamster ovary cells exposed in a three-dimensional phantom using a pencil beam scanning technique and to compare the experimental data with a novel biophysical model. Methods and Materials: Cell survival was measured in the phantom after irradiation with two opposing fields, thus mimicking the typical patient treatment scenario. The novel biophysical model represents a substantial extension of the local effect model, previously used for treatment planning in carbon ion therapy for more than 400 patients, and potentially can be used to predict effectiveness of all ion species relevant for radiotherapy. A key feature of the new approach is the more sophisticated consideration of spatially correlated damage induced by ion irradiation. Results: The experimental data obtained for Chinese hamster ovary cells clearly demonstrate that higher cell killing is achieved in the target region with carbon ions as compared with protons when the effects in the entrance channel are comparable. The model predictions demonstrate agreement with these experimental data and with data obtained with helium ions under similar conditions. Good agreement is also achieved with relative biological effectiveness values reported in the literature for other cell lines for monoenergetic proton, helium, and carbon ions. Conclusion: Both the experimental data and the new modeling approach are supportive of the advantages of carbon ions as compared with protons for treatment-like field configurations. Because the model predicts the effectiveness for several ion species with similar accuracy, it represents a powerful tool for further optimization and utilization of the potential of ion beams in tumor therapy.

  3. Species identification and quantification in meat and meat products using droplet digital PCR (ddPCR).

    Science.gov (United States)

    Floren, C; Wiedemann, I; Brenig, B; Schütz, E; Beck, J

    2015-04-15

    Species fraud and product mislabelling in processed food, albeit not being a direct health issue, often results in consumer distrust. Therefore methods for quantification of undeclared species are needed. Targeting mitochondrial DNA, e.g. CYTB gene, for species quantification is unsuitable, due to a fivefold inter-tissue variation in mtDNA content per cell resulting in either an under- (-70%) or overestimation (+160%) of species DNA contents. Here, we describe a reliable two-step droplet digital PCR (ddPCR) assay targeting the nuclear F2 gene for precise quantification of cattle, horse, and pig in processed meat products. The ddPCR assay is advantageous over qPCR showing a limit of quantification (LOQ) and detection (LOD) in different meat products of 0.01% and 0.001%, respectively. The specificity was verified in 14 different species. Hence, determining F2 in food by ddPCR can be recommended for quality assurance and control in production systems. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  4. Longitudinal exchange: an alternative strategy towards quantification of dynamics parameters in ZZ exchange spectroscopy

    International Nuclear Information System (INIS)

    Kloiber, Karin; Spitzer, Romana; Grutsch, Sarina; Kreutz, Christoph; Tollinger, Martin

    2011-01-01

    Longitudinal exchange experiments facilitate the quantification of the rates of interconversion between the exchanging species, along with their longitudinal relaxation rates, by analyzing the time-dependence of direct correlation and exchange cross peaks. Here we present a simple and robust alternative to this strategy, which is based on the combination of two complementary experiments, one with and one without resolving exchange cross peaks. We show that by combining the two data sets systematic errors that are caused by differential line-broadening of the exchanging species are avoided and reliable quantification of kinetic and relaxation parameters in the presence of additional conformational exchange on the ms–μs time scale is possible. The strategy is applied to a bistable DNA oligomer that displays different line-broadening in the two exchanging species.

  5. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  6. Raman spectroscopy for DNA quantification in cell nucleus.

    Science.gov (United States)

    Okotrub, K A; Surovtsev, N V; Semeshin, V F; Omelyanchuk, L V

    2015-01-01

    Here we demonstrate the feasibility of a novel approach to quantify DNA in cell nuclei. This approach is based on spectroscopy analysis of Raman light scattering, and avoids the problem of nonstoichiometric binding of dyes to DNA, as it directly measures the signal from DNA. Quantitative analysis of nuclear DNA contribution to Raman spectrum could be reliably performed using intensity of a phosphate mode at 1096 cm(-1) . When compared to the known DNA standards from cells of different animals, our results matched those values at error of 10%. We therefore suggest that this approach will be useful to expand the list of DNA standards, to properly adjust the duration of hydrolysis in Feulgen staining, to assay the applicability of fuchsines for DNA quantification, as well as to measure DNA content in cells with complex hydrolysis patterns, when Feulgen densitometry is inappropriate. © 2014 International Society for Advancement of Cytometry.

  7. Inter-rater reliability of direct observations of the physical and psychosocial working conditions in eldercare: An evaluation in the DOSES project

    NARCIS (Netherlands)

    Karstad, K. (Kristina); Rugulies, R. (Reiner); Skotte, J. (Jørgen); Munch, P.K. (Pernille Kold); Greiner, B.A. (Birgit A.); Burdorf, A. (Alex); Søgaard, K. (Karen); A. Holtermann (Andreas)

    2018-01-01

    textabstractThe aim of the study was to develop and evaluate the reliability of the “Danish observational study of eldercare work and musculoskeletal disorders” (DOSES) observation instrument to assess physical and psychosocial risk factors for musculoskeletal disorders (MSD) in eldercare work.

  8. The reliability and accuracy of two methods for proximal caries detection and depth on directly visible proximal surfaces: an in vitro study

    DEFF Research Database (Denmark)

    Ekstrand, K R; Alloza, Alvaro Luna; Promisiero, L

    2011-01-01

    This study aimed to determine the reliability and accuracy of the ICDAS and radiographs in detecting and estimating the depth of proximal lesions on extracted teeth. The lesions were visible to the naked eye. Three trained examiners scored a total of 132 sound/carious proximal surfaces from 106 p...

  9. Interactive reliability assessment using an integrated reliability data bank

    International Nuclear Information System (INIS)

    Allan, R.N.; Whitehead, A.M.

    1986-01-01

    The logical structure, techniques and practical application of a computer-aided technique based on a microcomputer using floppy disc Random Access Files is described. This interactive computational technique is efficient if the reliability prediction program is coupled directly to a relevant source of data to create an integrated reliability assessment/reliability data bank system. (DG)

  10. Precise Quantitative Assessment of the Clinical Performances of Two High-Flux Polysulfone Hemodialyzers in Hemodialysis: Validation of a Blood-Based Simple Kinetic Model Versus Direct Dialysis Quantification.

    Science.gov (United States)

    Lim, Paik-Seong; Lin, Yuyu; Chen, Minfeng; Xu, Xiaoqi; Shi, Yun; Bowry, Sudhir; Canaud, Bernard

    2018-05-01

    Highly permeable dialysis membranes with better design filters have contributed to improved solute removal and dialysis efficacy. However, solute membrane permeability needs to be well controlled to avoid increased loss of albumin that is considered to be detrimental for dialysis patients. A novel high-flux dialyzer type (FX CorDiax; Fresenius Medical Care) incorporating an advanced polysulfone membrane modified with nano-controlled spinning technology to enhance the elimination of a broader spectrum of uremic toxins has been released. The aim of this study was to compare in the clinical setting two dialyzer types having the same surface area, the current (FX dialyzer) and the new dialyzer generation (FX CorDiax), with respect to solute removal capacity over a broad spectrum of markers, including assessment of albumin loss based on a direct dialysis quantification method. We performed a crossover study following an A1-B-A2 design involving 10 patients. Phase A1 was 1 week of thrice-weekly bicarbonate hemodialysis with the FX dialyzer, 4 h per treatment; phase B was performed with a similar treatment regimen but with a new FX CorDiax dialyzer and finally the phase A2 was repeated with FX dialyzer as the former phase. Solute removal markers of interest were assessed from blood samples taken before and after treatment and from total spent dialysate collection (direct dialysis quantification) permitting a mass transfer calculation (mg/session into total spent dialysate/ultrafiltrate). On the blood side, there were no significant differences in the solute percent reduction between FX CorDiax 80 and FX 80. On the dialysate side, no difference was observed regarding eliminated mass of different solutes including β 2 -microglobulin (143.1 ± 33.6 vs. 138.3 ± 41.9 mg, P = 0.8), while the solute mass removal of total protein (1.65 ± 0.51 vs. 2.14 ± 0.75 g, P = 0.04), and albumin (0.41 ± 0.21 vs. 1.22 ± 0.51 g, P < 0.001) were

  11. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon

    2015-01-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against...... human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification...

  12. FRANX. Application for analysis and quantification of the APS fire

    International Nuclear Information System (INIS)

    Snchez, A.; Osorio, F.; Ontoso, N.

    2014-01-01

    The FRANX application has been developed by EPRI within the Risk and Reliability User Group in order to facilitate the process of quantification and updating APS Fire (also covers floods and earthquakes). By applying fire scenarios are quantified in the central integrating the tasks performed during the APS fire. This paper describes the main features of the program to allow quantification of an APS Fire. (Author)

  13. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  14. Reliability training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  15. Metal Stable Isotope Tagging: Renaissance of Radioimmunoassay for Multiplex and Absolute Quantification of Biomolecules.

    Science.gov (United States)

    Liu, Rui; Zhang, Shixi; Wei, Chao; Xing, Zhi; Zhang, Sichun; Zhang, Xinrong

    2016-05-17

    The unambiguous quantification of biomolecules is of great significance in fundamental biological research as well as practical clinical diagnosis. Due to the lack of a detectable moiety, the direct and highly sensitive quantification of biomolecules is often a "mission impossible". Consequently, tagging strategies to introduce detectable moieties for labeling target biomolecules were invented, which had a long and significant impact on studies of biomolecules in the past decades. For instance, immunoassays have been developed with radioisotope tagging by Yalow and Berson in the late 1950s. The later languishment of this technology can be almost exclusively ascribed to the use of radioactive isotopes, which led to the development of nonradioactive tagging strategy-based assays such as enzyme-linked immunosorbent assay, fluorescent immunoassay, and chemiluminescent and electrochemiluminescent immunoassay. Despite great success, these strategies suffered from drawbacks such as limited spectral window capacity for multiplex detection and inability to provide absolute quantification of biomolecules. After recalling the sequences of tagging strategies, an apparent question is why not use stable isotopes from the start? A reasonable explanation is the lack of reliable means for accurate and precise quantification of stable isotopes at that time. The situation has changed greatly at present, since several atomic mass spectrometric measures for metal stable isotopes have been developed. Among the newly developed techniques, inductively coupled plasma mass spectrometry is an ideal technique to determine metal stable isotope-tagged biomolecules, for its high sensitivity, wide dynamic linear range, and more importantly multiplex and absolute quantification ability. Since the first published report by our group, metal stable isotope tagging has become a revolutionary technique and gained great success in biomolecule quantification. An exciting research highlight in this area

  16. Quantification of osmotic water transport in vivo using fluorescent albumin.

    Science.gov (United States)

    Morelle, Johann; Sow, Amadou; Vertommen, Didier; Jamar, François; Rippe, Bengt; Devuyst, Olivier

    2014-10-15

    Osmotic water transport across the peritoneal membrane is applied during peritoneal dialysis to remove the excess water accumulated in patients with end-stage renal disease. The discovery of aquaporin water channels and the generation of transgenic animals have stressed the need for novel and accurate methods to unravel molecular mechanisms of water permeability in vivo. Here, we describe the use of fluorescently labeled albumin as a reliable indicator of osmotic water transport across the peritoneal membrane in a well-established mouse model of peritoneal dialysis. After detailed evaluation of intraperitoneal tracer mass kinetics, the technique was validated against direct volumetry, considered as the gold standard. The pH-insensitive dye Alexa Fluor 555-albumin was applied to quantify osmotic water transport across the mouse peritoneal membrane resulting from modulating dialysate osmolality and genetic silencing of the water channel aquaporin-1 (AQP1). Quantification of osmotic water transport using Alexa Fluor 555-albumin closely correlated with direct volumetry and with estimations based on radioiodinated ((125)I) serum albumin (RISA). The low intraperitoneal pressure probably accounts for the negligible disappearance of the tracer from the peritoneal cavity in this model. Taken together, these data demonstrate the appropriateness of pH-insensitive Alexa Fluor 555-albumin as a practical and reliable intraperitoneal volume tracer to quantify osmotic water transport in vivo. Copyright © 2014 the American Physiological Society.

  17. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  18. Quantification of local mobilities

    DEFF Research Database (Denmark)

    Zhang, Y. B.

    2018-01-01

    A new method for quantification of mobilities of local recrystallization boundary segments is presented. The quantification is based on microstructures characterized using electron microscopy and on determination of migration velocities and driving forces for local boundary segments. Pure aluminium...... is investigated and the results show that even for a single recrystallization boundary, different boundary segments migrate differently, and the differences can be understood based on variations in mobilities and local deformed microstructures. The present work has important implications for understanding...

  19. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  20. SHARP1: A revised systematic human action reliability procedure

    International Nuclear Information System (INIS)

    Wakefield, D.J.; Parry, G.W.; Hannaman, G.W.; Spurgin, A.J.

    1990-12-01

    Individual plant examinations (IPE) are being performed by utilities to evaluate plant-specific vulnerabilities to severe accidents. A major tool in performing an IPE is a probabilistic risk assessment (PRA). The importance of human interactions in determining the plant response in past PRAs is well documented. The modeling and quantification of the probabilities of human interactions have been the subjects of considerable research by the Electric Power Research Institute (EPRI). A revised framework, SHARP1, for incorporating human interactions into PRA is summarized in this report. SHARP1 emphasizes that the process stages are iterative and directed at specific goals rather than being performed sequentially in a stepwise procedure. This expanded summary provides the reader with a flavor of the full report content. Excerpts from the full report are presented, following the same outline as the full report. In the full report, the interface of the human reliability analysis with the plant logic model development in a PRA is given special attention. In addition to describing a methodology framework, the report also discusses the types of human interactions to be evaluated, and how to formulate a project team to perform the human reliability analysis. A concise description and comparative evaluation of the selected existing methods of quantification of human error are also presented. Four case studies are also provided to illustrate the SHARP1 process

  1. Human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1992-01-01

    This book resulted from the activity of Task Force 4.2 - 'Human Reliability'. This group was established on February 27th, 1986, at the plenary meeting of the Technical Reliability Committee of VDI, within the framework of the joint committee of VDI on industrial systems technology - GIS. It is composed of representatives of industry, representatives of research institutes, of technical control boards and universities, whose job it is to study how man fits into the technical side of the world of work and to optimize this interaction. In a total of 17 sessions, information from the part of ergonomy dealing with human reliability in using technical systems at work was exchanged, and different methods for its evaluation were examined and analyzed. The outcome of this work was systematized and compiled in this book. (orig.) [de

  2. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  3. Fluorescent quantification of melanin.

    Science.gov (United States)

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  4. The practical analysis of food: the development of Sakalar quantification table of DNA (SQT-DNA).

    Science.gov (United States)

    Sakalar, Ergün

    2013-11-15

    Practical and highly sensitive Sakalar quantification table of DNA (SQT-DNA) has been developed for the detection% of species-specific DNA amount in food products. Cycle threshold (Ct) data were obtained from multiple curves of real-time qPCR. The statistical analysis was done to estimate the concentration of standard dilutions. Amplicon concentrations versus each Ct value were assessed by the predictions of targets at known concentrations. SQT-DNA was prepared by using the percentage versus each Ct values. The applicability of SQT-DNA to commercial foods was proved by using sausages containing varying ratios of beef, chicken, and soybean. The results showed that SQT-DNA can be used to directly quantify food DNA by a single PCR without the need to construct a standart curve in parallel with the samples every time the experiment is performed, and also quantification by SQT-DNA is as reliable as standard curve quantification for a wide range of DNA concentrations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. DIRECT GEOREFERENCING ON SMALL UNMANNED AERIAL PLATFORMS FOR IMPROVED RELIABILITY AND ACCURACY OF MAPPING WITHOUT THE NEED FOR GROUND CONTROL POINTS

    Directory of Open Access Journals (Sweden)

    O. Mian

    2015-08-01

    Full Text Available This paper presents results from a Direct Mapping Solution (DMS comprised of an Applanix APX-15 UAV GNSS-Inertial system integrated with a Sony a7R camera to produce highly accurate ortho-rectified imagery without Ground Control Points on a Microdrones md4-1000 platform. A 55 millimeter Nikkor f/1.8 lens was mounted on the Sony a7R and the camera was then focused and calibrated terrestrially using the Applanix camera calibration facility, and then integrated with the APX-15 UAV GNSS-Inertial system using a custom mount specifically designed for UAV applications. In July 2015, Applanix and Avyon carried out a test flight of this system. The goal of the test flight was to assess the performance of DMS APX-15 UAV direct georeferencing system on the md4-1000. The area mapped during the test was a 250 x 300 meter block in a rural setting in Ontario, Canada. Several ground control points are distributed within the test area. The test included 8 North-South lines and 1 cross strip flown at 80 meters AGL, resulting in a ~1 centimeter Ground Sample Distance (GSD. Map products were generated from the test flight using Direct Georeferencing, and then compared for accuracy against the known positions of ground control points in the test area. The GNSS-Inertial data collected by the APX-15 UAV was post-processed in Single Base mode, using a base station located in the project area via POSPac UAV. The base-station’s position was precisely determined by processing a 12-hour session using the CSRS-PPP Post Processing service. The ground control points were surveyed in using differential GNSS post-processing techniques with respect to the base-station.

  6. Uncertainty Quantification in Alchemical Free Energy Methods.

    Science.gov (United States)

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-05-02

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  7. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    Science.gov (United States)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential

  8. Evaluation of flaw characteristics and their influence on inservice inspection reliability

    International Nuclear Information System (INIS)

    Becker, F.L.

    1980-01-01

    This report describes the results of the first year's effort of a five year program which is being conducted by Battelle, Pacific Northwest Laboratories, on behalf of the US Nuclear Regulatory Commission. This initial effort was directed toward identification and quantification of inspection uncertainties, which are likely to occur during inservice inspection of LWR primary piping systems, and their influence on inspection reliability. These experiments were conducted on 304 stainless steel samples, however, the results are equally applicable to other materials. Later portions of the program will extend these measurements and evaluations to other materials and conditions

  9. Disease quantification in dermatology

    DEFF Research Database (Denmark)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very ...

  10. Human factors reliability benchmark exercise: a review

    International Nuclear Information System (INIS)

    Humphreys, P.

    1990-01-01

    The Human Factors Reliability Benchmark Exercise has addressed the issues of identification, analysis, representation and quantification of Human Error in order to identify the strengths and weaknesses of available techniques. Using a German PWR nuclear powerplant as the basis for the studies, fifteen teams undertook evaluations of a routine functional Test and Maintenance procedure plus an analysis of human actions during an operational transient. The techniques employed by the teams are discussed and reviewed on a comparative basis. The qualitative assessments performed by each team compare well, but at the quantification stage there is much less agreement. (author)

  11. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    Science.gov (United States)

    Ryan Wagner; Robert Moon; Jon Pratt; Gordon Shaw; Arvind Raman

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale...

  12. Image cytometry: nuclear and chromosomal DNA quantification.

    Science.gov (United States)

    Carvalho, Carlos Roberto; Clarindo, Wellington Ronildo; Abreu, Isabella Santiago

    2011-01-01

    Image cytometry (ICM) associates microscopy, digital image and software technologies, and has been particularly useful in spatial and densitometric cytological analyses, such as DNA ploidy and DNA content measurements. Basically, ICM integrates methodologies of optical microscopy calibration, standard density filters, digital CCD camera, and image analysis softwares for quantitative applications. Apart from all system calibration and setup, cytological protocols must provide good slide preparations for efficient and reliable ICM analysis. In this chapter, procedures for ICM applications employed in our laboratory are described. Protocols shown here for human DNA ploidy determination and quantification of nuclear and chromosomal DNA content in plants could be used as described, or adapted for other studies.

  13. Safety and reliability criteria

    International Nuclear Information System (INIS)

    O'Neil, R.

    1978-01-01

    Nuclear power plants and, in particular, reactor pressure boundary components have unique reliability requirements, in that usually no significant redundancy is possible, and a single failure can give rise to possible widespread core damage and fission product release. Reliability may be required for availability or safety reasons, but in the case of the pressure boundary and certain other systems safety may dominate. Possible Safety and Reliability (S and R) criteria are proposed which would produce acceptable reactor design. Without some S and R requirement the designer has no way of knowing how far he must go in analysing his system or component, or whether his proposed solution is likely to gain acceptance. The paper shows how reliability targets for given components and systems can be individually considered against the derived S and R criteria at the design and construction stage. Since in the case of nuclear pressure boundary components there is often very little direct experience on which to base reliability studies, relevant non-nuclear experience is examined. (author)

  14. Prediction of software operational reliability using testing environment factors

    International Nuclear Information System (INIS)

    Jung, Hoan Sung; Seong, Poong Hyun

    1995-01-01

    For many years, many researches have focused on the quantification of software reliability and there are many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. The experiences show that the operational reliability is higher than the test reliability User's interest is on the operational reliability rather than on the test reliability, however. With the assumption that the difference in reliability results from the change of environment, testing environment factors comprising the aging factor and the coverage factor are defined in this study to predict the ultimate operational reliability with the failure data. It is by incorporating test environments applied beyond the operational profile into testing environment factors. The application results are close to the actual data

  15. An Introduction To Reliability

    International Nuclear Information System (INIS)

    Park, Kyoung Su

    1993-08-01

    This book introduces reliability with definition of reliability, requirement of reliability, system of life cycle and reliability, reliability and failure rate such as summary, reliability characteristic, chance failure, failure rate which changes over time, failure mode, replacement, reliability in engineering design, reliability test over assumption of failure rate, and drawing of reliability data, prediction of system reliability, conservation of system, failure such as summary and failure relay and analysis of system safety.

  16. Uncertainty Quantification Bayesian Framework for Porous Media Flows

    Science.gov (United States)

    Demyanov, V.; Christie, M.; Erbas, D.

    2005-12-01

    Uncertainty quantification is an increasingly important aspect of many areas of applied science, where the challenge is to make reliable predictions about the performance of complex physical systems in the absence of complete or reliable data. Predicting flows of fluids through undersurface reservoirs is an example of a complex system where accuracy in prediction is needed (e.g. in oil industry it is essential for financial reasons). Simulation of fluid flow in oil reservoirs is usually carried out using large commercially written finite difference simulators solving conservation equations describing the multi-phase flow through the porous reservoir rocks, which is a highly computationally expensive task. This work examines a Bayesian Framework for uncertainty quantification in porous media flows that uses a stochastic sampling algorithm to generate models that match observed time series data. The framework is flexible for a wide range of general physical/statistical parametric models, which are used to describe the underlying hydro-geological process in its temporal dynamics. The approach is based on exploration of the parameter space and update of the prior beliefs about what the most likely model definitions are. Optimization problem for a highly parametric physical model usually have multiple solutions, which impact the uncertainty of the made predictions. Stochastic search algorithm (e.g. genetic algorithm) allows to identify multiple "good enough" models in the parameter space. Furthermore, inference of the generated model ensemble via MCMC based algorithm evaluates the posterior probability of the generated models and quantifies uncertainty of the predictions. Machine learning algorithm - Artificial Neural Networks - are used to speed up the identification of regions in parameter space where good matches to observed data can be found. Adaptive nature of ANN allows to develop different ways of integrating them into the Bayesian framework: as direct time

  17. A Direct Aqueous Derivatization GSMS Method for Determining Benzoylecgonine Concentrations in Human Urine.

    Science.gov (United States)

    Chericoni, Silvio; Stefanelli, Fabio; Da Valle, Ylenia; Giusiani, Mario

    2015-09-01

    A sensitive and reliable method for extraction and quantification of benzoylecgonine (BZE) and cocaine (COC) in urine is presented. Propyl-chloroformate was used as derivatizing agent, and it was directly added to the urine sample: the propyl derivative and COC were then recovered by liquid-liquid extraction procedure. Gas chromatography-mass spectrometry was used to detect the analytes in selected ion monitoring mode. The method proved to be precise for BZE and COC both in term of intraday and interday analysis, with a coefficient of variation (CV)0.999 and >0.997, respectively) within the range investigated. The method, applied to thirty authentic samples, showed to be very simple, fast, and reliable, so it can be easily applied in routine analysis for the quantification of BZE and COC in urine samples. © 2015 American Academy of Forensic Sciences.

  18. Towards Reliable Integrated Services for Dependable Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Ravn, Anders Peter; Izadi-Zamanabadi, Roozbeh

    Reliability issues for various technical systems are discussed and focus is directed towards distributed systems, where communication facilities are vital to maintain system functionality. Reliability in communication subsystems is considered as a resource to be shared among a number of logical c...... applications residing on alternative routes. Details are provided for the operation of RRRSVP based on reliability slack calculus. Conclusions summarize the considerations and give directions for future research....... connections and a reliability management framework is suggested. We suggest a network layer level reliability management protocol RRSVP (Reliability Resource Reservation Protocol) as a counterpart of the RSVP for bandwidth and time resource management. Active and passive standby redundancy by background...

  19. Towards Reliable Integrated Services for Dependable Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Ravn, Anders Peter; Izadi-Zamanabadi, Roozbeh

    2003-01-01

    Reliability issues for various technical systems are discussed and focus is directed towards distributed systems, where communication facilities are vital to maintain system functionality. Reliability in communication subsystems is considered as a resource to be shared among a number of logical c...... applications residing on alternative routes. Details are provided for the operation of RRRSVP based on reliability slack calculus. Conclusions summarize the considerations and give directions for future research....... connections and a reliability management framework is suggested. We suggest a network layer level reliability management protocol RRSVP (Reliability Resource Reservation Protocol) as a counterpart of the RSVP for bandwidth and time resource management. Active and passive standby redundancy by background...

  20. Prediction of safety critical software operational reliability from test reliability using testing environment factors

    International Nuclear Information System (INIS)

    Jung, Hoan Sung; Seong, Poong Hyun

    1999-01-01

    It has been a critical issue to predict the safety critical software reliability in nuclear engineering area. For many years, many researches have focused on the quantification of software reliability and there have been many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. User's interest is however on the operational reliability rather than on the test reliability. The experiences show that the operational reliability is higher than the test reliability. With the assumption that the difference in reliability results from the change of environment, from testing to operation, testing environment factors comprising the aging factor and the coverage factor are developed in this paper and used to predict the ultimate operational reliability with the failure data in testing phase. It is by incorporating test environments applied beyond the operational profile into testing environment factors. The application results show that the proposed method can estimate the operational reliability accurately. (Author). 14 refs., 1 tab., 1 fig

  1. Reliability Based Management of Marine Fouling

    DEFF Research Database (Denmark)

    Faber, Michael Havbro; Hansen, Peter Friis

    1999-01-01

    The present paper describes the results of a recent study on the application of methods from structural reliability to optimise management of marine fouling on jacket type structures.In particular the study addresses effects on the structural response by assessment and quantification of uncertain......The present paper describes the results of a recent study on the application of methods from structural reliability to optimise management of marine fouling on jacket type structures.In particular the study addresses effects on the structural response by assessment and quantification...... of uncertainties of a set of parameters. These are the seasonal variation of marine fouling parameters, the wave loading (taking into account the seasonal variation in sea-state statistics), and the effects of spatial variations and seasonal effects of marine fouling parameters. Comparison of design values...

  2. Easy, Fast, and Reproducible Quantification of Cholesterol and Other Lipids in Human Plasma by Combined High Resolution MSX and FTMS Analysis

    Science.gov (United States)

    Gallego, Sandra F.; Højlund, Kurt; Ejsing, Christer S.

    2018-01-01

    Reliable, cost-effective, and gold-standard absolute quantification of non-esterified cholesterol in human plasma is of paramount importance in clinical lipidomics and for the monitoring of metabolic health. Here, we compared the performance of three mass spectrometric approaches available for direct detection and quantification of cholesterol in extracts of human plasma. These approaches are high resolution full scan Fourier transform mass spectrometry (FTMS) analysis, parallel reaction monitoring (PRM), and novel multiplexed MS/MS (MSX) technology, where fragments from selected precursor ions are detected simultaneously. Evaluating the performance of these approaches in terms of dynamic quantification range, linearity, and analytical precision showed that the MSX-based approach is superior to that of the FTMS and PRM-based approaches. To further show the efficacy of this approach, we devised a simple routine for extensive plasma lipidome characterization using only 8 μL of plasma, using a new commercially available ready-to-spike-in mixture with 14 synthetic lipid standards, and executing a single 6 min sample injection with combined MSX analysis for cholesterol quantification and FTMS analysis for quantification of sterol esters, glycerolipids, glycerophospholipids, and sphingolipids. Using this simple routine afforded reproducible and absolute quantification of 200 lipid species encompassing 13 lipid classes in human plasma samples. Notably, the analysis time of this procedure can be shortened for high throughput-oriented clinical lipidomics studies or extended with more advanced MSALL technology (Almeida R. et al., J. Am. Soc. Mass Spectrom. 26, 133-148 [1]) to support in-depth structural elucidation of lipid molecules. [Figure not available: see fulltext.

  3. Cut set-based risk and reliability analysis for arbitrarily interconnected networks

    Science.gov (United States)

    Wyss, Gregory D.

    2000-01-01

    Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.

  4. A highly sensitive method for quantification of iohexol

    DEFF Research Database (Denmark)

    Schulz, A.; Boeringer, F.; Swifka, J.

    2014-01-01

    -chromatography-electrospray-massspectrometry (LC-ESI-MS) approach using the multiple reaction monitoring mode for iohexol quantification. In order to test whether a significantly decreased amount of iohexol is sufficient for reliable quantification, a LC-ESI-MS approach was assessed. We analyzed the kinetics of iohexol in rats after application...... of different amounts of iohexol (15 mg to 150 1.tg per rat). Blood sampling was conducted at four time points, at 15, 30, 60, and 90 min, after iohexol injection. The analyte (iohexol) and the internal standard (iotha(amic acid) were separated from serum proteins using a centrifugal filtration device...... with a cut-off of 3 kDa. The chromatographic separation was achieved on an analytical Zorbax SB C18 column. The detection and quantification were performed on a high capacity trap mass spectrometer using positive ion ESI in the multiple reaction monitoring (MRM) mode. Furthermore, using real-time polymerase...

  5. Frontiers of reliability

    CERN Document Server

    Basu, Asit P; Basu, Sujit K

    1998-01-01

    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  6. Quantification of Safety-Critical Software Test Uncertainty

    International Nuclear Information System (INIS)

    Khalaquzzaman, M.; Cho, Jaehyun; Lee, Seung Jun; Jung, Wondea

    2015-01-01

    The method, conservatively assumes that the failure probability of a software for the untested inputs is 1, and the failure probability turns in 0 for successful testing of all test cases. However, in reality the chance of failure exists due to the test uncertainty. Some studies have been carried out to identify the test attributes that affect the test quality. Cao discussed the testing effort, testing coverage, and testing environment. Management of the test uncertainties was discussed in. In this study, the test uncertainty has been considered to estimate the software failure probability because the software testing process is considered to be inherently uncertain. A reliability estimation of software is very important for a probabilistic safety analysis of a digital safety critical system of NPPs. This study focused on the estimation of the probability of a software failure that considers the uncertainty in software testing. In our study, BBN has been employed as an example model for software test uncertainty quantification. Although it can be argued that the direct expert elicitation of test uncertainty is much simpler than BBN estimation, however the BBN approach provides more insights and a basis for uncertainty estimation

  7. Direct estimation of diffuse gaseous emissions from coal fires: current methods and future directions

    Science.gov (United States)

    Engle, Mark A.; Olea, Ricardo A.; O'Keefe, Jennifer M. K.; Hower, James C.; Geboy, Nicholas J.

    2013-01-01

    Coal fires occur in nature spontaneously, contribute to increases in greenhouse gases, and emit atmospheric toxicants. Increasing interest in quantifying coal fire emissions has resulted in the adaptation and development of specialized approaches and adoption of numerical modeling techniques. Overview of these methods for direct estimation of diffuse gas emissions from coal fires is presented in this paper. Here we take advantage of stochastic Gaussian simulation to interpolate CO2 fluxes measured using a dynamic closed chamber at the Ruth Mullins coal fire in Perry County, Kentucky. This approach allows for preparing a map of diffuse gas emissions, one of the two primary ways that gases emanate from coal fires, and establishing the reliability of the study both locally and for the entire fire. Future research directions include continuous and automated sampling to improve quantification of gaseous coal fire emissions.

  8. Aerospace reliability applied to biomedicine.

    Science.gov (United States)

    Lalli, V. R.; Vargo, D. J.

    1972-01-01

    An analysis is presented that indicates that the reliability and quality assurance methodology selected by NASA to minimize failures in aerospace equipment can be applied directly to biomedical devices to improve hospital equipment reliability. The Space Electric Rocket Test project is used as an example of NASA application of reliability and quality assurance (R&QA) methods. By analogy a comparison is made to show how these same methods can be used in the development of transducers, instrumentation, and complex systems for use in medicine.

  9. The value of reliability

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Karlström, Anders

    2010-01-01

    We derive the value of reliability in the scheduling of an activity of random duration, such as travel under congested conditions. Using a simple formulation of scheduling utility, we show that the maximal expected utility is linear in the mean and standard deviation of trip duration, regardless...... of the form of the standardised distribution of trip durations. This insight provides a unification of the scheduling model and models that include the standard deviation of trip duration directly as an argument in the cost or utility function. The results generalise approximately to the case where the mean...

  10. Automation of a Nile red staining assay enables high throughput quantification of microalgal lipid production.

    Science.gov (United States)

    Morschett, Holger; Wiechert, Wolfgang; Oldiges, Marco

    2016-02-09

    Within the context of microalgal lipid production for biofuels and bulk chemical applications, specialized higher throughput devices for small scale parallelized cultivation are expected to boost the time efficiency of phototrophic bioprocess development. However, the increasing number of possible experiments is directly coupled to the demand for lipid quantification protocols that enable reliably measuring large sets of samples within short time and that can deal with the reduced sample volume typically generated at screening scale. To meet these demands, a dye based assay was established using a liquid handling robot to provide reproducible high throughput quantification of lipids with minimized hands-on-time. Lipid production was monitored using the fluorescent dye Nile red with dimethyl sulfoxide as solvent facilitating dye permeation. The staining kinetics of cells at different concentrations and physiological states were investigated to successfully down-scale the assay to 96 well microtiter plates. Gravimetric calibration against a well-established extractive protocol enabled absolute quantification of intracellular lipids improving precision from ±8 to ±2 % on average. Implementation into an automated liquid handling platform allows for measuring up to 48 samples within 6.5 h, reducing hands-on-time to a third compared to manual operation. Moreover, it was shown that automation enhances accuracy and precision compared to manual preparation. It was revealed that established protocols relying on optical density or cell number for biomass adjustion prior to staining may suffer from errors due to significant changes of the cells' optical and physiological properties during cultivation. Alternatively, the biovolume was used as a measure for biomass concentration so that errors from morphological changes can be excluded. The newly established assay proved to be applicable for absolute quantification of algal lipids avoiding limitations of currently established

  11. Human factors reliability benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1989-08-01

    The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim of assessing the state of the art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participated in the HF-RBE. The HF-RBE was organised around two study cases: (1) analysis of routine functional Test and Maintenance (TPM) procedures: with the aim of assessing the probability of test induced failures, the probability of failures to remain unrevealed and the potential to initiate transients because of errors performed in the test; (2) analysis of human actions during an operational transient: with the aim of assessing the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. This report summarises the contributions received from the participants and analyses these contributions on a comparative basis. The aim of this analysis was to compare the procedures, modelling techniques and quantification methods used, to obtain insight in the causes and magnitude of the variability observed in the results, to try to identify preferred human reliability assessment approaches and to get an understanding of the current state of the art in the field identifying the limitations that are still inherent to the different approaches

  12. System Reliability Engineering

    International Nuclear Information System (INIS)

    Lim, Tae Jin

    2005-02-01

    This book tells of reliability engineering, which includes quality and reliability, reliability data, importance of reliability engineering, reliability and measure, the poisson process like goodness of fit test and the poisson arrival model, reliability estimation like exponential distribution, reliability of systems, availability, preventive maintenance such as replacement policies, minimal repair policy, shock models, spares, group maintenance and periodic inspection, analysis of common cause failure, and analysis model of repair effect.

  13. A reduction approach to improve the quantification of linked fault trees through binary decision diagrams

    International Nuclear Information System (INIS)

    Ibanez-Llano, Cristina; Rauzy, Antoine; Melendez, Enrique; Nieto, Francisco

    2010-01-01

    Over the last two decades binary decision diagrams have been applied successfully to improve Boolean reliability models. Conversely to the classical approach based on the computation of the MCS, the BDD approach involves no approximation in the quantification of the model and is able to handle correctly negative logic. However, when models are sufficiently large and complex, as for example the ones coming from the PSA studies of the nuclear industry, it begins to be unfeasible to compute the BDD within a reasonable amount of time and computer memory. Therefore, simplification or reduction of the full model has to be considered in some way to adapt the application of the BDD technology to the assessment of such models in practice. This paper proposes a reduction process based on using information provided by the set of the most relevant minimal cutsets of the model in order to perform the reduction directly on it. This allows controlling the degree of reduction and therefore the impact of such simplification on the final quantification results. This reduction is integrated in an incremental procedure that is compatible with the dynamic generation of the event trees and therefore adaptable to the recent dynamic developments and extensions of the PSA studies. The proposed method has been applied to a real case study, and the results obtained confirm that the reduction enables the BDD computation while maintaining accuracy.

  14. A reduction approach to improve the quantification of linked fault trees through binary decision diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez-Llano, Cristina, E-mail: cristina.ibanez@iit.upcomillas.e [Instituto de Investigacion Tecnologica (IIT), Escuela Tecnica Superior de Ingenieria ICAI, Universidad Pontificia Comillas, C/Santa Cruz de Marcenado 26, 28015 Madrid (Spain); Rauzy, Antoine, E-mail: Antoine.RAUZY@3ds.co [Dassault Systemes, 10 rue Marcel Dassault CS 40501, 78946 Velizy Villacoublay, Cedex (France); Melendez, Enrique, E-mail: ema@csn.e [Consejo de Seguridad Nuclear (CSN), C/Justo Dorado 11, 28040 Madrid (Spain); Nieto, Francisco, E-mail: nieto@iit.upcomillas.e [Instituto de Investigacion Tecnologica (IIT), Escuela Tecnica Superior de Ingenieria ICAI, Universidad Pontificia Comillas, C/Santa Cruz de Marcenado 26, 28015 Madrid (Spain)

    2010-12-15

    Over the last two decades binary decision diagrams have been applied successfully to improve Boolean reliability models. Conversely to the classical approach based on the computation of the MCS, the BDD approach involves no approximation in the quantification of the model and is able to handle correctly negative logic. However, when models are sufficiently large and complex, as for example the ones coming from the PSA studies of the nuclear industry, it begins to be unfeasible to compute the BDD within a reasonable amount of time and computer memory. Therefore, simplification or reduction of the full model has to be considered in some way to adapt the application of the BDD technology to the assessment of such models in practice. This paper proposes a reduction process based on using information provided by the set of the most relevant minimal cutsets of the model in order to perform the reduction directly on it. This allows controlling the degree of reduction and therefore the impact of such simplification on the final quantification results. This reduction is integrated in an incremental procedure that is compatible with the dynamic generation of the event trees and therefore adaptable to the recent dynamic developments and extensions of the PSA studies. The proposed method has been applied to a real case study, and the results obtained confirm that the reduction enables the BDD computation while maintaining accuracy.

  15. Detection and quantification of beef and pork materials in meat products by duplex droplet digital PCR

    OpenAIRE

    Cai, Yicun; He, Yuping; Lv, Rong; Chen, Hongchao; Wang, Qiang; Pan, Liangwen

    2017-01-01

    Meat products often consist of meat from multiple animal species, and inaccurate food product adulteration and mislabeling can negatively affect consumers. Therefore, a cost-effective and reliable method for identification and quantification of animal species in meat products is required. In this study, we developed a duplex droplet digital PCR (dddPCR) detection and quantification system to simultaneously identify and quantify the source of meat in samples containing a mixture of beef (Bos t...

  16. Quantification of competitive value of documents

    Directory of Open Access Journals (Sweden)

    Pavel Šimek

    2009-01-01

    Full Text Available The majority of Internet users use the global network to search for different information using fulltext search engines such as Google, Yahoo!, or Seznam. The web presentation operators are trying, with the help of different optimization techniques, to get to the top places in the results of fulltext search engines. Right there is a great importance of Search Engine Optimization and Search Engine Marketing, because normal users usually try links only on the first few pages of the fulltext search engines results on certain keywords and in catalogs they use primarily hierarchically higher placed links in each category. Key to success is the application of optimization methods which deal with the issue of keywords, structure and quality of content, domain names, individual sites and quantity and reliability of backward links. The process is demanding, long-lasting and without a guaranteed outcome. A website operator without advanced analytical tools do not identify the contribution of individual documents from which the entire web site consists. If the web presentation operators want to have an overview of their documents and web site in global, it is appropriate to quantify these positions in a specific way, depending on specific key words. For this purpose serves the quantification of competitive value of documents, which consequently sets global competitive value of a web site. Quantification of competitive values is performed on a specific full-text search engine. For each full-text search engine can be and often are, different results. According to published reports of ClickZ agency or Market Share is according to the number of searches by English-speaking users most widely used Google search engine, which has a market share of more than 80%. The whole procedure of quantification of competitive values is common, however, the initial step which is the analysis of keywords depends on a choice of the fulltext search engine.

  17. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  18. Advancing agricultural greenhouse gas quantification*

    Science.gov (United States)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    . 4. Current data infrastructure and systems supporting GHG quantification in the agricultural sector To understand the challenges facing GHG quantification it is helpful to understand the existing supporting infrastructure and systems for quantification. The existing and developing structures for national and local data acquisition and management are the foundation for the empirical and process-based models used by most countries and projects currently quantifying agricultural greenhouse gases. Direct measurement can be used to complement and supplement such models, but this is not yet sufficient by itself given costs, complexities, and uncertainties. One of the primary purposes of data acquisition and quantification is for national-level inventories and planning. For such efforts countries are conducting national-level collection of activity data (who is doing which agricultural practices where) and some are also developing national or regional-level emissions factors. Infrastructure that supports these efforts includes intergovernmental panels, global alliances, and data-sharing networks. Multilateral data sharing for applications, such as the FAO Statistical Database (FAOSTAT) (FAO 2012), the IPCC Emission Factor Database (IPCC 2012), and UNFCCC national inventories (UNFCCC 2012), are building greater consistency and standardization by using global standards such as the IPCC's Good Practice Guidance for Land Use, Land-Use Change and Forestry (e.g., IPCC 1996, 2003, 2006). There is also work on common quantification methods and accounting, for example agreed on global warming potentials for different contributing gases and GHG quantification methodologies for projects (e.g., the Verified Carbon Standard Sustainable Agricultural Land Management [SALM] protocol, VCS 2011). Other examples include the Global Research Alliance on Agricultural Greenhouse Gases (2012) and GRACEnet (Greenhouse gas Reduction through Agricultural Carbon Enhancement network) (USDA

  19. Automated Quantification of Hematopoietic Cell – Stromal Cell Interactions in Histological Images of Undecalcified Bone

    Science.gov (United States)

    Zehentmeier, Sandra; Cseresnyes, Zoltan; Escribano Navarro, Juan; Niesner, Raluca A.; Hauser, Anja E.

    2015-01-01

    Confocal microscopy is the method of choice for the analysis of localization of multiple cell types within complex tissues such as the bone marrow. However, the analysis and quantification of cellular localization is difficult, as in many cases it relies on manual counting, thus bearing the risk of introducing a rater-dependent bias and reducing interrater reliability. Moreover, it is often difficult to judge whether the co-localization between two cells results from random positioning, especially when cell types differ strongly in the frequency of their occurrence. Here, a method for unbiased quantification of cellular co-localization in the bone marrow is introduced. The protocol describes the sample preparation used to obtain histological sections of whole murine long bones including the bone marrow, as well as the staining protocol and the acquisition of high-resolution images. An analysis workflow spanning from the recognition of hematopoietic and non-hematopoietic cell types in 2-dimensional (2D) bone marrow images to the quantification of the direct contacts between those cells is presented. This also includes a neighborhood analysis, to obtain information about the cellular microenvironment surrounding a certain cell type. In order to evaluate whether co-localization of two cell types is the mere result of random cell positioning or reflects preferential associations between the cells, a simulation tool which is suitable for testing this hypothesis in the case of hematopoietic as well as stromal cells, is used. This approach is not limited to the bone marrow, and can be extended to other tissues to permit reproducible, quantitative analysis of histological data. PMID:25938636

  20. Benchmark of systematic human action reliability procedure

    International Nuclear Information System (INIS)

    Spurgin, A.J.; Hannaman, G.W.; Moieni, P.

    1986-01-01

    Probabilistic risk assessment (PRA) methodology has emerged as one of the most promising tools for assessing the impact of human interactions on plant safety and understanding the importance of the man/machine interface. Human interactions were considered to be one of the key elements in the quantification of accident sequences in a PRA. The approach to quantification of human interactions in past PRAs has not been very systematic. The Electric Power Research Institute sponsored the development of SHARP to aid analysts in developing a systematic approach for the evaluation and quantification of human interactions in a PRA. The SHARP process has been extensively peer reviewed and has been adopted by the Institute of Electrical and Electronics Engineers as the basis of a draft guide for the industry. By carrying out a benchmark process, in which SHARP is an essential ingredient, however, it appears possible to assess the strengths and weaknesses of SHARP to aid human reliability analysts in carrying out human reliability analysis as part of a PRA

  1. Verb aspect, alternations and quantification

    Directory of Open Access Journals (Sweden)

    Svetla Koeva

    2015-11-01

    Full Text Available Verb aspect, alternations and quantification In this paper we are briefly discuss the nature of Bulgarian verb aspect and argue that the verb aspect pairs are different lexical units with different (although related meaning, different argument structure (reflecting categories, explicitness and referential status of arguments and different sets of semantic and syntactic alternations. The verb prefixes resulting in perfective verbs derivation in some cases can be interpreted as lexical quantifiers as well. Thus the Bulgarian verb aspect is related (in different way both with the potential for the generation of alternations and with the prefixal lexical quantification. It is shown that the scope of the lexical quantification by means of verbal prefixes is the quantified verb phrase and the scope remains constant in all derived alternations. The paper concerns the basic issues of these complex problems, while the detailed description of the conditions satisfying particular alternation or particular lexical quantification are subject of a more detailed study.

  2. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    ]. There are also bootstrapping and cross-validation approaches.Sometimes analyses are conducted using surrogate models [12]. The availability of so many options can be confusing. Categorizing methods based on fundamental questions assists in communicating the essential results of uncertainty analyses to stakeholders. Such questions can focus on model adequacy (e.g., How well does the model reproduce observed system characteristics and dynamics?) and sensitivity analysis (e.g., What parameters can be estimated with available data? What observations are important to parameters and predictions? What parameters are important to predictions?), as well as on the uncertainty quantification (e.g., How accurate and precise are the predictions?). The methods can also be classified by the number of model runs required: few (10s to 1000s) or many (10,000s to 1,000,000s). Of the methods listed above, the most computationally frugal are generally those based on local derivatives; MCMC methods tend to be among the most computationally demanding. Surrogate models (emulators)do not necessarily produce computational frugality because many runs of the full model are generally needed to create a meaningful surrogate model. With this categorization, we can, in general, address all the fundamental questions mentioned above using either computationally frugal or demanding methods. Model development and analysis can thus be conducted consistently using either computation-ally frugal or demanding methods; alternatively, different fundamental questions can be addressed using methods that require different levels of effort. Based on this perspective, we pose the question: Can computationally frugal methods be useful companions to computationally demanding meth-ods? The reliability of computationally frugal methods generally depends on the model being reasonably linear, which usually means smooth nonlin-earities and the assumption of Gaussian errors; both tend to be more valid with more linear

  3. Quantification of renal function

    International Nuclear Information System (INIS)

    Mubarak, Amani Hayder

    1999-06-01

    The evaluation of glomerular filtration rate (GFR) with Tc99m-DTPA using single injection with multiple blood sample method (plasma clearance), is a standard and reliable method but the procedure is complicated and may not suitable for routine clinical use. Alternatively, estimation of GFR by using Tc99m-DTPA and gamma camera computer system is very simple, dose not require sampling of blood or urine and provide individual kidney value of GFR (integral, uptake index methods)

  4. HCV-RNA quantification in liver bioptic samples and extrahepatic compartments, using the abbott RealTime HCV assay.

    Science.gov (United States)

    Antonucci, FrancescoPaolo; Cento, Valeria; Sorbo, Maria Chiara; Manuelli, Matteo Ciancio; Lenci, Ilaria; Sforza, Daniele; Di Carlo, Domenico; Milana, Martina; Manzia, Tommaso Maria; Angelico, Mario; Tisone, Giuseppe; Perno, Carlo Federico; Ceccherini-Silberstein, Francesca

    2017-08-01

    We evaluated the performance of a rapid method to quantify HCV-RNA in the hepatic and extrahepatic compartments, by using for the first time the Abbott RealTime HCV-assay. Non-tumoral (NT), tumoral (TT) liver samples, lymph nodes and ascitic fluid from patients undergoing orthotopic-liver-transplantation (N=18) or liver resection (N=4) were used for the HCV-RNA quantification; 5/22 patients were tested after or during direct acting antivirals (DAA) treatment. Total RNA and DNA quantification from tissue-biopsies allowed normalization of HCV-RNA concentrations in IU/μg of total RNA and IU/10 6 liver-cells, respectively. HCV-RNA was successfully quantified with high reliability in liver biopsies, lymph nodes and ascitic fluid samples. Among the 17 untreated patients, a positive and significant HCV-RNA correlation between serum and NT liver-samples was observed (Pearson: rho=0.544, p=0.024). Three DAA-treated patients were HCV-RNA "undetectable" in serum, but still "detectable" in all tested liver-tissues. Differently, only one DAA-treated patient, tested after sustained-virological-response, showed HCV-RNA "undetectability" in liver-tissue. HCV-RNA was successfully quantified with high reliability in liver bioptic samples and extrahepatic compartments, even when HCV-RNA was "undetectable" in serum. Abbott RealTime HCV-assay is a good diagnostic tool for HCV quantification in intra- and extra-hepatic compartments, whenever a bioptic sample is available. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Quantification of informed opinion

    International Nuclear Information System (INIS)

    Rasmuson, D.M.

    1985-01-01

    The objective of this session, Quantification of Informed Opinion, is to provide the statistician with a better understanding of this important area. The NRC uses informed opinion, sometimes called engineering judgment or subjective judgment, in many areas. Sometimes informed opinion is the only source of information that exists, especially in phenomenological areas, such as steam explosions, where experiments are costly and phenomena are very difficult to measure. There are many degrees of informed opinion. These vary from the weatherman who makes predictions concerning relatively high probability events with a large data base to the phenomenological expert who must use his intuition tempered with basic knowledge and little or no measured data to predict the behavior of events with a low probability of occurrence. The first paper in this session provides the reader with an overview of the subject area. The second paper provides some aspects that must be considered in the collection of informed opinion to improve the quality of the information. The final paper contains an example of the use of informed opinion in the area of seismic hazard characterization. These papers should be useful to researchers and statisticians who need to collect and use informed opinion in their work

  6. Quantification In Neurology

    Directory of Open Access Journals (Sweden)

    Netravati M

    2005-01-01

    Full Text Available There is a distinct shift of emphasis in clinical neurology in the last few decades. A few years ago, it was just sufficient for a clinician to precisely record history, document signs, establish diagnosis and write prescription. In the present context, there has been a significant intrusion of scientific culture in clinical practice. Several criteria have been proposed, refined and redefined to ascertain accurate diagnosis for many neurological disorders. Introduction of the concept of impairment, disability, handicap and quality of life has added new dimension to the measurement of health and disease and neurological disorders are no exception. "Best guess" treatment modalities are no more accepted and evidence based medicine has become an integral component of medical care. Traditional treatments need validation and new therapies require vigorous trials. Thus, proper quantification in neurology has become essential, both in practice and research methodology in neurology. While this aspect is widely acknowledged, there is a limited access to a comprehensive document pertaining to measurements in neurology. This following description is a critical appraisal of various measurements and also provides certain commonly used rating scales/scores in neurological practice.

  7. AMSAA Reliability Growth Guide

    National Research Council Canada - National Science Library

    Broemm, William

    2000-01-01

    ... has developed reliability growth methodology for all phases of the process, from planning to tracking to projection. The report presents this methodology and associated reliability growth concepts.

  8. The fair value of operational reliability

    International Nuclear Information System (INIS)

    Patino-Echeverri, Dalia; Morel, Benoit

    2005-01-01

    Information about the uncertainties that surround the operation of the power system can be used to enlighten the debate of how much reliability should be pursued and how resources should be allocated to pursue it. In this paper we present a method to determine the value of having flexible generators to react to load fluctuations. This value can be seen as the value of hedging against the uncertainty on the load due to the volatility of the demand and the possibility of congestion. Because having this flexibility can be related to a financial option, we will use an extension of options theory and in particular the risk-neutral valuation method, to find a risk neutral quantification of its value. We illustrate our point valuing the flexibility that leads to ''operational reliability'' in the PJM market. Our formula for that value is what we call ''the fair value'' of operational reliability. (Author)

  9. Direct and indirect methods for the quantification of leg volume: Comparison between water displacement volumetry, the disk model method and the frustum sign model method, using the correlation coefficient and the limits of agreement

    NARCIS (Netherlands)

    D.M.K.S. Kaulesar Sukul (D. M K S); P.Th. den Hoed (Pieter); T. Johannes (Tanja); R. van Dolder (R.); E. Benda (Eric)

    1993-01-01

    textabstractVolume changes can be measured either directly by water-displacement volumetry or by various indirect methods in which calculation of the volume is based on circumference measurements. The aim of the present study was to determine the most appropriate indirect method for lower leg volume

  10. Human factors considerations for reliability and safety

    International Nuclear Information System (INIS)

    Carnino, A.

    1985-01-01

    Human factors in many industries have become an important issue, since the last few years. They should be considered during the whole life time of a plant: design, fabrication and construction, licensing, operation. Improvements have been performed in the field of man-machine interface such as procedures, control room lay-out, operator aids, training. In order to meet the needs of reliability and probabilistic risk studies, quantification of human errors has been developed but needs still improvements in the field of cognitive behaviour, diagnosis and representation errors. Data banks to support these quantifications are still in a development stage. This applies to nuclear power plants and several examples are given to illustrate the above ideas. In conclusion, human factors field is in a very quickly evolving process but the tendency is still to adapt the man to the machines whilst the reverse would be desirable

  11. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  12. Direct Quantification of Cd2+ in the Presence of Cu2+ by a Combination of Anodic Stripping Voltammetry Using a Bi-Film-Modified Glassy Carbon Electrode and an Artificial Neural Network.

    Science.gov (United States)

    Zhao, Guo; Wang, Hui; Liu, Gang

    2017-07-03

    Abstract : In this study, a novel method based on a Bi/glassy carbon electrode (Bi/GCE) for quantitatively and directly detecting Cd 2+ in the presence of Cu 2+ without further electrode modifications by combining square-wave anodic stripping voltammetry (SWASV) and a back-propagation artificial neural network (BP-ANN) has been proposed. The influence of the Cu 2+ concentration on the stripping response to Cd 2+ was studied. In addition, the effect of the ferrocyanide concentration on the SWASV detection of Cd 2+ in the presence of Cu 2+ was investigated. A BP-ANN with two inputs and one output was used to establish the nonlinear relationship between the concentration of Cd 2+ and the stripping peak currents of Cu 2+ and Cd 2+ . The factors affecting the SWASV detection of Cd 2+ and the key parameters of the BP-ANN were optimized. Moreover, the direct calibration model (i.e., adding 0.1 mM ferrocyanide before detection), the BP-ANN model and other prediction models were compared to verify the prediction performance of these models in terms of their mean absolute errors (MAEs), root mean square errors (RMSEs) and correlation coefficients. The BP-ANN model exhibited higher prediction accuracy than the direct calibration model and the other prediction models. Finally, the proposed method was used to detect Cd 2+ in soil samples with satisfactory results.

  13. A taxonomy for human reliability analysis

    International Nuclear Information System (INIS)

    Beattie, J.D.; Iwasa-Madge, K.M.

    1984-01-01

    A human interaction taxonomy (classification scheme) was developed to facilitate human reliability analysis in a probabilistic safety evaluation of a nuclear power plant, being performed at Ontario Hydro. A human interaction occurs, by definition, when operators or maintainers manipulate, or respond to indication from, a plant component or system. The taxonomy aids the fault tree analyst by acting as a heuristic device. It helps define the range and type of human errors to be identified in the construction of fault trees, while keeping the identification by different analysts consistent. It decreases the workload associated with preliminary quantification of the large number of identified interactions by including a category called 'simple interactions'. Fault tree analysts quantify these according to a procedure developed by a team of human reliability specialists. The interactions which do not fit into this category are called 'complex' and are quantified by the human reliability team. The taxonomy is currently being used in fault tree construction in a probabilistic safety evaluation. As far as can be determined at this early stage, the potential benefits of consistency and completeness in identifying human interactions and streamlining the initial quantification are being realized

  14. Effects of humic acid on DNA quantification with Quantifiler® Human DNA Quantification kit and short tandem repeat amplification efficiency.

    Science.gov (United States)

    Seo, Seung Bum; Lee, Hye Young; Zhang, Ai Hua; Kim, Hye Yeon; Shin, Dong Hoon; Lee, Soong Deok

    2012-11-01

    Correct DNA quantification is an essential part to obtain reliable STR typing results. Forensic DNA analysts often use commercial kits for DNA quantification; among them, real-time-based DNA quantification kits are most frequently used. Incorrect DNA quantification due to the presence of PCR inhibitors may affect experiment results. In this study, we examined the alteration degree of DNA quantification results estimated in DNA samples containing a PCR inhibitor by using a Quantifiler® Human DNA Quantification kit. For experiments, we prepared approximately 0.25 ng/μl DNA samples containing various concentrations of humic acid (HA). The quantification results were 0.194-0.303 ng/μl at 0-1.6 ng/μl HA (final concentration in the Quantifiler reaction) and 0.003-0.168 ng/μl at 2.4-4.0 ng/μl HA. Most DNA quantity was undetermined when HA concentration was higher than 4.8 ng/μl HA. The C (T) values of an internal PCR control (IPC) were 28.0-31.0, 36.5-37.1, and undetermined at 0-1.6, 2.4, and 3.2 ng/μl HA. These results indicate that underestimated DNA quantification results may be obtained in the DNA sample with high C (T) values of IPC. Thus, researchers should carefully interpret the DNA quantification results. We additionally examined the effects of HA on the STR amplification by using an Identifiler® kit and a MiniFiler™ kit. Based on the results of this study, it is thought that a better understanding of various effects of HA would help researchers recognize and manipulate samples containing HA.

  15. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Directory of Open Access Journals (Sweden)

    Žel Jana

    2006-08-01

    Full Text Available Abstract Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was

  16. Critical points of DNA quantification by real-time PCR--effects of DNA extraction method and sample matrix on quantification of genetically modified organisms.

    Science.gov (United States)

    Cankar, Katarina; Stebih, Dejan; Dreo, Tanja; Zel, Jana; Gruden, Kristina

    2006-08-14

    Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to

  17. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Science.gov (United States)

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary

  18. Results of the event sequence reliability benchmark exercise

    International Nuclear Information System (INIS)

    Silvestri, E.

    1990-01-01

    The Event Sequence Reliability Benchmark Exercise is the fourth of a series of benchmark exercises on reliability and risk assessment, with specific reference to nuclear power plant applications, and is the logical continuation of the previous benchmark exercises on System Analysis Common Cause Failure and Human Factors. The reference plant is the Nuclear Power Plant at Grohnde Federal Republic of Germany a 1300 MW PWR plant of KWU design. The specific objective of the Exercise is to model, to quantify and to analyze such event sequences initiated by the occurrence of a loss of offsite power that involve the steam generator feed. The general aim is to develop a segment of a risk assessment, which ought to include all the specific aspects and models of quantification, such as common canal failure, Human Factors and System Analysis, developed in the previous reliability benchmark exercises, with the addition of the specific topics of dependences between homologous components belonging to different systems featuring in a given event sequence and of uncertainty quantification, to end up with an overall assessment of: - the state of the art in risk assessment and the relative influences of quantification problems in a general risk assessment framework. The Exercise has been carried out in two phases, both requiring modelling and quantification, with the second phase adopting more restrictive rules and fixing certain common data, as emerged necessary from the first phase. Fourteen teams have participated in the Exercise mostly from EEC countries, with one from Sweden and one from the USA. (author)

  19. Reliability data banks

    International Nuclear Information System (INIS)

    Cannon, A.G.; Bendell, A.

    1991-01-01

    Following an introductory chapter on Reliability, what is it, why it is needed, how it is achieved and measured, the principles of reliability data bases and analysis methodologies are the subject of the next two chapters. Achievements due to the development of data banks are mentioned for different industries in the next chapter, FACTS, a comprehensive information system for industrial safety and reliability data collection in process plants are covered next. CREDO, the Central Reliability Data Organization is described in the next chapter and is indexed separately, as is the chapter on DANTE, the fabrication reliability Data analysis system. Reliability data banks at Electricite de France and IAEA's experience in compiling a generic component reliability data base are also separately indexed. The European reliability data system, ERDS, and the development of a large data bank come next. The last three chapters look at 'Reliability data banks, - friend foe or a waste of time'? and future developments. (UK)

  20. Suncor maintenance and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Little, S. [Suncor Energy, Calgary, AB (Canada)

    2006-07-01

    Fleet maintenance and reliability at Suncor Energy was discussed in this presentation, with reference to Suncor Energy's primary and support equipment fleets. This paper also discussed Suncor Energy's maintenance and reliability standard involving people, processes and technology. An organizational maturity chart that graphed organizational learning against organizational performance was illustrated. The presentation also reviewed the maintenance and reliability framework; maintenance reliability model; the process overview of the maintenance and reliability standard; a process flow chart of maintenance strategies and programs; and an asset reliability improvement process flow chart. An example of an improvement initiative was included, with reference to a shovel reliability review; a dipper trip reliability investigation; bucket related failures by type and frequency; root cause analysis of the reliability process; and additional actions taken. Last, the presentation provided a graph of the results of the improvement initiative and presented the key lessons learned. tabs., figs.

  1. Development of a highly reliable CRT processor

    International Nuclear Information System (INIS)

    Shimizu, Tomoya; Saiki, Akira; Hirai, Kenji; Jota, Masayoshi; Fujii, Mikiya

    1996-01-01

    Although CRT processors have been employed by the main control board to reduce the operator's workload during monitoring, the control systems are still operated by hardware switches. For further advancement, direct controller operation through a display device is expected. A CRT processor providing direct controller operation must be as reliable as the hardware switches are. The authors are developing a new type of highly reliable CRT processor that enables direct controller operations. In this paper, we discuss the design principles behind a highly reliable CRT processor. The principles are defined by studies of software reliability and of the functional reliability of the monitoring and operation systems. The functional configuration of an advanced CRT processor is also addressed. (author)

  2. Reliability evaluation of smart distribution grids

    OpenAIRE

    Kazemi, Shahram

    2011-01-01

    The term "Smart Grid" generally refers to a power grid equipped with the advanced technologies dedicated for purposes such as reliability improvement, ease of control and management, integrating of distributed energy resources and electricity market operations. Improving the reliability of electric power delivered to the end users is one of the main targets of employing smart grid technologies. The smart grid investments targeted for reliability improvement can be directed toward the generati...

  3. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  4. A high area, porous and resistant platinized stainless steel fiber coated by nanostructured polypyrrole for direct HS-SPME of nicotine in biological samples prior to GC-FID quantification.

    Science.gov (United States)

    Abdolhosseini, Sana; Ghiasvand, Alireza; Heidari, Nahid

    2017-09-01

    The surface of a stainless steel fiber was made porous, resistant and cohesive using electrophoretic deposition and coated by the nanostructured polypyrrole using an amended in-situ electropolymerization method. The coated fiber was applied for direct extraction of nicotine in biological samples through a headspace solid-phase microextraction (HS-SPME) method followed by GC-FID determination. The effects of the important experimental variables on the efficiency of the developed HS-SPME-GC-FID method, including pH of sample solution, extraction temperature and time, stirring rate, and ionic strength were evaluated and optimized. Under the optimal experimental conditions, the calibration curve was linear over the range of 0.1-20μgmL -1 and the detection limit was obtained 20ngmL -1 . Relative standard deviation (RSD, n=6) was calculated 7.6%. The results demonstrated the superiority of the proposed fiber compared with the most used commercial types. The proposed HS-SPME-GC-FID method was successfully used for the analysis of nicotine in urine and human plasma samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Quantification of 2,5-dimethyl-4-hydroxy-3(2H)-furanone using solid-phase extraction and direct microvial insert thermal desorption gas chromatography-mass spectrometry.

    Science.gov (United States)

    Du, Xiaofen; Qian, Michael

    2008-10-24

    A GC-MS method for the determination of furaneol in fruit juice was developed using Lichrolut-EN solid-phase extraction (SPE) coupled to microvial insert thermal desorption. Lichrolut-EN can effectively extract furaneol from juice, and had much less retention for pigments and other non-volatiles than HLB and C18 columns. The furaneol can be completely eluted out from the Lichrolut-EN SPE column with 1mL of methanol, which can be directly analyzed on GC-MS using an automated large volume microvial insert thermal desorption technique without further purification and concentration. The method is sensitive, has good recovery (98%) and reproducibility (CVfuraneol in some commonly grown strawberry, raspberry, and blackberry cultivars in Pacific Northwest of the United States was determined. Strawberries had the highest concentration of furaneol with 'Totem' and 'Pinnacle' cultivars over 13mgkg(-1) fruit. 'Marion' blackberry had 5 times more furaneol than 'Black Diamond', and 16 times more than 'Thornless Evergreen' blackberry. Raspberries had furaneol concentration ranged from 0.8 to 1.1mgkg(-1) fruit.

  6. Prospective comparison of liver stiffness measurements between two point wave elastography methods: Virtual ouch quantification and elastography point quantification

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Hyun Suk; Lee, Jeong Min; Yoon, Jeong Hee; Lee, Dong Ho; Chang, Won; Han, Joon Koo [Seoul National University Hospital, Seoul (Korea, Republic of)

    2016-09-15

    To prospectively compare technical success rate and reliable measurements of virtual touch quantification (VTQ) elastography and elastography point quantification (ElastPQ), and to correlate liver stiffness (LS) measurements obtained by the two elastography techniques. Our study included 85 patients, 80 of whom were previously diagnosed with chronic liver disease. The technical success rate and reliable measurements of the two kinds of point shear wave elastography (pSWE) techniques were compared by χ{sup 2} analysis. LS values measured using the two techniques were compared and correlated via Wilcoxon signed-rank test, Spearman correlation coefficient, and 95% Bland-Altman limit of agreement. The intraobserver reproducibility of ElastPQ was determined by 95% Bland-Altman limit of agreement and intraclass correlation coefficient (ICC). The two pSWE techniques showed similar technical success rate (98.8% for VTQ vs. 95.3% for ElastPQ, p = 0.823) and reliable LS measurements (95.3% for VTQ vs. 90.6% for ElastPQ, p = 0.509). The mean LS measurements obtained by VTQ (1.71 ± 0.47 m/s) and ElastPQ (1.66 ± 0.41 m/s) were not significantly different (p = 0.209). The LS measurements obtained by the two techniques showed strong correlation (r = 0.820); in addition, the 95% limit of agreement of the two methods was 27.5% of the mean. Finally, the ICC of repeat ElastPQ measurements was 0.991. Virtual touch quantification and ElastPQ showed similar technical success rate and reliable measurements, with strongly correlated LS measurements. However, the two methods are not interchangeable due to the large limit of agreement.

  7. Quantification of uranyl in presence of citric acid

    International Nuclear Information System (INIS)

    Garcia G, N.; Barrera D, C.E.; Ordonez R, E.

    2007-01-01

    To determine the influence that has the organic matter of the soil on the uranyl sorption on some solids is necessary to have a detection technique and quantification of uranyl that it is reliable and sufficiently quick in the obtaining of results. For that in this work, it intends to carry out the uranyl quantification in presence of citric acid modifying the Fluorescence induced by UV-Vis radiation technique. Since the uranyl ion is very sensitive to the medium that contains it, (speciation, pH, ionic forces, etc.) it was necessary to develop an analysis technique that stands out the fluorescence of uranyl ion avoiding the out one that produce the organic acids. (Author)

  8. Post-event human decision errors: operator action tree/time reliability correlation

    Energy Technology Data Exchange (ETDEWEB)

    Hall, R E; Fragola, J; Wreathall, J

    1982-11-01

    This report documents an interim framework for the quantification of the probability of errors of decision on the part of nuclear power plant operators after the initiation of an accident. The framework can easily be incorporated into an event tree/fault tree analysis. The method presented consists of a structure called the operator action tree and a time reliability correlation which assumes the time available for making a decision to be the dominating factor in situations requiring cognitive human response. This limited approach decreases the magnitude and complexity of the decision modeling task. Specifically, in the past, some human performance models have attempted prediction by trying to emulate sequences of human actions, or by identifying and modeling the information processing approach applicable to the task. The model developed here is directed at describing the statistical performance of a representative group of hypothetical individuals responding to generalized situations.

  9. Post-event human decision errors: operator action tree/time reliability correlation

    International Nuclear Information System (INIS)

    Hall, R.E.; Fragola, J.; Wreathall, J.

    1982-11-01

    This report documents an interim framework for the quantification of the probability of errors of decision on the part of nuclear power plant operators after the initiation of an accident. The framework can easily be incorporated into an event tree/fault tree analysis. The method presented consists of a structure called the operator action tree and a time reliability correlation which assumes the time available for making a decision to be the dominating factor in situations requiring cognitive human response. This limited approach decreases the magnitude and complexity of the decision modeling task. Specifically, in the past, some human performance models have attempted prediction by trying to emulate sequences of human actions, or by identifying and modeling the information processing approach applicable to the task. The model developed here is directed at describing the statistical performance of a representative group of hypothetical individuals responding to generalized situations

  10. Human Reliability Program Overview

    Energy Technology Data Exchange (ETDEWEB)

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  11. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  12. Quantification of heterogeneity observed in medical images

    International Nuclear Information System (INIS)

    Brooks, Frank J; Grigsby, Perry W

    2013-01-01

    There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity

  13. Quantification of heterogeneity observed in medical images.

    Science.gov (United States)

    Brooks, Frank J; Grigsby, Perry W

    2013-03-02

    There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity.

  14. Reliability of lifeline networks under seismic hazard

    International Nuclear Information System (INIS)

    Selcuk, A. Sevtap; Yuecemen, M. Semih

    1999-01-01

    Lifelines, such as pipelines, transportation, communication and power transmission systems, are networks which extend spatially over large geographical regions. The quantification of the reliability (survival probability) of a lifeline under seismic threat requires attention, as the proper functioning of these systems during or after a destructive earthquake is vital. In this study, a lifeline is idealized as an equivalent network with the capacity of its elements being random and spatially correlated and a comprehensive probabilistic model for the assessment of the reliability of lifelines under earthquake loads is developed. The seismic hazard that the network is exposed to is described by a probability distribution derived by using the past earthquake occurrence data. The seismic hazard analysis is based on the 'classical' seismic hazard analysis model with some modifications. An efficient algorithm developed by Yoo and Deo (Yoo YB, Deo N. A comparison of algorithms for terminal pair reliability. IEEE Transactions on Reliability 1988; 37: 210-215) is utilized for the evaluation of the network reliability. This algorithm eliminates the CPU time and memory capacity problems for large networks. A comprehensive computer program, called LIFEPACK is coded in Fortran language in order to carry out the numerical computations. Two detailed case studies are presented to show the implementation of the proposed model

  15. Reliable Design Versus Trust

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  16. Pocket Handbook on Reliability

    Science.gov (United States)

    1975-09-01

    exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future

  17. Principles of Bridge Reliability

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, Andrzej S.

    The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated......, and as an example the reliability profile and a sensitivity analyses for a corroded reinforced concrete bridge is shown....

  18. Reliability in engineering '87

    International Nuclear Information System (INIS)

    Tuma, M.

    1987-01-01

    The participants heard 51 papers dealing with the reliability of engineering products. Two of the papers were incorporated in INIS, namely ''Reliability comparison of two designs of low pressure regeneration of the 1000 MW unit at the Temelin nuclear power plant'' and ''Use of probability analysis of reliability in designing nuclear power facilities.''(J.B.)

  19. Prediction of software operational reliability using testing environment factor

    International Nuclear Information System (INIS)

    Jung, Hoan Sung

    1995-02-01

    Software reliability is especially important to customers these days. The need to quantify software reliability of safety-critical systems has been received very special attention and the reliability is rated as one of software's most important attributes. Since the software is an intellectual product of human activity and since it is logically complex, the failures are inevitable. No standard models have been established to prove the correctness and to estimate the reliability of software systems by analysis and/or testing. For many years, many researches have focused on the quantification of software reliability and there are many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. User's interest is on the operational reliability rather than on the test reliability, however. The experiences show that the operational reliability is higher than the test reliability. With the assumption that the difference in reliability results from the change of environment, testing environment factor comprising the aging factor and the coverage factor are defined in this work to predict the ultimate operational reliability with the failure data. It is by incorporating test environments applied beyond the operational profile into testing environment factor Test reliability can also be estimated with this approach without any model change. The application results are close to the actual data. The approach used in this thesis is expected to be applicable to ultra high reliable software systems that are used in nuclear power plants, airplanes, and other safety-critical applications

  20. Developing Reliable Life Support for Mars

    Science.gov (United States)

    Jones, Harry W.

    2017-01-01

    A human mission to Mars will require highly reliable life support systems. Mars life support systems may recycle water and oxygen using systems similar to those on the International Space Station (ISS). However, achieving sufficient reliability is less difficult for ISS than it will be for Mars. If an ISS system has a serious failure, it is possible to provide spare parts, or directly supply water or oxygen, or if necessary bring the crew back to Earth. Life support for Mars must be designed, tested, and improved as needed to achieve high demonstrated reliability. A quantitative reliability goal should be established and used to guide development t. The designers should select reliable components and minimize interface and integration problems. In theory a system can achieve the component-limited reliability, but testing often reveal unexpected failures due to design mistakes or flawed components. Testing should extend long enough to detect any unexpected failure modes and to verify the expected reliability. Iterated redesign and retest may be required to achieve the reliability goal. If the reliability is less than required, it may be improved by providing spare components or redundant systems. The number of spares required to achieve a given reliability goal depends on the component failure rate. If the failure rate is under estimated, the number of spares will be insufficient and the system may fail. If the design is likely to have undiscovered design or component problems, it is advisable to use dissimilar redundancy, even though this multiplies the design and development cost. In the ideal case, a human tended closed system operational test should be conducted to gain confidence in operations, maintenance, and repair. The difficulty in achieving high reliability in unproven complex systems may require the use of simpler, more mature, intrinsically higher reliability systems. The limitations of budget, schedule, and technology may suggest accepting lower and

  1. DETECTION AND QUANTIFICATION OF COW FECAL POLLUTION WITH REAL-TIME PCR

    Science.gov (United States)

    Assessment of health risk and fecal bacteria loads associated with cow fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for enumeration of two recently described cow-specific g...

  2. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  3. Human factor reliability program

    International Nuclear Information System (INIS)

    Knoblochova, L.

    2017-01-01

    The human factor's reliability program was at Slovenske elektrarne, a.s. (SE) nuclear power plants. introduced as one of the components Initiatives of Excellent Performance in 2011. The initiative's goal was to increase the reliability of both people and facilities, in response to 3 major areas of improvement - Need for improvement of the results, Troubleshooting support, Supporting the achievement of the company's goals. The human agent's reliability program is in practice included: - Tools to prevent human error; - Managerial observation and coaching; - Human factor analysis; -Quick information about the event with a human agent; -Human reliability timeline and performance indicators; - Basic, periodic and extraordinary training in human factor reliability(authors)

  4. Reliability analysis of reactor pressure vessel intensity

    International Nuclear Information System (INIS)

    Zheng Liangang; Lu Yongbo

    2012-01-01

    This paper performs the reliability analysis of reactor pressure vessel (RPV) with ANSYS. The analysis method include direct Monte Carlo Simulation method, Latin Hypercube Sampling, central composite design and Box-Behnken Matrix design. The RPV integrity reliability under given input condition is proposed. The result shows that the effects on the RPV base material reliability are internal press, allowable basic stress and elasticity modulus of base material in descending order, and the effects on the bolt reliability are allowable basic stress of bolt material, preload of bolt and internal press in descending order. (authors)

  5. Photoacoustic-fluorescence in vitro flow cytometry for quantification of absorption, scattering and fluorescence properties of the cells

    Science.gov (United States)

    Nedosekin, D. A.; Sarimollaoglu, M.; Foster, S.; Galanzha, E. I.; Zharov, V. P.

    2013-03-01

    Fluorescence flow cytometry is a well-established analytical tool that provides quantification of multiple biological parameters of cells at molecular levels, including their functional states, morphology, composition, proliferation, and protein expression. However, only the fluorescence and scattering parameters of the cells or labels are available for detection. Cell pigmentation, presence of non-fluorescent dyes or nanoparticles cannot be reliably quantified. Herewith, we present a novel photoacoustic (PA) flow cytometry design for simple integration of absorbance measurements into schematics of conventional in vitro flow cytometers. The integrated system allow simultaneous measurements of light absorbance, scattering and of multicolor fluorescence from single cells in the flow at rates up to 2 m/s. We compared various combinations of excitation laser sources for multicolor detection, including simultaneous excitation of PA and fluorescence using a single 500 kHz pulsed nanosecond laser. Multichannel detection scheme allows simultaneous detection of up to 8 labels, including 4 fluorescent tags and 4 PA colors. In vitro PA-fluorescence flow cytometer was used for studies of nanoparticles uptake and for the analysis of cell line pigmentation, including genetically encoded melanin expression in breast cancer cell line. We demonstrate that this system can be used for direct nanotoxicity studies with simultaneous quantification of nanoparticles content and assessment of cell viability using a conventional fluorescent apoptosis assays.

  6. Guarantee of reliability of devices complexes for plastic tube welding

    International Nuclear Information System (INIS)

    Voskresenskij, L.A.; Zajtsev, A.I.; Nelyubov, V.I.; Fedorov, M.A.

    1988-01-01

    Results of calculations and experimental studies on providing reliability of complex for plastic tube welding are presented. Choice of reliability indeces and standards is based. Reliability levels of components are determined. The most waded details are calculated. It is shown that they meet the reqrurements of strength and reliability. Service life tests supported the correct choice of springs. Recommendations on elevating reliability are given. Directions of further developments are shown. 8 refs.; 2 figs.; 1 tab

  7. NEW MODEL FOR QUANTIFICATION OF ICT DEPENDABLE ORGANIZATIONS RESILIENCE

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2011-03-01

    Full Text Available Business environment today demands high reliable organizations in every segment to be competitive on the global market. Beside that, ICT sector is becoming irreplaceable in many fields of business, from the communication to the complex systems for process control and production. To fulfill those requirements and to develop further, many organizations worldwide are implementing business paradigm called - organizations resilience. Although resilience is well known term in many science fields, it is not well studied due to its complex nature. This paper is dealing with developing the new model for assessment and quantification of ICT dependable organizations resilience.

  8. Uncertainty Quantification and Statistical Engineering for Hypersonic Entry Applications

    Science.gov (United States)

    Cozmuta, Ioana

    2011-01-01

    NASA has invested significant resources in developing and validating a mathematical construct for TPS margin management: a) Tailorable for low/high reliability missions; b) Tailorable for ablative/reusable TPS; c) Uncertainty Quantification and Statistical Engineering are valuable tools not exploited enough; and d) Need to define strategies combining both Theoretical Tools and Experimental Methods. The main reason for this lecture is to give a flavor of where UQ and SE could contribute and hope that the broader community will work with us to improve in these areas.

  9. Uncertainty quantification in ion–solid interaction simulations

    Energy Technology Data Exchange (ETDEWEB)

    Preuss, R., E-mail: preuss@ipp.mpg.de; Toussaint, U. von

    2017-02-15

    Within the framework of Bayesian uncertainty quantification we propose a non-intrusive reduced-order spectral approach (polynomial chaos expansion) to the simulation of ion–solid interactions. The method not only reduces the number of function evaluations but provides simultaneously a quantitative measure for which combinations of inputs have the most important impact on the result. It is applied to SDTRIM-simulations (Möller et al., 1988) with several uncertain and Gaussian distributed input parameters (i.e. angle, projectile energy, surface binding energy, target composition) and the results are compared to full-grid based approaches and sampling based methods with respect to reliability, efficiency and scalability.

  10. DNA imaging and quantification using chemi-luminescent probes

    International Nuclear Information System (INIS)

    Dorner, G.; Redjdal, N.; Laniece, P.; Siebert, R.; Tricoire, H.; Valentin, L.

    1999-01-01

    During this interdisciplinary study we have developed an ultra sensitive and reliable imaging system of DNA labelled by chemiluminescence. Based on a liquid nitrogen cooled CCD, the system achieves sensitivities down to 10 fg/mm 2 labelled DNA over a surface area of 25 x 25 cm 2 with a sub-millimeter resolution. Commercially available chemi-luminescent - and enhancer molecules are compared and their reaction conditions optimized for best signal-to-noise ratios. Double labelling was performed to verify quantification with radioactive probes. (authors)

  11. Equipment Reliability Program in NPP Krsko

    International Nuclear Information System (INIS)

    Skaler, F.; Djetelic, N.

    2006-01-01

    Operation that is safe, reliable, effective and acceptable to public is the common message in a mission statement of commercial nuclear power plants (NPPs). To fulfill these goals, nuclear industry, among other areas, has to focus on: 1 Human Performance (HU) and 2 Equipment Reliability (EQ). The performance objective of HU is as follows: The behaviors of all personnel result in safe and reliable station operation. While unwanted human behaviors in operations mostly result directly in the event, the behavior flaws either in the area of maintenance or engineering usually cause decreased equipment reliability. Unsatisfied Human performance leads even the best designed power plants into significant operating events, which can be found as well-known examples in nuclear industry. Equipment reliability is today recognized as the key to success. While the human performance at most NPPs has been improving since the start of WANO / INPO / IAEA evaluations, the open energy market has forced the nuclear plants to reduce production costs and operate more reliably and effectively. The balance between these two (opposite) goals has made equipment reliability even more important for safe, reliable and efficient production. Insisting on on-line operation by ignoring some principles of safety could nowadays in a well-developed safety culture and human performance environment exceed the cost of electricity losses. In last decade the leading USA nuclear companies put a lot of effort to improve equipment reliability primarily based on INPO Equipment Reliability Program AP-913 at their NPP stations. The Equipment Reliability Program is the key program not only for safe and reliable operation, but also for the Life Cycle Management and Aging Management on the way to the nuclear power plant life extension. The purpose of Equipment Reliability process is to identify, organize, integrate and coordinate equipment reliability activities (preventive and predictive maintenance, maintenance

  12. Quality assurance and reliability

    International Nuclear Information System (INIS)

    Normand, J.; Charon, M.

    1975-01-01

    Concern for obtaining high-quality products which will function properly when required to do so is nothing new - it is one manifestation of a conscientious attitude to work. However, the complexity and cost of equipment and the consequences of even temporary immobilization are such that it has become necessary to make special arrangements for obtaining high-quality products and examining what one has obtained. Each unit within an enterprise must examine its own work or arrange for it to be examined; a unit whose specific task is quality assurance is responsible for overall checking, but does not relieve other units of their responsibility. Quality assurance is a form of mutual assistance within an enterprise, designed to remove the causes of faults as far as possible. It begins very early in a project and continues through the ordering stage, construction, start-up trials and operation. Quality and hence reliability are the direct result of what is done at all stages of a project. They depend on constant attention to detail, for even a minor piece of poor workmanship can, in the case of an essential item of equipment, give rise to serious operational difficulties

  13. Collagen Quantification in Tissue Specimens.

    Science.gov (United States)

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  14. Reliability and safety engineering

    CERN Document Server

    Verma, Ajit Kumar; Karanki, Durga Rao

    2016-01-01

    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  15. DNA imaging and quantification using chemi-luminescent probes; Imagerie et quantification d`ADN par chimiluminescence

    Energy Technology Data Exchange (ETDEWEB)

    Dorner, G; Redjdal, N; Laniece, P; Siebert, R; Tricoire, H; Valentin, L [Groupe I.P.B., Experimental Research Division, Inst. de Physique Nucleaire, Paris-11 Univ., 91 - Orsay (France)

    1999-11-01

    During this interdisciplinary study we have developed an ultra sensitive and reliable imaging system of DNA labelled by chemiluminescence. Based on a liquid nitrogen cooled CCD, the system achieves sensitivities down to 10 fg/mm{sup 2} labelled DNA over a surface area of 25 x 25 cm{sup 2} with a sub-millimeter resolution. Commercially available chemi-luminescent - and enhancer molecules are compared and their reaction conditions optimized for best signal-to-noise ratios. Double labelling was performed to verify quantification with radioactive probes. (authors) 1 fig.

  16. Rapid Quantification and Validation of Lipid Concentrations within Liposomes

    Directory of Open Access Journals (Sweden)

    Carla B. Roces

    2016-09-01

    Full Text Available Quantification of the lipid content in liposomal adjuvants for subunit vaccine formulation is of extreme importance, since this concentration impacts both efficacy and stability. In this paper, we outline a high performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD method that allows for the rapid and simultaneous quantification of lipid concentrations within liposomal systems prepared by three liposomal manufacturing techniques (lipid film hydration, high shear mixing, and microfluidics. The ELSD system was used to quantify four lipids: 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC, cholesterol, dimethyldioctadecylammonium (DDA bromide, and ᴅ-(+-trehalose 6,6′-dibehenate (TDB. The developed method offers rapidity, high sensitivity, direct linearity, and a good consistency on the responses (R2 > 0.993 for the four lipids tested. The corresponding limit of detection (LOD and limit of quantification (LOQ were 0.11 and 0.36 mg/mL (DMPC, 0.02 and 0.80 mg/mL (cholesterol, 0.06 and 0.20 mg/mL (DDA, and 0.05 and 0.16 mg/mL (TDB, respectively. HPLC-ELSD was shown to be a rapid and effective method for the quantification of lipids within liposome formulations without the need for lipid extraction processes.

  17. Comparison of DNA Quantification Methods for Next Generation Sequencing.

    Science.gov (United States)

    Robin, Jérôme D; Ludlow, Andrew T; LaRanger, Ryan; Wright, Woodring E; Shay, Jerry W

    2016-04-06

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library's heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality.

  18. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  19. Reliability of electronic systems

    International Nuclear Information System (INIS)

    Roca, Jose L.

    2001-01-01

    Reliability techniques have been developed subsequently as a need of the diverse engineering disciplines, nevertheless they are not few those that think they have been work a lot on reliability before the same word was used in the current context. Military, space and nuclear industries were the first ones that have been involved in this topic, however not only in these environments it is that it has been carried out this small great revolution in benefit of the increase of the reliability figures of the products of those industries, but rather it has extended to the whole industry. The fact of the massive production, characteristic of the current industries, drove four decades ago, to the fall of the reliability of its products, on one hand, because the massively itself and, for other, to the recently discovered and even not stabilized industrial techniques. Industry should be changed according to those two new requirements, creating products of medium complexity and assuring an enough reliability appropriated to production costs and controls. Reliability began to be integral part of the manufactured product. Facing this philosophy, the book describes reliability techniques applied to electronics systems and provides a coherent and rigorous framework for these diverse activities providing a unifying scientific basis for the entire subject. It consists of eight chapters plus a lot of statistical tables and an extensive annotated bibliography. Chapters embrace the following topics: 1- Introduction to Reliability; 2- Basic Mathematical Concepts; 3- Catastrophic Failure Models; 4-Parametric Failure Models; 5- Systems Reliability; 6- Reliability in Design and Project; 7- Reliability Tests; 8- Software Reliability. This book is in Spanish language and has a potentially diverse audience as a text book from academic to industrial courses. (author)

  20. French power system reliability report 2008

    International Nuclear Information System (INIS)

    Tesseron, J.M.

    2009-06-01

    The reliability of the French power system was fully under control in 2008, despite the power outage in the eastern part of the Provence-Alpes-Cote d'Azur region on November 3, which had been dreaded for several years, since it had not been possible to set up a structurally adequate network. Pursuant to a consultation meeting, the reinforcement solution proposed by RTE was approved by the Minister of Energy, boding well for greater reliability in future. Based on the observations presented in this 2008 Report, RTE's Power System Reliability Audit Mission considers that no new recommendations are needed beyond those expressed in previous reliability reports and during reliability audits. The publication of this yearly report is in keeping with RTE's goal to promote the follow-up over time of the evolution of reliability in its various aspects. RTE thus aims to contribute to the development of reliability culture, by encouraging an improved assessment by the different players (both RTE and network users) of the role they play in building reliability, and by advocating the taking into account of reliability and benchmarking in the European organisations of Transmission System Operators. Contents: 1 - Brief overview of the evolution of the internal and external environment; 2 - Operating situations encountered: climatic conditions, supply / demand balance management, operation of interconnections, management of internal congestion, contingencies affecting the transmission facilities; 3 - Evolution of the reliability reference guide: external reference guide: directives, laws, decrees, etc, ETSO, UCTE, ENTSO-E, contracting contributing to reliability, RTE internal reference guide; 4 - Evolution of measures contributing to reliability in the equipment field: intrinsic performances of components (generating sets, protection systems, operation PLC's, instrumentation and control, automatic frequency and voltage controls, transmission facilities, control systems, load

  1. Operational safety reliability research

    International Nuclear Information System (INIS)

    Hall, R.E.; Boccio, J.L.

    1986-01-01

    Operating reactor events such as the TMI accident and the Salem automatic-trip failures raised the concern that during a plant's operating lifetime the reliability of systems could degrade from the design level that was considered in the licensing process. To address this concern, NRC is sponsoring the Operational Safety Reliability Research project. The objectives of this project are to identify the essential tasks of a reliability program and to evaluate the effectiveness and attributes of such a reliability program applicable to maintaining an acceptable level of safety during the operating lifetime at the plant

  2. Circuit design for reliability

    CERN Document Server

    Cao, Yu; Wirth, Gilson

    2015-01-01

    This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units.  The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.

  3. Development of Accident Scenarios and Quantification Methodology for RAON Accelerator

    International Nuclear Information System (INIS)

    Lee, Yongjin; Jae, Moosung

    2014-01-01

    The RIsp (Rare Isotope Science Project) plans to provide neutron-rich isotopes (RIs) and stable heavy ion beams. The accelerator is defined as radiation production system according to Nuclear Safety Law. Therefore, it needs strict operate procedures and safety assurance to prevent radiation exposure. In order to satisfy this condition, there is a need for evaluating potential risk of accelerator from the design stage itself. Though some of PSA researches have been conducted for accelerator, most of them focus on not general accident sequence but simple explanation of accident. In this paper, general accident scenarios are developed by Event Tree and deduce new quantification methodology of Event Tree. In this study, some initial events, which may occur in the accelerator, are selected. Using selected initial events, the accident scenarios of accelerator facility are developed with Event Tree. These results can be used as basic data of the accelerator for future risk assessments. After analyzing the probability of each heading, it is possible to conduct quantification and evaluate the significance of the accident result. If there is a development of the accident scenario for external events, risk assessment of entire accelerator facility will be completed. To reduce the uncertainty of the Event Tree, it is possible to produce a reliable data via the presented quantification techniques

  4. Hawaii Electric System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Loose, Verne William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2012-08-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  5. Hawaii electric system reliability.

    Energy Technology Data Exchange (ETDEWEB)

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  6. Improving machinery reliability

    CERN Document Server

    Bloch, Heinz P

    1998-01-01

    This totally revised, updated and expanded edition provides proven techniques and procedures that extend machinery life, reduce maintenance costs, and achieve optimum machinery reliability. This essential text clearly describes the reliability improvement and failure avoidance steps practiced by best-of-class process plants in the U.S. and Europe.

  7. LED system reliability

    NARCIS (Netherlands)

    Driel, W.D. van; Yuan, C.A.; Koh, S.; Zhang, G.Q.

    2011-01-01

    This paper presents our effort to predict the system reliability of Solid State Lighting (SSL) applications. A SSL system is composed of a LED engine with micro-electronic driver(s) that supplies power to the optic design. Knowledge of system level reliability is not only a challenging scientific

  8. Integrated system reliability analysis

    DEFF Research Database (Denmark)

    Gintautas, Tomas; Sørensen, John Dalsgaard

    Specific targets: 1) The report shall describe the state of the art of reliability and risk-based assessment of wind turbine components. 2) Development of methodology for reliability and risk-based assessment of the wind turbine at system level. 3) Describe quantitative and qualitative measures...

  9. Reliability of neural encoding

    DEFF Research Database (Denmark)

    Alstrøm, Preben; Beierholm, Ulrik; Nielsen, Carsten Dahl

    2002-01-01

    The reliability with which a neuron is able to create the same firing pattern when presented with the same stimulus is of critical importance to the understanding of neuronal information processing. We show that reliability is closely related to the process of phaselocking. Experimental results f...

  10. Considerations on the elements of quantifying human reliability

    International Nuclear Information System (INIS)

    Straeter, Oliver

    2004-01-01

    This paper attempts to provide a contribution for the discussion of what the term 'data' means and how the qualitative perspective can be linked with the quantitative one. It will argue that the terms 'quantitative data' and 'qualitative data' are not distinct but a continuum that spans over the entire spectrum of the expertise that has to be involved in the HRA process. It elaborates the rational behind any human reliability quantification figure and suggests a scientific way forward to better data for human reliability assessment

  11. Design reliability engineering

    International Nuclear Information System (INIS)

    Buden, D.; Hunt, R.N.M.

    1989-01-01

    Improved design techniques are needed to achieve high reliability at minimum cost. This is especially true of space systems where lifetimes of many years without maintenance are needed and severe mass limitations exist. Reliability must be designed into these systems from the start. Techniques are now being explored to structure a formal design process that will be more complete and less expensive. The intent is to integrate the best features of design, reliability analysis, and expert systems to design highly reliable systems to meet stressing needs. Taken into account are the large uncertainties that exist in materials, design models, and fabrication techniques. Expert systems are a convenient method to integrate into the design process a complete definition of all elements that should be considered and an opportunity to integrate the design process with reliability, safety, test engineering, maintenance and operator training. 1 fig

  12. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  13. 77 FR 7526 - Interpretation of Protection System Reliability Standard

    Science.gov (United States)

    2012-02-13

    ... Federal Power Act (FPA) requires a Commission-certified Electric Reliability Organization (ERO) to develop.... Cir. 2009). \\8\\ Mandatory Reliability Standards for the Bulk-Power System, Order No. 693, FERC Stats... a person that is ``directly and materially affected'' by Bulk-Power System reliability may request...

  14. A reliability program approach to operational safety

    International Nuclear Information System (INIS)

    Mueller, C.J.; Bezella, W.A.

    1985-01-01

    A Reliability Program (RP) model based on proven reliability techniques is being formulated for potential application in the nuclear power industry. Methods employed under NASA and military direction, commercial airline and related FAA programs were surveyed and a review of current nuclear risk-dominant issues conducted. The need for a reliability approach to address dependent system failures, operating and emergency procedures and human performance, and develop a plant-specific performance data base for safety decision making is demonstrated. Current research has concentrated on developing a Reliability Program approach for the operating phase of a nuclear plant's lifecycle. The approach incorporates performance monitoring and evaluation activities with dedicated tasks that integrate these activities with operation, surveillance, and maintenance of the plant. The detection, root-cause evaluation and before-the-fact correction of incipient or actual systems failures as a mechanism for maintaining plant safety is a major objective of the Reliability Program. (orig./HP)

  15. Reliability of construction materials

    International Nuclear Information System (INIS)

    Merz, H.

    1976-01-01

    One can also speak of reliability with respect to materials. While for reliability of components the MTBF (mean time between failures) is regarded as the main criterium, this is replaced with regard to materials by possible failure mechanisms like physical/chemical reaction mechanisms, disturbances of physical or chemical equilibrium, or other interactions or changes of system. The main tasks of the reliability analysis of materials therefore is the prediction of the various failure reasons, the identification of interactions, and the development of nondestructive testing methods. (RW) [de

  16. Structural Reliability Methods

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Madsen, H. O.

    The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...... of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature...... of the uncertainties and their interplay is the developed, step-by-step. The concepts presented are illustrated by numerous examples throughout the text....

  17. Reliability and mechanical design

    International Nuclear Information System (INIS)

    Lemaire, Maurice

    1997-01-01

    A lot of results in mechanical design are obtained from a modelisation of physical reality and from a numerical solution which would lead to the evaluation of needs and resources. The goal of the reliability analysis is to evaluate the confidence which it is possible to grant to the chosen design through the calculation of a probability of failure linked to the retained scenario. Two types of analysis are proposed: the sensitivity analysis and the reliability analysis. Approximate methods are applicable to problems related to reliability, availability, maintainability and safety (RAMS)

  18. RTE - 2013 Reliability Report

    International Nuclear Information System (INIS)

    Denis, Anne-Marie

    2014-01-01

    RTE publishes a yearly reliability report based on a standard model to facilitate comparisons and highlight long-term trends. The 2013 report is not only stating the facts of the Significant System Events (ESS), but it moreover underlines the main elements dealing with the reliability of the electrical power system. It highlights the various elements which contribute to present and future reliability and provides an overview of the interaction between the various stakeholders of the Electrical Power System on the scale of the European Interconnected Network. (author)

  19. Calibration transfer of a Raman spectroscopic quantification method for the assessment of liquid detergent compositions between two at-line instruments installed at two liquid detergent production plants.

    Science.gov (United States)

    Brouckaert, D; Uyttersprot, J-S; Broeckx, W; De Beer, T

    2017-09-01

    Calibration transfer of partial least squares (PLS) quantification models is established between two Raman spectrometers located at two liquid detergent production plants. As full recalibration of existing calibration models is time-consuming, labour-intensive and costly, it is investigated whether the use of mathematical correction methods requiring only a handful of standardization samples can overcome the dissimilarities in spectral response observed between both measurement systems. Univariate and multivariate standardization approaches are investigated, ranging from simple slope/bias correction (SBC), local centring (LC) and single wavelength standardization (SWS) to more complex direct standardization (DS) and piecewise direct standardization (PDS). The results of these five calibration transfer methods are compared reciprocally, as well as with regard to a full recalibration. Four PLS quantification models, each predicting the concentration of one of the four main ingredients in the studied liquid detergent composition, are aimed at transferring. Accuracy profiles are established from the original and transferred quantification models for validation purposes. A reliable representation of the calibration models performance before and after transfer is thus established, based on β-expectation tolerance intervals. For each transferred model, it is investigated whether every future measurement that will be performed in routine will be close enough to the unknown true value of the sample. From this validation, it is concluded that instrument standardization is successful for three out of four investigated calibration models using multivariate (DS and PDS) transfer approaches. The fourth transferred PLS model could not be validated over the investigated concentration range, due to a lack of precision of the slave instrument. Comparing these transfer results to a full recalibration on the slave instrument allows comparison of the predictive power of both Raman

  20. When and why are reliable organizations favored?

    DEFF Research Database (Denmark)

    Ethiraj, Sendil; Yi, Sangyoon

    of the ensuing work examined only corollary implications of this observation. We treat the observation as a research question and ask: when and why are reliable organizations favored by evolutionary forces? Using a simple theoretical model, we direct attention at a minimal set of variables that are implicated...... shocks, reliable organizations can in fact outperform their less reliable counterparts if they can take advantage of the knowledge resident in their historical choices. While these results are counter-intuitive, the caveat is that our results are only an existence proof for our theory rather than...

  1. The fair value of operational reliability

    Energy Technology Data Exchange (ETDEWEB)

    Patino-Echeverri, Dalia; Morel, Benoit

    2005-12-15

    Information about the uncertainties that surround the operation of the power system can be used to enlighten the debate of how much reliability should be pursued and how resources should be allocated to pursue it. In this paper we present a method to determine the value of having flexible generators to react to load fluctuations. This value can be seen as the value of hedging against the uncertainty on the load due to the volatility of the demand and the possibility of congestion. Because having this flexibility can be related to a financial option, we will use an extension of options theory and in particular the risk-neutral valuation method, to find a risk neutral quantification of its value. We illustrate our point valuing the flexibility that leads to ''operational reliability'' in the PJM market. Our formula for that value is what we call ''the fair value'' of operational reliability. (Author)

  2. Sensitivity analysis in a structural reliability context

    International Nuclear Information System (INIS)

    Lemaitre, Paul

    2014-01-01

    This thesis' subject is sensitivity analysis in a structural reliability context. The general framework is the study of a deterministic numerical model that allows to reproduce a complex physical phenomenon. The aim of a reliability study is to estimate the failure probability of the system from the numerical model and the uncertainties of the inputs. In this context, the quantification of the impact of the uncertainty of each input parameter on the output might be of interest. This step is called sensitivity analysis. Many scientific works deal with this topic but not in the reliability scope. This thesis' aim is to test existing sensitivity analysis methods, and to propose more efficient original methods. A bibliographical step on sensitivity analysis on one hand and on the estimation of small failure probabilities on the other hand is first proposed. This step raises the need to develop appropriate techniques. Two variables ranking methods are then explored. The first one proposes to make use of binary classifiers (random forests). The second one measures the departure, at each step of a subset method, between each input original density and the density given the subset reached. A more general and original methodology reflecting the impact of the input density modification on the failure probability is then explored. The proposed methods are then applied on the CWNR case, which motivates this thesis. (author)

  3. Data Used in Quantified Reliability Models

    Science.gov (United States)

    DeMott, Diana; Kleinhammer, Roger K.; Kahn, C. J.

    2014-01-01

    Data is the crux to developing quantitative risk and reliability models, without the data there is no quantification. The means to find and identify reliability data or failure numbers to quantify fault tree models during conceptual and design phases is often the quagmire that precludes early decision makers consideration of potential risk drivers that will influence design. The analyst tasked with addressing a system or product reliability depends on the availability of data. But, where is does that data come from and what does it really apply to? Commercial industries, government agencies, and other international sources might have available data similar to what you are looking for. In general, internal and external technical reports and data based on similar and dissimilar equipment is often the first and only place checked. A common philosophy is "I have a number - that is good enough". But, is it? Have you ever considered the difference in reported data from various federal datasets and technical reports when compared to similar sources from national and/or international datasets? Just how well does your data compare? Understanding how the reported data was derived, and interpreting the information and details associated with the data is as important as the data itself.

  4. Approach to reliability assessment

    International Nuclear Information System (INIS)

    Green, A.E.; Bourne, A.J.

    1975-01-01

    Experience has shown that reliability assessments can play an important role in the early design and subsequent operation of technological systems where reliability is at a premium. The approaches to and techniques for such assessments, which have been outlined in the paper, have been successfully applied in variety of applications ranging from individual equipments to large and complex systems. The general approach involves the logical and systematic establishment of the purpose, performance requirements and reliability criteria of systems. This is followed by an appraisal of likely system achievment based on the understanding of different types of variational behavior. A fundamental reliability model emerges from the correlation between the appropriate Q and H functions for performance requirement and achievement. This model may cover the complete spectrum of performance behavior in all the system dimensions

  5. Reference gene identification for reliable normalisation of quantitative RT-PCR data in Setaria viridis.

    Science.gov (United States)

    Nguyen, Duc Quan; Eamens, Andrew L; Grof, Christopher P L

    2018-01-01

    selected reference genes for the accurate and reliable normalisation of S. viridis RT-qPCR expression data. Further, the work reported here forms a highly useful platform for future gene expression quantification in S. viridis and can also be potentially directly translatable to other closely related and agronomically important C 4 crop species.

  6. The rating reliability calculator

    Directory of Open Access Journals (Sweden)

    Solomon David J

    2004-04-01

    Full Text Available Abstract Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program.

  7. Structural systems reliability analysis

    International Nuclear Information System (INIS)

    Frangopol, D.

    1975-01-01

    For an exact evaluation of the reliability of a structure it appears necessary to determine the distribution densities of the loads and resistances and to calculate the correlation coefficients between loads and between resistances. These statistical characteristics can be obtained only on the basis of a long activity period. In case that such studies are missing the statistical properties formulated here give upper and lower bounds of the reliability. (orig./HP) [de

  8. Reliability and maintainability

    International Nuclear Information System (INIS)

    1994-01-01

    Several communications in this conference are concerned with nuclear plant reliability and maintainability; their titles are: maintenance optimization of stand-by Diesels of 900 MW nuclear power plants; CLAIRE: an event-based simulation tool for software testing; reliability as one important issue within the periodic safety review of nuclear power plants; design of nuclear building ventilation by the means of functional analysis; operation characteristic analysis for a power industry plant park, as a function of influence parameters

  9. Reliability data book

    International Nuclear Information System (INIS)

    Bento, J.P.; Boerje, S.; Ericsson, G.; Hasler, A.; Lyden, C.O.; Wallin, L.; Poern, K.; Aakerlund, O.

    1985-01-01

    The main objective for the report is to improve failure data for reliability calculations as parts of safety analyses for Swedish nuclear power plants. The work is based primarily on evaluations of failure reports as well as information provided by the operation and maintenance staff of each plant. In the report are presented charts of reliability data for: pumps, valves, control rods/rod drives, electrical components, and instruments. (L.E.)

  10. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  11. Analysis and Application of Reliability

    International Nuclear Information System (INIS)

    Jeong, Hae Seong; Park, Dong Ho; Kim, Jae Ju

    1999-05-01

    This book tells of analysis and application of reliability, which includes definition, importance and historical background of reliability, function of reliability and failure rate, life distribution and assumption of reliability, reliability of unrepaired system, reliability of repairable system, sampling test of reliability, failure analysis like failure analysis by FEMA and FTA, and cases, accelerated life testing such as basic conception, acceleration and acceleration factor, and analysis of accelerated life testing data, maintenance policy about alternation and inspection.

  12. Reliability databases: State-of-the-art and perspectives

    DEFF Research Database (Denmark)

    Akhmedjanov, Farit

    2001-01-01

    The report gives a history of development and an overview of the existing reliability databases. This overview also describes some other (than computer databases) sources of reliability and failures information, e.g. reliability handbooks, but the mainattention is paid to standard models...... and software packages containing the data mentioned. The standards corresponding to collection and exchange of reliability data are observed too. Finally, perspective directions in such data sources development areshown....

  13. Quantification of the proliferation of arbuscular mycorrhizal fungi in soil

    Science.gov (United States)

    Zhang, Ning; Lilje, Osu; McGee, Peter

    2013-04-01

    Good soil structure is important for sustaining agricultural production and preserving functions of the soil ecosystem. Soil aggregation is a critically important component of soil structure. Stable aggregates enable water infiltration, gas exchange for biological activities of plant roots and microorganisms, living space and surfaces for soil microbes, and contribute to stabilization of organic matter and storage of organic carbon (OC) in soil. Soil aggregation involves fine roots, organic matter and hyphae of arbuscular mycorrhizal (AM) fungi. Hyphal proliferation is essential for soil aggregation and sequestration of OC in soil. We do not yet have a mechanism to directly quantify the density of hyphae in soil. Organic materials and available phosphorus are two of the major factors that influence fungi in soil. Organic materials are a source of energy for saprotrophic microbes. Fungal hyphae increase in the presence of organic matter. Phosphorus is an important element usually found in ecosystems. The low availability of phosphorus limits the biological activity of microbes. AM fungi benefit plants by delivering phosphorus to the root system. However, the density and the length of hyphae of AM fungi do not appear to be influenced by available phosphorus. A number of indirect methods have been used to visualize distribution of fungi in soil. Reliable analyses of soil are limited because of soil characteristics. Soils are fragile, and fragility limits opportunity for non-destructive analysis. The soil ecosystem is complex. Soil particles are dense and the density obscures the visualization of fungal hyphae. Fungal hyphae are relatively fine and information at the small scale (hyphae of AM fungi. Hyphae were quantified in an artificial soil matrix using micro-computer aided tomography. Micro-computer aided tomography provides three dimensional images of hyphal ramification through electron lucent materials and enables the visualization and quantification of hyphae

  14. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1993-05-01

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  15. Integrated Reliability and Risk Analysis System (IRRAS)

    International Nuclear Information System (INIS)

    Russell, K.D.; McKay, M.K.; Sattison, M.B.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance

  16. Standardizing the practice of human reliability analysis

    International Nuclear Information System (INIS)

    Hallbert, B.P.

    1993-01-01

    The practice of human reliability analysis (HRA) within the nuclear industry varies greatly in terms of posited mechanisms that shape human performance, methods of characterizing and analytically modeling human behavior, and the techniques that are employed to estimate the frequency with which human error occurs. This variation has been a source of contention among HRA practitioners regarding the validity of results obtained from different HRA methods. It has also resulted in attempts to develop standard methods and procedures for conducting HRAs. For many of the same reasons, the practice of HRA has not been standardized or has been standardized only to the extent that individual analysts have developed heuristics and consistent approaches in their practice of HRA. From the standpoint of consumers and regulators, this has resulted in a lack of clear acceptance criteria for the assumptions, modeling, and quantification of human errors in probabilistic risk assessments

  17. Results of a Demonstration Assessment of Passive System Reliability Utilizing the Reliability Method for Passive Systems (RMPS)

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia; Grelle, Austin

    2015-04-26

    Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), a systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.

  18. Human reliability analysis of errors of commission: a review of methods and applications

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B

    2007-06-15

    Illustrated by specific examples relevant to contemporary probabilistic safety assessment (PSA), this report presents a review of human reliability analysis (HRA) addressing post initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions. The review addressed both methods and applications. Emerging HRA methods providing advanced features and explicit guidance suitable for PSA are: A Technique for Human Event Analysis (ATHEANA, key publications in 1998/2000), Methode d'Evaluation de la Realisation des Missions Operateur pour la Surete (MERMOS, 1998/2000), the EOC HRA method developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS, 2003), the Misdiagnosis Tree Analysis (MDTA) method (2005/2006), the Cognitive Reliability and Error Analysis Method (CREAM, 1998), and the Commission Errors Search and Assessment (CESA) method (2002/2004). As a result of a thorough investigation of various PSA/HRA applications, this paper furthermore presents an overview of EOCs (termination of safety injection, shutdown of secondary cooling, etc.) referred to in predictive studies and a qualitative review of cases of EOC quantification. The main conclusions of the review of both the methods and the EOC HRA cases are: (1) The CESA search scheme, which proceeds from possible operator actions to the affected systems to scenarios, may be preferable because this scheme provides a formalized way for identifying relatively important scenarios with EOC opportunities; (2) an EOC identification guidance like CESA, which is strongly based on the procedural guidance and important measures of systems or components affected by inappropriate actions, however should pay some attention to EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions. (3) Orientations of advanced EOC quantification comprise a) modeling of multiple contexts for a given scenario, b) accounting for

  19. Human reliability analysis of errors of commission: a review of methods and applications

    International Nuclear Information System (INIS)

    Reer, B.

    2007-06-01

    Illustrated by specific examples relevant to contemporary probabilistic safety assessment (PSA), this report presents a review of human reliability analysis (HRA) addressing post initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions. The review addressed both methods and applications. Emerging HRA methods providing advanced features and explicit guidance suitable for PSA are: A Technique for Human Event Analysis (ATHEANA, key publications in 1998/2000), Methode d'Evaluation de la Realisation des Missions Operateur pour la Surete (MERMOS, 1998/2000), the EOC HRA method developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS, 2003), the Misdiagnosis Tree Analysis (MDTA) method (2005/2006), the Cognitive Reliability and Error Analysis Method (CREAM, 1998), and the Commission Errors Search and Assessment (CESA) method (2002/2004). As a result of a thorough investigation of various PSA/HRA applications, this paper furthermore presents an overview of EOCs (termination of safety injection, shutdown of secondary cooling, etc.) referred to in predictive studies and a qualitative review of cases of EOC quantification. The main conclusions of the review of both the methods and the EOC HRA cases are: (1) The CESA search scheme, which proceeds from possible operator actions to the affected systems to scenarios, may be preferable because this scheme provides a formalized way for identifying relatively important scenarios with EOC opportunities; (2) an EOC identification guidance like CESA, which is strongly based on the procedural guidance and important measures of systems or components affected by inappropriate actions, however should pay some attention to EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions. (3) Orientations of advanced EOC quantification comprise a) modeling of multiple contexts for a given scenario, b) accounting for

  20. Sensitive Quantification of Aflatoxin B1 in Animal Feeds, Corn Feed Grain, and Yellow Corn Meal Using Immunomagnetic Bead-Based Recovery and Real-Time Immunoquantitative-PCR

    Directory of Open Access Journals (Sweden)

    Dinesh Babu

    2014-12-01

    Full Text Available Aflatoxins are considered unavoidable natural mycotoxins encountered in foods, animal feeds, and feed grains. In this study, we demonstrate the application of our recently developed real-time immunoquantitative PCR (RT iq-PCR assay for sensitive detection and quantification of aflatoxins in poultry feed, two types of dairy feed (1 and 2, horse feed, whole kernel corn feed grains, and retail yellow ground corn meal. Upon testing methanol/water (60:40 extractions of the above samples using competitive direct enzyme linked immunosorbent assay, the aflatoxin content was found to be <20 μg/kg. The RT iq-PCR assay exhibited high antigen hook effect in samples containing aflatoxin levels higher than the quantification limits (0.1–10 μg/kg, addressed by comparing the quantification results of undiluted and diluted extracts. In testing the reliability of the immuno-PCR assay, samples were spiked with 200 μg/kg of aflatoxin B1, but the recovery of spiked aflatoxin was found to be poor. Considering the significance of determining trace levels of aflatoxins and their serious implications for animal and human health, the RT iq-PCR method described in this study can be useful for quantifying low natural aflatoxin levels in complex matrices of food or animal feed samples without the requirement of extra sample cleanup.

  1. Sensitive quantification of aflatoxin B1 in animal feeds, corn feed grain, and yellow corn meal using immunomagnetic bead-based recovery and real-time immunoquantitative-PCR.

    Science.gov (United States)

    Babu, Dinesh; Muriana, Peter M

    2014-12-02

    Aflatoxins are considered unavoidable natural mycotoxins encountered in foods, animal feeds, and feed grains. In this study, we demonstrate the application of our recently developed real-time immunoquantitative PCR (RT iq-PCR) assay for sensitive detection and quantification of aflatoxins in poultry feed, two types of dairy feed (1 and 2), horse feed, whole kernel corn feed grains, and retail yellow ground corn meal. Upon testing methanol/water (60:40) extractions of the above samples using competitive direct enzyme linked immunosorbent assay, the aflatoxin content was found to be effect in samples containing aflatoxin levels higher than the quantification limits (0.1-10 μg/kg), addressed by comparing the quantification results of undiluted and diluted extracts. In testing the reliability of the immuno-PCR assay, samples were spiked with 200 μg/kg of aflatoxin B1, but the recovery of spiked aflatoxin was found to be poor. Considering the significance of determining trace levels of aflatoxins and their serious implications for animal and human health, the RT iq-PCR method described in this study can be useful for quantifying low natural aflatoxin levels in complex matrices of food or animal feed samples without the requirement of extra sample cleanup.

  2. Wireless, intraoral hybrid electronics for real-time quantification of sodium intake toward hypertension management.

    Science.gov (United States)

    Lee, Yongkuk; Howe, Connor; Mishra, Saswat; Lee, Dong Sup; Mahmood, Musa; Piper, Matthew; Kim, Youngbin; Tieu, Katie; Byun, Hun-Soo; Coffey, James P; Shayan, Mahdis; Chun, Youngjae; Costanzo, Richard M; Yeo, Woon-Hong

    2018-05-22

    Recent wearable devices offer portable monitoring of biopotentials, heart rate, or physical activity, allowing for active management of human health and wellness. Such systems can be inserted in the oral cavity for measuring food intake in regard to controlling eating behavior, directly related to diseases such as hypertension, diabetes, and obesity. However, existing devices using plastic circuit boards and rigid sensors are not ideal for oral insertion. A user-comfortable system for the oral cavity requires an ultrathin, low-profile, and soft electronic platform along with miniaturized sensors. Here, we introduce a stretchable hybrid electronic system that has an exceptionally small form factor, enabling a long-range wireless monitoring of sodium intake. Computational study of flexible mechanics and soft materials provides fundamental aspects of key design factors for a tissue-friendly configuration, incorporating a stretchable circuit and sensor. Analytical calculation and experimental study enables reliable wireless circuitry that accommodates dynamic mechanical stress. Systematic in vitro modeling characterizes the functionality of a sodium sensor in the electronics. In vivo demonstration with human subjects captures the device feasibility for real-time quantification of sodium intake, which can be used to manage hypertension.

  3. Importance of independent and dependent human error to system reliability and plant safety

    International Nuclear Information System (INIS)

    Dach, K.

    1988-08-01

    Uncertainty analysis of the quantification of the unavailability for the emergency core cooling system was made. The reliability analysis of the low pressure injection system (LPIS) of the ECCS of WWER-440 reactor was also performed. Results of reliability analysis proved that LPIS reliability under normal conditions is sufficient and can be increased by two orders of magnitude. This increase in reliability can be achieved by means of simple changes such as securing an opening of the quick-acting fittings at LPIS discharge line. A method for analysis of systems uncertainty with periodic inspected components was elaborated and verified by performing an analysis of the medium size system. Refs, figs and tabs

  4. Reliability of reference distances used in photogrammetry.

    Science.gov (United States)

    Aksu, Muge; Kaya, Demet; Kocadereli, Ilken

    2010-07-01

    To determine the reliability of the reference distances used for photogrammetric assessment. The sample consisted of 100 subjects with mean ages of 22.97 +/- 2.98 years. Five lateral and four frontal parameters were measured directly on the subjects' faces. For photogrammetric assessment, two reference distances for the profile view and three reference distances for the frontal view were established. Standardized photographs were taken and all the parameters that had been measured directly on the face were measured on the photographs. The reliability of the reference distances was checked by comparing direct and indirect values of the parameters obtained from the subjects' faces and photographs. Repeated measure analysis of variance (ANOVA) and Bland-Altman analyses were used for statistical assessment. For profile measurements, the indirect values measured were statistically different from the direct values except for Sn-Sto in male subjects and Prn-Sn and Sn-Sto in female subjects. The indirect values of Prn-Sn and Sn-Sto were reliable in both sexes. The poorest results were obtained in the indirect values of the N-Sn parameter for female subjects and the Sn-Me parameter for male subjects according to the Sa-Sba reference distance. For frontal measurements, the indirect values were statistically different from the direct values in both sexes except for one in male subjects. The indirect values measured were not statistically different from the direct values for Go-Go. The indirect values of Ch-Ch were reliable in male subjects. The poorest results were obtained according to the P-P reference distance. For profile assessment, the T-Ex reference distance was reliable for Prn-Sn and Sn-Sto in both sexes. For frontal assessment, Ex-Ex and En-En reference distances were reliable for Ch-Ch in male subjects.

  5. Comparison of five DNA quantification methods

    DEFF Research Database (Denmark)

    Nielsen, Karsten; Mogensen, Helle Smidt; Hedman, Johannes

    2008-01-01

    Six commercial preparations of human genomic DNA were quantified using five quantification methods: UV spectrometry, SYBR-Green dye staining, slot blot hybridization with the probe D17Z1, Quantifiler Human DNA Quantification kit and RB1 rt-PCR. All methods measured higher DNA concentrations than...... Quantification kit in two experiments. The measured DNA concentrations with Quantifiler were 125 and 160% higher than expected based on the manufacturers' information. When the Quantifiler human DNA standard (Raji cell line) was replaced by the commercial human DNA preparation G147A (Promega) to generate the DNA...... standard curve in the Quantifiler Human DNA Quantification kit, the DNA quantification results of the human DNA preparations were 31% higher than expected based on the manufacturers' information. The results indicate a calibration problem with the Quantifiler human DNA standard for its use...

  6. A reliability analysis tool for SpaceWire network

    Science.gov (United States)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  7. Integrating the Carbon and Water Footprints’ Costs in the Water Framework Directive 2000/60/EC Full Water Cost Recovery Concept: Basic Principles Towards Their Reliable Calculation and Socially Just Allocation

    Directory of Open Access Journals (Sweden)

    Anastasia Papadopoulou

    2012-01-01

    Full Text Available This paper presents the basic principles for the integration of the water and carbon footprints cost into the resource and environmental costs respectively, taking the suggestions set by the Water Framework Directive (WFD 2000/60/EC one step forward. WFD states that full water cost recovery (FWCR should be based on the estimation of the three sub-costs related: direct; environmental; and resource cost. It also strongly suggests the EU Member States develop and apply effective water pricing policies to achieve FWCR. These policies must be socially just to avoid any social injustice phenomena. This is a very delicate task to handle, especially within the fragile economic conditions that the EU is facing today. Water losses play a crucial role for the FWC estimation. Water losses should not be neglected since they are one of the major “water uses” in any water supply network. A methodology is suggested to reduce water losses and the related Non Revenue Water (NRW index. An Expert Decision Support System is proposed to assess the FWC incorporating the Water and Carbon Footprint costs.

  8. Proposed reliability cost model

    Science.gov (United States)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  9. A novel quantification strategy of transferrin and albumin in human serum by species-unspecific isotope dilution laser ablation inductively coupled plasma mass spectrometry (ICP-MS)

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Liuxing, E-mail: fenglx@nim.ac.cn; Zhang, Dan; Wang, Jun; Shen, Dairui; Li, Hongmei

    2015-07-16

    process. Moreover, the application of species-unspecific isotope dilution GE-LA-ICP-MS has the potential to offer reliable, direct and simultaneous quantification of proteins after conventional 1D and 2D gel electrophoretic separations.

  10. A novel quantification strategy of transferrin and albumin in human serum by species-unspecific isotope dilution laser ablation inductively coupled plasma mass spectrometry (ICP-MS)

    International Nuclear Information System (INIS)

    Feng, Liuxing; Zhang, Dan; Wang, Jun; Shen, Dairui; Li, Hongmei

    2015-01-01

    application of species-unspecific isotope dilution GE-LA-ICP-MS has the potential to offer reliable, direct and simultaneous quantification of proteins after conventional 1D and 2D gel electrophoretic separations

  11. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  12. Issues in cognitive reliability

    International Nuclear Information System (INIS)

    Woods, D.D.; Hitchler, M.J.; Rumancik, J.A.

    1984-01-01

    This chapter examines some problems in current methods to assess reactor operator reliability at cognitive tasks and discusses new approaches to solve these problems. The two types of human failures are errors in the execution of an intention and errors in the formation/selection of an intention. Topics considered include the types of description, error correction, cognitive performance and response time, the speed-accuracy tradeoff function, function based task analysis, and cognitive task analysis. One problem of human reliability analysis (HRA) techniques in general is the question of what are the units of behavior whose reliability are to be determined. A second problem for HRA is that people often detect and correct their errors. The use of function based analysis, which maps the problem space for plant control, is recommended

  13. A computational Bayesian approach to dependency assessment in system reliability

    International Nuclear Information System (INIS)

    Yontay, Petek; Pan, Rong

    2016-01-01

    Due to the increasing complexity of engineered products, it is of great importance to develop a tool to assess reliability dependencies among components and systems under the uncertainty of system reliability structure. In this paper, a Bayesian network approach is proposed for evaluating the conditional probability of failure within a complex system, using a multilevel system configuration. Coupling with Bayesian inference, the posterior distributions of these conditional probabilities can be estimated by combining failure information and expert opinions at both system and component levels. Three data scenarios are considered in this study, and they demonstrate that, with the quantification of the stochastic relationship of reliability within a system, the dependency structure in system reliability can be gradually revealed by the data collected at different system levels. - Highlights: • A Bayesian network representation of system reliability is presented. • Bayesian inference methods for assessing dependencies in system reliability are developed. • Complete and incomplete data scenarios are discussed. • The proposed approach is able to integrate reliability information from multiple sources at multiple levels of the system.

  14. Reliability issues in PACS

    Science.gov (United States)

    Taira, Ricky K.; Chan, Kelby K.; Stewart, Brent K.; Weinberg, Wolfram S.

    1991-07-01

    Reliability is an increasing concern when moving PACS from the experimental laboratory to the clinical environment. Any system downtime may seriously affect patient care. The authors report on the several classes of errors encountered during the pre-clinical release of the PACS during the past several months and present the solutions implemented to handle them. The reliability issues discussed include: (1) environmental precautions, (2) database backups, (3) monitor routines of critical resources and processes, (4) hardware redundancy (networks, archives), and (5) development of a PACS quality control program.

  15. Reliability Parts Derating Guidelines

    Science.gov (United States)

    1982-06-01

    226-30, October 1974. 66 I, 26. "Reliability of GAAS Injection Lasers", De Loach , B. C., Jr., 1973 IEEE/OSA Conference on Laser Engineering and...Vol. R-23, No. 4, 226-30, October 1974. 28. "Reliability of GAAS Injection Lasers", De Loach , B. C., Jr., 1973 IEEE/OSA Conference on Laser...opnatien ot 󈨊 deg C, mounted on a 4-inach square 0.250~ inch thick al~loy alum~nusi panel.. This mounting technique should be L~ ken into cunoidur~tiou

  16. Improved quantification of alite and belite in anhydrous Portland cements by 29Si MAS NMR: Effects of paramagnetic ions

    DEFF Research Database (Denmark)

    Poulsen, Søren Lundsted; Kocaba, Vanessa; Le Saoût, Gwenn

    2009-01-01

    The applicability, reliability, and repeatability of 29Si MAS NMR for determination of the quantities of alite (Ca3SiO5) and belite (Ca2SiO4) in anhydrous Portland cement was investigated in detail for 11 commercial Portland cements and the results compared with phase quantifications based...

  17. Outcome quantification using SPHARM-PDM toolbox in orthognathic surgery

    Science.gov (United States)

    Cevidanes, Lucia; Zhu, HongTu; Styner, Martin

    2011-01-01

    Purpose Quantification of surgical outcomes in longitudinal studies has led to significant progress in the treatment of dentofacial deformity, both by offering options to patients who might not otherwise have been recommended for treatment and by clarifying the selection of appropriate treatment methods. Most existing surgical treatments have not been assessed in a systematic way. This paper presents the quantification of surgical outcomes in orthognathic surgery via our localized shape analysis framework. Methods In our setting, planning and surgical simulation is performed using the surgery planning software CMFapp. We then employ the SPHARM-PDM to measure the difference between pre-surgery and virtually simulated post-surgery models. This SPHARM-PDM shape framework is validated for use with craniofacial structures via simulating known 3D surgical changes within CMFapp. Results Our results show that SPHARM-PDM analysis accurately measures surgical displacements, compared with known displacement values. Visualization of color maps of virtually simulated surgical displacements describe corresponding surface distances that precisely describe location of changes, and difference vectors indicate directionality and magnitude of changes. Conclusions SPHARM-PDM-based quantification of surgical outcome is feasible. When compared to prior solutions, our method has the potential to make the surgical planning process more flexible, increase the level of detail and accuracy of the plan, yield higher operative precision and control and enhance the follow-up and documentation of clinical cases. PMID:21161693

  18. Toponomics method for the automated quantification of membrane protein translocation.

    Science.gov (United States)

    Domanova, Olga; Borbe, Stefan; Mühlfeld, Stefanie; Becker, Martin; Kubitz, Ralf; Häussinger, Dieter; Berlage, Thomas

    2011-09-19

    Intra-cellular and inter-cellular protein translocation can be observed by microscopic imaging of tissue sections prepared immunohistochemically. A manual densitometric analysis is time-consuming, subjective and error-prone. An automated quantification is faster, more reproducible, and should yield results comparable to manual evaluation. The automated method presented here was developed on rat liver tissue sections to study the translocation of bile salt transport proteins in hepatocytes. For validation, the cholestatic liver state was compared to the normal biological state. An automated quantification method was developed to analyze the translocation of membrane proteins and evaluated in comparison to an established manual method. Firstly, regions of interest (membrane fragments) are identified in confocal microscopy images. Further, densitometric intensity profiles are extracted orthogonally to membrane fragments, following the direction from the plasma membrane to cytoplasm. Finally, several different quantitative descriptors were derived from the densitometric profiles and were compared regarding their statistical significance with respect to the transport protein distribution. Stable performance, robustness and reproducibility were tested using several independent experimental datasets. A fully automated workflow for the information extraction and statistical evaluation has been developed and produces robust results. New descriptors for the intensity distribution profiles were found to be more discriminative, i.e. more significant, than those used in previous research publications for the translocation quantification. The slow manual calculation can be substituted by the fast and unbiased automated method.

  19. Cross recurrence quantification for cover song identification

    Energy Technology Data Exchange (ETDEWEB)

    Serra, Joan; Serra, Xavier; Andrzejak, Ralph G [Department of Information and Communication Technologies, Universitat Pompeu Fabra, Roc Boronat 138, 08018 Barcelona (Spain)], E-mail: joan.serraj@upf.edu

    2009-09-15

    There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from real-world dynamics even though these are not necessarily deterministic and stationary. In the present study, we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose, we here propose a recurrence quantification analysis measure that allows the tracking of potentially curved and disrupted traces in cross recurrence plots (CRPs). We apply this measure to CRPs constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Roessler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

  20. Cross recurrence quantification for cover song identification

    International Nuclear Information System (INIS)

    Serra, Joan; Serra, Xavier; Andrzejak, Ralph G

    2009-01-01

    There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from real-world dynamics even though these are not necessarily deterministic and stationary. In the present study, we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose, we here propose a recurrence quantification analysis measure that allows the tracking of potentially curved and disrupted traces in cross recurrence plots (CRPs). We apply this measure to CRPs constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Roessler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

  1. Tributyltin--critical pollutant in whole water samples--development of traceable measurement methods for monitoring under the European Water Framework Directive (WFD) 2000/60/EC.

    Science.gov (United States)

    Richter, Janine; Fettig, Ina; Philipp, Rosemarie; Jakubowski, Norbert

    2015-07-01

    Tributyltin is listed as one of the priority substances in the European Water Framework Directive (WFD). Despite its decreasing input in the environment, it is still present and has to be monitored. In the European Metrology Research Programme project ENV08, a sensitive and reliable analytical method according to the WFD was developed to quantify this environmental pollutant at a very low limit of quantification. With the development of such a primary reference method for tributyltin, the project helped to improve the quality and comparability of monitoring data. An overview of project aims and potential analytical tools is given.

  2. Development of Quantification Method for Bioluminescence Imaging

    International Nuclear Information System (INIS)

    Kim, Hyeon Sik; Min, Jung Joon; Lee, Byeong Il; Choi, Eun Seo; Tak, Yoon O; Choi, Heung Kook; Lee, Ju Young

    2009-01-01

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  3. A Method of Nuclear Software Reliability Estimation

    International Nuclear Information System (INIS)

    Park, Gee Yong; Eom, Heung Seop; Cheon, Se Woo; Jang, Seung Cheol

    2011-01-01

    A method on estimating software reliability for nuclear safety software is proposed. This method is based on the software reliability growth model (SRGM) where the behavior of software failure is assumed to follow the non-homogeneous Poisson process. Several modeling schemes are presented in order to estimate and predict more precisely the number of software defects based on a few of software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating the software test cases into the model. It is identified that this method is capable of accurately estimating the remaining number of software defects which are on-demand type directly affecting safety trip functions. The software reliability can be estimated from a model equation and one method of obtaining the software reliability is proposed

  4. Solder joint technology materials, properties, and reliability

    CERN Document Server

    Tu, King-Ning

    2007-01-01

    Solder joints are ubiquitous in electronic consumer products. The European Union has a directive to ban the use of Pb-based solders in these products on July 1st, 2006. There is an urgent need for an increase in the research and development of Pb-free solders in electronic manufacturing. For example, spontaneous Sn whisker growth and electromigration induced failure in solder joints are serious issues. These reliability issues are quite complicated due to the combined effect of electrical, mechanical, chemical, and thermal forces on solder joints. To improve solder joint reliability, the science of solder joint behavior under various driving forces must be understood. In this book, the advanced materials reliability issues related to copper-tin reaction and electromigration in solder joints are emphasized and methods to prevent these reliability problems are discussed.

  5. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  6. Calculations of mechanisms for balance control during narrow and single-leg standing in fit older adults: A reliability study.

    Science.gov (United States)

    Aberg, A C; Thorstensson, A; Tarassova, O; Halvorsen, K

    2011-07-01

    For older people balance control in standing is critical for performance of activities of daily living without falling. The aims were to investigate reliability of quantification of the usage of the two balance mechanisms M(1) 'moving the centre of pressure' and M(2) 'segment acceleration' and also to compare calculation methods based on a combination of kinetic (K) and kinematic (Km) data, (K-Km), or Km data only concerning M(2). For this purpose nine physically fit persons aged 70-78 years were tested in narrow and single-leg standing. Data were collected by a 7-camera motion capture system and two force plates. Repeated measure ANOVA and Tukey's post hoc tests were used to detect differences between the standing tasks. Reliability was estimated by ICCs, standard error of measurement including its 95% CI, and minimal detectable change, whereas Pearson's correlation coefficient was used to investigate agreement between the two calculation methods. The results indicated that for the tasks investigated, M(1) and M(2) can be measured with acceptable inter- and intrasession reliability, and that both Km and K-Km based calculations may be useful for M(2), although Km data may give slightly lower values. The proportional M(1):M(2) usage was approximately 9:1, in both anterio-posterior (AP) and medio-lateral (ML) directions for narrow standing, and about 2:1 in the AP and of 1:2 in the ML direction in single-leg standing, respectively. In conclusion, the tested measurements and calculations appear to constitute a reliable way of quantifying one important aspect of balance capacity in fit older people. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. A performance study on three qPCR quantification kits and their compatibilities with the 6-dye DNA profiling systems.

    Science.gov (United States)

    Lin, Sze-Wah; Li, Christina; Ip, Stephen C Y

    2018-03-01

    DNA quantification plays an integral role in forensic DNA profiling. Not only does it estimate the total amount of amplifiable human autosomal and male DNA to ensure optimal amplification of target DNA for subsequent analysis, but also assesses the extraction efficiency and purity of the DNA extract. Latest DNA quantification systems even offer an estimate for the degree of DNA degradation in a sample. Here, we report the performance of three new generation qPCR kits, namely Investigator ® Quantiplex HYres Kit from QIAGEN, Quantifiler ® Trio DNA Quantification Kit from Applied Biosystems™, and PowerQuant ® System from Promega, and their compatibilities with three 6-dye DNA profiling systems. Our results have demonstrated that all three kits generate standard curves with satisfactory consistency and reproducibility, and are capable of screening out traces of male DNA in the presence of 30-fold excess of female DNA. They also exhibit a higher tolerance to PCR inhibition than Quantifiler ® Human DNA Quantification Kit from Applied Biosystems™ in autosomal DNA quantification. PowerQuant ® , as compared to Quantiplex HYres and Quantifiler ® Trio, shows a better precision for both autosomal and male DNA quantifications. Quantifiler ® Trio and PowerQuant ® in contrast to Quantiplex HYres offer better correlations with lower discrepancies between autosomal and male DNA quantification, and their additional degradation index features provide a detection platform for inhibited and/or degraded DNA template. Regarding the compatibility between these quantification and profiling systems: (1) both Quantifiler ® Trio and PowerQuant ® work well with GlobalFiler and Fusion 6C, allowing a fairly accurate prediction of their DNA typing results based on the quantification values; (2) Quantiplex HYres offers a fairly reliable IPC system for detecting any potential inhibitions on Investigator 24plex, whereas Quantifiler ® Trio and PowerQuant ® suit better for Global

  8. Sample Size Bounding and Context Ranking as Approaches to the Human Error Quantification Problem

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B

    2004-03-01

    The paper describes a technique denoted as Sub-Sample-Size Bounding (SSSB), which is useable for the statistical derivation of context-specific probabilities from data available in existing reports on operating experience. Applications to human reliability analysis (HRA) are emphasised in the presentation of this technique. Exemplified by a sample of 180 abnormal event sequences, the manner in which SSSB can provide viable input for the quantification of errors of commission (EOCs) are outlined. (author)

  9. Working group of experts on rare events in human error analysis and quantification

    International Nuclear Information System (INIS)

    Goodstein, L.P.

    1977-01-01

    In dealing with the reference problem of rare events in nuclear power plants, the group has concerned itself with the man-machine system and, in particular, with human error analysis and quantification. The Group was requested to review methods of human reliability prediction, to evaluate the extent to which such analyses can be formalized and to establish criteria to be met by task conditions and system design which would permit a systematic, formal analysis. Recommendations are given on the Fessenheim safety system

  10. Sample Size Bounding and Context Ranking as Approaches to the Human Error Quantification Problem

    International Nuclear Information System (INIS)

    Reer, B.

    2004-01-01

    The paper describes a technique denoted as Sub-Sample-Size Bounding (SSSB), which is useable for the statistical derivation of context-specific probabilities from data available in existing reports on operating experience. Applications to human reliability analysis (HRA) are emphasised in the presentation of this technique. Exemplified by a sample of 180 abnormal event sequences, the manner in which SSSB can provide viable input for the quantification of errors of commission (EOCs) are outlined. (author)

  11. Columbus safety and reliability

    Science.gov (United States)

    Longhurst, F.; Wessels, H.

    1988-10-01

    Analyses carried out to ensure Columbus reliability, availability, and maintainability, and operational and design safety are summarized. Failure modes/effects/criticality is the main qualitative tool used. The main aspects studied are fault tolerance, hazard consequence control, risk minimization, human error effects, restorability, and safe-life design.

  12. Reliability versus reproducibility

    International Nuclear Information System (INIS)

    Lautzenheiser, C.E.

    1976-01-01

    Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results

  13. Power transformer reliability modelling

    NARCIS (Netherlands)

    Schijndel, van A.

    2010-01-01

    Problem description Electrical power grids serve to transport and distribute electrical power with high reliability and availability at acceptable costs and risks. These grids play a crucial though preferably invisible role in supplying sufficient power in a convenient form. Today’s society has

  14. Designing reliability into accelerators

    International Nuclear Information System (INIS)

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the ''factories,'' reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis

  15. Proof tests on reliability

    International Nuclear Information System (INIS)

    Mishima, Yoshitsugu

    1983-01-01

    In order to obtain public understanding on nuclear power plants, tests should be carried out to prove the reliability and safety of present LWR plants. For example, the aseismicity of nuclear power plants must be verified by using a large scale earthquake simulator. Reliability test began in fiscal 1975, and the proof tests on steam generators and on PWR support and flexure pins against stress corrosion cracking have already been completed, and the results have been internationally highly appreciated. The capacity factor of the nuclear power plant operation in Japan rose to 80% in the summer of 1983, and considering the period of regular inspection, it means the operation of almost full capacity. Japanese LWR technology has now risen to the top place in the world after having overcome the defects. The significance of the reliability test is to secure the functioning till the age limit is reached, to confirm the correct forecast of deteriorating process, to confirm the effectiveness of the remedy to defects and to confirm the accuracy of predicting the behavior of facilities. The reliability of nuclear valves, fuel assemblies, the heat affected zones in welding, reactor cooling pumps and electric instruments has been tested or is being tested. (Kako, I.)

  16. Reliability and code level

    NARCIS (Netherlands)

    Kasperski, M.; Geurts, C.P.W.

    2005-01-01

    The paper describes the work of the IAWE Working Group WBG - Reliability and Code Level, one of the International Codification Working Groups set up at ICWE10 in Copenhagen. The following topics are covered: sources of uncertainties in the design wind load, appropriate design target values for the

  17. Reliability of Plastic Slabs

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    1989-01-01

    In the paper it is shown how upper and lower bounds for the reliability of plastic slabs can be determined. For the fundamental case it is shown that optimal bounds of a deterministic and a stochastic analysis are obtained on the basis of the same failure mechanisms and the same stress fields....

  18. Reliability based structural design

    NARCIS (Netherlands)

    Vrouwenvelder, A.C.W.M.

    2014-01-01

    According to ISO 2394, structures shall be designed, constructed and maintained in such a way that they are suited for their use during the design working life in an economic way. To fulfil this requirement one needs insight into the risk and reliability under expected and non-expected actions. A

  19. Travel time reliability modeling.

    Science.gov (United States)

    2011-07-01

    This report includes three papers as follows: : 1. Guo F., Rakha H., and Park S. (2010), "A Multi-state Travel Time Reliability Model," : Transportation Research Record: Journal of the Transportation Research Board, n 2188, : pp. 46-54. : 2. Park S.,...

  20. Reliability and Model Fit

    Science.gov (United States)

    Stanley, Leanne M.; Edwards, Michael C.

    2016-01-01

    The purpose of this article is to highlight the distinction between the reliability of test scores and the fit of psychometric measurement models, reminding readers why it is important to consider both when evaluating whether test scores are valid for a proposed interpretation and/or use. It is often the case that an investigator judges both the…

  1. Parametric Mass Reliability Study

    Science.gov (United States)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  2. FRANX. Application for analysis and quantification of the APS fire; FRANK. Aplicacion para el analisis y cuantificacion de los APS de incendios

    Energy Technology Data Exchange (ETDEWEB)

    Snchez, A.; Osorio, F.; Ontoso, N.

    2014-07-01

    The FRANX application has been developed by EPRI within the Risk and Reliability User Group in order to facilitate the process of quantification and updating APS Fire (also covers floods and earthquakes). By applying fire scenarios are quantified in the central integrating the tasks performed during the APS fire. This paper describes the main features of the program to allow quantification of an APS Fire. (Author)

  3. Reliability Approach of a Compressor System using Reliability Block ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... This paper presents a reliability analysis of such a system using reliability ... Keywords-compressor system, reliability, reliability block diagram, RBD .... the same structure has been kept with the three subsystems: air flow, oil flow and .... and Safety in Engineering Design", Springer, 2009. [3] P. O'Connor ...

  4. Uncertainty Quantification in Geomagnetic Field Modeling

    Science.gov (United States)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  5. Comparison of the direct enzyme assay method with the membrane ...

    African Journals Online (AJOL)

    Comparison of the direct enzyme assay method with the membrane filtration technique in the quantification and monitoring of microbial indicator organisms – seasonal variations in the activities of coliforms and E. coli, temperature and pH.

  6. Accuracy and Precision of Radioactivity Quantification in Nuclear Medicine Images

    Science.gov (United States)

    Frey, Eric C.; Humm, John L.; Ljungberg, Michael

    2012-01-01

    The ability to reliably quantify activity in nuclear medicine has a number of increasingly important applications. Dosimetry for targeted therapy treatment planning or for approval of new imaging agents requires accurate estimation of the activity in organs, tumors, or voxels at several imaging time points. Another important application is the use of quantitative metrics derived from images, such as the standard uptake value commonly used in positron emission tomography (PET), to diagnose and follow treatment of tumors. These measures require quantification of organ or tumor activities in nuclear medicine images. However, there are a number of physical, patient, and technical factors that limit the quantitative reliability of nuclear medicine images. There have been a large number of improvements in instrumentation, including the development of hybrid single-photon emission computed tomography/computed tomography and PET/computed tomography systems, and reconstruction methods, including the use of statistical iterative reconstruction methods, which have substantially improved the ability to obtain reliable quantitative information from planar, single-photon emission computed tomography, and PET images. PMID:22475429

  7. Development of a methodology for conducting an integrated HRA/PRA --. Task 1, An assessment of human reliability influences during LP&S conditions PWRs

    Energy Technology Data Exchange (ETDEWEB)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [Wreathall (John) and Co., Dublin, OH (United States); Cooper, S.E. [Science Applications International Corp., McLean, VA (United States)

    1993-06-01

    During Low Power and Shutdown (LP&S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant`s systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP&S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP&S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP&S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP&S, (2) identification of potentially important LP&S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP&S conditions for a pressurized water reactor (PWR).

  8. Quantification in histopathology-Can magnetic particles help?

    International Nuclear Information System (INIS)

    Mitchels, John; Hawkins, Peter; Luxton, Richard; Rhodes, Anthony

    2007-01-01

    Every year, more than 270,000 people are diagnosed with cancer in the UK alone; this means that one in three people worldwide contract cancer within their lifetime. Histopathology is the principle method for confirming cancer and directing treatment. In this paper, a novel application of magnetic particles is proposed to help address the problem of subjectivity in histopathology. Preliminary results indicate that magnetic nanoparticles cannot only be used to assist diagnosis through improving quantification but also potentially increase throughput, hence offering a way of dramatically reducing costs within the routine histopathology laboratory

  9. Human reliability assessors guide: an overview

    International Nuclear Information System (INIS)

    Humphreys, P.

    1988-01-01

    The Human Reliability Assessors Guide 1 provides a review of techniques currently available for the quantification of Human Error Probabilities. The Guide has two main objectives. The first is to provide a clear and comprehensive description of eight major techniques which can be used to assess human reliability. This is supplemented by case studies taken from practical applications of each technique to industrial problems. The second objective is to provide practical guidelines for the selection of techniques. The selection process is aided by reference to a set of criteria against which each of the eight techniques have been evaluated. Utilising the criteria and critiques, a selection method is presented. This is designed to assist the potential user in choosing the technique, or combination of techniques, most suited to answering the users requirements. For each of the eight selected techniques, a summary of the origins of the technique is provided, together with a method description, detailed case studies, abstracted case studies and supporting references. (author)

  10. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    Science.gov (United States)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  11. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  12. Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.

    Science.gov (United States)

    Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania

    2016-04-01

    The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets.

  13. Reliability in the utility computing era: Towards reliable Fog computing

    DEFF Research Database (Denmark)

    Madsen, Henrik; Burtschy, Bernard; Albeanu, G.

    2013-01-01

    This paper considers current paradigms in computing and outlines the most important aspects concerning their reliability. The Fog computing paradigm as a non-trivial extension of the Cloud is considered and the reliability of the networks of smart devices are discussed. Combining the reliability...... requirements of grid and cloud paradigms with the reliability requirements of networks of sensor and actuators it follows that designing a reliable Fog computing platform is feasible....

  14. RTE - Reliability report 2016

    International Nuclear Information System (INIS)

    2017-06-01

    Every year, RTE produces a reliability report for the past year. This document lays out the main factors that affected the electrical power system's operational reliability in 2016 and the initiatives currently under way intended to ensure its reliability in the future. Within a context of the energy transition, changes to the European interconnected network mean that RTE has to adapt on an on-going basis. These changes include the increase in the share of renewables injecting an intermittent power supply into networks, resulting in a need for flexibility, and a diversification in the numbers of stakeholders operating in the energy sector and changes in the ways in which they behave. These changes are dramatically changing the structure of the power system of tomorrow and the way in which it will operate - particularly the way in which voltage and frequency are controlled, as well as the distribution of flows, the power system's stability, the level of reserves needed to ensure supply-demand balance, network studies, assets' operating and control rules, the tools used and the expertise of operators. The results obtained in 2016 are evidence of a globally satisfactory level of reliability for RTE's operations in somewhat demanding circumstances: more complex supply-demand balance management, cross-border schedules at interconnections indicating operation that is closer to its limits and - most noteworthy - having to manage a cold spell just as several nuclear power plants had been shut down. In a drive to keep pace with the changes expected to occur in these circumstances, RTE implemented numerous initiatives to ensure high levels of reliability: - maintaining investment levels of euro 1.5 billion per year; - increasing cross-zonal capacity at borders with our neighbouring countries, thus bolstering the security of our electricity supply; - implementing new mechanisms (demand response, capacity mechanism, interruptibility, etc.); - involvement in tests or projects

  15. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  16. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  17. Uncertainty Quantification in Aerodynamics Simulations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  18. Quantification of virus syndrome in chili peppers

    African Journals Online (AJOL)

    Jane

    2011-06-15

    Jun 15, 2011 ... alternative for the quantification of the disease' syndromes in regards to this crop. The result of these ..... parison of treatments such as cultivars or control measures and ..... Vascular discoloration and stem necrosis. 2.

  19. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.; Curioni, A.; Fedulova, I.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost

  20. Building and integrating reliability models in a Reliability-Centered-Maintenance approach

    International Nuclear Information System (INIS)

    Verite, B.; Villain, B.; Venturini, V.; Hugonnard, S.; Bryla, P.

    1998-03-01

    Electricite de France (EDF) has recently developed its OMF-Structures method, designed to optimize preventive maintenance of passive structures such as pipes and support, based on risk. In particular, reliability performances of components need to be determined; it is a two-step process, consisting of a qualitative sort followed by a quantitative evaluation, involving two types of models. Initially, degradation models are widely used to exclude some components from the field of preventive maintenance. The reliability of the remaining components is then evaluated by means of quantitative reliability models. The results are then included in a risk indicator that is used to directly optimize preventive maintenance tasks. (author)

  1. Estimation of immune cell densities in immune cell conglomerates: an approach for high-throughput quantification.

    Directory of Open Access Journals (Sweden)

    Niels Halama

    2009-11-01

    Full Text Available Determining the correct number of positive immune cells in immunohistological sections of colorectal cancer and other tumor entities is emerging as an important clinical predictor and therapy selector for an individual patient. This task is usually obstructed by cell conglomerates of various sizes. We here show that at least in colorectal cancer the inclusion of immune cell conglomerates is indispensable for estimating reliable patient cell counts. Integrating virtual microscopy and image processing principally allows the high-throughput evaluation of complete tissue slides.For such large-scale systems we demonstrate a robust quantitative image processing algorithm for the reproducible quantification of cell conglomerates on CD3 positive T cells in colorectal cancer. While isolated cells (28 to 80 microm(2 are counted directly, the number of cells contained in a conglomerate is estimated by dividing the area of the conglomerate in thin tissues sections (< or =6 microm by the median area covered by an isolated T cell which we determined as 58 microm(2. We applied our algorithm to large numbers of CD3 positive T cell conglomerates and compared the results to cell counts obtained manually by two independent observers. While especially for high cell counts, the manual counting showed a deviation of up to 400 cells/mm(2 (41% variation, algorithm-determined T cell numbers generally lay in between the manually observed cell numbers but with perfect reproducibility.In summary, we recommend our approach as an objective and robust strategy for quantifying immune cell densities in immunohistological sections which can be directly implemented into automated full slide image processing systems.

  2. Waste package reliability analysis

    International Nuclear Information System (INIS)

    Pescatore, C.; Sastre, C.

    1983-01-01

    Proof of future performance of a complex system such as a high-level nuclear waste package over a period of hundreds to thousands of years cannot be had in the ordinary sense of the word. The general method of probabilistic reliability analysis could provide an acceptable framework to identify, organize, and convey the information necessary to satisfy the criterion of reasonable assurance of waste package performance according to the regulatory requirements set forth in 10 CFR 60. General principles which may be used to evaluate the qualitative and quantitative reliability of a waste package design are indicated and illustrated with a sample calculation of a repository concept in basalt. 8 references, 1 table

  3. Accelerator reliability workshop

    Energy Technology Data Exchange (ETDEWEB)

    Hardy, L; Duru, Ph; Koch, J M; Revol, J L; Van Vaerenbergh, P; Volpe, A M; Clugnet, K; Dely, A; Goodhew, D

    2002-07-01

    About 80 experts attended this workshop, which brought together all accelerator communities: accelerator driven systems, X-ray sources, medical and industrial accelerators, spallation sources projects (American and European), nuclear physics, etc. With newly proposed accelerator applications such as nuclear waste transmutation, replacement of nuclear power plants and others. Reliability has now become a number one priority for accelerator designers. Every part of an accelerator facility from cryogenic systems to data storage via RF systems are concerned by reliability. This aspect is now taken into account in the design/budget phase, especially for projects whose goal is to reach no more than 10 interruptions per year. This document gathers the slides but not the proceedings of the workshop.

  4. Human Reliability Program Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Landers, John; Rogers, Erin; Gerke, Gretchen

    2014-05-18

    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  5. Reliability and construction control

    Directory of Open Access Journals (Sweden)

    Sherif S. AbdelSalam

    2016-06-01

    Full Text Available The goal of this study was to determine the most reliable and efficient combination of design and construction methods required for vibro piles. For a wide range of static and dynamic formulas, the reliability-based resistance factors were calculated using EGYPT database, which houses load test results for 318 piles. The analysis was extended to introduce a construction control factor that determines the variation between the pile nominal capacities calculated using static versus dynamic formulae. From the major outcomes, the lowest coefficient of variation is associated with Davisson’s criterion, and the resistance factors calculated for the AASHTO method are relatively high compared with other methods. Additionally, the CPT-Nottingham and Schmertmann method provided the most economic design. Recommendations related to a pile construction control factor were also presented, and it was found that utilizing the factor can significantly reduce variations between calculated and actual capacities.

  6. Scyllac equipment reliability analysis

    International Nuclear Information System (INIS)

    Gutscher, W.D.; Johnson, K.J.

    1975-01-01

    Most of the failures in Scyllac can be related to crowbar trigger cable faults. A new cable has been designed, procured, and is currently undergoing evaluation. When the new cable has been proven, it will be worked into the system as quickly as possible without causing too much additional down time. The cable-tip problem may not be easy or even desirable to solve. A tightly fastened permanent connection that maximizes contact area would be more reliable than the plug-in type of connection in use now, but it would make system changes and repairs much more difficult. The balance of the failures have such a low occurrence rate that they do not cause much down time and no major effort is underway to eliminate them. Even though Scyllac was built as an experimental system and has many thousands of components, its reliability is very good. Because of this the experiment has been able to progress at a reasonable pace

  7. Improving Power Converter Reliability

    DEFF Research Database (Denmark)

    Ghimire, Pramod; de Vega, Angel Ruiz; Beczkowski, Szymon

    2014-01-01

    of a high-power IGBT module during converter operation, which may play a vital role in improving the reliability of the power converters. The measured voltage is used to estimate the module average junction temperature of the high and low-voltage side of a half-bridge IGBT separately in every fundamental......The real-time junction temperature monitoring of a high-power insulated-gate bipolar transistor (IGBT) module is important to increase the overall reliability of power converters for industrial applications. This article proposes a new method to measure the on-state collector?emitter voltage...... is measured in a wind power converter at a low fundamental frequency. To illustrate more, the test method as well as the performance of the measurement circuit are also presented. This measurement is also useful to indicate failure mechanisms such as bond wire lift-off and solder layer degradation...

  8. Accelerator reliability workshop

    International Nuclear Information System (INIS)

    Hardy, L.; Duru, Ph.; Koch, J.M.; Revol, J.L.; Van Vaerenbergh, P.; Volpe, A.M.; Clugnet, K.; Dely, A.; Goodhew, D.

    2002-01-01

    About 80 experts attended this workshop, which brought together all accelerator communities: accelerator driven systems, X-ray sources, medical and industrial accelerators, spallation sources projects (American and European), nuclear physics, etc. With newly proposed accelerator applications such as nuclear waste transmutation, replacement of nuclear power plants and others. Reliability has now become a number one priority for accelerator designers. Every part of an accelerator facility from cryogenic systems to data storage via RF systems are concerned by reliability. This aspect is now taken into account in the design/budget phase, especially for projects whose goal is to reach no more than 10 interruptions per year. This document gathers the slides but not the proceedings of the workshop

  9. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  10. Stochastic approach for radionuclides quantification

    Science.gov (United States)

    Clement, A.; Saurel, N.; Perrin, G.

    2018-01-01

    Gamma spectrometry is a passive non-destructive assay used to quantify radionuclides present in more or less complex objects. Basic methods using empirical calibration with a standard in order to quantify the activity of nuclear materials by determining the calibration coefficient are useless on non-reproducible, complex and single nuclear objects such as waste packages. Package specifications as composition or geometry change from one package to another and involve a high variability of objects. Current quantification process uses numerical modelling of the measured scene with few available data such as geometry or composition. These data are density, material, screen, geometric shape, matrix composition, matrix and source distribution. Some of them are strongly dependent on package data knowledge and operator backgrounds. The French Commissariat à l'Energie Atomique (CEA) is developing a new methodology to quantify nuclear materials in waste packages and waste drums without operator adjustment and internal package configuration knowledge. This method suggests combining a global stochastic approach which uses, among others, surrogate models available to simulate the gamma attenuation behaviour, a Bayesian approach which considers conditional probability densities of problem inputs, and Markov Chains Monte Carlo algorithms (MCMC) which solve inverse problems, with gamma ray emission radionuclide spectrum, and outside dimensions of interest objects. The methodology is testing to quantify actinide activity in different kind of matrix, composition, and configuration of sources standard in terms of actinide masses, locations and distributions. Activity uncertainties are taken into account by this adjustment methodology.

  11. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  12. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander; Matthies, Hermann G.

    2014-01-01

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  13. Inverse problems and uncertainty quantification

    KAUST Repository

    Litvinenko, Alexander

    2013-12-18

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)— the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  14. Accurate and precise DNA quantification in the presence of different amplification efficiencies using an improved Cy0 method.

    Science.gov (United States)

    Guescini, Michele; Sisti, Davide; Rocchi, Marco B L; Panebianco, Renato; Tibollo, Pasquale; Stocchi, Vilberto

    2013-01-01

    Quantitative real-time PCR represents a highly sensitive and powerful technology for the quantification of DNA. Although real-time PCR is well accepted as the gold standard in nucleic acid quantification, there is a largely unexplored area of experimental conditions that limit the application of the Ct method. As an alternative, our research team has recently proposed the Cy0 method, which can compensate for small amplification variations among the samples being compared. However, when there is a marked decrease in amplification efficiency, the Cy0 is impaired, hence determining reaction efficiency is essential to achieve a reliable quantification. The proposed improvement in Cy0 is based on the use of the kinetic parameters calculated in the curve inflection point to compensate for efficiency variations. Three experimental models were used: inhibition of primer extension, non-optimal primer annealing and a very small biological sample. In all these models, the improved Cy0 method increased quantification accuracy up to about 500% without affecting precision. Furthermore, the stability of this procedure was enhanced integrating it with the SOD method. In short, the improved Cy0 method represents a simple yet powerful approach for reliable DNA quantification even in the presence of marked efficiency variations.

  15. Evaluation of peroxidative stress of cancer cells in vitro by real-time quantification of volatile aldehydes in culture headspace.

    Science.gov (United States)

    Shestivska, Violetta; Rutter, Abigail V; Sulé-Suso, Josep; Smith, David; Španěl, Patrik

    2017-08-30

    Peroxidation of lipids in cellular membranes results in the release of volatile organic compounds (VOCs), including saturated aldehydes. The real-time quantification of trace VOCs produced by cancer cells during peroxidative stress presents a new challenge to non-invasive clinical diagnostics, which as described here, we have met with some success. A combination of selected ion flow tube mass spectrometry (SIFT-MS), a technique that allows rapid, reliable quantification of VOCs in humid air and liquid headspace, and electrochemistry to generate reactive oxygen species (ROS) in vitro has been used. Thus, VOCs present in the headspace of CALU-1 cancer cell line cultures exposed to ROS have been monitored and quantified in real time using SIFT-MS. The CALU-1 lung cancer cells were cultured in 3D collagen to mimic in vivo tissue. Real-time SIFT-MS analyses focused on the volatile aldehydes: propanal, butanal, pentanal, hexanal, heptanal and malondialdehyde (propanedial), that are expected to be products of cellular membrane peroxidation. All six aldehydes were identified in the culture headspace, each reaching peak concentrations during the time of exposure to ROS and eventually reducing as the reactants were depleted in the culture. Pentanal and hexanal were the most abundant, reaching concentrations of a few hundred parts-per-billion by volume, ppbv, in the culture headspace. The results of these experiments demonstrate that peroxidation of cancer cells in vitro can be monitored and evaluated by direct real-time analysis of the volatile aldehydes produced. The combination of adopted methodology potentially has value for the study of other types of VOCs that may be produced by cellular damage. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Development of web-based reliability data base platform

    International Nuclear Information System (INIS)

    Hwang, Seok Won; Lee, Chang Ju; Sung, Key Yong

    2004-01-01

    Probabilistic safety assessment (PSA) is a systematic technique which estimates the degree of risk impacts to the public due to an accident scenario. Estimating the occurrence frequencies and consequences of potential scenarios requires a thorough analysis of the accident details and all fundamental parameters. The robustness of PSA to check weaknesses in a design and operation will allow a better informed and balanced decision to be reached. The fundamental parameters for PSA, such as the component failure rates, should be estimated under the condition of steady collection of the evidence throughout the operational period. However, since any single plant data does not sufficiently enough to provide an adequate PSA result, in actual, the whole operating data was commonly used to estimate the reliability parameters for the same type of components. The reliability data of any component type consists of two categories; the generic that is based on the operating experiences of whole plants, and the plant-specific that is based on the operation of a specific plant of interest. The generic data is highly essential for new or recently-built nuclear power plants (NPPs). Generally, the reliability data base may be categorized into the component reliability, initiating event frequencies, human performance, and so on. Among these data, the component reliability seems a key element because it has the most abundant population. Therefore, the component reliability data is essential for taking a part in the quantification of accident sequences because it becomes an input of various basic events which consists of the fault tree

  17. Reliability of Circumplex Axes

    Directory of Open Access Journals (Sweden)

    Micha Strack

    2013-06-01

    Full Text Available We present a confirmatory factor analysis (CFA procedure for computing the reliability of circumplex axes. The tau-equivalent CFA variance decomposition model estimates five variance components: general factor, axes, scale-specificity, block-specificity, and item-specificity. Only the axes variance component is used for reliability estimation. We apply the model to six circumplex types and 13 instruments assessing interpersonal and motivational constructs—Interpersonal Adjective List (IAL, Interpersonal Adjective Scales (revised; IAS-R, Inventory of Interpersonal Problems (IIP, Impact Messages Inventory (IMI, Circumplex Scales of Interpersonal Values (CSIV, Support Action Scale Circumplex (SAS-C, Interaction Problems With Animals (IPI-A, Team Role Circle (TRC, Competing Values Leadership Instrument (CV-LI, Love Styles, Organizational Culture Assessment Instrument (OCAI, Customer Orientation Circle (COC, and System for Multi-Level Observation of Groups (behavioral adjectives; SYMLOG—in 17 German-speaking samples (29 subsamples, grouped by self-report, other report, and metaperception assessments. The general factor accounted for a proportion ranging from 1% to 48% of the item variance, the axes component for 2% to 30%; and scale specificity for 1% to 28%, respectively. Reliability estimates varied considerably from .13 to .92. An application of the Nunnally and Bernstein formula proposed by Markey, Markey, and Tinsley overestimated axes reliabilities in cases of large-scale specificities but otherwise works effectively. Contemporary circumplex evaluations such as Tracey’s RANDALL are sensitive to the ratio of the axes and scale-specificity components. In contrast, the proposed model isolates both components.

  18. The cost of reliability

    International Nuclear Information System (INIS)

    Ilic, M.

    1998-01-01

    In this article the restructuring process under way in the US power industry is being revisited from the point of view of transmission system provision and reliability was rolled into the average cost of electricity to all, it is not so obvious how is this cost managed in the new industry. A new MIT approach to transmission pricing is here suggested as a possible solution [it

  19. Lung involvement quantification in chest radiographs

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A.; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M.

    2014-01-01

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  20. Improvement of Reliability of Diffusion Tensor Metrics in Thigh Skeletal Muscles.

    Science.gov (United States)

    Keller, Sarah; Chhabra, Avneesh; Ahmed, Shaheen; Kim, Anne C; Chia, Jonathan M; Yamamura, Jin; Wang, Zhiyue J

    2018-05-01

    Quantitative diffusion tensor imaging (DTI) of skeletal muscles is challenging due to the bias in DTI metrics, such as fractional anisotropy (FA) and mean diffusivity (MD), related to insufficient signal-to-noise ratio (SNR). This study compares the bias of DTI metrics in skeletal muscles via pixel-based and region-of-interest (ROI)-based analysis. DTI of the thigh muscles was conducted on a 3.0-T system in N = 11 volunteers using a fat-suppressed single-shot spin-echo echo planar imaging (SS SE-EPI) sequence with eight repetitions (number of signal averages (NSA) = 4 or 8 for each repeat). The SNR was calculated for different NSAs and estimated for the composite images combining all data (effective NSA = 48) as standard reference. The bias of MD and FA derived by pixel-based and ROI-based quantification were compared at different NSAs. An "intra-ROI diffusion direction dispersion angle (IRDDDA)" was calculated to assess the uniformity of diffusion within the ROI. Using our standard reference image with NSA = 48, the ROI-based and pixel-based measurements agreed for FA and MD. Larger disagreements were observed for the pixel-based quantification at NSA = 4. MD was less sensitive than FA to the noise level. The IRDDDA decreased with higher NSA. At NSA = 4, ROI-based FA showed a lower average bias (0.9% vs. 37.4%) and narrower 95% limits of agreement compared to the pixel-based method. The ROI-based estimation of FA is less prone to bias than the pixel-based estimations when SNR is low. The IRDDDA can be applied as a quantitative quality measure to assess reliability of ROI-based DTI metrics. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Software reliability studies

    Science.gov (United States)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  2. Investment in new product reliability

    International Nuclear Information System (INIS)

    Murthy, D.N.P.; Rausand, M.; Virtanen, S.

    2009-01-01

    Product reliability is of great importance to both manufacturers and customers. Building reliability into a new product is costly, but the consequences of inadequate product reliability can be costlier. This implies that manufacturers need to decide on the optimal investment in new product reliability by achieving a suitable trade-off between the two costs. This paper develops a framework and proposes an approach to help manufacturers decide on the investment in new product reliability.

  3. A real-time PCR assay for detection and quantification of Verticillium dahliae in spinach seed.

    Science.gov (United States)

    Duressa, Dechassa; Rauscher, Gilda; Koike, Steven T; Mou, Beiquan; Hayes, Ryan J; Maruthachalam, Karunakaran; Subbarao, Krishna V; Klosterman, Steven J

    2012-04-01

    Verticillium dahliae is a soilborne fungus that causes Verticillium wilt on multiple crops in central coastal California. Although spinach crops grown in this region for fresh and processing commercial production do not display Verticillium wilt symptoms, spinach seeds produced in the United States or Europe are commonly infected with V. dahliae. Planting of the infected seed increases the soil inoculum density and may introduce exotic strains that contribute to Verticillium wilt epidemics on lettuce and other crops grown in rotation with spinach. A sensitive, rapid, and reliable method for quantification of V. dahliae in spinach seed may help identify highly infected lots, curtail their planting, and minimize the spread of exotic strains via spinach seed. In this study, a quantitative real-time polymerase chain reaction (qPCR) assay was optimized and employed for detection and quantification of V. dahliae in spinach germplasm and 15 commercial spinach seed lots. The assay used a previously reported V. dahliae-specific primer pair (VertBt-F and VertBt-R) and an analytical mill for grinding tough spinach seed for DNA extraction. The assay enabled reliable quantification of V. dahliae in spinach seed, with a sensitivity limit of ≈1 infected seed per 100 (1.3% infection in a seed lot). The quantification was highly reproducible between replicate samples of a seed lot and in different real-time PCR instruments. When tested on commercial seed lots, a pathogen DNA content corresponding to a quantification cycle value of ≥31 corresponded with a percent seed infection of ≤1.3%. The assay is useful in qualitatively assessing seed lots for V. dahliae infection levels, and the results of the assay can be helpful to guide decisions on whether to apply seed treatments.

  4. 'Motion frozen' quantification and display of myocardial perfusion gated SPECT

    International Nuclear Information System (INIS)

    Slomka, P.J.; Hurwitz, G.A.; Baddredine, M.; Baranowski, J.; Aladl, U.E.

    2002-01-01

    Aim: Gated SPECT imaging incorporates both functional and perfusion information of the left ventricle (LV). However perfusion data is confounded by the effect of ventricular motion. Most existing quantification paradigms simply add all gated frames and then proceed to extract the perfusion information from static images, discarding the effects of cardiac motion. In an attempt to improve the reliability and accuracy of cardiac SPECT quantification we propose to eliminate the LV motion prior to the perfusion quantification via automated image warping algorithm. Methods: A pilot series of 14 male and 11 female gated stress SPECT images acquired with 8 time bins have been co-registered to the coordinates of the 3D normal templates. Subsequently the LV endo and epi-cardial 3D points (300-500) were identified on end-systolic (ES) and end-diastolic (ED) frames, defining the ES-ED motion vectors. The nonlinear image warping algorithm (thin-plate-spline) was then applied to warp end-systolic frame was onto the end-diastolic frames using the corresponding ES-ED motion vectors. The remaining 6 intermediate frames were also transformed to the ED coordinates using fractions of the motion vectors. Such warped images were then summed to provide the LV perfusion image in the ED phase but with counts from the full cycle. Results: The identification of the ED/ES corresponding points was successful in all cases. The corrected displacement between ED and ES images was up to 25 mm. The summed images had the appearance of the ED frames but have been much less noisy since all the counts have been used. The spatial resolution of such images appeared higher than that of summed gated images, especially in the female scans. These 'motion frozen' images could be displayed and quantified as regular non-gated tomograms including polar map paradigm. Conclusions: This image processing technique may improve the effective image resolution of summed gated myocardial perfusion images used for

  5. Hepatic Iron Quantification on 3 Tesla (3 T Magnetic Resonance (MR: Technical Challenges and Solutions

    Directory of Open Access Journals (Sweden)

    Muhammad Anwar

    2013-01-01

    Full Text Available MR has become a reliable and noninvasive method of hepatic iron quantification. Currently, most of the hepatic iron quantification is performed on 1.5 T MR, and the biopsy measurements have been paired with R2 and R2* values for 1.5 T MR. As the use of 3 T MR scanners is steadily increasing in clinical practice, it has become important to evaluate the practicality of calculating iron burden at 3 T MR. Hepatic iron quantification on 3 T MR requires a better understanding of the process and more stringent technical considerations. The purpose of this work is to focus on the technical challenges in establishing a relationship between T2* values at 1.5 T MR and 3 T MR for hepatic iron concentration (HIC and to develop an appropriately optimized MR protocol for the evaluation of T2* values in the liver at 3 T magnetic field strength. We studied 22 sickle cell patients using multiecho fast gradient-echo sequence (MFGRE 3 T MR and compared the results with serum ferritin and liver biopsy results. Our study showed that the quantification of hepatic iron on 3 T MRI in sickle cell disease patients correlates well with clinical blood test results and biopsy results. 3 T MR liver iron quantification based on MFGRE can be used for hepatic iron quantification in transfused patients.

  6. A phase quantification method based on EBSD data for a continuously cooled microalloyed steel

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, H.; Wynne, B.P.; Palmiere, E.J., E-mail: e.j.palmiere@sheffield.ac.uk

    2017-01-15

    Mechanical properties of steels depend on the phase constitutions of the final microstructures which can be related to the processing parameters. Therefore, accurate quantification of different phases is necessary to investigate the relationships between processing parameters, final microstructures and mechanical properties. Point counting on micrographs observed by optical or scanning electron microscopy is widely used as a phase quantification method, and different phases are discriminated according to their morphological characteristics. However, it is difficult to differentiate some of the phase constituents with similar morphology. Differently, for EBSD based phase quantification methods, besides morphological characteristics, other parameters derived from the orientation information can also be used for discrimination. In this research, a phase quantification method based on EBSD data in the unit of grains was proposed to identify and quantify the complex phase constitutions of a microalloyed steel subjected to accelerated coolings. Characteristics of polygonal ferrite/quasi-polygonal ferrite, acicular ferrite and bainitic ferrite on grain averaged misorientation angles, aspect ratios, high angle grain boundary fractions and grain sizes were analysed and used to develop the identification criteria for each phase. Comparing the results obtained by this EBSD based method and point counting, it was found that this EBSD based method can provide accurate and reliable phase quantification results for microstructures with relatively slow cooling rates. - Highlights: •A phase quantification method based on EBSD data in the unit of grains was proposed. •The critical grain area above which GAM angles are valid parameters was obtained. •Grain size and grain boundary misorientation were used to identify acicular ferrite. •High cooling rates deteriorate the accuracy of this EBSD based method.

  7. Reliability analysis of prestressed concrete containment structures

    International Nuclear Information System (INIS)

    Jiang, J.; Zhao, Y.; Sun, J.

    1993-01-01

    The reliability analysis of prestressed concrete containment structures subjected to combinations of static and dynamic loads with consideration of uncertainties of structural and load parameters is presented. Limit state probabilities for given parameters are calculated using the procedure developed at BNL, while that with consideration of parameter uncertainties are calculated by a fast integration for time variant structural reliability. The limit state surface of the prestressed concrete containment is constructed directly incorporating the prestress. The sensitivities of the Choleskey decomposition matrix and the natural vibration character are calculated by simplified procedures. (author)

  8. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    Science.gov (United States)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  9. The National Centre of Systems Reliability and some aspects of its future activities

    International Nuclear Information System (INIS)

    Bourne, A.J.

    1975-01-01

    The National Centre of Systems Reliability (NCSR) has been set up to enhance the work of the Systems Reliability Service (SRS), which during its four years of operation by the UKAEA has offered to industry expertise in the quantification of reliability of systems in various technological applications. An outline is presented of the background to the establishment of the NCSR, including a brief summary of the work of the SRS. Certain aspects of the future activities of the NCSR particularly in relation to research and collaboration with universities are discussed. (U.K.)

  10. Nuclear performance and reliability

    International Nuclear Information System (INIS)

    Rothwell, G.

    1993-01-01

    If fewer forced outages are a sign of improved safety, nuclear power plants have become safer and more productive. There has been a significant improvement in nuclear power plant performance, due largely to a decline in the forced outage rate and a dramatic drop in the average number of forced outages per fuel cycle. If fewer forced outages are a sign of improved safety, nuclear power plants have become safer and more productive over time. To encourage further increases in performance, regulatory incentive schemes should reward reactor operators for improved reliability and safety, as well as for improved performance

  11. [How Reliable is Neuronavigation?].

    Science.gov (United States)

    Stieglitz, Lennart Henning

    2016-02-17

    Neuronavigation plays a central role in modern neurosurgery. It allows visualizing instruments and three-dimensional image data intraoperatively and supports spatial orientation. Thus it allows to reduce surgical risks and speed up complex surgical procedures. The growing availability and importance of neuronavigation makes clear how relevant it is to know about its reliability and accuracy. Different factors may influence the accuracy during the surgery unnoticed, misleading the surgeon. Besides the best possible optimization of the systems themselves, a good knowledge about its weaknesses is mandatory for every neurosurgeon.

  12. Load Control System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Trudnowski, Daniel [Montana Tech of the Univ. of Montana, Butte, MT (United States)

    2015-04-03

    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  13. Microprocessor hardware reliability

    Energy Technology Data Exchange (ETDEWEB)

    Wright, R I

    1982-01-01

    Microprocessor-based technology has had an impact in nearly every area of industrial electronics and many applications have important safety implications. Microprocessors are being used for the monitoring and control of hazardous processes in the chemical, oil and power generation industries, for the control and instrumentation of aircraft and other transport systems and for the control of industrial machinery. Even in the field of nuclear reactor protection, where designers are particularly conservative, microprocessors are used to implement certain safety functions and may play increasingly important roles in protection systems in the future. Where microprocessors are simply replacing conventional hard-wired control and instrumentation systems no new hazards are created by their use. In the field of robotics, however, the microprocessor has opened up a totally new technology and with it has created possible new and as yet unknown hazards. The paper discusses some of the design and manufacturing techniques which may be used to enhance the reliability of microprocessor based systems and examines the available reliability data on lsi/vlsi microcircuits. 12 references.

  14. Supply chain reliability modelling

    Directory of Open Access Journals (Sweden)

    Eugen Zaitsev

    2012-03-01

    Full Text Available Background: Today it is virtually impossible to operate alone on the international level in the logistics business. This promotes the establishment and development of new integrated business entities - logistic operators. However, such cooperation within a supply chain creates also many problems related to the supply chain reliability as well as the optimization of the supplies planning. The aim of this paper was to develop and formulate the mathematical model and algorithms to find the optimum plan of supplies by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Methods: The mathematical model and algorithms to find the optimum plan of supplies were developed and formulated by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Results and conclusions: The problem of ensuring failure-free performance of goods supply channel analyzed in the paper is characteristic of distributed network systems that make active use of business process outsourcing technologies. The complex planning problem occurring in such systems that requires taking into account the consumer's requirements for failure-free performance in terms of supply volumes and correctness can be reduced to a relatively simple linear programming problem through logical analysis of the structures. The sequence of the operations, which should be taken into account during the process of the supply planning with the supplier's functional reliability, was presented.

  15. Quantification of arbuscular mycorrhizal fungal DNA in roots: how important is material preservation?

    Science.gov (United States)

    Janoušková, Martina; Püschel, David; Hujslová, Martina; Slavíková, Renata; Jansa, Jan

    2015-04-01

    Monitoring populations of arbuscular mycorrhizal fungi (AMF) in roots is a pre-requisite for improving our understanding of AMF ecology and functioning of the symbiosis in natural conditions. Among other approaches, quantification of fungal DNA in plant tissues by quantitative real-time PCR is one of the advanced techniques with a great potential to process large numbers of samples and to deliver truly quantitative information. Its application potential would greatly increase if the samples could be preserved by drying, but little is currently known about the feasibility and reliability of fungal DNA quantification from dry plant material. We addressed this question by comparing quantification results based on dry root material to those obtained from deep-frozen roots of Medicago truncatula colonized with Rhizophagus sp. The fungal DNA was well conserved in the dry root samples with overall fungal DNA levels in the extracts comparable with those determined in extracts of frozen roots. There was, however, no correlation between the quantitative data sets obtained from the two types of material, and data from dry roots were more variable. Based on these results, we recommend dry material for qualitative screenings but advocate using frozen root materials if precise quantification of fungal DNA is required.

  16. The Qiagen Investigator® Quantiplex HYres as an alternative kit for DNA quantification.

    Science.gov (United States)

    Frégeau, Chantal J; Laurin, Nancy

    2015-05-01

    The Investigator® Quantiplex HYres kit was evaluated as a potential replacement for dual DNA quantification of casework samples. This kit was determined to be highly sensitive with a limit of quantification and limit of detection of 0.0049ng/μL and 0.0003ng/μL, respectively, for both human and male DNA, using full or half reaction volumes. It was also accurate in assessing the amount of male DNA present in 96 mock and actual casework male:female mixtures (various ratios) processed in this exercise. The close correlation between the male/human DNA ratios expressed in percentages derived from the Investigator® Quantiplex HYres quantification results and the male DNA proportion calculated in mixed AmpFlSTR® Profiler® Plus or AmpFlSTR® Identifiler® Plus profiles, using the Amelogenin Y peak and STR loci, allowed guidelines to be developed to facilitate decisions regarding when to submit samples to Y-STR rather than autosomal STR profiling. The internal control (IC) target was shown to be more sensitive to inhibitors compared to the human and male DNA targets included in the Investigator® Quantiplex HYres kit serving as a good quality assessor of DNA extracts. The new kit met our criteria of enhanced sensitivity, accuracy, consistency, reliability and robustness for casework DNA quantification. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.

  17. Developing safety performance functions incorporating reliability-based risk measures.

    Science.gov (United States)

    Ibrahim, Shewkar El-Bassiouni; Sayed, Tarek

    2011-11-01

    Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (P(nc)) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit-cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the P(nc) provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I+F) and property damage only (PDO) collisions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Reliability-Based Optimization of Series Systems of Parallel Systems

    DEFF Research Database (Denmark)

    Enevoldsen, I.; Sørensen, John Dalsgaard

    Reliability-based design of structural systems is considered. Especially systems where the reliability model is a series system of parallel systems are analysed. A sensitivity analysis for this class of problems is presented. Direct and sequential optimization procedures to solve the optimization...

  19. DoD Nuclear Weapons Personnel Reliability Assurance

    Science.gov (United States)

    2016-04-27

    systems, positive control material (PCM) and equipment, and special nuclear material (SNM) and subject to a nuclear weapons personnel reliability...assurance implementation guidance for consistency and compliance with this issuance. c. Conducts programmatic reviews, manages audits , and directs...personnel reliability assurance education and training materials . 2.4. ASSISTANT SECRETARY OF DEFENSE FOR HEALTH AFFAIRS (ASD(HA)). Under the authority

  20. The quantification of risk and tourism

    Directory of Open Access Journals (Sweden)

    Piet Croucamp

    2014-01-01

    Full Text Available Tourism in South Africa comprises 9.5% of Gross Domestic Product (GDP, but remains an underresearched industry, especially regarding the quantification of the risks prevailing in the social, political and economic environment in which the industry operates. Risk prediction, extrapolation forecasting is conducted largely in the context of a qualitative methodology. This article reflects on the quantification of social constructs as variables of risk in the tourism industry with reference to South Africa. The theory and methodology of quantification is briefly reviewed and the indicators of risk are conceptualized and operationalized. The identified indicators are scaled in indices for purposes of quantification. Risk assessments and the quantification of constructs rely heavily on the experience - often personal - of the researcher and this scholarly endeavour is, therefore, not inclusive of all possible identified indicators of risk. It is accepted that tourism in South Africa is an industry comprising of a large diversity of sectors, each with a different set of risk indicators and risk profiles. The emphasis of this article is thus on the methodology to be applied to a risk profile. A secondary endeavour is to provide for clarity about the conceptual and operational confines of risk in general, as well as how quantified risk relates to the tourism industry. The indices provided include both domesticand international risk indicators. The motivation for the article is to encourage a greater emphasis on quantitative research in our efforts to understand and manage a risk profile for the tourist industry.

  1. OSS reliability measurement and assessment

    CERN Document Server

    Yamada, Shigeru

    2016-01-01

    This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.

  2. Transit ridership, reliability, and retention.

    Science.gov (United States)

    2008-10-01

    This project explores two major components that affect transit ridership: travel time reliability and rider : retention. It has been recognized that transit travel time reliability may have a significant impact on : attractiveness of transit to many ...

  3. Travel reliability inventory for Chicago.

    Science.gov (United States)

    2013-04-01

    The overarching goal of this research project is to enable state DOTs to document and monitor the reliability performance : of their highway networks. To this end, a computer tool, TRIC, was developed to produce travel reliability inventories from : ...

  4. Pore REconstruction and Segmentation (PORES) method for improved porosity quantification of nanoporous materials

    Energy Technology Data Exchange (ETDEWEB)

    Van Eyndhoven, G., E-mail: geert.vaneyndhoven@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Kurttepeli, M. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Van Oers, C.J.; Cool, P. [Laboratory of Adsorption and Catalysis, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Bals, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Batenburg, K.J. [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Centrum Wiskunde and Informatica, Science Park 123, NL-1090 GB Amsterdam (Netherlands); Mathematical Institute, Universiteit Leiden, Niels Bohrweg 1, NL-2333 CA Leiden (Netherlands); Sijbers, J. [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium)

    2015-01-15

    Electron tomography is currently a versatile tool to investigate the connection between the structure and properties of nanomaterials. However, a quantitative interpretation of electron tomography results is still far from straightforward. Especially accurate quantification of pore-space is hampered by artifacts introduced in all steps of the processing chain, i.e., acquisition, reconstruction, segmentation and quantification. Furthermore, most common approaches require subjective manual user input. In this paper, the PORES algorithm “POre REconstruction and Segmentation” is introduced; it is a tailor-made, integral approach, for the reconstruction, segmentation, and quantification of porous nanomaterials. The PORES processing chain starts by calculating a reconstruction with a nanoporous-specific reconstruction algorithm: the Simultaneous Update of Pore Pixels by iterative REconstruction and Simple Segmentation algorithm (SUPPRESS). It classifies the interior region to the pores during reconstruction, while reconstructing the remaining region by reducing the error with respect to the acquired electron microscopy data. The SUPPRESS reconstruction can be directly plugged into the remaining processing chain of the PORES algorithm, resulting in accurate individual pore quantification and full sample pore statistics. The proposed approach was extensively validated on both simulated and experimental data, indicating its ability to generate accurate statistics of nanoporous materials. - Highlights: • An electron tomography reconstruction/segmentation method for nanoporous materials. • The method exploits the porous nature of the scanned material. • Validated extensively on both simulation and real data experiments. • Results in increased image resolution and improved porosity quantification.

  5. Photoacoustic bio-quantification of graphene based nanomaterials at a single cell level (Conference Presentation)

    Science.gov (United States)

    Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.

    2017-03-01

    Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.

  6. An approach for assessing ALWR passive safety system reliability

    International Nuclear Information System (INIS)

    Hake, T.M.

    1991-01-01

    Many of the advanced light water reactor (ALWR) concepts proposed for the next generation of nuclear power plants rely on passive rather than active systems to perform safety functions. Despite the reduced redundancy of the passive systems as compared to active systems in current plants, the assertion is that the overall safety of the plant is enhanced due to the much higher expected reliability of the passive systems. In order to investigate this assertion, a study is being conducted at Sandia National Laboratories to evaluate the reliability of ALWR passive safety features in the context of probabilistic risk assessment (PRA). The purpose of this paper is to provide a brief overview of the approach to this study. The quantification of passive system reliability is not as straightforward as for active systems, due to the lack of operating experience, and to the greater uncertainty in the governing physical phenomena. Thus, the adequacy of current methods for evaluating system reliability must be assessed, and alternatives proposed if necessary. For this study, the Westinghouse Advanced Passive 600 MWe reactor (AP600) was chosen as the advanced reactor for analysis, because of the availability of AP600 design information. This study compares the reliability of AP600 emergency cooling system with that of corresponding systems in a current generation reactor

  7. Direct measurement of strontium-90 and uranium-238 in soils on a real-time basis: 1994 summary report

    International Nuclear Information System (INIS)

    Schilk, A.J.; Hubbard, C.W.; Knopf, M.A.; Thompson, R.C.

    1995-04-01

    Traditional methodologies for quantitative characterization of radionuclide-contaminated soils over extended areas are often tedious, costly, and non-representative. A rapid characterization methodology was designed that provides reliable output with spatial resolution on the order of a few meters or less. It incorporates an innovative sensor of square plastic scintillating fibers that has been designed to be placed directly on or above a contaminated soil to detect and quantify high-energy beta particles associated with the decay chains of uranium and/or strontium. Under the direction and auspices of the DOE's Characterization, Monitoring, and Sensor Technology Integrated Program, Pacific Northwest Laboratory (PNL) constructed a high-energy beta scintillation sensor that was optimized for the detection and quantification of uranium and strontium contamination in surface soils (in the presence of potentially interfering natural and anthropogenic radionuclides), demonstrated and evaluated this detector in various field and laboratory scenarios, and provides this document in completion of the aforementioned requirements

  8. Recommendations for certification or measurement of reliability for reliable digital archival repositories with emphasis on access

    Directory of Open Access Journals (Sweden)

    Paula Regina Ventura Amorim Gonçalez

    2017-04-01

    Full Text Available Introduction: Considering the guidelines of ISO 16363: 2012 (Space data and information transfer systems -- Audit and certification of trustworthy digital repositories and the text of CONARQ Resolution 39 for certification of Reliable Digital Archival Repository (RDC-Arq, verify the technical recommendations should be used as the basis for a digital archival repository to be considered reliable. Objective: Identify requirements for the creation of Reliable Digital Archival Repositories with emphasis on access to information from the ISO 16363: 2012 and CONARQ Resolution 39. Methodology: For the development of the study, the methodology consisted of an exploratory, descriptive and documentary theoretical investigation, since it is based on ISO 16363: 2012 and CONARQ Resolution 39. From the perspective of the problem approach, the study is qualitative and quantitative, since the data were collected, tabulated, and analyzed from the interpretation of their contents. Results: We presented a set of Checklist Recommendations for reliability measurement and/or certification for RDC-Arq with a clipping focused on the identification of requirements with emphasis on access to information is presented. Conclusions: The right to information as well as access to reliable information is a premise for Digital Archival Repositories, so the set of recommendations is directed to archivists who work in Digital Repositories and wish to verify the requirements necessary to evaluate the reliability of the Digital Repository or still guide the information professional in collecting requirements for repository reliability certification.

  9. reliability analysis of a two span floor designed according

    African Journals Online (AJOL)

    user

    deterministic approach, considering both ultimate and serviceability limit states. Reliability analysis of the floor ... loading, strength and stiffness parameters, dimensions .... to show that there is a direct relation between the failure probability (Pf) ...

  10. Quantifications and Modeling of Human Failure Events in a Fire PSA

    International Nuclear Information System (INIS)

    Kang, Dae Il; Kim, Kilyoo; Jang, Seung-Cheol

    2014-01-01

    USNRC and EPRI developed guidance, 'Fire Human Reliability Analysis Guidelines, NUREG-1921', for estimating human error probabilities (HEPs) for HFEs under fire conditions. NUREG-1921 classifies HFEs into four types associated with the following human actions: - Type 1: New and existing Main Control Room (MCR) actions - Type 2: New and existing ex-MCR actions - Type 3: Actions associated with using alternate shutdown means (ASD) - Type 4: Actions relating to the error of commissions (EOCs) or error of omissions (EOOs) as a result of incorrect indications (SPI) In this paper, approaches for the quantifications and modeling of HFEs related to Type 1, 2 and 3 human actions are introduced. This paper introduced the human reliability analysis process for a fire PSA of Hanul Unit 3. A multiplier of 10 was used to re-estimate the HEPs for the preexisting internal human actions. The HEPs for all ex- MCR actions were assumed to be one. New MCR human actions were quantified using the scoping analysis method of NUREG-1921. If the quantified human action were identified to be risk-significant, detailed approaches (modeling and quantification) were used for incorporating fire situations into them. Multiple HFEs for single human action were defined and they were separately and were separately quantified to incorporate the specific fire situations into them. From this study, we can confirm that the modeling as well as quantifications of human actions is very important to appropriately treat them in PSA logic structures

  11. Quantifications and Modeling of Human Failure Events in a Fire PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Kim, Kilyoo; Jang, Seung-Cheol [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    USNRC and EPRI developed guidance, 'Fire Human Reliability Analysis Guidelines, NUREG-1921', for estimating human error probabilities (HEPs) for HFEs under fire conditions. NUREG-1921 classifies HFEs into four types associated with the following human actions: - Type 1: New and existing Main Control Room (MCR) actions - Type 2: New and existing ex-MCR actions - Type 3: Actions associated with using alternate shutdown means (ASD) - Type 4: Actions relating to the error of commissions (EOCs) or error of omissions (EOOs) as a result of incorrect indications (SPI) In this paper, approaches for the quantifications and modeling of HFEs related to Type 1, 2 and 3 human actions are introduced. This paper introduced the human reliability analysis process for a fire PSA of Hanul Unit 3. A multiplier of 10 was used to re-estimate the HEPs for the preexisting internal human actions. The HEPs for all ex- MCR actions were assumed to be one. New MCR human actions were quantified using the scoping analysis method of NUREG-1921. If the quantified human action were identified to be risk-significant, detailed approaches (modeling and quantification) were used for incorporating fire situations into them. Multiple HFEs for single human action were defined and they were separately and were separately quantified to incorporate the specific fire situations into them. From this study, we can confirm that the modeling as well as quantifications of human actions is very important to appropriately treat them in PSA logic structures.

  12. Quantification of bronchial dimensions at MDCT using dedicated software

    International Nuclear Information System (INIS)

    Brillet, P.Y.; Fetita, C.I.; Saragaglia, A.; Perchet, D.; Preteux, F.; Beigelman-Aubry, C.; Grenier, P.A.

    2007-01-01

    This study aimed to assess the feasibility of quantification of bronchial dimensions at MDCT using dedicated software (BronCare). We evaluated the reliability of the software to segment the airways and defined criteria ensuring accurate measurements. BronCare was applied on two successive examinations in 10 mild asthmatic patients. Acquisitions were performed at pneumotachographically controlled lung volume (65% TLC), with reconstructions focused on the right lung base. Five validation criteria were imposed: (1) bronchus type: segmental and subsegmental; (2) lumen area (LA)>4 mm 2 ; (3) bronchus length (Lg) > 7 mm; (4) confidence index - giving the percentage of the bronchus not abutted by a vessel - (CI) >55% for validation of wall area (WA) and (5) a minimum of 10 contiguous cross-sectional images fulfilling the criteria. A complete segmentation procedure on both acquisitions made possible an evaluation of LA and WA in 174/223 (78%) and 171/174 (98%) of bronchi, respectively. The validation criteria were met for 56/69 (81%) and for 16/69 (23%) of segmental bronchi and for 73/102 (72%) and 58/102 (57%) of subsegmental bronchi, for LA and WA, respectively. In conclusion, BronCare is reliable to segment the airways in clinical practice. The proposed criteria seem appropriate to select bronchi candidates for measurement. (orig.)

  13. Robust sleep quality quantification method for a personal handheld device.

    Science.gov (United States)

    Shin, Hangsik; Choi, Byunghun; Kim, Doyoon; Cho, Jaegeol

    2014-06-01

    The purpose of this study was to develop and validate a novel method for sleep quality quantification using personal handheld devices. The proposed method used 3- or 6-axes signals, including acceleration and angular velocity, obtained from built-in sensors in a smartphone and applied a real-time wavelet denoising technique to minimize the nonstationary noise. Sleep or wake status was decided on each axis, and the totals were finally summed to calculate sleep efficiency (SE), regarded as sleep quality in general. The sleep experiment was carried out for performance evaluation of the proposed method, and 14 subjects participated. An experimental protocol was designed for comparative analysis. The activity during sleep was recorded not only by the proposed method but also by well-known commercial applications simultaneously; moreover, activity was recorded on different mattresses and locations to verify the reliability in practical use. Every calculated SE was compared with the SE of a clinically certified medical device, the Philips (Amsterdam, The Netherlands) Actiwatch. In these experiments, the proposed method proved its reliability in quantifying sleep quality. Compared with the Actiwatch, accuracy and average bias error of SE calculated by the proposed method were 96.50% and -1.91%, respectively. The proposed method was vastly superior to other comparative applications with at least 11.41% in average accuracy and at least 6.10% in average bias; average accuracy and average absolute bias error of comparative applications were 76.33% and 17.52%, respectively.

  14. Reliable quantification of BOLD fMRI cerebrovascular reactivity despite poor breath-hold performance.

    Science.gov (United States)

    Bright, Molly G; Murphy, Kevin

    2013-12-01

    Cerebrovascular reactivity (CVR) can be mapped using BOLD fMRI to provide a clinical insight into vascular health that can be used to diagnose cerebrovascular disease. Breath-holds are a readily accessible method for producing the required arterial CO2 increases but their implementation into clinical studies is limited by concerns that patients will demonstrate highly variable performance of breath-hold challenges. This study assesses the repeatability of CVR measurements despite poor task performance, to determine if and how robust results could be achieved with breath-holds in patients. Twelve healthy volunteers were scanned at 3 T. Six functional scans were acquired, each consisting of 6 breath-hold challenges (10, 15, or 20 s duration) interleaved with periods of paced breathing. These scans simulated the varying breath-hold consistency and ability levels that may occur in patient data. Uniform ramps, time-scaled ramps, and end-tidal CO2 data were used as regressors in a general linear model in order to measure CVR at the grey matter, regional, and voxelwise level. The intraclass correlation coefficient (ICC) quantified the repeatability of the CVR measurement for each breath-hold regressor type and scale of interest across the variable task performances. The ramp regressors did not fully account for variability in breath-hold performance and did not achieve acceptable repeatability (ICC0.4). Further analysis of intra-subject CVR variability across the brain (ICCspatial and voxelwise correlation) supported the use of end-tidal CO2 data to extract robust whole-brain CVR maps, despite variability in breath-hold performance. We conclude that the incorporation of end-tidal CO2 monitoring into scanning enables robust, repeatable measurement of CVR that makes breath-hold challenges suitable for routine clinical practice. © 2013.

  15. Quantification of the genetic risk of environmental mutagens

    International Nuclear Information System (INIS)

    Ehling, U.H.

    1988-01-01

    Screening methods are used for hazard identification. Assays for heritable mutations in mammals are used for the confirmation of short-term test results and for the quantification of the genetic risk. There are two main approaches in making genetic risk estimates. One of these, termed the direct method, expresses risk in terms of the expected frequency of genetic changes induced per unit. The other, referred to as the doubling dose method or the indirect method, expresses risk in relation to the observed incidence of genetic disorders now present in man. The indirect method uses experimental data only for the calculation of the doubling dose. The quality of the risk estimation depends on the assumption of persistence of the induced mutations and the ability to determine the current incidence of genetic diseases. The difficulties of improving the estimates of current incidences of genetic diseases or the persistence of the genes in the population led them to the development of an alternative method, the direct estimation of the genetic risk. The direct estimation uses experimental data for the induced frequency for dominant mutations in mice. For the verification of these quantifications one can use the data of Hiroshima and Nagasaki. According to the estimation with the direct method, one would expect less than 1 radiation-induced dominant cataract in 19,000 children with one or both parents exposed. The expected overall frequency of dominant mutations in the first generation would be 20-25, based on radiation-induced dominant cataract mutations. It is estimated that 10 times more recessive than dominant mutations are induced. The same approaches can be used to determine the impact of chemical mutagens

  16. 2017 NREL Photovoltaic Reliability Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-15

    NREL's Photovoltaic (PV) Reliability Workshop (PVRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology -- both critical goals for moving PV technologies deeper into the electricity marketplace.

  17. AECL's reliability and maintainability program

    International Nuclear Information System (INIS)

    Wolfe, W.A.; Nieuwhof, G.W.E.

    1976-05-01

    AECL's reliability and maintainability program for nuclear generating stations is described. How the various resources of the company are organized to design and construct stations that operate reliably and safely is shown. Reliability and maintainability includes not only special mathematically oriented techniques, but also the technical skills and organizational abilities of the company. (author)

  18. Business of reliability

    Science.gov (United States)

    Engel, Pierre

    1999-12-01

    The presentation is organized around three themes: (1) The decrease of reception equipment costs allows non-Remote Sensing organization to access a technology until recently reserved to scientific elite. What this means is the rise of 'operational' executive agencies considering space-based technology and operations as a viable input to their daily tasks. This is possible thanks to totally dedicated ground receiving entities focusing on one application for themselves, rather than serving a vast community of users. (2) The multiplication of earth observation platforms will form the base for reliable technical and financial solutions. One obstacle to the growth of the earth observation industry is the variety of policies (commercial versus non-commercial) ruling the distribution of the data and value-added products. In particular, the high volume of data sales required for the return on investment does conflict with traditional low-volume data use for most applications. Constant access to data sources supposes monitoring needs as well as technical proficiency. (3) Large volume use of data coupled with low- cost equipment costs is only possible when the technology has proven reliable, in terms of application results, financial risks and data supply. Each of these factors is reviewed. The expectation is that international cooperation between agencies and private ventures will pave the way for future business models. As an illustration, the presentation proposes to use some recent non-traditional monitoring applications, that may lead to significant use of earth observation data, value added products and services: flood monitoring, ship detection, marine oil pollution deterrent systems and rice acreage monitoring.

  19. Quantitative quenching evaluation and direct intracellular metabolite analysis in Penicillium chrysogenum.

    Science.gov (United States)

    Meinert, Sabine; Rapp, Sina; Schmitz, Katja; Noack, Stephan; Kornfeld, Georg; Hardiman, Timo

    2013-07-01

    Sustained progress in metabolic engineering methodologies has stimulated new efforts toward optimizing fungal production strains such as through metabolite analysis of Penicillium chrysogenum industrial-scale processes. Accurate intracellular metabolite quantification requires sampling procedures that rapidly stop metabolism (quenching) and avoid metabolite loss via the cell membrane (leakage). When sampling protocols are validated, the quenching efficiency is generally not quantitatively assessed. For fungal metabolomics, quantitative biomass separation using centrifugation is a further challenge. In this study, P. chrysogenum intracellular metabolites were quantified directly from biomass extracts using automated sampling and fast filtration. A master/slave bioreactor concept was applied to provide industrial production conditions. Metabolic activity during sampling was monitored by 13C tracing. Enzyme activities were efficiently stopped and metabolite leakage was absent. This work provides a reliable method for P. chrysogenum metabolomics and will be an essential base for metabolic engineering of industrial processes. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. Simultaneous quantification of 21 water soluble vitamin circulating forms in human plasma by liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Meisser Redeuil, Karine; Longet, Karin; Bénet, Sylvie; Munari, Caroline; Campos-Giménez, Esther

    2015-11-27

    This manuscript reports a validated analytical approach for the quantification of 21 water soluble vitamins and their main circulating forms in human plasma. Isotope dilution-based sample preparation consisted of protein precipitation using acidic methanol enriched with stable isotope labelled internal standards. Separation was achieved by reversed-phase liquid chromatography and detection performed by tandem mass spectrometry in positive electrospray ionization mode. Instrumental lower limits of detection and quantification reached water soluble vitamins in human plasma single donor samples. The present report provides a sensitive and reliable approach for the quantification of water soluble vitamins and main circulating forms in human plasma. In the future, the application of this analytical approach will give more confidence to provide a comprehensive assessment of water soluble vitamins nutritional status and bioavailability studies in humans. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Terahertz identification and quantification of penicillamine enantiomers

    International Nuclear Information System (INIS)

    Ji Te; Zhao Hongwei; Chen Min; Xiao Tiqiao; Han Pengyu

    2013-01-01

    Identification and characterization of L-, D- and DL- penicillamine were demonstrated by Terahertz time-domain spectroscopy (THz-TDS). To understand the physical origins of the low frequency resonant modes, the density functional theory (DFT) was adopted for theoretical calculation. It was found that the collective THz frequency motions were decided by the intramolecular and intermolecular hydrogen bond interactions. Moreover, the quantification of penicillamine enantiomers mixture was demonstrated by a THz spectra fitting method with a relative error of less than 3.5%. This technique can be a valuable tool for the discrimination and quantification of chiral drugs in pharmaceutical industry. (authors)

  2. Electronics reliability calculation and design

    CERN Document Server

    Dummer, Geoffrey W A; Hiller, N

    1966-01-01

    Electronics Reliability-Calculation and Design provides an introduction to the fundamental concepts of reliability. The increasing complexity of electronic equipment has made problems in designing and manufacturing a reliable product more and more difficult. Specific techniques have been developed that enable designers to integrate reliability into their products, and reliability has become a science in its own right. The book begins with a discussion of basic mathematical and statistical concepts, including arithmetic mean, frequency distribution, median and mode, scatter or dispersion of mea

  3. Quantification of Water Flux in Vesicular Systems.

    Science.gov (United States)

    Hannesschläger, Christof; Barta, Thomas; Siligan, Christine; Horner, Andreas

    2018-06-04

    Water transport across lipid membranes is fundamental to all forms of life and plays a major role in health and disease. However, not only typical water facilitators like aquaporins facilitate water flux, but also transporters, ion channels or receptors represent potent water pathways. The efforts directed towards a mechanistic understanding of water conductivity determinants in transmembrane proteins, the development of water flow inhibitors, and the creation of biomimetic membranes with incorporated membrane proteins or artificial water channels depend on reliable and accurate ways of quantifying water permeabilities P f . A conventional method is to subject vesicles to an osmotic gradient in a stopped-flow device: Fast recordings of scattered light intensity are converted into the time course of vesicle volume change. Even though an analytical solution accurately acquiring P f from scattered light intensities exists, approximations potentially misjudging P f by orders of magnitude are used. By means of computational and experimental data we point out that erroneous results such as that the single channel water permeability p f depends on the osmotic gradient are direct results of such approximations. Finally, we propose an empirical solution of which calculated permeability values closely match those calculated with the analytical solution in the relevant range of parameters.

  4. Kinematic source inversions of teleseismic data based on the QUESO library for uncertainty quantification and prediction

    Science.gov (United States)

    Zielke, O.; McDougall, D.; Mai, P. M.; Babuska, I.

    2014-12-01

    One fundamental aspect of seismic hazard mitigation is gaining a better understanding of the rupture process. Because direct observation of the relevant parameters and properties is not possible, other means such as kinematic source inversions are used instead. By constraining the spatial and temporal evolution of fault slip during an earthquake, those inversion approaches may enable valuable insights in the physics of the rupture process. However, due to the underdetermined nature of this inversion problem (i.e., inverting a kinematic source model for an extended fault based on seismic data), the provided solutions are generally non-unique. Here we present a statistical (Bayesian) inversion approach based on an open-source library for uncertainty quantification (UQ) called QUESO that was developed at ICES (UT Austin). The approach has advantages with respect to deterministic inversion approaches as it provides not only a single (non-unique) solution but also provides uncertainty bounds with it. Those uncertainty bounds help to qualitatively and quantitatively judge how well constrained an inversion solution is and how much rupture complexity the data reliably resolve. The presented inversion scheme uses only tele-seismically recorded body waves but future developments may lead us towards joint inversion schemes. After giving an insight in the inversion scheme ifself (based on delayed rejection adaptive metropolis, DRAM) we explore the method's resolution potential. For that, we synthetically generate tele-seismic data, add for example different levels of noise and/or change fault plane parameterization and then apply our inversion scheme in the attempt to extract the (known) kinematic rupture model. We conclude with exemplary inverting real tele-seismic data of a recent large earthquake and compare those results with deterministically derived kinematic source models provided by other research groups.

  5. Probabilistic assessment of pressure vessel and piping reliability

    International Nuclear Information System (INIS)

    Sundararajan, C.

    1986-01-01

    The paper presents a critical review of the state-of-the-art in probabilistic assessment of pressure vessel and piping reliability. First the differences in assessing the reliability directly from historical failure data and indirectly by a probabilistic analysis of the failure phenomenon are discussed and the advantages and disadvantages are pointed out. The rest of the paper deals with the latter approach of reliability assessment. Methods of probabilistic reliability assessment are described and major projects where these methods are applied for pressure vessel and piping problems are discussed. An extensive list of references is provided at the end of the paper

  6. Comparison of the quantification of acetaminophen in plasma, cerebrospinal fluid and dried blood spots using high-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Taylor, Rachel R; Hoffman, Keith L; Schniedewind, Björn; Clavijo, Claudia; Galinkin, Jeffrey L; Christians, Uwe

    2013-09-01

    Acetaminophen (paracetamol, N-(4-hydroxyphenyl) acetamide) is one of the most commonly prescribed drugs for the management of pain in children. Quantification of acetaminophen in pre-term and term neonates and small children requires the availability of highly sensitive assays in small volume blood samples. We developed and validated an LC-MS/MS assay for the quantification of acetaminophen in human plasma, cerebro-spinal fluid (CSF) and dried blood spots (DBS). Reconstitution in water (DBS only) and addition of a protein precipitation solution containing the deuterated internal standard were the only manual steps. Extracted samples were analyzed on a Kinetex 2.6 μm PFP column using an acetonitrile/formic acid gradient. The analytes were detected in the positive multiple reaction mode. Alternatively, DBS were automatically processed using direct desorption in a sample card and preparation (SCAP) robotic autosampler in combination with online extraction. The range of reliable response in plasma and CSF was 3.05-20,000 ng/ml (r(2)>0.99) and 27.4-20,000 ng/ml (r(2)>0.99) for DBS (manual extraction and automated direct desorption). Inter-day accuracy was always within 85-115% and inter-day precision for plasma, CSF and manually extracted DBS were less than 15%. Deming regression analysis comparing 167 matching pairs of plasma and DBS samples showed a correlation coefficient of 0.98. Bland Altman analysis indicated a 26.6% positive bias in DBS, most likely reflecting the blood: plasma distribution ratio of acetaminophen. DBS are a valid matrix for acetaminophen pharmacokinetic studies. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Applicability of Information Theory to the Quantification of Responses to Anthropogenic Noise by Southeast Alaskan Humpback Whales

    Directory of Open Access Journals (Sweden)

    J. Ellen Blue

    2008-05-01

    Full Text Available We assess the effectiveness of applying information theory to the characterization and quantification of the affects of anthropogenic vessel noise on humpback whale (Megaptera novaeangliae vocal behavior in and around Glacier Bay, Alaska. Vessel noise has the potential to interfere with the complex vocal behavior of these humpback whales which could have direct consequences on their feeding behavior and thus ultimately on their health and reproduction. Humpback whale feeding calls recorded during conditions of high vessel-generated noise and lower levels of background noise are compared for differences in acoustic structure, use, and organization using information theoretic measures. We apply information theory in a self-referential manner (i.e., orders of entropy to quantify the changes in signaling behavior. We then compare this with the reduction in channel capacity due to noise in Glacier Bay itself treating it as a (Gaussian noisy channel. We find that high vessel noise is associated with an increase in the rate and repetitiveness of sequential use of feeding call types in our averaged sample of humpback whale vocalizations, indicating that vessel noise may be modifying the patterns of use of feeding calls by the endangered humpback whales in Southeast Alaska. The information theoretic approach suggested herein can make a reliable quantitative measure of such relationships and may also be adapted for wider application to many species where environmental noise is thought to be a problem.

  8. Mathematical reliability an expository perspective

    CERN Document Server

    Mazzuchi, Thomas; Singpurwalla, Nozer

    2004-01-01

    In this volume consideration was given to more advanced theoretical approaches and novel applications of reliability to ensure that topics having a futuristic impact were specifically included. Topics like finance, forensics, information, and orthopedics, as well as the more traditional reliability topics were purposefully undertaken to make this collection different from the existing books in reliability. The entries have been categorized into seven parts, each emphasizing a theme that seems poised for the future development of reliability as an academic discipline with relevance. The seven parts are networks and systems; recurrent events; information and design; failure rate function and burn-in; software reliability and random environments; reliability in composites and orthopedics, and reliability in finance and forensics. Embedded within the above are some of the other currently active topics such as causality, cascading, exchangeability, expert testimony, hierarchical modeling, optimization and survival...

  9. Probabilistic safety assessment of Tehran Research Reactor using systems analysis programs for hands-on integrated reliability evaluations

    International Nuclear Information System (INIS)

    Hosseini, M.H.; Nematollahi, M.R.; Sepanloo, K.

    2004-01-01

    Probabilistic safety assessment application is found to be a practical tool for research reactor safety due to intense involvement of human interactions in an experimental facility. In this document the application of the probabilistic safety assessment to the Tehran Research Reactor is presented. The level 1 practicabilities safety assessment application involved: Familiarization with the plant, selection of accident initiators, mitigating functions and system definitions, event tree constructions and quantifications, fault tree constructions and quantification, human reliability, component failure data base development and dependent failure analysis. Each of the steps of the analysis given above is discussed with highlights from the selected results. Quantification of the constructed models is done using systems analysis programs for hands-on integrated reliability evaluations software

  10. Capturing cognitive causal paths in human reliability analysis with Bayesian network models

    International Nuclear Information System (INIS)

    Zwirglmaier, Kilian; Straub, Daniel; Groth, Katrina M.

    2017-01-01

    reIn the last decade, Bayesian networks (BNs) have been identified as a powerful tool for human reliability analysis (HRA), with multiple advantages over traditional HRA methods. In this paper we illustrate how BNs can be used to include additional, qualitative causal paths to provide traceability. The proposed framework provides the foundation to resolve several needs frequently expressed by the HRA community. First, the developed extended BN structure reflects the causal paths found in cognitive psychology literature, thereby addressing the need for causal traceability and strong scientific basis in HRA. Secondly, the use of node reduction algorithms allows the BN to be condensed to a level of detail at which quantification is as straightforward as the techniques used in existing HRA. We illustrate the framework by developing a BN version of the critical data misperceived crew failure mode in the IDHEAS HRA method, which is currently under development at the US NRC . We illustrate how the model could be quantified with a combination of expert-probabilities and information from operator performance databases such as SACADA. This paper lays the foundations necessary to expand the cognitive and quantitative foundations of HRA. - Highlights: • A framework for building traceable BNs for HRA, based on cognitive causal paths. • A qualitative BN structure, directly showing these causal paths is developed. • Node reduction algorithms are used for making the BN structure quantifiable. • BN quantified through expert estimates and observed data (Bayesian updating). • The framework is illustrated for a crew failure mode of IDHEAS.

  11. Benchmarking common quantification strategies for large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Hogrebe, Alexander; von Stechow, Louise; Bekker-Jensen, Dorte B

    2018-01-01

    Comprehensive mass spectrometry (MS)-based proteomics is now feasible, but reproducible quantification remains challenging, especially for post-translational modifications such as phosphorylation. Here, we compare the most popular quantification techniques for global phosphoproteomics: label-free...

  12. Quantification of uranyl in presence of citric acid; Cuantificacion de uranilo en presencia de acido citrico

    Energy Technology Data Exchange (ETDEWEB)

    Garcia G, N.; Barrera D, C.E. [UAEM, Facultad de Quimica, 50000 Toluca, Estado de Mexico (Mexico); Ordonez R, E. [ININ, 52750 La Marquesa, Estado de Mexico (Mexico)]. e-mail: nidgg@yahoo.com.mx

    2007-07-01

    To determine the influence that has the organic matter of the soil on the uranyl sorption on some solids is necessary to have a detection technique and quantification of uranyl that it is reliable and sufficiently quick in the obtaining of results. For that in this work, it intends to carry out the uranyl quantification in presence of citric acid modifying the Fluorescence induced by UV-Vis radiation technique. Since the uranyl ion is very sensitive to the medium that contains it, (speciation, pH, ionic forces, etc.) it was necessary to develop an analysis technique that stands out the fluorescence of uranyl ion avoiding the out one that produce the organic acids. (Author)

  13. A simple method to improve the quantification accuracy of energy-dispersive X-ray microanalysis

    International Nuclear Information System (INIS)

    Walther, T

    2008-01-01

    Energy-dispersive X-ray spectroscopy in a transmission electron microscope is a standard tool for chemical microanalysis and routinely provides qualitative information on the presence of all major elements above Z=5 (boron) in a sample. Spectrum quantification relies on suitable corrections for absorption and fluorescence, in particular for thick samples and soft X-rays. A brief presentation is given of an easy way to improve quantification accuracy by evaluating the intensity ratio of two measurements acquired at different detector take-off angles. As the take-off angle determines the effective sample thickness seen by the detector this method corresponds to taking two measurements from the same position at two different thicknesses, which allows to correct absorption and fluorescence more reliably. An analytical solution for determining the depth of a feature embedded in the specimen foil is also provided.

  14. Of plants and reliability

    International Nuclear Information System (INIS)

    Schneider Horst

    2009-01-01

    Behind the political statements made about the transformer event at the Kruemmel nuclear power station (KKK) in the summer of 2009 there are fundamental issues of atomic law. Pursuant to Articles 20 and 28 of its Basic Law, Germany is a state in which the rule of law applies. Consequently, the aspects of atomic law associated with the incident merit a closer look, all the more so as the items concerned have been known for many years. Important aspects in the debate about the Kruemmel nuclear power plant are the fact that the transformer is considered part of the nuclear power station under atomic law and thus a ''plant'' subject to surveillance by the nuclear regulatory agencies, on the one hand, and the reliability under atomic law of the operator and the executive personnel responsible, on the other hand. Both ''plant'' and ''reliability'' are terms focusing on nuclear safety. Hence the question to what extent safety was affected in the Kruemmel incident. The classification of the event as 0 = no or only a very slight safety impact on the INES scale (INES = International Nuclear Event Scale) should not be used to put aside the safety issue once and for all. Points of fact and their technical significance must be considered prior to any legal assessment. Legal assessments and regulations are associated with facts and circumstances. Any legal examination is based on the facts as determined and elucidated. Any other procedure would be tantamount to an inadmissible legal advance conviction. Now, what is the position of political statements, i.e. political assessments and political responsibility? If everything is done the correct way, they come at the end, after exploration of the facts and evaluation under applicable law. Sometimes things are handled differently, with consequences which are not very helpful. In the light of the provisions about the rule of law as laid down in the Basic Law, the new federal government should be made to observe the proper sequence of

  15. A Vision for Spaceflight Reliability: NASA's Objectives Based Strategy

    Science.gov (United States)

    Groen, Frank; Evans, John; Hall, Tony

    2015-01-01

    In defining the direction for a new Reliability and Maintainability standard, OSMA has extracted the essential objectives that our programs need, to undertake a reliable mission. These objectives have been structured to lead mission planning through construction of an objective hierarchy, which defines the critical approaches for achieving high reliability and maintainability (R M). Creating a hierarchy, as a basis for assurance implementation, is a proven approach; yet, it holds the opportunity to enable new directions, as NASA moves forward in tackling the challenges of space exploration.

  16. Individual Differences in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. Joe; Ronald L. Boring

    2014-06-01

    While human reliability analysis (HRA) methods include uncertainty in quantification, the nominal model of human error in HRA typically assumes that operator performance does not vary significantly when they are given the same initiating event, indicators, procedures, and training, and that any differences in operator performance are simply aleatory (i.e., random). While this assumption generally holds true when performing routine actions, variability in operator response has been observed in multiple studies, especially in complex situations that go beyond training and procedures. As such, complexity can lead to differences in operator performance (e.g., operator understanding and decision-making). Furthermore, psychological research has shown that there are a number of known antecedents (i.e., attributable causes) that consistently contribute to observable and systematically measurable (i.e., not random) differences in behavior. This paper reviews examples of individual differences taken from operational experience and the psychological literature. The impact of these differences in human behavior and their implications for HRA are then discussed. We propose that individual differences should not be treated as aleatory, but rather as epistemic. Ultimately, by understanding the sources of individual differences, it is possible to remove some epistemic uncertainty from analyses.

  17. Spectroscopic quantification of 5-hydroxymethylcytosine in genomic DNA.

    Science.gov (United States)

    Shahal, Tamar; Gilat, Noa; Michaeli, Yael; Redy-Keisar, Orit; Shabat, Doron; Ebenstein, Yuval

    2014-08-19

    5-Hydroxymethylcytosine (5hmC), a modified form of the DNA base cytosine, is an important epigenetic mark linked to regulation of gene expression in development, and tumorigenesis. We have developed a spectroscopic method for a global quantification of 5hmC in genomic DNA. The assay is performed within a multiwell plate, which allows simultaneous recording of up to 350 samples. Our quantification procedure of 5hmC is direct, simple, and rapid. It relies on a two-step protocol that consists of enzymatic glucosylation of 5hmC with an azide-modified glucose, followed by a "click reaction" with an alkyne-fluorescent tag. The fluorescence intensity recorded from the DNA sample is proportional to its 5hmC content and can be quantified by a simple plate reader measurement. This labeling technique is specific and highly sensitive, allowing detection of 5hmC down to 0.002% of the total nucleotides. Our results reveal significant variations in the 5hmC content obtained from different mouse tissues, in agreement with previously reported data.

  18. Complex Empiricism and the Quantification of Uncertainty in Paleoclimate Reconstructions

    Science.gov (United States)

    Brumble, K. C.

    2014-12-01

    Because the global climate cannot be observed directly, and because of vast and noisy data sets, climate science is a rich field to study how computational statistics informs what it means to do empirical science. Traditionally held virtues of empirical science and empirical methods like reproducibility, independence, and straightforward observation are complicated by representational choices involved in statistical modeling and data handling. Examining how climate reconstructions instantiate complicated empirical relationships between model, data, and predictions reveals that the path from data to prediction does not match traditional conceptions of empirical inference either. Rather, the empirical inferences involved are "complex" in that they require articulation of a good deal of statistical processing wherein assumptions are adopted and representational decisions made, often in the face of substantial uncertainties. Proxy reconstructions are both statistical and paleoclimate science activities aimed at using a variety of proxies to reconstruct past climate behavior. Paleoclimate proxy reconstructions also involve complex data handling and statistical refinement, leading to the current emphasis in the field on the quantification of uncertainty in reconstructions. In this presentation I explore how the processing needed for the correlation of diverse, large, and messy data sets necessitate the explicit quantification of the uncertainties stemming from wrangling proxies into manageable suites. I also address how semi-empirical pseudo-proxy methods allow for the exploration of signal detection in data sets, and as intermediary steps for statistical experimentation.

  19. Multinomial-exponential reliability function: a software reliability model

    International Nuclear Information System (INIS)

    Saiz de Bustamante, Amalio; Saiz de Bustamante, Barbara

    2003-01-01

    The multinomial-exponential reliability function (MERF) was developed during a detailed study of the software failure/correction processes. Later on MERF was approximated by a much simpler exponential reliability function (EARF), which keeps most of MERF mathematical properties, so the two functions together makes up a single reliability model. The reliability model MERF/EARF considers the software failure process as a non-homogeneous Poisson process (NHPP), and the repair (correction) process, a multinomial distribution. The model supposes that both processes are statistically independent. The paper discusses the model's theoretical basis, its mathematical properties and its application to software reliability. Nevertheless it is foreseen model applications to inspection and maintenance of physical systems. The paper includes a complete numerical example of the model application to a software reliability analysis

  20. Powder X-ray diffraction method for the quantification of cocrystals in the crystallization mixture.

    Science.gov (United States)

    Padrela, Luis; de Azevedo, Edmundo Gomes; Velaga, Sitaram P

    2012-08-01

    The solid state purity of cocrystals critically affects their performance. Thus, it is important to accurately quantify the purity of cocrystals in the final crystallization product. The aim of this study was to develop a powder X-ray diffraction (PXRD) quantification method for investigating the purity of cocrystals. The method developed was employed to study the formation of indomethacin-saccharin (IND-SAC) cocrystals by mechanochemical methods. Pure IND-SAC cocrystals were geometrically mixed with 1:1 w/w mixture of indomethacin/saccharin in various proportions. An accurately measured amount (550 mg) of the mixture was used for the PXRD measurements. The most intense, non-overlapping, characteristic diffraction peak of IND-SAC was used to construct the calibration curve in the range 0-100% (w/w). This calibration model was validated and used to monitor the formation of IND-SAC cocrystals by liquid-assisted grinding (LAG). The IND-SAC cocrystal calibration curve showed excellent linearity (R(2) = 0.9996) over the entire concentration range, displaying limit of detection (LOD) and limit of quantification (LOQ) values of 1.23% (w/w) and 3.74% (w/w), respectively. Validation results showed excellent correlations between actual and predicted concentrations of IND-SAC cocrystals (R(2) = 0.9981). The accuracy and reliability of the PXRD quantification method depend on the methods of sample preparation and handling. The crystallinity of the IND-SAC cocrystals was higher when larger amounts of methanol were used in the LAG method. The PXRD quantification method is suitable and reliable for verifying the purity of cocrystals in the final crystallization product.

  1. Seeking high reliability in primary care: Leadership, tools, and organization.

    Science.gov (United States)

    Weaver, Robert R

    2015-01-01

    organization. Progress toward a reliability-seeking, system-oriented approach to care remains ongoing, and movement in that direction requires deliberate and sustained effort by committed leaders in health care.

  2. Reliability of Oronasal Fistula Classification.

    Science.gov (United States)

    Sitzman, Thomas J; Allori, Alexander C; Matic, Damir B; Beals, Stephen P; Fisher, David M; Samson, Thomas D; Marcus, Jeffrey R; Tse, Raymond W

    2018-01-01

    Objective Oronasal fistula is an important complication of cleft palate repair that is frequently used to evaluate surgical quality, yet reliability of fistula classification has never been examined. The objective of this study was to determine the reliability of oronasal fistula classification both within individual surgeons and between multiple surgeons. Design Using intraoral photographs of children with repaired cleft palate, surgeons rated the location of palatal fistulae using the Pittsburgh Fistula Classification System. Intrarater and interrater reliability scores were calculated for each region of the palate. Participants Eight cleft surgeons rated photographs obtained from 29 children. Results Within individual surgeons reliability for each region of the Pittsburgh classification ranged from moderate to almost perfect (κ = .60-.96). By contrast, reliability between surgeons was lower, ranging from fair to substantial (κ = .23-.70). Between-surgeon reliability was lowest for the junction of the soft and hard palates (κ = .23). Within-surgeon and between-surgeon reliability were almost perfect for the more general classification of fistula in the secondary palate (κ = .95 and κ = .83, respectively). Conclusions This is the first reliability study of fistula classification. We show that the Pittsburgh Fistula Classification System is reliable when used by an individual surgeon, but less reliable when used among multiple surgeons. Comparisons of fistula occurrence among surgeons may be subject to less bias if they use the more general classification of "presence or absence of fistula of the secondary palate" rather than the Pittsburgh Fistula Classification System.

  3. Colour thresholding and objective quantification in bioimaging

    Science.gov (United States)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  4. Recurrence quantification analysis in Liu's attractor

    International Nuclear Information System (INIS)

    Balibrea, Francisco; Caballero, M. Victoria; Molera, Lourdes

    2008-01-01

    Recurrence Quantification Analysis is used to detect transitions chaos to periodical states or chaos to chaos in a new dynamical system proposed by Liu et al. This system contains a control parameter in the second equation and was originally introduced to investigate the forming mechanism of the compound structure of the chaotic attractor which exists when the control parameter is zero

  5. Quantification of coating aging using impedance measurements

    NARCIS (Netherlands)

    Westing, E.P.M. van; Weijde, D.H. van der; Vreijling, M.P.W.; Ferrari, G.M.; Wit, J.H.W. de

    1998-01-01

    This chapter shows the application results of a novel approach to quantify the ageing of organic coatings using impedance measurements. The ageing quantification is based on the typical impedance behaviour of barrier coatings in immersion. This immersion behaviour is used to determine the limiting

  6. Quantification analysis of CT for aphasic patients

    International Nuclear Information System (INIS)

    Watanabe, Shunzo; Ooyama, Hiroshi; Hojo, Kei; Tasaki, Hiroichi; Hanazono, Toshihide; Sato, Tokijiro; Metoki, Hirobumi; Totsuka, Motokichi; Oosumi, Noboru.

    1987-01-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on Slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis). (author)

  7. Quantification analysis of CT for aphasic patients

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, S.; Ooyama, H.; Hojo, K.; Tasaki, H.; Hanazono, T.; Sato, T.; Metoki, H.; Totsuka, M.; Oosumi, N.

    1987-02-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis).

  8. Quantification of Cannabinoid Content in Cannabis

    Science.gov (United States)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  9. Quantification of glycyrrhizin biomarker in Glycyrrhiza glabra ...

    African Journals Online (AJOL)

    Background: A simple and sensitive thin-layer chromatographic method has been established for quantification of glycyrrhizin in Glycyrrhiza glabra rhizome and baby herbal formulations by validated Reverse Phase HPTLC method. Materials and Methods: RP-HPTLC Method was carried out using glass coated with RP-18 ...

  10. Noninvasive Quantification of Pancreatic Fat in Humans

    OpenAIRE

    Lingvay, Ildiko; Esser, Victoria; Legendre, Jaime L.; Price, Angela L.; Wertz, Kristen M.; Adams-Huet, Beverley; Zhang, Song; Unger, Roger H.; Szczepaniak, Lidia S.

    2009-01-01

    Objective: To validate magnetic resonance spectroscopy (MRS) as a tool for non-invasive quantification of pancreatic triglyceride (TG) content and to measure the pancreatic TG content in a diverse human population with a wide range of body mass index (BMI) and glucose control.

  11. Cues, quantification, and agreement in language comprehension.

    Science.gov (United States)

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  12. Perfusion Quantification Using Gaussian Process Deconvolution

    DEFF Research Database (Denmark)

    Andersen, Irene Klærke; Have, Anna Szynkowiak; Rasmussen, Carl Edward

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated...

  13. Dependent systems reliability estimation by structural reliability approach

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2014-01-01

    Estimation of system reliability by classical system reliability methods generally assumes that the components are statistically independent, thus limiting its applicability in many practical situations. A method is proposed for estimation of the system reliability with dependent components, where...... the leading failure mechanism(s) is described by physics of failure model(s). The proposed method is based on structural reliability techniques and accounts for both statistical and failure effect correlations. It is assumed that failure of any component is due to increasing damage (fatigue phenomena...... identification. Application of the proposed method can be found in many real world systems....

  14. Lowering the quantification limit of the QubitTM RNA HS assay using RNA spike-in.

    Science.gov (United States)

    Li, Xin; Ben-Dov, Iddo Z; Mauro, Maurizio; Williams, Zev

    2015-05-06

    RNA quantification is often a prerequisite for most RNA analyses such as RNA sequencing. However, the relatively low sensitivity and large sample consumption of traditional RNA quantification methods such as UV spectrophotometry and even the much more sensitive fluorescence-based RNA quantification assays, such as the Qubit™ RNA HS Assay, are often inadequate for measuring minute levels of RNA isolated from limited cell and tissue samples and biofluids. Thus, there is a pressing need for a more sensitive method to reliably and robustly detect trace levels of RNA without interference from DNA. To improve the quantification limit of the Qubit™ RNA HS Assay, we spiked-in a known quantity of RNA to achieve the minimum reading required by the assay. Samples containing trace amounts of RNA were then added to the spike-in and measured as a reading increase over RNA spike-in baseline. We determined the accuracy and precision of reading increases between 1 and 20 pg/μL as well as RNA-specificity in this range, and compared to those of RiboGreen(®), another sensitive fluorescence-based RNA quantification assay. We then applied Qubit™ Assay with RNA spike-in to quantify plasma RNA samples. RNA spike-in improved the quantification limit of the Qubit™ RNA HS Assay 5-fold, from 25 pg/μL down to 5 pg/μL while maintaining high specificity to RNA. This enabled quantification of RNA with original concentration as low as 55.6 pg/μL compared to 250 pg/μL for the standard assay and decreased sample consumption from 5 to 1 ng. Plasma RNA samples that were not measurable by the Qubit™ RNA HS Assay were measurable by our modified method. The Qubit™ RNA HS Assay with RNA spike-in is able to quantify RNA with high specificity at 5-fold lower concentration and uses 5-fold less sample quantity than the standard Qubit™ Assay.

  15. Reliability of reactor materials

    International Nuclear Information System (INIS)

    Toerroenen, K.; Aho-Mantila, I.

    1986-05-01

    This report is the final technical report of the fracture mechanics part of the Reliability of Reactor Materials Programme, which was carried out at the Technical Research Centre of Finland (VTT) through the years 1981 to 1983. Research and development work was carried out in five major areas, viz. statistical treatment and modelling of cleavage fracture, crack arrest, ductile fracture, instrumented impact testing as well as comparison of numerical and experimental elastic-plastic fracture mechanics. In the area of cleavage fracture the critical variables affecting the fracture of steels are considered in the frames of a statistical model, so called WST-model. Comparison of fracture toughness values predicted by the model and corresponding experimental values shows excellent agreement for a variety of microstructures. different posibilities for using the model are discussed. The development work in the area of crack arrest testing was concentrated in the crack starter properties, test arrangement and computer control. A computerized elastic-plastic fracture testing method with a variety of test specimen geometries in a large temperature range was developed for a routine stage. Ductile fracture characteristics of reactor pressure vessel steel A533B and comparable weld material are given. The features of a new, patented instrumented impact tester are described. Experimental and theoretical comparisons between the new and conventional testers indicated clearly the improvements achieved with the new tester. A comparison of numerical and experimental elastic-plastic fracture mechanics capabilities at VTT was carried out. The comparison consisted of two-dimensional linear elastic as well as elastic-plastic finite element analysis of four specimen geometries and equivalent experimental tests. (author)

  16. Field reliability of electronic systems

    International Nuclear Information System (INIS)

    Elm, T.

    1984-02-01

    This report investigates, through several examples from the field, the reliability of electronic units in a broader sense. That is, it treats not just random parts failure, but also inadequate reliability design and (externally and internally) induced failures. The report is not meant to be merely an indication of the state of the art for the reliability prediction methods we know, but also as a contribution to the investigation of man-machine interplay in the operation and repair of electronic equipment. The report firmly links electronics reliability to safety and risk analyses approaches with a broader, system oriented view of reliability prediction and with postfailure stress analysis. It is intended to reveal, in a qualitative manner, the existence of symptom and cause patterns. It provides a background for further investigations to identify the detailed mechanisms of the faults and the remedical actions and precautions for achieving cost effective reliability. (author)

  17. Reliability Assessment Of Wind Turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2014-01-01

    Reduction of cost of energy for wind turbines are very important in order to make wind energy competitive compared to other energy sources. Therefore the turbine components should be designed to have sufficient reliability but also not be too costly (and safe). This paper presents models...... for uncertainty modeling and reliability assessment of especially the structural components such as tower, blades, substructure and foundation. But since the function of a wind turbine is highly dependent on many electrical and mechanical components as well as a control system also reliability aspects...... of these components are discussed and it is described how there reliability influences the reliability of the structural components. Two illustrative examples are presented considering uncertainty modeling, reliability assessment and calibration of partial safety factors for structural wind turbine components exposed...

  18. Reliability engineering theory and practice

    CERN Document Server

    Birolini, Alessandro

    2014-01-01

    This book shows how to build in, evaluate, and demonstrate reliability and availability of components, equipment, systems. It presents the state-of-theart of reliability engineering, both in theory and practice, and is based on the author's more than 30 years experience in this field, half in industry and half as Professor of Reliability Engineering at the ETH, Zurich. The structure of the book allows rapid access to practical results. This final edition extend and replace all previous editions. New are, in particular, a strategy to mitigate incomplete coverage, a comprehensive introduction to human reliability with design guidelines and new models, and a refinement of reliability allocation, design guidelines for maintainability, and concepts related to regenerative stochastic processes. The set of problems for homework has been extended. Methods & tools are given in a way that they can be tailored to cover different reliability requirement levels and be used for safety analysis. Because of the Appendice...

  19. Reliability of Wireless Sensor Networks

    Science.gov (United States)

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2014-01-01

    Wireless Sensor Networks (WSNs) consist of hundreds or thousands of sensor nodes with limited processing, storage, and battery capabilities. There are several strategies to reduce the power consumption of WSN nodes (by increasing the network lifetime) and increase the reliability of the network (by improving the WSN Quality of Service). However, there is an inherent conflict between power consumption and reliability: an increase in reliability usually leads to an increase in power consumption. For example, routing algorithms can send the same packet though different paths (multipath strategy), which it is important for reliability, but they significantly increase the WSN power consumption. In this context, this paper proposes a model for evaluating the reliability of WSNs considering the battery level as a key factor. Moreover, this model is based on routing algorithms used by WSNs. In order to evaluate the proposed models, three scenarios were considered to show the impact of the power consumption on the reliability of WSNs. PMID:25157553

  20. EDF/EPRI collaborative program on operator reliability experiments

    International Nuclear Information System (INIS)

    Villemeur, A.; Meslin, T.; Mosneron, F.; Worledge, D.H.; Joksimovich, V.; Spurgin, A.J.

    1988-01-01

    Electricite de France (EDF) and Electric Power Research Institute (EPRI) have been involved in human reliability studies over the last few years, in the context of improvements in human reliability assessment (HRA) methodologies, and have been following a systematic process since 1982 which consists of addressing the following five ingredients: - First, classify human interactions into a limited number of classes. - Second, introduce an acceptable framework to organize the application of HRA to PRA studies. - Third, select approach(es) to quantification. - Fourth, test promising models. - Fifth, establish an appropriate data base for tested model(s) with regard to specific applications. EPRI has just recently completed Phase I of the fourth topic. This primarily focused on testing the fundamental hypotheses behing the human cognitive reliability (HCR) correlation, using power plant simulators. EDF has been carrying out simulator studies since 1980, both for man-machine interface validation and HRA data collection. This background of experience provided a stepping stone for the EPRI project. On the other hand, before 1986, EDF had mainly been concentrating on getting qualitative insights from the tests and lacked experience in quantitative analysis and modeling, while EPRI had made advances in this latter area. Before the EPRI Operator Reliability Experiments (ORE) project was initiated, it was abundantly clear to EPRI and EDF that cooperation between the two could be useful and that both parties could gain from the cooperation