WorldWideScience

Sample records for reliable bioanalytical method

  1. Bioanalytical method transfer considerations of chromatographic-based assays.

    Science.gov (United States)

    Williard, Clark V

    2016-07-01

    Bioanalysis is an important part of the modern drug development process. The business practice of outsourcing and transferring bioanalytical methods from laboratory to laboratory has increasingly become a crucial strategy for successful and efficient delivery of therapies to the market. This chapter discusses important considerations when transferring various types of chromatographic-based assays in today's pharmaceutical research and development environment.

  2. The 10th Annual Bioassays and Bioanalytical Method Development Conference.

    Science.gov (United States)

    Ma, Mark; Tudan, Christopher; Koltchev, Dolly

    2015-01-01

    The 10th Annual Bioassays and Bioanalytical Method Development Conference was hosted in Boston, MA, USA on 20-22 October 2014. This meeting brought together scientists from the biopharmaceutical and life sciences industries, the regulatory agency and academia to share and discuss current trends in cell-based assays and bioanalysis, challenges and ideas for the future of the bioassays and bioanalytical method development. The experiences associated with new and innovative technologies were evaluated as well as their impact on the current bioassays methodologies and bioanalysis workflow, including quality, feasibility, outsourcing strategies and challenges, productivity and compliance. Several presentations were also provided by members of the US FDA, sharing both scientific and regulatory paradigms including a most recent update on the position of the FDA with specific aspects of the draft Bioanalytical Method Validation guidance following its review of the industry's responses. The meeting was jointly coincided with the 15th Annual Immunogenicity for Biotherapeutics meeting, allowing for attendees to also familiarize themselves with new and emerging approaches to overcome the effect of immunogenicity, in addition to investigative strategies.

  3. Quantitative bioanalytical and analytical method development of dibenzazepine derivative, carbamazepine: A review ☆

    OpenAIRE

    Datar, Prasanna A.

    2015-01-01

    Bioanalytical methods are widely used for quantitative estimation of drugs and their metabolites in physiological matrices. These methods could be applied to studies in areas of human clinical pharmacology and toxicology. The major bioanalytical services are method development, method validation and sample analysis (method application). Various methods such as GC, LC–MS/MS, HPLC, HPTLC, micellar electrokinetic chromatography, and UFLC have been used in laboratories for the qualitative and qua...

  4. Quantitative bioanalytical and analytical method development of dibenzazepine derivative, carbamazepine: A review

    Directory of Open Access Journals (Sweden)

    Prasanna A. Datar

    2015-08-01

    Full Text Available Bioanalytical methods are widely used for quantitative estimation of drugs and their metabolites in physiological matrices. These methods could be applied to studies in areas of human clinical pharmacology and toxicology. The major bioanalytical services are method development, method validation and sample analysis (method application. Various methods such as GC, LC–MS/MS, HPLC, HPTLC, micellar electrokinetic chromatography, and UFLC have been used in laboratories for the qualitative and quantitative analysis of carbamazepine in biological samples throughout all phases of clinical research and quality control. The article incorporates various reported methods developed to help analysts in choosing crucial parameters for new method development of carbamazepine and its derivatives and also enumerates metabolites, and impurities reported so far. Keywords: Carbamazepine, HPLC, LC–MS/MS, HPTLC, RP-UFLC, Micellar electrokinetic chromatography

  5. Bioanalytical LC-MS/MS of protein-based biopharmaceuticals

    NARCIS (Netherlands)

    Broek, I. van den; Niessen, W.M.A.; Dongen, W.D. van

    2013-01-01

    Biotechnology increasingly delivers highly promising protein-based biopharmaceutical candidates to the drug development funnel. For successful biopharmaceutical drug development, reliable bioanalytical methods enabling quantification of drugs in biological fluids (plasma, urine, tissue, etc.) are

  6. Bioanalytical HPTLC Method for Estimation of Zolpidem Tartrate from Human Plasma

    OpenAIRE

    Abhay R. Shirode; Bharti G. Jadhav; Vilasrao J. Kadam

    2016-01-01

    A simple and selective high performance thin layer chromatographic (HPTLC) method was developed and validated for the estimation of zolpidem tartrate from human plasma using eperisone hydrochloride as an internal standard (IS). Analyte and IS were extracted from human plasma by liquid liquid extraction (LLE) technique. The Camag HPTLC system, employed with software winCATS (ver.1.4.1.8) was used for the proposed bioanalytical work. Planar chromatographic development was carried out with the h...

  7. A Simple HPLC Bioanalytical Method for the Determination of ...

    African Journals Online (AJOL)

    Purpose: To develop a simple, accurate, and precise high performance chromatography (HPLC) method with spectrophotometric detection for the determination of doxorubicin hydrochloride in rat plasma. Methods: Doxorubicin hydrochloride and daunorubicin hydrochloride (internal standard, IS) were separated on a C18 ...

  8. Development and Validation of a Bioanalytical Method for Direct ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a user-friendly spiked plasma method for the extraction of diclofenac potassium that reduces the number of treatments with plasma sample, in order to minimize human error. Method: Instead of solvent evaporation technique, the spiked plasma sample was modified with H2SO4 and NaCl, ...

  9. Impact of Chiral Bioanalytical Methods on the Bioequivalence of Ibuprofen Products Containing Ibuprofen Lysinate and Ibuprofen Base.

    Science.gov (United States)

    García-Arieta, Alfredo; Ferrero-Cafiero, Juan Manuel; Puntes, Montse; Gich, Ignasi; Morales-Alcelay, Susana; Tarré, Maite; Font, Xavier; Antonijoan, Rosa Maria

    2016-05-01

    The purpose was to assess the impact of the use of a chiral bioanalytical method on the conclusions of a bioequivalence study that compared two ibuprofen suspensions with different rates of absorption. A comparison of the conclusion of bioequivalence between a chiral method and an achiral approach was made. Plasma concentrations of R-ibuprofen and S-ibuprofen were determined using a chiral bioanalytical method; bioequivalence was tested for R-ibuprofen and for S-ibuprofen separately and for the sum of both enantiomers as an approach for an achiral bioanalytical method. The 90% confidence interval (90% CI) that would have been obtained with an achiral bioanalytical method (90% CI: Cmax: 117.69-134.46; AUC0 (t) : 104.75-114.45) would have precluded the conclusion of bioequivalence. This conclusion cannot be generalized to the active enantiomer (90% CI: Cmax : 103.36-118.38; AUC0 (t) : 96.52-103.12), for which bioequivalence can be concluded, and/or the distomer (90% CI: Cmax : 132.97-151.33; AUC0 (t) : 115.91-135.77) for which a larger difference was observed. Chiral bioanalytical methods should be required when 1) the enantiomers exhibit different pharmacodynamics and 2) the exposure (AUC or Cmax ) ratio of enantiomers is modified by a difference in the rate of absorption. Furthermore, the bioequivalence conclusion should be based on all enantiomers, since the distomer(s) might not be completely inert, in contrast to what is required in the current regulatory guidelines. In those cases where it is unknown if the ratio between enantiomers is modified by changing the rate of absorption, chiral bioanalytical methods should be employed unless enantiomers exhibit the same pharmacodynamics. Chirality 28:429-433, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  10. Statistical approach for selection of regression model during validation of bioanalytical method

    Directory of Open Access Journals (Sweden)

    Natalija Nakov

    2014-06-01

    Full Text Available The selection of an adequate regression model is the basis for obtaining accurate and reproducible results during the bionalytical method validation. Given the wide concentration range, frequently present in bioanalytical assays, heteroscedasticity of the data may be expected. Several weighted linear and quadratic regression models were evaluated during the selection of the adequate curve fit using nonparametric statistical tests: One sample rank test and Wilcoxon signed rank test for two independent groups of samples. The results obtained with One sample rank test could not give statistical justification for the selection of linear vs. quadratic regression models because slight differences between the error (presented through the relative residuals were obtained. Estimation of the significance of the differences in the RR was achieved using Wilcoxon signed rank test, where linear and quadratic regression models were treated as two independent groups. The application of this simple non-parametric statistical test provides statistical confirmation of the choice of an adequate regression model.

  11. Pharmacokinetic-pharmacodynamic correlation of imipenem in pediatric burn patients using a bioanalytical liquid chromatographic method

    Directory of Open Access Journals (Sweden)

    Silvia Regina Cavani Jorge Santos

    2015-06-01

    Full Text Available A bioanalytical method was developed and applied to quantify the free imipenem concentrations for pharmacokinetics and PK/PD correlation studies of the dose adjustments required to maintain antimicrobial effectiveness in pediatric burn patients. A reverse-phase Supelcosil LC18 column (250 x 4.6 mm 5 micra, binary mobile phase consisting of 0.01 M, pH 7.0 phosphate buffer and acetonitrile (99:1, v/v, flow rate of 0.8 mL/min, was applied. The method showed good absolute recovery (above 90%, good linearity (0.25-100.0 µg/mL, r2=0.999, good sensitivity (LLOQ: 0.25 µg/mL; LLOD: 0.12 µg/mL and acceptable stability. Inter/intraday precision values were 7.3/5.9%, and mean accuracy was 92.9%. A bioanalytical method was applied to quantify free drug concentrations in children with burns. Six pediatric burn patients (median 7.0 years old, 27.5 kg, normal renal function, and 33% total burn surface area were prospectively investigated; inhalation injuries were present in 4/6 (67% of the patients. Plasma monitoring and PK assessments were performed using a serial blood sample collection for each set, totaling 10 sets. The PK/PD target attained (40%T>MIC for each minimum inhibitory concentration (MIC: 0.5, 1.0, 2.0, 4.0 mg/L occurred at a percentage higher than 80% of the sets investigated and 100% after dose adjustment. In conclusion, the purification of plasma samples using an ultrafiltration technique followed by quantification of imipenem plasma measurements using the LC method is quite simple, useful, and requires small volumes for blood sampling. In addition, a small amount of plasma (0.25 mL is needed to guarantee drug effectiveness in pediatric burn patients. There is also a low risk of neurotoxicity, which is important because pharmacokinetics are unpredictable in these critical patients with severe hospital infection. Finally, the PK/PD target was attained for imipenem in the control of sepsis in pediatric patients with burns.

  12. Synergic development of pharmacokinetics and bioanalytical methods as support of pharmaceutical research.

    Science.gov (United States)

    Marzo, M; Ciccarelli, R; Di Iorio, P; Giuliani, P; Caciagli, F; Marzo, A

    2016-06-01

    The development of pharmacokinetics led this science to achieve a relevant role in the investigation of new chemical entities for therapeutic application, and has allowed a series of new useful realizations of out of patent drugs like prolonged release and delayed release formulations, therapeutic delivery system (TDS) for drugs to be active in systemic circulation avoiding the first pass effect, orodispersible and effervescent formulations, intramuscular and subcutaneous depot formulations acting over a long period, oral inhalatory systems, and drug association at fixed dose. The above applications had pharmacokinetics as protagonist and have required the support from bioanalytical methods to assay drug concentrations, even in pg·mL(-1) of plasma, that really have paralleled the synergic development of pharmacokinetics.The complexity of the above realizations required specific guidelines from the regulatory authorities, mainly the US FDA and EU EMA, which have normalized and, in most cases, simplified the above applications admitting some waivers of in vivo bioequivalence.However, this review highlights some critical points, not yet focused on by operating guidelines, which need to be clarified by regulatory authorities. One of the most relevant issues is about the planning and conducting bioavailability and bioequivalence trials with endogenous substances, that possess own homeostatic equilibria with fluctuations, in some cases with specific rhythms, like melatonin and female sex hormones. The baseline subtraction required by guidelines to define the net contribute to the exogenous absorbed drug in most cases is a non-solvable problem. © The Author(s) 2015.

  13. Calculations for Adjusting Endogenous Biomarker Levels During Analytical Recovery Assessments for Ligand-Binding Assay Bioanalytical Method Validation.

    Science.gov (United States)

    Marcelletti, John F; Evans, Cindy L; Saxena, Manju; Lopez, Adriana E

    2015-07-01

    It is often necessary to adjust for detectable endogenous biomarker levels in spiked validation samples (VS) and in selectivity determinations during bioanalytical method validation for ligand-binding assays (LBA) with a matrix like normal human serum (NHS). Described herein are case studies of biomarker analyses using multiplex LBA which highlight the challenges associated with such adjustments when calculating percent analytical recovery (%AR). The LBA test methods were the Meso Scale Discovery V-PLEX® proinflammatory and cytokine panels with NHS as test matrix. The NHS matrix blank exhibited varied endogenous content of the 20 individual cytokines before spiking, ranging from undetectable to readily quantifiable. Addition and subtraction methods for adjusting endogenous cytokine levels in %AR calculations are both used in the bioanalytical field. The two methods were compared in %AR calculations following spiking and analysis of VS for cytokines having detectable endogenous levels in NHS. Calculations for %AR obtained by subtracting quantifiable endogenous biomarker concentrations from the respective total analytical VS values yielded reproducible and credible conclusions. The addition method, in contrast, yielded %AR conclusions that were frequently unreliable and discordant with values obtained with the subtraction adjustment method. It is shown that subtraction of assay signal attributable to matrix is a feasible alternative when endogenous biomarkers levels are below the limit of quantitation, but above the limit of detection. These analyses confirm that the subtraction method is preferable over that using addition to adjust for detectable endogenous biomarker levels when calculating %AR for biomarker LBA.

  14. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  15. Structural Reliability Methods

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Madsen, H. O.

    The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...... of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature...... of the uncertainties and their interplay is the developed, step-by-step. The concepts presented are illustrated by numerous examples throughout the text....

  16. Verification of Bioanalytical Method for Quantification of Exogenous Insulin (Insulin Aspart) by the Analyser Advia Centaur® XP.

    Science.gov (United States)

    Mihailov, Rossen; Stoeva, Dilyana; Pencheva, Blagovesta; Pentchev, Eugeni

    2018-03-01

    In a number of cases the monitoring of patients with type I diabetes mellitus requires measurement of the exogenous insulin levels. For the purpose of a clinical investigation of the efficacy of a medical device for application of exogenous insulin aspart, a verification of the method for measurement of this synthetic analogue of the hormone was needed. The information in the available medical literature for the measurement of the different exogenous insulin analogs is insufficient. Thus, verification was required to be in compliance with the active standards in Republic of Bulgaria. A manufactured method developed for ADVIA Centaur XP Immunoassay, Siemens Healthcare, was used which we verified using standard solutions and a patient serum pool by adding the appropriate quantity exogenous insulin aspart. The method was verified in accordance with the bioanalytical method verification criteria and regulatory requirements for using a standard method: CLIA chemiluminescence immunoassay ADVIA Centaur® XP. The following parameters are determined and monitored: intra-day precision and accuracy, inter-day precision and accuracy, limit of detection and lower limit of quantification, linearity, analytical recovery. The routine application of the method for measurement of immunoreactive insulin using the analyzer ADVIA Centaur® XP is directed to the measurement of endogenous insulin. The method is applicable for measuring different types of exogenous insulin, including insulin aspart.

  17. Bioanalytical methods for food allergy diagnosis, allergen detection and new allergen discovery

    OpenAIRE

    Gasilova, Natalia; Girault, Hubert H

    2015-01-01

    For effective monitoring and prevention of the food allergy, one of the emerging health problems nowadays, existing diagnostic procedures and allergen detection techniques are constantly improved. Meanwhile, new methods are also developed, and more and more putative allergens are discovered. This review describes traditional methods and summarizes recent advances in the fast evolving field of the in vitro food allergy diagnosis, allergen detection in food products and discovery of the new all...

  18. Bioanalytical methods for food allergy diagnosis, allergen detection and new allergen discovery.

    Science.gov (United States)

    Gasilova, Natalia; Girault, Hubert H

    2015-01-01

    For effective monitoring and prevention of the food allergy, one of the emerging health problems nowadays, existing diagnostic procedures and allergen detection techniques are constantly improved. Meanwhile, new methods are also developed, and more and more putative allergens are discovered. This review describes traditional methods and summarizes recent advances in the fast evolving field of the in vitro food allergy diagnosis, allergen detection in food products and discovery of the new allergenic molecules. A special attention is paid to the new diagnostic methods under laboratory development like various immuno- and aptamer-based assays, including immunoaffinity capillary electrophoresis. The latter technique shows the importance of MS application not only for the allergen detection but also for the allergy diagnosis.

  19. A validated bioanalytical HPLC method for pharmacokinetic evaluation of 2-deoxyglucose in human plasma.

    Science.gov (United States)

    Gounder, Murugesan K; Lin, Hongxia; Stein, Mark; Goodin, Susan; Bertino, Joseph R; Kong, Ah-Ng Tony; DiPaola, Robert S

    2012-05-01

    2-Deoxyglucose (2-DG), an analog of glucose, is widely used to interfere with glycolysis in tumor cells and studied as a therapeutic approach in clinical trials. To evaluate the pharmacokinetics of 2-DG, we describe the development and validation of a sensitive HPLC fluorescent method for the quantitation of 2-DG in plasma. Plasma samples were deproteinized with methanol and the supernatant was dried at 45°C. The residues were dissolved in methanolic sodium acetate-boric acid solution. 2-DG and other monosaccharides were derivatized to 2-aminobenzoic acid derivatives in a single step in the presence of sodium cyanoborohydride at 80°C for 45 min. The analytes were separated on a YMC ODS C₁₈ reversed-phase column using gradient elution. The excitation and emission wavelengths were set at 360 and 425 nm. The 2-DG calibration curves were linear over the range of 0.63-300 µg/mL with a limit of detection of 0.5 µg/mL. The assay provided satisfactory intra-day and inter-day precision with RSD less than 9.8%, and the accuracy ranged from 86.8 to 110.0%. The HPLC method is reproducible and suitable for the quantitation of 2-DG in plasma. The method was successfully applied to characterize the pharmacokinetics profile of 2-DG in patients with advanced solid tumors. Copyright © 2011 John Wiley & Sons, Ltd.

  20. Bioanalytical methods for the determination of cocaine and metabolites in human biological samples.

    Science.gov (United States)

    Barroso, M; Gallardo, E; Queiroz, J A

    2009-08-01

    Determination of cocaine and its metabolites in biological specimens is of great importance, not only in clinical and forensic toxicology, but also in workplace drug testing. These compounds are normally screened for using sensitive immunological methods. However, screening methods are unspecific and, therefore, the posterior confirmation of presumably positive samples by a specific technique is mandatory. Although GC-MS-based techniques are still the most commonly used for confirmation purposes of cocaine and its metabolites in biological specimens, the advent of LC-MS and LC-MS/MS has enabled the detection of even lower amounts of these drugs, which assumes particular importance when sample volume available is small, as frequently occurs with oral fluid. This paper will review recently-published papers that describe procedures for detection of cocaine and metabolites, not only in the most commonly used specimens, such as blood and urine, but also in other 'alternative' matrices (e.g., oral fluid and hair) with a special focus on sample preparation and chromatographic analysis.

  1. Rapid and label-free bioanalytical method of alpha fetoprotein detection using LSPR chip

    Science.gov (United States)

    Kim, Dongjoo; Kim, Jinwoon; Kwak, Cheol Hwan; Heo, Nam Su; Oh, Seo Yeong; Lee, Hoomin; Lee, Go-Woon; Vilian, A. T. Ezhil; Han, Young-Kyu; Kim, Woo-Sik; Kim, Gi-bum; Kwon, Soonjo; Huh, Yun Suk

    2017-07-01

    Alpha fetoprotein (AFP) is a cancer marker, particularly for hepatocellular carcinoma. Normal levels of AFP are less than 20 ng/mL; however, its levels can reach more than 400 ng/mL in patients with HCC. Enzyme linked immunosorbent assay (ELISA) and radioimmunoassay (RIA) have been employed for clinical diagnosis of AFP; however, these methods are time consuming and labor intensive. In this study, we developed a localized surface plasmon resonance (LSPR) based biosensor for simple and rapid detection of AFP. This biosensor consists of a UV-Vis spectrometer, a cuvette cell, and a biosensor chip nanopatterned with gold nanoparticles (AuNPs). In our LSPR biosensor, binding of AFP to the surface of the sensor chip led to an increasing magnitude of the LSPR signals, which was measured by an ultraviolet-visible (UV-Vis) spectrometer. Our LSPR biosensor showed sufficient detectability of AFP at concentrations of 1 ng/mL to 1 μg/mL. Moreover, the overall procedure for detection of AFP was completed within 20 min. This biosensor could also be utilized for a point of care test (POCT) by employing a portable UV-Vis spectrometer. Owing to the simplicity and rapidity of the detection process, our LSPR biosensor is expected to replace traditional diagnostic methods for the early detection of diseases.

  2. Bioanalytical methods for determination of tamoxifen and its phase I metabolites: A review

    International Nuclear Information System (INIS)

    Teunissen, S.F.; Rosing, H.; Schinkel, A.H.; Schellens, J.H.M.; Beijnen, J.H.

    2010-01-01

    The selective estrogen receptor modulator tamoxifen is used in the treatment of early and advanced breast cancer and in selected cases for breast cancer prevention in high-risk subjects. The cytochrome P450 enzyme system and flavin-containing monooxygenase are responsible for the extensive metabolism of tamoxifen into several phase I metabolites that vary in toxicity and potencies towards estrogen receptor (ER) alpha and ER beta. An extensive overview of publications on the determination of tamoxifen and its phase I metabolites in biological samples is presented. In these publications techniques were used such as capillary electrophoresis, liquid, gas and thin layer chromatography coupled with various detection techniques (mass spectrometry, ultraviolet or fluorescence detection, liquid scintillation counting and nuclear magnetic resonance spectroscopy). A trend is seen towards the use of liquid chromatography coupled to mass spectrometry (LC-MS). State-of-the-art LC-MS equipment allowed for identification of unknown metabolites and quantification of known metabolites reaching lower limit of quantification levels in the sub pg mL -1 range. Although tamoxifen is also metabolized into phase II metabolites, the number of publications reporting on phase II metabolism of tamoxifen is scarce. Therefore the focus of this review is on phase I metabolites of tamoxifen. We conclude that in the past decades tamoxifen metabolism has been studied extensively and numerous metabolites have been identified. Assays have been developed for both the identification and quantification of tamoxifen and its metabolites in an array of biological samples. This review can be used as a resource for method transfer and development of analytical methods used to support pharmacokinetic and pharmacodynamic studies of tamoxifen and its phase I metabolites.

  3. Sample Preparation and Extraction in Small Sample Volumes Suitable for Pediatric Clinical Studies: Challenges, Advances, and Experiences of a Bioanalytical HPLC-MS/MS Method Validation Using Enalapril and Enalaprilat

    Science.gov (United States)

    Burckhardt, Bjoern B.; Laeer, Stephanie

    2015-01-01

    In USA and Europe, medicines agencies force the development of child-appropriate medications and intend to increase the availability of information on the pediatric use. This asks for bioanalytical methods which are able to deal with small sample volumes as the trial-related blood lost is very restricted in children. Broadly used HPLC-MS/MS, being able to cope with small volumes, is susceptible to matrix effects. The latter restrains the precise drug quantification through, for example, causing signal suppression. Sophisticated sample preparation and purification utilizing solid-phase extraction was applied to reduce and control matrix effects. A scale-up from vacuum manifold to positive pressure manifold was conducted to meet the demands of high-throughput within a clinical setting. Faced challenges, advances, and experiences in solid-phase extraction are exemplarily presented on the basis of the bioanalytical method development and validation of low-volume samples (50 μL serum). Enalapril, enalaprilat, and benazepril served as sample drugs. The applied sample preparation and extraction successfully reduced the absolute and relative matrix effect to comply with international guidelines. Recoveries ranged from 77 to 104% for enalapril and from 93 to 118% for enalaprilat. The bioanalytical method comprising sample extraction by solid-phase extraction was fully validated according to FDA and EMA bioanalytical guidelines and was used in a Phase I study in 24 volunteers. PMID:25873972

  4. Sample Preparation and Extraction in Small Sample Volumes Suitable for Pediatric Clinical Studies: Challenges, Advances, and Experiences of a Bioanalytical HPLC-MS/MS Method Validation Using Enalapril and Enalaprilat

    Directory of Open Access Journals (Sweden)

    Bjoern B. Burckhardt

    2015-01-01

    Full Text Available In USA and Europe, medicines agencies force the development of child-appropriate medications and intend to increase the availability of information on the pediatric use. This asks for bioanalytical methods which are able to deal with small sample volumes as the trial-related blood lost is very restricted in children. Broadly used HPLC-MS/MS, being able to cope with small volumes, is susceptible to matrix effects. The latter restrains the precise drug quantification through, for example, causing signal suppression. Sophisticated sample preparation and purification utilizing solid-phase extraction was applied to reduce and control matrix effects. A scale-up from vacuum manifold to positive pressure manifold was conducted to meet the demands of high-throughput within a clinical setting. Faced challenges, advances, and experiences in solid-phase extraction are exemplarily presented on the basis of the bioanalytical method development and validation of low-volume samples (50 μL serum. Enalapril, enalaprilat, and benazepril served as sample drugs. The applied sample preparation and extraction successfully reduced the absolute and relative matrix effect to comply with international guidelines. Recoveries ranged from 77 to 104% for enalapril and from 93 to 118% for enalaprilat. The bioanalytical method comprising sample extraction by solid-phase extraction was fully validated according to FDA and EMA bioanalytical guidelines and was used in a Phase I study in 24 volunteers.

  5. 11th GCC Closed Forum: cumulative stability; matrix stability; immunogenicity assays; laboratory manuals; biosimilars; chiral methods; hybrid LBA/LCMS assays; fit-for-purpose validation; China Food and Drug Administration bioanalytical method validation.

    Science.gov (United States)

    Islam, Rafiq; Briscoe, Chad; Bower, Joseph; Cape, Stephanie; Arnold, Mark; Hayes, Roger; Warren, Mark; Karnik, Shane; Stouffer, Bruce; Xiao, Yi Qun; van der Strate, Barry; Sikkema, Daniel; Fang, Xinping; Tudoroniu, Ariana; Tayyem, Rabab; Brant, Ashley; Spriggs, Franklin; Barry, Colin; Khan, Masood; Keyhani, Anahita; Zimmer, Jennifer; Caturla, Maria Cruz; Couerbe, Philippe; Khadang, Ardeshir; Bourdage, James; Datin, Jim; Zemo, Jennifer; Hughes, Nicola; Fatmi, Saadya; Sheldon, Curtis; Fountain, Scott; Satterwhite, Christina; Colletti, Kelly; Vija, Jenifer; Yu, Mathilde; Stamatopoulos, John; Lin, Jenny; Wilfahrt, Jim; Dinan, Andrew; Ohorodnik, Susan; Hulse, James; Patel, Vimal; Garofolo, Wei; Savoie, Natasha; Brown, Michael; Papac, Damon; Buonarati, Mike; Hristopoulos, George; Beaver, Chris; Boudreau, Nadine; Williard, Clark; Liu, Yansheng; Ray, Gene; Warrino, Dominic; Xu, Allan; Green, Rachel; Hayward-Sewell, Joanne; Marcelletti, John; Sanchez, Christina; Kennedy, Michael; Charles, Jessica St; Bouhajib, Mohammed; Nehls, Corey; Tabler, Edward; Tu, Jing; Joyce, Philip; Iordachescu, Adriana; DuBey, Ira; Lindsay, John; Yamashita, Jim; Wells, Edward

    2018-04-01

    The 11th Global CRO Council Closed Forum was held in Universal City, CA, USA on 3 April 2017. Representatives from international CRO members offering bioanalytical services were in attendance in order to discuss scientific and regulatory issues specific to bioanalysis. The second CRO-Pharma Scientific Interchange Meeting was held on 7 April 2017, which included Pharma representatives' sharing perspectives on the topics discussed earlier in the week with the CRO members. The issues discussed at the meetings included cumulative stability evaluations, matrix stability evaluations, the 2016 US FDA Immunogenicity Guidance and recent and unexpected FDA Form 483s on immunogenicity assays, the bioanalytical laboratory's role in writing PK sample collection instructions, biosimilars, CRO perspectives on the use of chiral versus achiral methods, hybrid LBA/LCMS assays, applications of fit-for-purpose validation and, at the Global CRO Council Closed Forum only, the status and trend of current regulated bioanalytical practice in China under CFDA's new BMV policy. Conclusions from discussions of these topics at both meetings are included in this report.

  6. Quantitative evaluation of the matrix effect in bioanalytical methods based on LC-MS: A comparison of two approaches.

    Science.gov (United States)

    Rudzki, Piotr J; Gniazdowska, Elżbieta; Buś-Kwaśnik, Katarzyna

    2018-06-05

    Liquid chromatography coupled to mass spectrometry (LC-MS) is a powerful tool for studying pharmacokinetics and toxicokinetics. Reliable bioanalysis requires the characterization of the matrix effect, i.e. influence of the endogenous or exogenous compounds on the analyte signal intensity. We have compared two methods for the quantitation of matrix effect. The CVs(%) of internal standard normalized matrix factors recommended by the European Medicines Agency were evaluated against internal standard normalized relative matrix effects derived from Matuszewski et al. (2003). Both methods use post-extraction spiked samples, but matrix factors require also neat solutions. We have tested both approaches using analytes of diverse chemical structures. The study did not reveal relevant differences in the results obtained with both calculation methods. After normalization with the internal standard, the CV(%) of the matrix factor was on average 0.5% higher than the corresponding relative matrix effect. The method adopted by the European Medicines Agency seems to be slightly more conservative in the analyzed datasets. Nine analytes of different structures enabled a general overview of the problem, still, further studies are encouraged to confirm our observations. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Bio-analytical method development and validation of Rasagiline by high performance liquid chromatography tandem mass spectrometry detection and its application to pharmacokinetic study

    Directory of Open Access Journals (Sweden)

    Ravi Kumar Konda

    2012-10-01

    Full Text Available The most suitable bio-analytical method based on liquid–liquid extraction has been developed and validated for quantification of Rasagiline in human plasma. Rasagiline-13C3 mesylate was used as an internal standard for Rasagiline. Zorbax Eclipse Plus C18 (2.1 mm×50 mm, 3.5 μm column provided chromatographic separation of analyte followed by detection with mass spectrometry. The method involved simple isocratic chromatographic condition and mass spectrometric detection in the positive ionization mode using an API-4000 system. The total run time was 3.0 min. The proposed method has been validated with the linear range of 5–12000 pg/mL for Rasagiline. The intra-run and inter-run precision values were within 1.3%–2.9% and 1.6%–2.2% respectively for Rasagiline. The overall recovery for Rasagiline and Rasagiline-13C3 mesylate analog was 96.9% and 96.7% respectively. This validated method was successfully applied to the bioequivalence and pharmacokinetic study of human volunteers under fasting condition. Keywords: High performance liquid chromatography, Mass spectrometry, Rasagiline, Liquid–liquid extraction

  8. Simple and Selective HPLC-UV/Vis Bioanalytical Method to Determine Aluminum Phthalocyanine Chloride in Skin Permeation Studies

    Directory of Open Access Journals (Sweden)

    Thaiene Avila Reis

    2018-01-01

    Full Text Available Considering the feasibility of the aluminum phthalocyanine chloride (AlPcCl application in the topical photodynamic therapy of cutaneous tumors and the lack of HPLC methods capable of supporting skin permeation experiments using this compound, the aim of this study was to obtain a simple and selective chromatographic method for AlPcCl determination in skin matrices. A HPLC-UV/Vis method was developed using a normal-phase column operating at 30°C, an isocratic mobile phase of methanol : phosphoric acid (0.01 M at 1.5 mL/min, and detection at 670 nm. The method exhibited (i selectivity against various contaminants found in the different skin layers, (ii high drug extraction capacity from the hair follicle (>70% and remaining skin (>80%, and (iii low limits of detection and of quantification (0.03 and 0.09 μg/mL, resp.. The method was also linear in the range from 0.1 to 5.0 µg/mL (r = 0.9994 and demonstrated robustness with regard to experimental chromatographic parameters according to a factorial design. Lastly, the developed method was successfully tested in in vitro skin permeation studies of AlPcCl, proving its effectiveness in the development of pharmaceutical delivery systems containing this drug for topical photodynamic therapy of skin cancers.

  9. Application of reliability methods in Ontario Hydro

    International Nuclear Information System (INIS)

    Jeppesen, R.; Ravishankar, T.J.

    1985-01-01

    Ontario Hydro have established a reliability program in support of its substantial nuclear program. Application of the reliability program to achieve both production and safety goals is described. The value of such a reliability program is evident in the record of Ontario Hydro's operating nuclear stations. The factors which have contributed to the success of the reliability program are identified as line management's commitment to reliability; selective and judicious application of reliability methods; establishing performance goals and monitoring the in-service performance; and collection, distribution, review and utilization of performance information to facilitate cost-effective achievement of goals and improvements. (orig.)

  10. Development and validation of a bioanalytical LC-MS method for the quantification of GHRP-6 in human plasma.

    Science.gov (United States)

    Gil, Jeovanis; Cabrales, Ania; Reyes, Osvaldo; Morera, Vivian; Betancourt, Lázaro; Sánchez, Aniel; García, Gerardo; Moya, Galina; Padrón, Gabriel; Besada, Vladimir; González, Luis Javier

    2012-02-23

    Growth hormone-releasing peptide 6 (GHRP-6, His-(DTrp)-Ala-Trp-(DPhe)-Lys-NH₂, MW=872.44 Da) is a potent growth hormone secretagogue that exhibits a cytoprotective effect, maintaining tissue viability during acute ischemia/reperfusion episodes in different organs like small bowel, liver and kidneys. In the present work a quantitative method to analyze GHRP-6 in human plasma was developed and fully validated following FDA guidelines. The method uses an internal standard (IS) of GHRP-6 with ¹³C-labeled Alanine for quantification. Sample processing includes a precipitation step with cold acetone to remove the most abundant plasma proteins, recovering the GHRP-6 peptide with a high yield. Quantification was achieved by LC-MS in positive full scan mode in a Q-Tof mass spectrometer. The sensitivity of the method was evaluated, establishing the lower limit of quantification at 5 ng/mL and a range for the calibration curve from 5 ng/mL to 50 ng/mL. A dilution integrity test was performed to analyze samples at higher concentration of GHRP-6. The validation process involved five calibration curves and the analysis of quality control samples to determine accuracy and precision. The calibration curves showed R² higher than 0.988. The stability of the analyte and its internal standard (IS) was demonstrated in all conditions the samples would experience in a real time analyses. This method was applied to the quantification of GHRP-6 in plasma from nine healthy volunteers participating in a phase I clinical trial. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Bioanalytical method development and validation for determination of metoprolol tartarate and hydrochlorothiazide using HPTLC in human plasma

    Directory of Open Access Journals (Sweden)

    Ambadas Ranganath Rote

    2013-12-01

    Full Text Available A simple, sensitive, rapid and economic chromatographic method has been developed for determination of metoprolol tartarate and hydrochlorothiazide in human plasma using paracetamol as an internal standard. The analytical technique used for method development was high-performance thin-layer chromatography. HPTLC Camag with precoated silica gel Plate 60F254 (20 cm×10 cm at 250 µm thicknesses (E. Merck, Darmstadt, Germany was used as the stationary phase. The mobile phase used consisted of chloroform: methanol: ammonia (9:1:0.5v/v/v. Densitometric analysis was carried out at a wavelength of 239 nm. The rf values for hydrochlorothiazide, paracetamol and metoprolol tartarate were 0.13±0.04, 0.28±0.05, 0.48±0.04, respectively. Plasma samples were extracted by protein precipitation with methanol. Concentration ranges of 200, 400, 600, 800, 1000, 1200 ng/mL and 2000, 4000, 6000, 8000, 10000, 12000 ng/mL of hydrochlorothiazide and metoprolol tartarate, respectively, were used with plasma for the calibration curves. The percent recovery of metoprolol tartarate and hydrochlorothiazide was found to be 77.30 and 77.02 %, respectively. The stability of metoprolol tartarate and hydrochlorothiazide in plasma were confirmed during three freeze-thaw cycles (-20 ºC on a bench for 24 hours and post-preparatively for 48 hours. The proposed method was validated statistically and proved suitable for determination of metoprolol tartarate and hydrochlorothiazide in human plasma.

  12. Advanced bioanalytics for precision medicine.

    Science.gov (United States)

    Roda, Aldo; Michelini, Elisa; Caliceti, Cristiana; Guardigli, Massimo; Mirasoli, Mara; Simoni, Patrizia

    2018-01-01

    Precision medicine is a new paradigm that combines diagnostic, imaging, and analytical tools to produce accurate diagnoses and therapeutic interventions tailored to the individual patient. This approach stands in contrast to the traditional "one size fits all" concept, according to which researchers develop disease treatments and preventions for an "average" patient without considering individual differences. The "one size fits all" concept has led to many ineffective or inappropriate treatments, especially for pathologies such as Alzheimer's disease and cancer. Now, precision medicine is receiving massive funding in many countries, thanks to its social and economic potential in terms of improved disease prevention, diagnosis, and therapy. Bioanalytical chemistry is critical to precision medicine. This is because identifying an appropriate tailored therapy requires researchers to collect and analyze information on each patient's specific molecular biomarkers (e.g., proteins, nucleic acids, and metabolites). In other words, precision diagnostics is not possible without precise bioanalytical chemistry. This Trend article highlights some of the most recent advances, including massive analysis of multilayer omics, and new imaging technique applications suitable for implementing precision medicine. Graphical abstract Precision medicine combines bioanalytical chemistry, molecular diagnostics, and imaging tools for performing accurate diagnoses and selecting optimal therapies for each patient.

  13. Evaluation of structural reliability using simulation methods

    Directory of Open Access Journals (Sweden)

    Baballëku Markel

    2015-01-01

    Full Text Available Eurocode describes the 'index of reliability' as a measure of structural reliability, related to the 'probability of failure'. This paper is focused on the assessment of this index for a reinforced concrete bridge pier. It is rare to explicitly use reliability concepts for design of structures, but the problems of structural engineering are better known through them. Some of the main methods for the estimation of the probability of failure are the exact analytical integration, numerical integration, approximate analytical methods and simulation methods. Monte Carlo Simulation is used in this paper, because it offers a very good tool for the estimation of probability in multivariate functions. Complicated probability and statistics problems are solved through computer aided simulations of a large number of tests. The procedures of structural reliability assessment for the bridge pier and the comparison with the partial factor method of the Eurocodes have been demonstrated in this paper.

  14. A Method of Nuclear Software Reliability Estimation

    International Nuclear Information System (INIS)

    Park, Gee Yong; Eom, Heung Seop; Cheon, Se Woo; Jang, Seung Cheol

    2011-01-01

    A method on estimating software reliability for nuclear safety software is proposed. This method is based on the software reliability growth model (SRGM) where the behavior of software failure is assumed to follow the non-homogeneous Poisson process. Several modeling schemes are presented in order to estimate and predict more precisely the number of software defects based on a few of software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating the software test cases into the model. It is identified that this method is capable of accurately estimating the remaining number of software defects which are on-demand type directly affecting safety trip functions. The software reliability can be estimated from a model equation and one method of obtaining the software reliability is proposed

  15. Bioanalytical development and validation of liquid chromatographic–tandem mass spectrometric methods for the quantification of total and free cefazolin in human plasma and cord blood

    Directory of Open Access Journals (Sweden)

    Christopher A. Crutchfield

    2015-04-01

    Full Text Available Objectives: Cefazolin is a commonly prescribed β-lactam antibiotic for prophylaxis against skin infections following surgery, including caesarean sections. Assessment of maternal and neonatal exposure is important for correlating drug concentrations to clinical outcomes. Thus, bioanalytical methods for the quantification of both total and free cefazolin in maternal plasma and cord blood can assist in the comprehensive evaluation of cefazolin exposure. Design and methods: Specimen preparation for the measurement of total cefazolin was performed via protein precipitation with acetonitrile containing the internal standard cloxacillin. Ultrafiltration was used to isolate free cefazolin. Processed samples were analyzed on a Prelude SPLC system coupled to a TSQ triple quadrupole Vantage mass spectrometer. Methods were validated following FDA bioanalytical guidelines. Results: The analytical measuring ranges of these methods were 0.48–480 µg/mL and 0.048–48 µg/mL for total and free drug, respectively. Calibration curves were generated using 1/x2 weighted linear regression analysis. Total cefazolin demonstrated inter- and intra-assay precision of ≤20% at the LLOQ and ≤11.2% at other levels. Free cefazolin demonstrated inter- and intra-assay precision of ≤18.5% at the LLOQ and ≤12.6% at other levels, respectively. Accuracy (%DEV, carryover, matrix effects, recovery and stability studies were also acceptable based on FDA recommendations. Furthermore, it was demonstrated that samples prepared in cord blood can be accurately quantified from an adult plasma calibration curve, with recoveries ≤9.1% DIF and ≤11.9% DIF for total and free cefazolin, respectively. Conclusions: The described LC–MS/MS methods allow for the measurement of total and free cefazolin in both plasma and cord blood. Keywords: Mass spectrometry, Method validation, Cefazolin, Antibiotic, Ultrafiltration

  16. Outsourcing bioanalytical services at Janssen Research and Development: the sequel anno 2017.

    Science.gov (United States)

    Dillen, Lieve; Verhaeghe, Tom

    2017-08-01

    The strategy of outsourcing bioanalytical services at Janssen has been evolving over the last years and an update will be given on the recent changes in our processes. In 2016, all internal GLP-related activities were phased out and this decision lead to the re-orientation of the in-house bioanalytical activities. As a consequence, in-depth experience with the validated bioanalytical assays for new drug candidates is currently gained together with the external partner, since development and validation of the assay and execution of GLP preclinical studies are now transferred to the CRO. The evolution to externalize more bioanalytical support has created opportunities to build even stronger partnerships with the CROs and to refocus internal resources. Case studies are presented illustrating challenges encountered during method development and validation at preferred partners when limited internal experience is obtained or with introduction of new technology.

  17. Reliability of Estimation Pile Load Capacity Methods

    Directory of Open Access Journals (Sweden)

    Yudhi Lastiasih

    2014-04-01

    Full Text Available None of numerous previous methods for predicting pile capacity is known how accurate any of them are when compared with the actual ultimate capacity of piles tested to failure. The author’s of the present paper have conducted such an analysis, based on 130 data sets of field loading tests. Out of these 130 data sets, only 44 could be analysed, of which 15 were conducted until the piles actually reached failure. The pile prediction methods used were: Brinch Hansen’s method (1963, Chin’s method (1970, Decourt’s Extrapolation Method (1999, Mazurkiewicz’s method (1972, Van der Veen’s method (1953, and the Quadratic Hyperbolic Method proposed by Lastiasih et al. (2012. It was obtained that all the above methods were sufficiently reliable when applied to data from pile loading tests that loaded to reach failure. However, when applied to data from pile loading tests that loaded without reaching failure, the methods that yielded lower values for correction factor N are more recommended. Finally, the empirical method of Reese and O’Neill (1988 was found to be reliable enough to be used to estimate the Qult of a pile foundation based on soil data only.

  18. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    1984-10-01

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  19. A novel liquid chromatography/tandem mass spectrometry (LC-MS/MS) based bioanalytical method for quantification of ethyl esters of Eicosapentaenoic acid (EPA) and Docosahexaenoic acid (DHA) and its application in pharmacokinetic study.

    Science.gov (United States)

    Viswanathan, Sekarbabu; Verma, P R P; Ganesan, Muniyandithevar; Manivannan, Jeganathan

    2017-07-15

    Omega-3 fatty acids are clinically useful and the two marine omega-3 fatty acids eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) are prevalent in fish and fish oils. Omega-3 fatty acid formulations should undergo a rigorous regulatory step in order to obtain United States Food and Drug Administration (USFDA) approval as prescription drug. In connection with that, despite quantifying EPA and DHA fatty acids, there is a need for quantifying the level of ethyl esters of them in biological samples. In this study, we make use of reverse phase high performance liquid chromatography coupled with mass spectrometry (RP-HPLC-MS)technique for the method development. Here, we have developed a novel multiple reaction monitoring method along with optimized parameters for quantification of EPA and DHA as ethyl esters. Additionally, we attempted to validate the bio-analytical method by conducting the sensitivity, selectivity, precision accuracy batch, carryover test and matrix stability experiments. Furthermore, we also implemented our validated method for evaluation of pharmacokinetics of omega fatty acid ethyl ester formulations. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Optimization and validation of bioanalytical SPE – HPLC method for the simultaneous determination of carbamazepine and its main metabolite, carbamazepine-10, 11-epoxide, in plasma

    Directory of Open Access Journals (Sweden)

    Jasmina Tonic – Ribarska

    2012-03-01

    Full Text Available Carbamazepine is widely used as an antiepileptic drug in the treatment of partial and generalized tonic-clonic seizures. Carbamazepine 10,11-epoxide is the most important metabolite of carbamazepine, because it is a pharmacologically active compound with anticonvulsant properties. According to that, the routine analysis of carbamazepine 10,11-epoxide along with carbamazepine may provide optimal therapeutic monitoring of carbamazepine treatment. The aim of this study was to optimize and validate a simple and reliable solid - phase extraction method followed by RP-HPLC for the simultaneous determination of plasma levels of carbamazepine and carbamazepine-10,11-epoxide, in order to assure the implementation of the method for therapeutic monitoring. The extraction of the analytes from the plasma samples was performed by means of a solid-phase extraction procedure. The separation was carried out on a reversed-phase column using isocratic elution with acetonitrile and water (35:65, v/v as a mobile phase. The temperature was 30°C and UV detection was set at 220 nm. The extraction yield values were more than 98% for all analytes, measured at four concentration levels of the linear concentration range. The method displayed excellent selectivity, sensitivity, linearity, precision and accuracy. Stability studies indicate that stock solutions and plasma samples were stabile under different storage conditions at least during the observed period. The method was successfully applied to determine the carbamazepine and carbamazepine-10,11-epoxide in plasma of epileptic patients treated with carbamazepine as monotherapy and in polytherapy. In conclusion, the proposed method is suitable for application in therapeutic drug monitoring of epileptic patients undergoing treatment with carbamazepine.

  1. Review of Quantitative Software Reliability Methods

    Energy Technology Data Exchange (ETDEWEB)

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of

  2. Structural reliability methods: Code development status

    Science.gov (United States)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  3. An Evaluation Method of Equipment Reliability Configuration Management

    Science.gov (United States)

    Wang, Wei; Feng, Weijia; Zhang, Wei; Li, Yuan

    2018-01-01

    At present, many equipment development companies have been aware of the great significance of reliability of the equipment development. But, due to the lack of effective management evaluation method, it is very difficult for the equipment development company to manage its own reliability work. Evaluation method of equipment reliability configuration management is to determine the reliability management capabilities of equipment development company. Reliability is not only designed, but also managed to achieve. This paper evaluates the reliability management capabilities by reliability configuration capability maturity model(RCM-CMM) evaluation method.

  4. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  5. Validação em métodos cromatográficos para análises de pequenas moléculas em matrizes biológicas Chromatographic methods validation for analysis of small molecules in biological matrices

    OpenAIRE

    Neila Maria Cassiano; Juliana Cristina Barreiro; Lúcia Regina Rocha Martins; Regina Vincenzi Oliveira; Quezia Bezerra Cass

    2009-01-01

    Chromatographic methods are commonly used for analysis of small molecules in different biological matrices. An important step to be considered upon a bioanalytical method's development is the capacity to yield reliable and reproducible results. This review discusses validation procedures adopted by different governmental agencies, such as Food and Drug Administration (USA), European Union (EU) and Agência Nacional de Vigilância Sanitária (BR) for quantification of small molecules by bioanalyt...

  6. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine

    DEFF Research Database (Denmark)

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon

    2013-01-01

    as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine...

  7. 3D-printed Bioanalytical Devices

    Science.gov (United States)

    Bishop, Gregory W; Satterwhite-Warden, Jennifer E; Kadimisetty, Karteek; Rusling, James F

    2016-01-01

    While 3D printing technologies first appeared in the 1980s, prohibitive costs, limited materials, and the relatively small number of commercially available printers confined applications mainly to prototyping for manufacturing purposes. As technologies, printer cost, materials, and accessibility continue to improve, 3D printing has found widespread implementation in research and development in many disciplines due to ease-of-use and relatively fast design-to-object workflow. Several 3D printing techniques have been used to prepare devices such as milli- and microfluidic flow cells for analyses of cells and biomolecules as well as interfaces that enable bioanalytical measurements using cellphones. This review focuses on preparation and applications of 3D-printed bioanalytical devices. PMID:27250897

  8. Development and validation of bioanalytical UHPLC-UV method for simultaneous analysis of unchanged fenofibrate and its metabolite fenofibric acid in rat plasma: Application to pharmacokinetics

    Directory of Open Access Journals (Sweden)

    Rayan G. Alamri

    2017-01-01

    Full Text Available A simple, precise, selective and fast ultra-high performance liquid chromatography (UHPLC-UV method has been developed and validated for the simultaneous determination of a lipid regulating agent fenofibrate and its metabolite fenofibric acid in rat plasma. The chromatographic separation was carried out on a reversed-phase Acquity® BEH C18 column using methanol–water (65:35, v/v as the mobile phase. The isocratic flow was 0.3 ml/min with rapid run time of 2.5 min and UV detection was at 284 nm. The method was validated over a concentration range of 100–10000 ng/ml (r2 ⩾ 0.9993. The selectivity, specificity, recovery, accuracy and precision were validated for determination of fenofibrate/fenofibric acid in rat plasma. The lower limits of detection and quantitation of the method were 30 and 90 ng/ml for fenofibrate and 40 and 100 ng/ml for fenofibric acid, respectively. The within and between-day coefficients of variation were less than 5%. The validated method has been successfully applied to measure the plasma concentrations in pharmacokinetics study of fenofibrate in an animal model to illustrate the scope and application of the method.

  9. Bioanalytical method development and validation for the determination of glycine in human cerebrospinal fluid by ion-pair reversed-phase liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Jiang, Jian; James, Christopher A; Wong, Philip

    2016-09-05

    A LC-MS/MS method has been developed and validated for the determination of glycine in human cerebrospinal fluid (CSF). The validated method used artificial cerebrospinal fluid as a surrogate matrix for calibration standards. The calibration curve range for the assay was 100-10,000ng/mL and (13)C2, (15)N-glycine was used as an internal standard (IS). Pre-validation experiments were performed to demonstrate parallelism with surrogate matrix and standard addition methods. The mean endogenous glycine concentration in a pooled human CSF determined on three days by using artificial CSF as a surrogate matrix and the method of standard addition was found to be 748±30.6 and 768±18.1ng/mL, respectively. A percentage difference of -2.6% indicated that artificial CSF could be used as a surrogate calibration matrix for the determination of glycine in human CSF. Quality control (QC) samples, except the lower limit of quantitation (LLOQ) QC and low QC samples, were prepared by spiking glycine into aliquots of pooled human CSF sample. The low QC sample was prepared from a separate pooled human CSF sample containing low endogenous glycine concentrations, while the LLOQ QC sample was prepared in artificial CSF. Standard addition was used extensively to evaluate matrix effects during validation. The validated method was used to determine the endogenous glycine concentrations in human CSF samples. Incurred sample reanalysis demonstrated reproducibility of the method. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Extrapolation Method for System Reliability Assessment

    DEFF Research Database (Denmark)

    Qin, Jianjun; Nishijima, Kazuyoshi; Faber, Michael Havbro

    2012-01-01

    of integrals with scaled domains. The performance of this class of approximation depends on the approach applied for the scaling and the functional form utilized for the extrapolation. A scheme for this task is derived here taking basis in the theory of asymptotic solutions to multinormal probability integrals......The present paper presents a new scheme for probability integral solution for system reliability analysis, which takes basis in the approaches by Naess et al. (2009) and Bucher (2009). The idea is to evaluate the probability integral by extrapolation, based on a sequence of MC approximations...... that the proposed scheme is efficient and adds to generality for this class of approximations for probability integrals....

  11. Development and Validation of a Bioanalytical Method for Quantification of 2,6-Bis-(4-hydroxy-3-methoxybenzylidene-cyclohexanone (BHMC in Rat Plasma

    Directory of Open Access Journals (Sweden)

    Yu Zhao Lee

    2012-12-01

    Full Text Available A sensitive and accurate high performance liquid chromatography with ultraviolet/visible light detection (HPLC-UV/VIS method for the quantification of 2,6-bis-(4-hydroxy-3-methoxybenzylidene-cyclohexanone (BHMC in rat plasma was developed and validated. BHMC and the internal standard, harmaline, were extracted from plasma samples by a simple liquid–liquid extraction using 95% ethyl acetate and 5% methanol. Plasma concentration of BHMC and internal standard were analyzed by reversed phase chromatography using a C18 column (150 × 4.6 mm I.D., particle size 5 µm and elution with a gradient mobile phase of water and methanol at a flow rate of 1.0 mL/min. Detection of BHMC and internal standard was done at a wavelength of 380 nm. The limit of quantification was 0.02 µg/mL. The calibration curves was linear (R2 > 0.999 over the concentration range of 0.02–2.5 µg/mL. Intra- and inter-day precision were less than 2% coefficient of variation. The validated method was then applied to a pharmacokinetic study in rats by intravenous administration of BHMC at a single dose of 10 mg/kg. Pharmacokinetic parameters such as half-life, maximum plasma concentration, volume of distribution, clearance and elimination rate constant for BHMC were calculated.

  12. Reliability of non-destructive testing methods

    International Nuclear Information System (INIS)

    Broekhoven, M.J.G.

    1988-01-01

    This contribution regards the results of an evaluation of the reliability of radiography (X-rays and gamma-rays), manual-, and mechanized/automated ultrasonic examination by generally accepted codes/rules, with respect to detection, characterization and sizing/localization of defects. The evaluation is based on the results of examinations, by a number of teams, of 30 test plates, 30 and 50 mm thickness, containing V,U, X and K-shaped welds each containing several types of imperfections (211) in total) typical for steel arc fusion welding, such as porosity, inclusions, lack of fusion or penetration and cracks. Besides, some results are presented obtained from research on advanced UT-techniques, viz. the time-of-flight-diffraction and flaw-tip deflection technique. (author)

  13. Reliability of non-destructive testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Broekhoven, M J.G. [Ministry of Social Affairs, (Netherlands)

    1988-12-31

    This contribution regards the results of an evaluation of the reliability of radiography (X-rays and gamma-rays), manual-, and mechanized/automated ultrasonic examination by generally accepted codes/rules, with respect to detection, characterization and sizing/localization of defects. The evaluation is based on the results of examinations, by a number of teams, of 30 test plates, 30 and 50 mm thickness, containing V,U, X and K-shaped welds each containing several types of imperfections (211) in total) typical for steel arc fusion welding, such as porosity, inclusions, lack of fusion or penetration and cracks. Besides, some results are presented obtained from research on advanced UT-techniques, viz. the time-of-flight-diffraction and flaw-tip deflection technique. (author). 4 refs.

  14. A fully validated bioanalytical method using an UHPLC-MS/MS system for quantification of DNA and RNA oxidative stress biomarkers.

    Science.gov (United States)

    Cervinkova, Barbora; Krcmova, Lenka Kujovska; Sestakova, Veronika; Solichova, Dagmar; Solich, Petr

    2017-05-01

    A new, rapid and effective ultra-high-performance liquid chromatography method with mass spectrometry detection is described for the separation and quantification of 8-hydroxy-2-deoxyguanosine, 8-hydroxyguanosine and creatinine in human urine. The present study uses an isotope-labelled internal standard ([ 15 N] 5 -8-hydroxy-2-deoxyguanosine), a BIO core-shell stationary phase and an isocratic elution of methanol and water. Sample preparation of human urine was performed by solid-phase extraction (SPE) on Oasis HLB cartridges with methanol/water 50:50 (v/v) elution. Extraction recoveries ranged from 98.1% to 109.2%. Biological extracts showed high short-term stability. Several aspects of this procedure make it suitable for both clinical and research purposes: a short elution time of less than 3.2 min, an intra-day precision of 2.5-8.9%, an inter-day precision of 3.4-8.7% and low limits of quantification (27.7 nM for 8-hydroxyguanosine, 6.0 nM for 8-hydroxy-2-deoxyguanosine). Finally, simultaneous analysis of DNA and RNA oxidative stress biomarkers is a useful tool for monitoring disease progression in neurodegenerative disorders and cancer. Graphical abstract UHPLC-MS/MS analysis of DNA and RNA oxidative stress biomarkers.

  15. A reliability evaluation method for NPP safety DCS application software

    International Nuclear Information System (INIS)

    Li Yunjian; Zhang Lei; Liu Yuan

    2014-01-01

    In the field of nuclear power plant (NPP) digital i and c application, reliability evaluation for safety DCS application software is a key obstacle to be removed. In order to quantitatively evaluate reliability of NPP safety DCS application software, this paper propose a reliability evaluating method based on software development life cycle every stage's v and v defects density characteristics, by which the operating reliability level of the software can be predicted before its delivery, and helps to improve the reliability of NPP safety important software. (authors)

  16. Mathematical Methods in Survival Analysis, Reliability and Quality of Life

    CERN Document Server

    Huber, Catherine; Mesbah, Mounir

    2008-01-01

    Reliability and survival analysis are important applications of stochastic mathematics (probability, statistics and stochastic processes) that are usually covered separately in spite of the similarity of the involved mathematical theory. This title aims to redress this situation: it includes 21 chapters divided into four parts: Survival analysis, Reliability, Quality of life, and Related topics. Many of these chapters were presented at the European Seminar on Mathematical Methods for Survival Analysis, Reliability and Quality of Life in 2006.

  17. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    Science.gov (United States)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  18. A novel reliability evaluation method for large engineering systems

    Directory of Open Access Journals (Sweden)

    Reda Farag

    2016-06-01

    Full Text Available A novel reliability evaluation method for large nonlinear engineering systems excited by dynamic loading applied in time domain is presented. For this class of problems, the performance functions are expected to be function of time and implicit in nature. Available first- or second-order reliability method (FORM/SORM will be challenging to estimate reliability of such systems. Because of its inefficiency, the classical Monte Carlo simulation (MCS method also cannot be used for large nonlinear dynamic systems. In the proposed approach, only tens instead of hundreds or thousands of deterministic evaluations at intelligently selected points are used to extract the reliability information. A hybrid approach, consisting of the stochastic finite element method (SFEM developed by the author and his research team using FORM, response surface method (RSM, an interpolation scheme, and advanced factorial schemes, is proposed. The method is clarified with the help of several numerical examples.

  19. Advances in methods and applications of reliability and safety analysis

    International Nuclear Information System (INIS)

    Fieandt, J.; Hossi, H.; Laakso, K.; Lyytikaeinen, A.; Niemelae, I.; Pulkkinen, U.; Pulli, T.

    1986-01-01

    The know-how of the reliability and safety design and analysis techniques of Vtt has been established over several years in analyzing the reliability in the Finnish nuclear power plants Loviisa and Olkiluoto. This experience has been later on applied and developed to be used in the process industry, conventional power industry, automation and electronics. VTT develops and transfers methods and tools for reliability and safety analysis to the private and public sectors. The technology transfer takes place in joint development projects with potential users. Several computer-aided methods, such as RELVEC for reliability modelling and analysis, have been developed. The tool developed are today used by major Finnish companies in the fields of automation, nuclear power, shipbuilding and electronics. Development of computer-aided and other methods needed in analysis of operating experience, reliability or safety is further going on in a number of research and development projects

  20. Cellphone-based devices for bioanalytical sciences

    Science.gov (United States)

    Vashist, Sandeep Kumar; Mudanyali, Onur; Schneider, E.Marion; Zengerle, Roland; Ozcan, Aydogan

    2014-01-01

    During the last decade, there has been a rapidly growing trend toward the use of cellphone-based devices (CBDs) in bioanalytical sciences. For example, they have been used for digital microscopy, cytometry, read-out of immunoassays and lateral flow tests, electrochemical and surface plasmon resonance based bio-sensing, colorimetric detection and healthcare monitoring, among others. Cellphone can be considered as one of the most prospective devices for the development of next-generation point-of-care (POC) diagnostics platforms, enabling mobile healthcare delivery and personalized medicine. With more than 6.5 billion cellphone subscribers worldwide and approximately 1.6 billion new devices being sold each year, cellphone technology is also creating new business and research opportunities. Many cellphone-based devices, such as those targeted for diabetic management, weight management, monitoring of blood pressure and pulse rate, have already become commercially-available in recent years. In addition to such monitoring platforms, several other CBDs are also being introduced, targeting e.g., microscopic imaging and sensing applications for medical diagnostics using novel computational algorithms and components already embedded on cellphones. This manuscript aims to review these recent developments in CBDs for bioanalytical sciences along with some of the challenges involved and the future opportunities. PMID:24287630

  1. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  2. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    International Nuclear Information System (INIS)

    Boring, Ronald Laurids

    2010-01-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  3. A method to assign failure rates for piping reliability assessments

    International Nuclear Information System (INIS)

    Gamble, R.M.; Tagart, S.W. Jr.

    1991-01-01

    This paper reports on a simplified method that has been developed to assign failure rates that can be used in reliability and risk studies of piping. The method can be applied on a line-by-line basis by identifying line and location specific attributes that can lead to piping unreliability from in-service degradation mechanisms and random events. A survey of service experience for nuclear piping reliability also was performed. The data from this survey provides a basis for identifying in-service failure attributes and assigning failure rates for risk and reliability studies

  4. Reliability analysis of neutron transport simulation using Monte Carlo method

    International Nuclear Information System (INIS)

    Souza, Bismarck A. de; Borges, Jose C.

    1995-01-01

    This work presents a statistical and reliability analysis covering data obtained by computer simulation of neutron transport process, using the Monte Carlo method. A general description of the method and its applications is presented. Several simulations, corresponding to slowing down and shielding problems have been accomplished. The influence of the physical dimensions of the materials and of the sample size on the reliability level of results was investigated. The objective was to optimize the sample size, in order to obtain reliable results, optimizing computation time. (author). 5 refs, 8 figs

  5. Modifying nodal pricing method considering market participants optimality and reliability

    Directory of Open Access Journals (Sweden)

    A. R. Soofiabadi

    2015-06-01

    Full Text Available This paper develops a method for nodal pricing and market clearing mechanism considering reliability of the system. The effects of components reliability on electricity price, market participants’ profit and system social welfare is considered. This paper considers reliability both for evaluation of market participant’s optimality as well as for fair pricing and market clearing mechanism. To achieve fair pricing, nodal price has been obtained through a two stage optimization problem and to achieve fair market clearing mechanism, comprehensive criteria has been introduced for optimality evaluation of market participant. Social welfare of the system and system efficiency are increased under proposed modified nodal pricing method.

  6. A reliable and validated LC-MS/MS method for the simultaneous quantification of 4 cannabinoids in 40 consumer products.

    Directory of Open Access Journals (Sweden)

    Qingfang Meng

    Full Text Available In the past 50 years, Cannabis sativa (C. sativa has gone from a substance essentially prohibited worldwide to one that is gaining acceptance both culturally and legally in many countries for medicinal and recreational use. As additional jurisdictions legalize Cannabis products and the variety and complexity of these products surpass the classical dried plant material, appropriate methods for measuring the biologically active constituents is paramount to ensure safety and regulatory compliance. While there are numerous active compounds in C. sativa the primary cannabinoids of regulatory and safety concern are (--Δ⁹-tetrahydrocannabinol (THC, cannabidiol (CBD, and their respective acidic forms THCA-A and CBDA. Using the US Food and Drug Administration (FDA bioanalytical method validation guidelines we developed a sensitive, selective, and accurate method for the simultaneous analysis CBD, CBDA, THC, and THCA-A in oils and THC & CBD in more complex matrices. This HPLC-MS/MS method was simple and reliable using standard sample dilution and homogenization, an isocratic chromatographic separation, and a triple quadrupole mass spectrometer. The lower limit of quantification (LLOQ for analytes was 0.195 ng/mL over a 0.195-50.0 ng/mL range of quantification with a coefficient of correlation of >0.99. Average intra-day and inter-day accuracies were 94.2-112.7% and 97.2-110.9%, respectively. This method was used to quantify CBD, CBDA, THC, and THCA-A in 40 commercial hemp products representing a variety of matrices including oils, plant materials, and creams/cosmetics. All products tested met the federal regulatory restrictions on THC content in Canada (1,000 (an oil-based product. Overall, the method proved amenable to the analysis of various commercial products including oils, creams, and plant material and may be diagnostically indicative of adulteration with non-hemp C. sativa, specialized hemp cultivars, or unique manufacturing methods.

  7. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  8. Evaluation of Information Requirements of Reliability Methods in Engineering Design

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema

    2010-01-01

    This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...

  9. Bioanalytical outsourcing strategy at Janssen Research and Development.

    Science.gov (United States)

    Verhaeghe, Tom

    2014-05-01

    The times when all bioanalytical work was supported in-house are long behind us. In the modern bioanalytical laboratory, workload is divided between in-house support and outsourcing to contract research organizations. This paper outlines the outsourcing strategy of the Janssen-regulated bioanalytical group. Keeping the knowledge of the assay and the compound internally is a cornerstone of this strategy and is a driver for balancing the workload between the internal laboratory and contract laboratories. The number of contract laboratories that are being used is limited and criteria for selecting laboratories are discussed. Special attention is paid to the experience with outsourcing clinical studies to China.

  10. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    OpenAIRE

    Hai An; Ling Zhou; Hui Sun

    2016-01-01

    Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new...

  11. Sequential optimization and reliability assessment method for metal forming processes

    International Nuclear Information System (INIS)

    Sahai, Atul; Schramm, Uwe; Buranathiti, Thaweepat; Chen Wei; Cao Jian; Xia, Cedric Z.

    2004-01-01

    Uncertainty is inevitable in any design process. The uncertainty could be due to the variations in geometry of the part, material properties or due to the lack of knowledge about the phenomena being modeled itself. Deterministic design optimization does not take uncertainty into account and worst case scenario assumptions lead to vastly over conservative design. Probabilistic design, such as reliability-based design and robust design, offers tools for making robust and reliable decisions under the presence of uncertainty in the design process. Probabilistic design optimization often involves double-loop procedure for optimization and iterative probabilistic assessment. This results in high computational demand. The high computational demand can be reduced by replacing computationally intensive simulation models with less costly surrogate models and by employing Sequential Optimization and reliability assessment (SORA) method. The SORA method uses a single-loop strategy with a series of cycles of deterministic optimization and reliability assessment. The deterministic optimization and reliability assessment is decoupled in each cycle. This leads to quick improvement of design from one cycle to other and increase in computational efficiency. This paper demonstrates the effectiveness of Sequential Optimization and Reliability Assessment (SORA) method when applied to designing a sheet metal flanging process. Surrogate models are used as less costly approximations to the computationally expensive Finite Element simulations

  12. Sample preparation for large-scale bioanalytical studies based on liquid chromatographic techniques.

    Science.gov (United States)

    Medvedovici, Andrei; Bacalum, Elena; David, Victor

    2018-01-01

    Quality of the analytical data obtained for large-scale and long term bioanalytical studies based on liquid chromatography depends on a number of experimental factors including the choice of sample preparation method. This review discusses this tedious part of bioanalytical studies, applied to large-scale samples and using liquid chromatography coupled with different detector types as core analytical technique. The main sample preparation methods included in this paper are protein precipitation, liquid-liquid extraction, solid-phase extraction, derivatization and their versions. They are discussed by analytical performances, fields of applications, advantages and disadvantages. The cited literature covers mainly the analytical achievements during the last decade, although several previous papers became more valuable in time and they are included in this review. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  14. New practical algorithm for modelling analyte recovery in bioanalytical reversed phase and mixed-mode solid phase extraction

    NARCIS (Netherlands)

    Hendriks, G.; Uges, D. R. A.; Franke, J. P.

    2008-01-01

    Solid phase extraction (SPE) is a widely used method for sample cleanup and sample concentration in bioanalytical sample preparation. A few methods to model the retention behaviour on SPE cartridges have been described previously but they are either not applicable to ionised species or are not

  15. Level III Reliability methods feasible for complex structures

    NARCIS (Netherlands)

    Waarts, P.H.; Boer, A. de

    2001-01-01

    The paper describes the comparison between three types of reliability methods: code type level I used by a designer, full level I and a level III method. Two cases that are typical for civil engineering practise, a cable-stayed subjected to traffic load and the installation of a soil retaining sheet

  16. Developing a reliable signal wire attachment method for rail.

    Science.gov (United States)

    2014-11-01

    The goal of this project was to develop a better attachment method for rail signal wires to improve the reliability of signaling : systems. EWI conducted basic research into the failure mode of current attachment methods and developed and tested a ne...

  17. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  18. Reliability Estimation of the Pultrusion Process Using the First-Order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Baran, Ismet; Tutum, Cem Celal; Hattel, Jesper Henri

    2013-01-01

    In the present study the reliability estimation of the pultrusion process of a flat plate is analyzed by using the first order reliability method (FORM). The implementation of the numerical process model is validated by comparing the deterministic temperature and cure degree profiles...... with corresponding analyses in the literature. The centerline degree of cure at the exit (CDOCE) being less than a critical value and the maximum composite temperature (Tmax) during the process being greater than a critical temperature are selected as the limit state functions (LSFs) for the FORM. The cumulative...

  19. Reliably detectable flaw size for NDE methods that use calibration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-04-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-1823 and associated mh18232 POD software gives most common methods of POD analysis. In this paper, POD analysis is applied to an NDE method, such as eddy current testing, where calibration is used. NDE calibration standards have known size artificial flaws such as electro-discharge machined (EDM) notches and flat bottom hole (FBH) reflectors which are used to set instrument sensitivity for detection of real flaws. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. Therefore, it is important to correlate signal responses from real flaws with signal responses form artificial flaws used in calibration process to determine reliably detectable flaw size.

  20. HUMAN RELIABILITY ANALYSIS DENGAN PENDEKATAN COGNITIVE RELIABILITY AND ERROR ANALYSIS METHOD (CREAM

    Directory of Open Access Journals (Sweden)

    Zahirah Alifia Maulida

    2015-01-01

    Full Text Available Kecelakaan kerja pada bidang grinding dan welding menempati urutan tertinggi selama lima tahun terakhir di PT. X. Kecelakaan ini disebabkan oleh human error. Human error terjadi karena pengaruh lingkungan kerja fisik dan non fisik.Penelitian kali menggunakan skenario untuk memprediksi serta mengurangi kemungkinan terjadinya error pada manusia dengan pendekatan CREAM (Cognitive Reliability and Error Analysis Method. CREAM adalah salah satu metode human reliability analysis yang berfungsi untuk mendapatkan nilai Cognitive Failure Probability (CFP yang dapat dilakukan dengan dua cara yaitu basic method dan extended method. Pada basic method hanya akan didapatkan nilai failure probabailty secara umum, sedangkan untuk extended method akan didapatkan CFP untuk setiap task. Hasil penelitian menunjukkan faktor- faktor yang mempengaruhi timbulnya error pada pekerjaan grinding dan welding adalah kecukupan organisasi, kecukupan dari Man Machine Interface (MMI & dukungan operasional, ketersediaan prosedur/ perencanaan, serta kecukupan pelatihan dan pengalaman. Aspek kognitif pada pekerjaan grinding yang memiliki nilai error paling tinggi adalah planning dengan nilai CFP 0.3 dan pada pekerjaan welding yaitu aspek kognitif execution dengan nilai CFP 0.18. Sebagai upaya untuk mengurangi nilai error kognitif pada pekerjaan grinding dan welding rekomendasi yang diberikan adalah memberikan training secara rutin, work instrucstion yang lebih rinci dan memberikan sosialisasi alat. Kata kunci: CREAM (cognitive reliability and error analysis method, HRA (human reliability analysis, cognitive error Abstract The accidents in grinding and welding sectors were the highest cases over the last five years in PT. X and it caused by human error. Human error occurs due to the influence of working environment both physically and non-physically. This study will implement an approaching scenario called CREAM (Cognitive Reliability and Error Analysis Method. CREAM is one of human

  1. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  2. An overview of reliability methods in mechanical and structural design

    Science.gov (United States)

    Wirsching, P. H.; Ortiz, K.; Lee, S. J.

    1987-01-01

    An evaluation is made of modern methods of fast probability integration and Monte Carlo treatment for the assessment of structural systems' and components' reliability. Fast probability integration methods are noted to be more efficient than Monte Carlo ones. This is judged to be an important consideration when several point probability estimates must be made in order to construct a distribution function. An example illustrating the relative efficiency of the various methods is included.

  3. Assessment of the reliability of ultrasonic inspection methods

    International Nuclear Information System (INIS)

    Haines, N.F.; Langston, D.B.; Green, A.J.; Wilson, R.

    1982-01-01

    The reliability of NDT techniques has remained an open question for many years. A reliable technique may be defined as one that, when rigorously applied by a number of inspection teams, consistently finds then correctly sizes all defects of concern. In this paper we report an assessment of the reliability of defect detection by manual ultrasonic methods applied to the inspection of thick section pressure vessel weldments. Initially we consider the available data relating to the inherent physical capabilities of ultrasonic techniques to detect cracks in weldment and then, independently, we assess the likely variability in team to team performance when several teams are asked to follow the same specified test procedure. The two aspects of 'capability' and 'variability' are brought together to provide quantitative estimates of the overall reliability of ultrasonic inspection of thick section pressure vessel weldments based on currently existing data. The final section of the paper considers current research programmes on reliability and presents a view on how these will help to further improve NDT reliability. (author)

  4. Investigation of MLE in nonparametric estimation methods of reliability function

    International Nuclear Information System (INIS)

    Ahn, Kwang Won; Kim, Yoon Ik; Chung, Chang Hyun; Kim, Kil Yoo

    2001-01-01

    There have been lots of trials to estimate a reliability function. In the ESReDA 20 th seminar, a new method in nonparametric way was proposed. The major point of that paper is how to use censored data efficiently. Generally there are three kinds of approach to estimate a reliability function in nonparametric way, i.e., Reduced Sample Method, Actuarial Method and Product-Limit (PL) Method. The above three methods have some limits. So we suggest an advanced method that reflects censored information more efficiently. In many instances there will be a unique maximum likelihood estimator (MLE) of an unknown parameter, and often it may be obtained by the process of differentiation. It is well known that the three methods generally used to estimate a reliability function in nonparametric way have maximum likelihood estimators that are uniquely exist. So, MLE of the new method is derived in this study. The procedure to calculate a MLE is similar just like that of PL-estimator. The difference of the two is that in the new method, the mass (or weight) of each has an influence of the others but the mass in PL-estimator not

  5. Reliability methods in nuclear power plant ageing management

    International Nuclear Information System (INIS)

    Simola, K.

    1999-01-01

    The aim of nuclear power plant ageing management is to maintain an adequate safety level throughout the lifetime of the plant. In ageing studies, the reliability of components, systems and structures is evaluated taking into account the possible time-dependent degradation. The phases of ageing analyses are generally the identification of critical components, identification and evaluation of ageing effects, and development of mitigation methods. This thesis focuses on the use of reliability methods and analyses of plant- specific operating experience in nuclear power plant ageing studies. The presented applications and method development have been related to nuclear power plants, but many of the approaches can also be applied outside the nuclear industry. The thesis consists of a summary and seven publications. The summary provides an overview of ageing management and discusses the role of reliability methods in ageing analyses. In the publications, practical applications and method development are described in more detail. The application areas at component and system level are motor-operated valves and protection automation systems, for which experience-based ageing analyses have been demonstrated. Furthermore, Bayesian ageing models for repairable components have been developed, and the management of ageing by improving maintenance practices is discussed. Recommendations for improvement of plant information management in order to facilitate ageing analyses are also given. The evaluation and mitigation of ageing effects on structural components is addressed by promoting the use of probabilistic modelling of crack growth, and developing models for evaluation of the reliability of inspection results. (orig.)

  6. Reliability methods in nuclear power plant ageing management

    Energy Technology Data Exchange (ETDEWEB)

    Simola, K. [VTT Automation, Espoo (Finland). Industrial Automation

    1999-07-01

    The aim of nuclear power plant ageing management is to maintain an adequate safety level throughout the lifetime of the plant. In ageing studies, the reliability of components, systems and structures is evaluated taking into account the possible time-dependent degradation. The phases of ageing analyses are generally the identification of critical components, identification and evaluation of ageing effects, and development of mitigation methods. This thesis focuses on the use of reliability methods and analyses of plant- specific operating experience in nuclear power plant ageing studies. The presented applications and method development have been related to nuclear power plants, but many of the approaches can also be applied outside the nuclear industry. The thesis consists of a summary and seven publications. The summary provides an overview of ageing management and discusses the role of reliability methods in ageing analyses. In the publications, practical applications and method development are described in more detail. The application areas at component and system level are motor-operated valves and protection automation systems, for which experience-based ageing analyses have been demonstrated. Furthermore, Bayesian ageing models for repairable components have been developed, and the management of ageing by improving maintenance practices is discussed. Recommendations for improvement of plant information management in order to facilitate ageing analyses are also given. The evaluation and mitigation of ageing effects on structural components is addressed by promoting the use of probabilistic modelling of crack growth, and developing models for evaluation of the reliability of inspection results. (orig.)

  7. Selected Methods For Increases Reliability The Of Electronic Systems Security

    Directory of Open Access Journals (Sweden)

    Paś Jacek

    2015-11-01

    Full Text Available The article presents the issues related to the different methods to increase the reliability of electronic security systems (ESS for example, a fire alarm system (SSP. Reliability of the SSP in the descriptive sense is a property preservation capacity to implement the preset function (e.g. protection: fire airport, the port, logistics base, etc., at a certain time and under certain conditions, e.g. Environmental, despite the possible non-compliance by a specific subset of elements this system. Analyzing the available literature on the ESS-SSP is not available studies on methods to increase the reliability (several works similar topics but moving with respect to the burglary and robbery (Intrusion. Based on the analysis of the set of all paths in the system suitability of the SSP for the scenario mentioned elements fire events (device critical because of security.

  8. A method of predicting the reliability of CDM coil insulation

    International Nuclear Information System (INIS)

    Kytasty, A.; Ogle, C.; Arrendale, H.

    1992-01-01

    This paper presents a method of predicting the reliability of the Collider Dipole Magnet (CDM) coil insulation design. The method proposes a probabilistic treatment of electrical test data, stress analysis, material properties variability and loading uncertainties to give the reliability estimate. The approach taken to predict reliability of design related failure modes of the CDM is to form analytical models of the various possible failure modes and their related mechanisms or causes, and then statistically assess the contributions of the various contributing variables. The probability of the failure mode occurring is interpreted as the number of times one would expect certain extreme situations to combine and randomly occur. One of the more complex failure modes of the CDM will be used to illustrate this methodology

  9. Human reliability analysis methods for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-11-01

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  10. A reliable method for the stability analysis of structures ...

    African Journals Online (AJOL)

    The detection of structural configurations with singular tangent stiffness matrix is essential because they can be unstable. The secondary paths, especially in unstable buckling, can play the most important role in the loss of stability and collapse of the structure. A new method for reliable detection and accurate computation of ...

  11. Methods to compute reliabilities for genomic predictions of feed intake

    Science.gov (United States)

    For new traits without historical reference data, cross-validation is often the preferred method to validate reliability (REL). Time truncation is less useful because few animals gain substantial REL after the truncation point. Accurate cross-validation requires separating genomic gain from pedigree...

  12. Planning of operation & maintenance using risk and reliability based methods

    DEFF Research Database (Denmark)

    Florian, Mihai; Sørensen, John Dalsgaard

    2015-01-01

    Operation and maintenance (OM) of offshore wind turbines contributes with a substantial part of the total levelized cost of energy (LCOE). The objective of this paper is to present an application of risk- and reliability-based methods for planning of OM. The theoretical basis is presented...

  13. Assessment of reliability of Greulich and Pyle (gp) method for ...

    African Journals Online (AJOL)

    Background: Greulich and Pyle standards are the most widely used age estimation standards all over the world. The applicability of the Greulich and Pyle standards to populations which differ from their reference population is often questioned. This study aimed to assess the reliability of Greulich and Pyle (GP) method for ...

  14. Statistical Bayesian method for reliability evaluation based on ADT data

    Science.gov (United States)

    Lu, Dawei; Wang, Lizhi; Sun, Yusheng; Wang, Xiaohong

    2018-05-01

    Accelerated degradation testing (ADT) is frequently conducted in the laboratory to predict the products’ reliability under normal operating conditions. Two kinds of methods, degradation path models and stochastic process models, are utilized to analyze degradation data and the latter one is the most popular method. However, some limitations like imprecise solution process and estimation result of degradation ratio still exist, which may affect the accuracy of the acceleration model and the extrapolation value. Moreover, the conducted solution of this problem, Bayesian method, lose key information when unifying the degradation data. In this paper, a new data processing and parameter inference method based on Bayesian method is proposed to handle degradation data and solve the problems above. First, Wiener process and acceleration model is chosen; Second, the initial values of degradation model and parameters of prior and posterior distribution under each level is calculated with updating and iteration of estimation values; Third, the lifetime and reliability values are estimated on the basis of the estimation parameters; Finally, a case study is provided to demonstrate the validity of the proposed method. The results illustrate that the proposed method is quite effective and accuracy in estimating the lifetime and reliability of a product.

  15. Issues in benchmarking human reliability analysis methods: A literature review

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Hendrickson, Stacey M.L.; Forester, John A.; Tran, Tuan Q.; Lois, Erasmia

    2010-01-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies is presented in order to aid in the design of future HRA benchmarking endeavors.

  16. Issues in benchmarking human reliability analysis methods : a literature review.

    Energy Technology Data Exchange (ETDEWEB)

    Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  17. BIOANALYTICAL STANDARDIZING FOR SEROLOGICAL DIAGNOSTIC MEDICAL DEVICES

    Directory of Open Access Journals (Sweden)

    A. Yu. Galkin

    2015-04-01

    Full Text Available In article we analyzed national and international regulations concerning the quality and safety of medical devices for in vitro diagnostics. We discussed the possibility of a partial application of the recommendations of the State Pharmacopoeia of Ukraine to this type of product. The main guiding regulatory documents establishing requirements for quality and safety tools for the serological diagnosis products are The technical regulation on medical devices for the diagnosis in vitro, DSTU ISO 13485 “Medical devices. Quality management system. Regulatory requirements”, and DSTU ISO/IEC 17025 “General requirements for the competence of testing and calibration laboratories”. Similar requirements of the State Pharmacopoeia of Ukraine which are used for drug standardization can not be directly applied to the medical devises for in vitro diagnostics due to a number of features, namely, the serological diagnosis products pre-designed to determine the unknown concentration of a particular analyte in a biological material, the diagnostic kits has to include the control samples (internal standard systems that need to be calibrated. It was determined following parameters of bioanalytical standardization and validation characterization for of qualitative (semi quantitative test-kits for serological diagnosis: precision (convergence, intralaboratory precision and reproducibility, diagnostic and analytical specificity, diagnostic sensitivity. It’s necessary to inspect additional parameters for quantitative test-kits such as accuracy (precision, linearity, analytical sensitivity and range.

  18. Survey of methods used to asses human reliability in the human factors reliability benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1988-01-01

    The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim to assess the state-of-the-art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participate in the HF-RBE, which is organised around two study cases: (1) analysis of routine functional test and maintenance procedures, with the aim to assess the probability of test-induced failures, the probability of failures to remain unrevealed, and the potential to initiate transients because of errors performed in the test; and (2) analysis of human actions during an operational transient, with the aim to assess the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. The paper briefly reports how the HF-RBE was structured and gives an overview of the methods that have been used for predicting human reliability in both study cases. The experience in applying these methods is discussed and the results obtained are compared. (author)

  19. Reliability and discriminatory power of methods for dental plaque quantification

    Directory of Open Access Journals (Sweden)

    Daniela Prócida Raggio

    2010-04-01

    Full Text Available OBJECTIVE: This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI and fluorescence camera (FC to detect plaque. MATERIAL AND METHODS: Six volunteers used palatal appliances with six bovine enamel blocks presenting different stages of plaque accumulation. The presence of plaque with and without disclosing was assessed using VI. Images were obtained with FC and digital camera in both conditions. The area covered by plaque was assessed. Examinations were done by two independent examiners. Data were analyzed by Kruskal-Wallis and Kappa tests to compare different conditions of samples and to assess the inter-examiner reproducibility. RESULTS: Some methods presented adequate reproducibility. The Turesky index and the assessment of area covered by disclosed plaque in the FC images presented the highest discriminatory powers. CONCLUSION: The Turesky index and images with FC with disclosing present good reliability and discriminatory power in quantifying dental plaque.

  20. A generic method for estimating system reliability using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Ramirez-Marquez, Jose Emmanuel

    2009-01-01

    This study presents a holistic method for constructing a Bayesian network (BN) model for estimating system reliability. BN is a probabilistic approach that is used to model and predict the behavior of a system based on observed stochastic events. The BN model is a directed acyclic graph (DAG) where the nodes represent system components and arcs represent relationships among them. Although recent studies on using BN for estimating system reliability have been proposed, they are based on the assumption that a pre-built BN has been designed to represent the system. In these studies, the task of building the BN is typically left to a group of specialists who are BN and domain experts. The BN experts should learn about the domain before building the BN, which is generally very time consuming and may lead to incorrect deductions. As there are no existing studies to eliminate the need for a human expert in the process of system reliability estimation, this paper introduces a method that uses historical data about the system to be modeled as a BN and provides efficient techniques for automated construction of the BN model, and hence estimation of the system reliability. In this respect K2, a data mining algorithm, is used for finding associations between system components, and thus building the BN model. This algorithm uses a heuristic to provide efficient and accurate results while searching for associations. Moreover, no human intervention is necessary during the process of BN construction and reliability estimation. The paper provides a step-by-step illustration of the method and evaluation of the approach with literature case examples

  1. A generic method for estimating system reliability using Bayesian networks

    Energy Technology Data Exchange (ETDEWEB)

    Doguc, Ozge [Stevens Institute of Technology, Hoboken, NJ 07030 (United States); Ramirez-Marquez, Jose Emmanuel [Stevens Institute of Technology, Hoboken, NJ 07030 (United States)], E-mail: jmarquez@stevens.edu

    2009-02-15

    This study presents a holistic method for constructing a Bayesian network (BN) model for estimating system reliability. BN is a probabilistic approach that is used to model and predict the behavior of a system based on observed stochastic events. The BN model is a directed acyclic graph (DAG) where the nodes represent system components and arcs represent relationships among them. Although recent studies on using BN for estimating system reliability have been proposed, they are based on the assumption that a pre-built BN has been designed to represent the system. In these studies, the task of building the BN is typically left to a group of specialists who are BN and domain experts. The BN experts should learn about the domain before building the BN, which is generally very time consuming and may lead to incorrect deductions. As there are no existing studies to eliminate the need for a human expert in the process of system reliability estimation, this paper introduces a method that uses historical data about the system to be modeled as a BN and provides efficient techniques for automated construction of the BN model, and hence estimation of the system reliability. In this respect K2, a data mining algorithm, is used for finding associations between system components, and thus building the BN model. This algorithm uses a heuristic to provide efficient and accurate results while searching for associations. Moreover, no human intervention is necessary during the process of BN construction and reliability estimation. The paper provides a step-by-step illustration of the method and evaluation of the approach with literature case examples.

  2. Limitations in simulator time-based human reliability analysis methods

    International Nuclear Information System (INIS)

    Wreathall, J.

    1989-01-01

    Developments in human reliability analysis (HRA) methods have evolved slowly. Current methods are little changed from those of almost a decade ago, particularly in the use of time-reliability relationships. While these methods were suitable as an interim step, the time (and the need) has come to specify the next evolution of HRA methods. As with any performance-oriented data source, power plant simulator data have no direct connection to HRA models. Errors reported in data are normal deficiencies observed in human performance; failures are events modeled in probabilistic risk assessments (PRAs). Not all errors cause failures; not all failures are caused by errors. Second, the times at which actions are taken provide no measure of the likelihood of failures to act correctly within an accident scenario. Inferences can be made about human reliability, but they must be made with great care. Specific limitations are discussed. Simulator performance data are useful in providing qualitative evidence of the variety of error types and their potential influences on operating systems. More work is required to combine recent developments in the psychology of error with the qualitative data collected at stimulators. Until data become openly available, however, such an advance will not be practical

  3. An exact method for solving logical loops in reliability analysis

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2009-01-01

    This paper presents an exact method for solving logical loops in reliability analysis. The systems that include logical loops are usually described by simultaneous Boolean equations. First, present a basic rule of solving simultaneous Boolean equations. Next, show the analysis procedures for three-component system with external supports. Third, more detailed discussions are given for the establishment of logical loop relation. Finally, take up two typical structures which include more than one logical loop. Their analysis results and corresponding GO-FLOW charts are given. The proposed analytical method is applicable to loop structures that can be described by simultaneous Boolean equations, and it is very useful in evaluating the reliability of complex engineering systems.

  4. COMPOSITE METHOD OF RELIABILITY RESEARCH FOR HIERARCHICAL MULTILAYER ROUTING SYSTEMS

    Directory of Open Access Journals (Sweden)

    R. B. Tregubov

    2016-09-01

    Full Text Available The paper deals with the idea of a research method for hierarchical multilayer routing systems. The method represents a composition of methods of graph theories, reliability, probabilities, etc. These methods are applied to the solution of different private analysis and optimization tasks and are systemically connected and coordinated with each other through uniform set-theoretic representation of the object of research. The hierarchical multilayer routing systems are considered as infrastructure facilities (gas and oil pipelines, automobile and railway networks, systems of power supply and communication with distribution of material resources, energy or information with the use of hierarchically nested functions of routing. For descriptive reasons theoretical constructions are considered on the example of task solution of probability determination for up state of specific infocommunication system. The author showed the possibility of constructive combination of graph representation of structure of the object of research and a logic probable analysis method of its reliability indices through uniform set-theoretic representation of its elements and processes proceeding in them.

  5. Structural Reliability Using Probability Density Estimation Methods Within NESSUS

    Science.gov (United States)

    Chamis, Chrisos C. (Technical Monitor); Godines, Cody Ric

    2003-01-01

    A reliability analysis studies a mathematical model of a physical system taking into account uncertainties of design variables and common results are estimations of a response density, which also implies estimations of its parameters. Some common density parameters include the mean value, the standard deviation, and specific percentile(s) of the response, which are measures of central tendency, variation, and probability regions, respectively. Reliability analyses are important since the results can lead to different designs by calculating the probability of observing safe responses in each of the proposed designs. All of this is done at the expense of added computational time as compared to a single deterministic analysis which will result in one value of the response out of many that make up the density of the response. Sampling methods, such as monte carlo (MC) and latin hypercube sampling (LHS), can be used to perform reliability analyses and can compute nonlinear response density parameters even if the response is dependent on many random variables. Hence, both methods are very robust; however, they are computationally expensive to use in the estimation of the response density parameters. Both methods are 2 of 13 stochastic methods that are contained within the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) program. NESSUS is a probabilistic finite element analysis (FEA) program that was developed through funding from NASA Glenn Research Center (GRC). It has the additional capability of being linked to other analysis programs; therefore, probabilistic fluid dynamics, fracture mechanics, and heat transfer are only a few of what is possible with this software. The LHS method is the newest addition to the stochastic methods within NESSUS. Part of this work was to enhance NESSUS with the LHS method. The new LHS module is complete, has been successfully integrated with NESSUS, and been used to study four different test cases that have been

  6. Applicability of simplified human reliability analysis methods for severe accidents

    Energy Technology Data Exchange (ETDEWEB)

    Boring, R.; St Germain, S. [Idaho National Lab., Idaho Falls, Idaho (United States); Banaseanu, G.; Chatri, H.; Akl, Y. [Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2016-03-15

    Most contemporary human reliability analysis (HRA) methods were created to analyse design-basis accidents at nuclear power plants. As part of a comprehensive expansion of risk assessments at many plants internationally, HRAs will begin considering severe accident scenarios. Severe accidents, while extremely rare, constitute high consequence events that significantly challenge successful operations and recovery. Challenges during severe accidents include degraded and hazardous operating conditions at the plant, the shift in control from the main control room to the technical support center, the unavailability of plant instrumentation, and the need to use different types of operating procedures. Such shifts in operations may also test key assumptions in existing HRA methods. This paper discusses key differences between design basis and severe accidents, reviews efforts to date to create customized HRA methods suitable for severe accidents, and recommends practices for adapting existing HRA methods that are already being used for HRAs at the plants. (author)

  7. A simple reliability block diagram method for safety integrity verification

    International Nuclear Information System (INIS)

    Guo Haitao; Yang Xianhui

    2007-01-01

    IEC 61508 requires safety integrity verification for safety related systems to be a necessary procedure in safety life cycle. PFD avg must be calculated to verify the safety integrity level (SIL). Since IEC 61508-6 does not give detailed explanations of the definitions and PFD avg calculations for its examples, it is difficult for common reliability or safety engineers to understand when they use the standard as guidance in practice. A method using reliability block diagram is investigated in this study in order to provide a clear and feasible way of PFD avg calculation and help those who take IEC 61508-6 as their guidance. The method finds mean down times (MDTs) of both channel and voted group first and then PFD avg . The calculated results of various voted groups are compared with those in IEC61508 part 6 and Ref. [Zhang T, Long W, Sato Y. Availability of systems with self-diagnostic components-applying Markov model to IEC 61508-6. Reliab Eng System Saf 2003;80(2):133-41]. An interesting outcome can be realized from the comparison. Furthermore, although differences in MDT of voted groups exist between IEC 61508-6 and this paper, PFD avg of voted groups are comparatively close. With detailed description, the method of RBD presented can be applied to the quantitative SIL verification, showing a similarity of the method in IEC 61508-6

  8. A reliable and validated LC-MS/MS method for the simultaneous quantification of 4 cannabinoids in 40 consumer products.

    Science.gov (United States)

    Meng, Qingfang; Buchanan, Beth; Zuccolo, Jonathan; Poulin, Mathieu-Marc; Gabriele, Joseph; Baranowski, David Charles

    2018-01-01

    In the past 50 years, Cannabis sativa (C. sativa) has gone from a substance essentially prohibited worldwide to one that is gaining acceptance both culturally and legally in many countries for medicinal and recreational use. As additional jurisdictions legalize Cannabis products and the variety and complexity of these products surpass the classical dried plant material, appropriate methods for measuring the biologically active constituents is paramount to ensure safety and regulatory compliance. While there are numerous active compounds in C. sativa the primary cannabinoids of regulatory and safety concern are (-)-Δ⁹-tetrahydrocannabinol (THC), cannabidiol (CBD), and their respective acidic forms THCA-A and CBDA. Using the US Food and Drug Administration (FDA) bioanalytical method validation guidelines we developed a sensitive, selective, and accurate method for the simultaneous analysis CBD, CBDA, THC, and THCA-A in oils and THC & CBD in more complex matrices. This HPLC-MS/MS method was simple and reliable using standard sample dilution and homogenization, an isocratic chromatographic separation, and a triple quadrupole mass spectrometer. The lower limit of quantification (LLOQ) for analytes was 0.195 ng/mL over a 0.195-50.0 ng/mL range of quantification with a coefficient of correlation of >0.99. Average intra-day and inter-day accuracies were 94.2-112.7% and 97.2-110.9%, respectively. This method was used to quantify CBD, CBDA, THC, and THCA-A in 40 commercial hemp products representing a variety of matrices including oils, plant materials, and creams/cosmetics. All products tested met the federal regulatory restrictions on THC content in Canada (CBD, the majority of analyzed products contained low CBD levels and a CBD: CBDA ratio of CBD and a CBD: CBDA ratio of >1,000 (an oil-based product). Overall, the method proved amenable to the analysis of various commercial products including oils, creams, and plant material and may be diagnostically indicative of

  9. Application of system reliability analytical method, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Mitomo, Nobuo; Miyazaki, Keiko; Matsukura, Hiroshi; Kobayashi, Michiyuki

    1999-01-01

    The Ship Research Institute proceed a developmental study on GO-FLOW method with various advancing functionalities for the system reliability analysis method occupying main parts of PSA (Probabilistic Safety Assessment). Here was attempted to intend to upgrade functionality of the GO-FLOW method, to develop an analytical function integrated with dynamic behavior analytical function, physical behavior and probable subject transfer, and to prepare a main accident sequence picking-out function. In 1997 fiscal year, in dynamic event-tree analytical system, an analytical function was developed by adding dependency between headings. In simulation analytical function of the accident sequence, main accident sequence of MRX for improved ship propulsion reactor became possible to be covered perfectly. And, input data for analysis was prepared with a function capable easily to set by an analysis operator. (G.K.)

  10. Validação em métodos cromatográficos para análises de pequenas moléculas em matrizes biológicas Chromatographic methods validation for analysis of small molecules in biological matrices

    Directory of Open Access Journals (Sweden)

    Neila Maria Cassiano

    2009-01-01

    Full Text Available Chromatographic methods are commonly used for analysis of small molecules in different biological matrices. An important step to be considered upon a bioanalytical method's development is the capacity to yield reliable and reproducible results. This review discusses validation procedures adopted by different governmental agencies, such as Food and Drug Administration (USA, European Union (EU and Agência Nacional de Vigilância Sanitária (BR for quantification of small molecules by bioanalytical chromatographic methods. The main parameters addressed in this review are: selectivity, linearity, precision, accuracy, quantification and detection limits, recovery, dilution integrity, stability and robustness. Also, the acceptance criterions are clearly specified.

  11. Improvement of human reliability analysis method for PRA

    International Nuclear Information System (INIS)

    Tanji, Junichi; Fujimoto, Haruo

    2013-09-01

    It is required to refine human reliability analysis (HRA) method by, for example, incorporating consideration for the cognitive process of operator into the evaluation of diagnosis errors and decision-making errors, as a part of the development and improvement of methods used in probabilistic risk assessments (PRAs). JNES has been developed a HRA method based on ATHENA which is suitable to handle the structured relationship among diagnosis errors, decision-making errors and operator cognition process. This report summarizes outcomes obtained from the improvement of HRA method, in which enhancement to evaluate how the plant degraded condition affects operator cognitive process and to evaluate human error probabilities (HEPs) which correspond to the contents of operator tasks is made. In addition, this report describes the results of case studies on the representative accident sequences to investigate the applicability of HRA method developed. HEPs of the same accident sequences are also estimated using THERP method, which is most popularly used HRA method, and comparisons of the results obtained using these two methods are made to depict the differences of these methods and issues to be solved. Important conclusions obtained are as follows: (1) Improvement of HRA method using operator cognitive action model. Clarification of factors to be considered in the evaluation of human errors, incorporation of degraded plant safety condition into HRA and investigation of HEPs which are affected by the contents of operator tasks were made to improve the HRA method which can integrate operator cognitive action model into ATHENA method. In addition, the detail of procedures of the improved method was delineated in the form of flowchart. (2) Case studies and comparison with the results evaluated by THERP method. Four operator actions modeled in the PRAs of representative BWR5 and 4-loop PWR plants were selected and evaluated as case studies. These cases were also evaluated using

  12. Comparison of two-concentration with multi-concentration linear regressions: Retrospective data analysis of multiple regulated LC-MS bioanalytical projects.

    Science.gov (United States)

    Musuku, Adrien; Tan, Aimin; Awaiye, Kayode; Trabelsi, Fethi

    2013-09-01

    Linear calibration is usually performed using eight to ten calibration concentration levels in regulated LC-MS bioanalysis because a minimum of six are specified in regulatory guidelines. However, we have previously reported that two-concentration linear calibration is as reliable as or even better than using multiple concentrations. The purpose of this research is to compare two-concentration with multiple-concentration linear calibration through retrospective data analysis of multiple bioanalytical projects that were conducted in an independent regulated bioanalytical laboratory. A total of 12 bioanalytical projects were randomly selected: two validations and two studies for each of the three most commonly used types of sample extraction methods (protein precipitation, liquid-liquid extraction, solid-phase extraction). When the existing data were retrospectively linearly regressed using only the lowest and the highest concentration levels, no extra batch failure/QC rejection was observed and the differences in accuracy and precision between the original multi-concentration regression and the new two-concentration linear regression are negligible. Specifically, the differences in overall mean apparent bias (square root of mean individual bias squares) are within the ranges of -0.3% to 0.7% and 0.1-0.7% for the validations and studies, respectively. The differences in mean QC concentrations are within the ranges of -0.6% to 1.8% and -0.8% to 2.5% for the validations and studies, respectively. The differences in %CV are within the ranges of -0.7% to 0.9% and -0.3% to 0.6% for the validations and studies, respectively. The average differences in study sample concentrations are within the range of -0.8% to 2.3%. With two-concentration linear regression, an average of 13% of time and cost could have been saved for each batch together with 53% of saving in the lead-in for each project (the preparation of working standard solutions, spiking, and aliquoting). Furthermore

  13. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    Directory of Open Access Journals (Sweden)

    Hai An

    2016-08-01

    Full Text Available Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new hybrid reliability index definition is presented based on the random–fuzzy–interval model. Furthermore, the calculation flowchart of the hybrid reliability index is presented and it is solved using the modified limit-step length iterative algorithm, which ensures convergence. And the validity of convergent algorithm for the hybrid reliability model is verified through the calculation examples in literature. In the end, a numerical example is demonstrated to show that the hybrid reliability index is applicable for the wear reliability assessment of mechanisms, where truncated random variables, fuzzy random variables, and interval variables coexist. The demonstration also shows the good convergence of the iterative algorithm proposed in this article.

  14. Label-Free Bioanalyte Detection from Nanometer to Micrometer Dimensions—Molecular Imprinting and QCMs †

    Directory of Open Access Journals (Sweden)

    Adnan Mujahid

    2018-06-01

    Full Text Available Modern diagnostic tools and immunoassay protocols urges direct analyte recognition based on its intrinsic behavior without using any labeling indicator. This not only improves the detection reliability, but also reduces sample preparation time and complexity involved during labeling step. Label-free biosensor devices are capable of monitoring analyte physiochemical properties such as binding sensitivity and selectivity, affinity constants and other dynamics of molecular recognition. The interface of a typical biosensor could range from natural antibodies to synthetic receptors for example molecular imprinted polymers (MIPs. The foremost advantages of using MIPs are their high binding selectivity comparable to natural antibodies, straightforward synthesis in short time, high thermal/chemical stability and compatibility with different transducers. Quartz crystal microbalance (QCM resonators are leading acoustic devices that are extensively used for mass-sensitive measurements. Highlight features of QCM devices include low cost fabrication, room temperature operation, and most importantly ability to monitor extremely low mass shifts, thus potentially a universal transducer. The combination of MIPs with quartz QCM has turned out as a prominent sensing system for label-free recognition of diverse bioanalytes. In this article, we shall encompass the potential applications of MIP-QCM sensors exclusively label-free recognition of bacteria and virus species as representative micro and nanosized bioanalytes.

  15. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  16. Structural reliability calculation method based on the dual neural network and direct integration method.

    Science.gov (United States)

    Li, Haibin; He, Yun; Nie, Xiaobo

    2018-01-01

    Structural reliability analysis under uncertainty is paid wide attention by engineers and scholars due to reflecting the structural characteristics and the bearing actual situation. The direct integration method, started from the definition of reliability theory, is easy to be understood, but there are still mathematics difficulties in the calculation of multiple integrals. Therefore, a dual neural network method is proposed for calculating multiple integrals in this paper. Dual neural network consists of two neural networks. The neural network A is used to learn the integrand function, and the neural network B is used to simulate the original function. According to the derivative relationships between the network output and the network input, the neural network B is derived from the neural network A. On this basis, the performance function of normalization is employed in the proposed method to overcome the difficulty of multiple integrations and to improve the accuracy for reliability calculations. The comparisons between the proposed method and Monte Carlo simulation method, Hasofer-Lind method, the mean value first-order second moment method have demonstrated that the proposed method is an efficient and accurate reliability method for structural reliability problems.

  17. Collection of methods for reliability and safety engineering

    International Nuclear Information System (INIS)

    Fussell, J.B.; Rasmuson, D.M.; Wilson, J.R.; Burdick, G.R.; Zipperer, J.C.

    1976-04-01

    The document presented contains five reports each describing a method of reliability and safety engineering. Report I provides a conceptual framework for the study of component malfunctions during system evaluations. Report II provides methods for locating groups of critical component failures such that all the component failures in a given group can be caused to occur by the occurrence of a single separate event. These groups of component failures are called common cause candidates. Report III provides a method for acquiring and storing system-independent component failure logic information. The information stored is influenced by the concepts presented in Report I and also includes information useful in locating common cause candidates. Report IV puts forth methods for analyzing situations that involve systems which change character in a predetermined time sequence. These phased missions techniques are applicable to the hypothetical ''accident chains'' frequently analyzed for nuclear power plants. Report V presents a unified approach to cause-consequence analysis, a method of analysis useful during risk assessments. This approach, as developed by the Danish Atomic Energy Commission, is modified to reflect the format and symbology conventionally used for other types of analysis of nuclear reactor systems

  18. A Reliability-Oriented Design Method for Power Electronic Converters

    DEFF Research Database (Denmark)

    Wang, Huai; Zhou, Dao; Blaabjerg, Frede

    2013-01-01

    Reliability is a crucial performance indicator of power electronic systems in terms of availability, mission accomplishment and life cycle cost. A paradigm shift in the research on reliability of power electronics is going on from simple handbook based calculations (e.g. models in MIL-HDBK-217F h...... and reliability prediction models are provided. A case study on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical component IGBT modules....

  19. Precision profiles and analytic reliability of radioimmunologic methods

    International Nuclear Information System (INIS)

    Yaneva, Z.; Popova, Yu.

    1991-01-01

    The aim of the present study is to investigate and compare some methods for creation of 'precision profiles' (PP) and to clarify their possibilities for determining the analytical reliability of RIA. Only methods without complicated mathematical calculations has been used. The reproducibility in serums with a concentration of the determinable hormone in the whole range of the calibration curve has been studied. The radioimmunoassay has been performed with TSH-RIA set (ex East Germany), and comparative evaluations - with commercial sets of HOECHST (Germany) and AMERSHAM (GB). Three methods for obtaining the relationship concentration (IU/l) -reproducibility (C.V.,%) are used and a comparison is made of their corresponding profiles: preliminary rough profile, Rodbard-PP and Ekins-PP. It is concluded that the creation of a precision profile is obligatory and the method of its construction does not influence the relationship's course. PP allows to determine concentration range giving stable results which improves the efficiency of the analitical work. 16 refs., 4 figs

  20. Monte Carlo methods for the reliability analysis of Markov systems

    International Nuclear Information System (INIS)

    Buslik, A.J.

    1985-01-01

    This paper presents Monte Carlo methods for the reliability analysis of Markov systems. Markov models are useful in treating dependencies between components. The present paper shows how the adjoint Monte Carlo method for the continuous time Markov process can be derived from the method for the discrete-time Markov process by a limiting process. The straightforward extensions to the treatment of mean unavailability (over a time interval) are given. System unavailabilities can also be estimated; this is done by making the system failed states absorbing, and not permitting repair from them. A forward Monte Carlo method is presented in which the weighting functions are related to the adjoint function. In particular, if the exact adjoint function is known then weighting factors can be constructed such that the exact answer can be obtained with a single Monte Carlo trial. Of course, if the exact adjoint function is known, there is no need to perform the Monte Carlo calculation. However, the formulation is useful since it gives insight into choices of the weight factors which will reduce the variance of the estimator

  1. Development of reliability centered maintenance methods and tools

    International Nuclear Information System (INIS)

    Jacquot, J.P.; Dubreuil-Chambardel, A.; Lannoy, A.; Monnier, B.

    1992-12-01

    This paper recalls the development of the RCM (Reliability Centered Maintenance) approach in the nuclear industry and describes the trial study implemented by EDF in the context of the OMF (RCM) Project. The approach developed is currently being applied to about thirty systems (Industrial Project). On a parallel, R and D efforts are being maintained to improve the selectivity of the analysis methods. These methods use Probabilistic Safety Study models, thereby guaranteeing better selectivity in the identification of safety critical elements and enhancing consistency between Maintenance and Safety studies. They also offer more detailed analysis of operation feedback, invoking for example Bayes' methods combining expert judgement and feedback data. Finally, they propose a functional and material representation of the plant. This dual representation describes both the functions assured by maintenance provisions and the material elements required for their implementation. In the final chapter, the targets of the future OMF workstation are summarized and the latter's insertion in the EDF information system is briefly described. (authors). 5 figs., 2 tabs., 7 refs

  2. Results of a Demonstration Assessment of Passive System Reliability Utilizing the Reliability Method for Passive Systems (RMPS)

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia; Grelle, Austin

    2015-04-26

    Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), a systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.

  3. The psychophysiological assessment method for pilot's professional reliability.

    Science.gov (United States)

    Zhang, L M; Yu, L S; Wang, K N; Jing, B S; Fang, C

    1997-05-01

    Previous research has shown that a pilot's professional reliability depends on two relative factors: the pilot's functional state and the demands of task workload. The Psychophysiological Reserve Capacity (PRC) is defined as a pilot's ability to accomplish additive tasks without reducing the performance of the primary task (flight task). We hypothesized that the PRC was a mirror of the pilot's functional state. The purpose of this study was to probe the psychophysiological method for evaluating a pilot's professional reliability on a simulator. The PRC Comprehensive Evaluating System (PRCCES) which was used in the experiment included four subsystems: a) quantitative evaluation system for pilot's performance on simulator; b) secondary task display and quantitative estimating system; c) multiphysiological data monitoring and statistical system; and d) comprehensive evaluation system for pilot PRC. Two studies were performed. In study one, 63 healthy and 13 hospitalized pilots participated. Each pilot performed a double 180 degrees circuit flight program with and without secondary task (three digit operation). The operator performance, score of secondary task and cost of physiological effort were measured and compared by PRCCES in the two conditions. Then, each pilot's flight skill in training was subjectively scored by instructor pilot ratings. In study two, 7 healthy pilots volunteered to take part in the experiment on the effects of sleep deprivation on pilot's PRC. Each participant had PRC tested pre- and post-8 h sleep deprivation. The results show that the PRC values of a healthy pilot was positively correlated with abilities of flexibility, operating and correcting deviation, attention distribution, and accuracy of instrument flight in the air (r = 0.27-0.40, p < 0.05), and negatively correlated with emotional anxiety in flight (r = -0.40, p < 0.05). The values of PRC in healthy pilots (0.61 +/- 0.17) were significantly higher than that of hospitalized pilots

  4. Development on methods for evaluating structure reliability of piping components

    International Nuclear Information System (INIS)

    Schimpfke, T.; Grebner, H.; Peschke, J.; Sievers, J.

    2003-01-01

    In the frame of the German reactor safety research program of the Federal Ministry of Economics and Labour, GRS has started to develop an analysis code named PROST (PRObabilistic STructure analysis) for estimating the leak and break probabilities of piping systems in nuclear power plants. The development is based on the experience achieved with applications of the public available US code PRAISE 3.10 (Piping Reliability Analysis Including Seismic Events), which was supplemented by additional features regarding the statistical evaluation and the crack orientation. PROST is designed to be more flexible to changes and supplementations. Up to now it can be used for calculating fatigue problems. The paper mentions the main capabilities and theoretical background of the present PROST development and presents a parametric study on the influence by changing the method of stress intensity factor and limit load calculation and the statistical evaluation options on the leak probability of an exemplary pipe with postulated axial crack distribution. Furthermore the resulting leak probability of an exemplary pipe with postulated circumferential crack distribution is compared with the results of the modified PRAISE computer program. The intention of this investigation is to show trends. Therefore the resulting absolute values for probabilities should not be considered as realistic evaluations. (author)

  5. Methods for qualification of highly reliable software - international procedure

    International Nuclear Information System (INIS)

    Kersken, M.

    1997-01-01

    Despite the advantages of computer-assisted safety technology, there still is some uneasyness to be observed with respect to the novel processes, resulting from absence of a body of generally accepted and uncontentious qualification guides (regulatory provisions, standards) for safety evaluation of the computer codes applied. Warranty of adequate protection of the population, operators or plant components is an essential aspect in this context, too - as it is in general with reliability and risk assessment of novel technology - so that, due to appropriate legislation still missing, there currently is a licensing risk involved in the introduction of digital safety systems. Nevertheless, there is some extent of agreement within the international community and utility operators about what standards and measures should be applied for qualification of software of relevance to plant safety. The standard IEC 880/IEC 86/ in particular, in its original version, or national documents based on this standard, are applied in all countries using or planning to install those systems. A novel supplement to this standard, document /IEC 96/, is in the process of finalization and defines the requirements to be met by modern methods of software engineering. (orig./DG) [de

  6. Safety and reliability analysis based on nonprobabilistic methods

    International Nuclear Information System (INIS)

    Kozin, I.O.; Petersen, K.E.

    1996-01-01

    Imprecise probabilities, being developed during the last two decades, offer a considerably more general theory having many advantages which make it very promising for reliability and safety analysis. The objective of the paper is to argue that imprecise probabilities are more appropriate tool for reliability and safety analysis, that they allow to model the behavior of nuclear industry objects more comprehensively and give a possibility to solve some problems unsolved in the framework of conventional approach. Furthermore, some specific examples are given from which we can see the usefulness of the tool for solving some reliability tasks

  7. Review of methods for the integration of reliability and design engineering

    International Nuclear Information System (INIS)

    Reilly, J.T.

    1978-03-01

    A review of methods for the integration of reliability and design engineering was carried out to establish a reliability program philosophy, an initial set of methods, and procedures to be used by both the designer and reliability analyst. The report outlines a set of procedures which implements a philosophy that requires increased involvement by the designer in reliability analysis. Discussions of each method reviewed include examples of its application

  8. Method of core thermodynamic reliability determination in pressurized water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ackermann, G.; Horche, W. (Ingenieurhochschule Zittau (German Democratic Republic). Sektion Kraftwerksanlagenbau und Energieumwandlung)

    1983-01-01

    A statistical model appropriate to determine the thermodynamic reliability and the power-limiting parameter of PWR cores is described for cases of accidental transients. The model is compared with the hot channel model hitherto applied.

  9. Method of core thermodynamic reliability determination in pressurized water reactors

    International Nuclear Information System (INIS)

    Ackermann, G.; Horche, W.

    1983-01-01

    A statistical model appropriate to determine the thermodynamic reliability and the power-limiting parameter of PWR cores is described for cases of accidental transients. The model is compared with the hot channel model hitherto applied. (author)

  10. Structural Reliability Methods for Wind Power Converter System Component Reliability Assessment

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    Wind power converter systems are essential subsystems in both off-shore and on-shore wind turbines. It is the main interface between generator and grid connection. This system is affected by numerous stresses where the main contributors might be defined as vibration and temperature loadings....... The temperature variations induce time-varying stresses and thereby fatigue loads. A probabilistic model is used to model fatigue failure for an electrical component in the power converter system. This model is based on a linear damage accumulation and physics of failure approaches, where a failure criterion...... is defined by the threshold model. The attention is focused on crack propagation in solder joints of electrical components due to the temperature loadings. Structural Reliability approaches are used to incorporate model, physical and statistical uncertainties. Reliability estimation by means of structural...

  11. Characteristics and application study of AP1000 NPPs equipment reliability classification method

    International Nuclear Information System (INIS)

    Guan Gao

    2013-01-01

    AP1000 nuclear power plant applies an integrated approach to establish equipment reliability classification, which includes probabilistic risk assessment technique, maintenance rule administrative, power production reliability classification and functional equipment group bounding method, and eventually classify equipment reliability into 4 levels. This classification process and result are very different from classical RCM and streamlined RCM. It studied the characteristic of AP1000 equipment reliability classification approach, considered that equipment reliability classification should effectively support maintenance strategy development and work process control, recommended to use a combined RCM method to establish the future equipment reliability program of AP1000 nuclear power plants. (authors)

  12. Calculation of the reliability of large complex systems by the relevant path method

    International Nuclear Information System (INIS)

    Richter, G.

    1975-03-01

    In this paper, analytical methods are presented and tested with which the probabilistic reliability data of technical systems can be determined for given fault trees and block diagrams and known reliability data of the components. (orig./AK) [de

  13. Reliability of a semi-quantitative method for dermal exposure assessment (DREAM)

    NARCIS (Netherlands)

    Wendel de Joode, B. van; Hemmen, J.J. van; Meijster, T.; Major, V.; London, L.; Kromhout, H.

    2005-01-01

    Valid and reliable semi-quantitative dermal exposure assessment methods for epidemiological research and for occupational hygiene practice, applicable for different chemical agents, are practically nonexistent. The aim of this study was to assess the reliability of a recently developed

  14. A Reliability Assessment Method for the VHTR Safety Systems

    International Nuclear Information System (INIS)

    Lee, Hyung Sok; Jae, Moo Sung; Kim, Yong Wan

    2011-01-01

    The Passive safety system by very high temperature reactor which has attracted worldwide attention in the last century is the reliability safety system introduced for the improvement in the safety of the next generation nuclear power plant design. The Passive system functionality does not rely on an external source of energy, but on an intelligent use of the natural phenomena, such as gravity, conduction and radiation, which are always present. Because of these features, it is difficult to evaluate the passive safety on the risk analysis methodology having considered the existing active system failure. Therefore new reliability methodology has to be considered. In this study, the preliminary evaluation and conceptualization are tried, applying the concept of the load and capacity from the reliability physics model, designing the new passive system analysis methodology, and the trial applying to paper plant.

  15. Reliability-Based Shape Optimization using Stochastic Finite Element Methods

    DEFF Research Database (Denmark)

    Enevoldsen, Ib; Sørensen, John Dalsgaard; Sigurdsson, G.

    1991-01-01

    stochastic fields (e.g. loads and material parameters such as Young's modulus and the Poisson ratio). In this case stochastic finite element techniques combined with FORM analysis can be used to obtain measures of the reliability of the structural systems, see Der Kiureghian & Ke (6) and Liu & Der Kiureghian...

  16. Survey of industry methods for producing highly reliable software

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Persons, W.L.

    1994-11-01

    The Nuclear Reactor Regulation Office of the US Nuclear Regulatory Commission is charged with assessing the safety of new instrument and control designs for nuclear power plants which may use computer-based reactor protection systems. Lawrence Livermore National Laboratory has evaluated the latest techniques in software reliability for measurement, estimation, error detection, and prediction that can be used during the software life cycle as a means of risk assessment for reactor protection systems. One aspect of this task has been a survey of the software industry to collect information to help identify the design factors used to improve the reliability and safety of software. The intent was to discover what practices really work in industry and what design factors are used by industry to achieve highly reliable software. The results of the survey are documented in this report. Three companies participated in the survey: Computer Sciences Corporation, International Business Machines (Federal Systems Company), and TRW. Discussions were also held with NASA Software Engineering Lab/University of Maryland/CSC, and the AIAA Software Reliability Project

  17. Advancing methods for reliably assessing motivational interviewing fidelity using the motivational interviewing skills code.

    Science.gov (United States)

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C

    2015-02-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Usefulness of the Monte Carlo method in reliability calculations

    International Nuclear Information System (INIS)

    Lanore, J.M.; Kalli, H.

    1977-01-01

    Three examples of reliability Monte Carlo programs developed in the LEP (Laboratory for Radiation Shielding Studies in the Nuclear Research Center at Saclay) are presented. First, an uncertainty analysis is given for a simplified spray system; a Monte Carlo program PATREC-MC has been written to solve the problem with the system components given in the fault tree representation. The second program MONARC 2 has been written to solve the problem of complex systems reliability by the Monte Carlo simulation, here again the system (a residual heat removal system) is in the fault tree representation. Third, the Monte Carlo program MONARC was used instead of the Markov diagram to solve the simulation problem of an electric power supply including two nets and two stand-by diesels

  19. Reliability improvement methods for sapphire fiber temperature sensors

    Science.gov (United States)

    Schietinger, C.; Adams, B.

    1991-08-01

    Mechanical, optical, electrical, and software design improvements can be brought to bear in the enhancement of fiber-optic sapphire-fiber temperature measurement tool reliability in harsh environments. The optical fiber thermometry (OFT) equipment discussed is used in numerous process industries and generally involves a sapphire sensor, an optical transmission cable, and a microprocessor-based signal analyzer. OFT technology incorporating sensors for corrosive environments, hybrid sensors, and two-wavelength measurements, are discussed.

  20. Multifunctional Nanomaterials Utilizing Hybridization Chain Reaction for Molecular Diagnostics and Bioanalytical Applications

    Science.gov (United States)

    Rana, Md. Muhit

    DNA nanotechnology has shown great promise in molecular diagnostic, bioanalytical and biomedical applications. The great challenge of detecting target analytes, biomarkers and small molecules, in molecular diagnostics is low yield sensitivity. To address this challenge, different nanomaterials have been used for a long time and to date there is no such cost-effective bioanalytical technique which can detect these target biomarkers (DNA, RNA, circulating DNA/miRNA) or environmental heavy metal ions (Hg2+ and Ag+) in a cost-effective and efficient manner. Herein, we initially discuss two possible bioanalytical detection methods- a) colorimetric and b) fluorometric assays which are very popular nowadays due to their distinctive spectroscopic properties. Finally, we report the promising colorimetric assay using a novel DNA based amplification strategy know as hybridization chain reaction (HCR) for potential application in the visual detection of low copies of biomarkers (miRNAs as little as 20 femtomole in an RNA pool and cell extracts in seven different combinations and Ebola virus DNA as low as 400 attomoles in liquid biopsy mimics in sixteen different combinations), environmental and biological heavy metal ions (mercury and silver concentrations as low as 10 pM in water, soil and urine samples) and also successfully applied to a molecular logic gate operation to distinguish OR and AND logic gates. No results showed any false-positive or false-negative information. On the other hand, we also discuss the future possibilities of HCR amplification technology, which is very promising for fluorometric bioanalysis. The HCR based nanoprobe technology has numerous remarkable advantages over other methods. It is re-programmable, simple, inexpensive, easy to assemble and operate and can be performed with visual and spectroscopic read-outs upon recognition of the target analytes. This rapid, specific and sensitive approach for biomarkers and heavy metal ion detection generates

  1. Reliability analysis of reactor systems by applying probability method; Analiza pouzdanosti reaktorskih sistema primenom metoda verovatnoce

    Energy Technology Data Exchange (ETDEWEB)

    Milivojevic, S [Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1974-12-15

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component.

  2. Reliability Evaluation of Bridges Based on Nonprobabilistic Response Surface Limit Method

    Directory of Open Access Journals (Sweden)

    Xuyong Chen

    2017-01-01

    Full Text Available Due to many uncertainties in nonprobabilistic reliability assessment of bridges, the limit state function is generally unknown. The traditional nonprobabilistic response surface method is a lengthy and oscillating iteration process and leads to difficultly solving the nonprobabilistic reliability index. This article proposes a nonprobabilistic response surface limit method based on the interval model. The intention of this method is to solve the upper and lower limits of the nonprobabilistic reliability index and to narrow the range of the nonprobabilistic reliability index. If the range of the reliability index reduces to an acceptable accuracy, the solution will be considered convergent, and the nonprobabilistic reliability index will be obtained. The case study indicates that using the proposed method can avoid oscillating iteration process, make iteration process stable and convergent, reduce iteration steps significantly, and improve computational efficiency and precision significantly compared with the traditional nonprobabilistic response surface method. Finally, the nonprobabilistic reliability evaluation process of bridge will be built through evaluating the reliability of one PC continuous rigid frame bridge with three spans using the proposed method, which appears to be more simple and reliable when lack of samples and parameters in the bridge nonprobabilistic reliability evaluation is present.

  3. Luminescent lanthanide reporters: new concepts for use in bioanalytical applications

    International Nuclear Information System (INIS)

    Vuojola, Johanna; Soukka, Tero

    2014-01-01

    Lanthanides represent the chemical elements from lanthanum to lutetium. They intrinsically exhibit some very exciting photophysical properties, which can be further enhanced by incorporating the lanthanide ion into organic or inorganic sensitizing structures. A very popular approach is to conjugate the lanthanide ion to an organic chromophore structure forming lanthanide chelates. Another approach, which has quickly gained interest, is to incorporate the lanthanide ions into nanoparticle structures, thus attaining improved specific activity and a large surface area for biomolecule immobilization. Lanthanide-based reporters, when properly shielded from the quenching effects of water, usually express strong luminescence emission, multiple narrow emission lines covering a wide wavelength range, and exceptionally long excited state lifetimes enabling time-gated luminescence detection. Because of these properties, lanthanide-based reporters have found widespread applications in various fields of life. This review focuses on the field of bioanalytical applications. Luminescent lanthanide reporters and assay formats utilizing these reporters pave the way for increasingly sensitive, simple, and easily automated bioanalytical applications. (topical review)

  4. Developing a strategy for a regulated electronic bioanalytical laboratory.

    Science.gov (United States)

    McDowall, R D

    2014-01-01

    This perspective article considers the strategy, design and implementation of an electronic bioanalytical laboratory working to GLP and/or GCP regulations. There are a range of available automated systems and laboratory informatics that could be implemented and integrated to make an electronic laboratory. However, which are the appropriate ones to select and what is realistic and cost-effective for an individual laboratory? The answer is to develop an overall automation strategy that is updated periodically after each system or application has been implemented to assess if the strategy is still valid or needs to be changed. As many laboratory informatics applications have functional overlap or convergence, for example, Laboratory Information Management System, Electronic Laboratory Notebook, and Instrument and Chromatography Data Systems, the decision of which application performs a specific task needs to be carefully considered in the overall strategy. Ensuring data integrity and regulatory compliance, especially in light of a number of recent falsification cases, is a mandatory consideration for the overall strategy for an electronic bioanalytical laboratory submitting data to regulatory authorities.

  5. Assessment and Improving Methods of Reliability Indices in Bakhtar Regional Electricity Company

    Directory of Open Access Journals (Sweden)

    Saeed Shahrezaei

    2013-04-01

    Full Text Available Reliability of a system is the ability of a system to do prospected duties in future and the probability of desirable operation for doing predetermined duties. Power system elements failures data are the main data of reliability assessment in the network. Determining antiseptic parameters is the goal of reliability assessment by using system history data. These parameters help to recognize week points of the system. In other words, the goal of reliability assessment is operation improving and decreasing of the failures and power outages. This paper is developed to assess reliability indices of Bakhtar Regional Electricity Company up to 1393 and the improving methods and their effects on the reliability indices in this network. DIgSILENT Power Factory software is employed for simulation. Simulation results show the positive effect of improving methods in reliability indices of Bakhtar Regional Electricity Company.

  6. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine: An overview

    Science.gov (United States)

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon; Prasad, Ramjee

    2013-01-01

    Recently, a need to develop supportive new scientific evidence for contemporary Ayurveda has emerged. One of the research objectives is an assessment of the reliability of diagnoses and treatment. Reliability is a quantitative measure of consistency. It is a crucial issue in classification (such as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine diagnoses is in the formative stage. However, reliability studies in Ayurveda are in the preliminary stage. In this paper, examples are provided to illustrate relevant concepts of reliability studies of diagnostic methods and their implication in practice, education, and training. An introduction to reliability estimates and different study designs and statistical analysis is given for future studies in Ayurveda. PMID:23930037

  7. Real Time Analysis of Bioanalytes in Healthcare, Food, Zoology and Botany.

    Science.gov (United States)

    Wang, Tianqi; Ramnarayanan, Ashwin; Cheng, Huanyu

    2017-12-21

    The growing demand for real time analysis of bioanalytes has spurred development in the field of wearable technology to offer non-invasive data collection at a low cost. The manufacturing processes for creating these sensing systems vary significantly by the material used, the type of sensors needed and the subject of study as well. The methods predominantly involve stretchable electronic sensors to monitor targets and transmit data mainly through flexible wires or short-range wireless communication devices. Capable of conformal contact, the application of wearable technology goes beyond the healthcare to fields of food, zoology and botany. With a brief review of wearable technology and its applications to various fields, we believe this mini review would be of interest to the reader in broad fields of materials, sensor development and areas where wearable sensors can provide data that are not available elsewhere.

  8. Method for assessing reliability of a network considering probabilistic safety assessment

    International Nuclear Information System (INIS)

    Cepin, M.

    2005-01-01

    A method for assessment of reliability of the network is developed, which uses the features of the fault tree analysis. The method is developed in a way that the increase of the network under consideration does not require significant increase of the model. The method is applied to small examples of network consisting of a small number of nodes and a small number of their connections. The results give the network reliability. They identify equipment, which is to be carefully maintained in order that the network reliability is not reduced, and equipment, which is a candidate for redundancy, as this would improve network reliability significantly. (author)

  9. Comparison of Model Reliabilities from Single-Step and Bivariate Blending Methods

    DEFF Research Database (Denmark)

    Taskinen, Matti; Mäntysaari, Esa; Lidauer, Martin

    2013-01-01

    Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from...... be calculated. Model reliabilities by the single-step and the bivariate blending methods were higher than by animal model due to genomic information. Compared to the single-step method, the bivariate blending method reliability estimates were, in general, lower. Computationally bivariate blending method was......, on the other hand, lighter than the single-step method....

  10. RELIABILITY ASSESSMENT OF ENTROPY METHOD FOR SYSTEM CONSISTED OF IDENTICAL EXPONENTIAL UNITS

    Institute of Scientific and Technical Information of China (English)

    Sun Youchao; Shi Jun

    2004-01-01

    The reliability assessment of unit-system near two levels is the most important content in the reliability multi-level synthesis of complex systems. Introducing the information theory into system reliability assessment, using the addible characteristic of information quantity and the principle of equivalence of information quantity, an entropy method of data information conversion is presented for the system consisted of identical exponential units. The basic conversion formulae of entropy method of unit test data are derived based on the principle of information quantity equivalence. The general models of entropy method synthesis assessment for system reliability approximate lower limits are established according to the fundamental principle of the unit reliability assessment. The applications of the entropy method are discussed by way of practical examples. Compared with the traditional methods, the entropy method is found to be valid and practicable and the assessment results are very satisfactory.

  11. The Reliability, Impact, and Cost-Effectiveness of Value-Added Teacher Assessment Methods

    Science.gov (United States)

    Yeh, Stuart S.

    2012-01-01

    This article reviews evidence regarding the intertemporal reliability of teacher rankings based on value-added methods. Value-added methods exhibit low reliability, yet are broadly supported by prominent educational researchers and are increasingly being used to evaluate and fire teachers. The article then presents a cost-effectiveness analysis…

  12. Reliability testing of tendon disease using two different scanning methods in patients with rheumatoid arthritis

    DEFF Research Database (Denmark)

    Bruyn, George A W; Möller, Ingrid; Garrido, Jesus

    2012-01-01

    To assess the intra- and interobserver reliability of musculoskeletal ultrasonography (US) in detecting inflammatory and destructive tendon abnormalities in patients with RA using two different scanning methods.......To assess the intra- and interobserver reliability of musculoskeletal ultrasonography (US) in detecting inflammatory and destructive tendon abnormalities in patients with RA using two different scanning methods....

  13. Model correction factor method for reliability problems involving integrals of non-Gaussian random fields

    DEFF Research Database (Denmark)

    Franchin, P.; Ditlevsen, Ove Dalager; Kiureghian, Armen Der

    2002-01-01

    The model correction factor method (MCFM) is used in conjunction with the first-order reliability method (FORM) to solve structural reliability problems involving integrals of non-Gaussian random fields. The approach replaces the limit-state function with an idealized one, in which the integrals ...

  14. Fast Reliability Assessing Method for Distribution Network with Distributed Renewable Energy Generation

    Science.gov (United States)

    Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming

    2018-01-01

    This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.

  15. A comparative study on the HW reliability assessment methods for digital I and C equipment

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Hoan Sung; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Lee, G. Y. [Korea Atomic Energy Research Institute, Taejeon (Korea); Kim, M. C. [Korea Advanced Institute of Science and Technology, Taejeon (Korea); Jun, S. T. [KHNP, Taejeon (Korea)

    2002-03-01

    It is necessary to predict or to evaluate the reliability of electronic equipment for the probabilistic safety analysis of digital instrument and control equipment. But most databases for the reliability prediction have no data for the up-to-date equipment and the failure modes are not classified. The prediction results for the specific component show different values according to the methods and databases. For boards and systems each method shows different values than others also. This study is for reliability prediction of PDC system for Wolsong NPP1 as a digital I and C equipment. Various reliability prediction methods and failure databases are used in calculation of the reliability to compare the effects of sensitivity and accuracy of each model and database. Many considerations for the reliability assessment of digital systems are derived with the results of this study. 14 refs., 19 figs., 15 tabs. (Author)

  16. Method of reliability allocation based on fault tree analysis and fuzzy math in nuclear power plants

    International Nuclear Information System (INIS)

    Chen Zhaobing; Deng Jian; Cao Xuewu

    2005-01-01

    Reliability allocation is a kind of a difficult multi-objective optimization problem. It can not only be applied to determine the reliability characteristic of reactor systems, subsystem and main components but also be performed to improve the design, operation and maintenance of nuclear plants. The fuzzy math known as one of the powerful tools for fuzzy optimization and the fault analysis deemed to be one of the effective methods of reliability analysis can be applied to the reliability allocation model so as to work out the problems of fuzzy characteristic of some factors and subsystem's choice respectively in this paper. Thus we develop a failure rate allocation model on the basis of the fault tree analysis and fuzzy math. For the choice of the reliability constraint factors, we choose the six important ones according to practical need for conducting the reliability allocation. The subsystem selected by the top-level fault tree analysis is to avoid allocating reliability for all the equipment and components including the unnecessary parts. During the reliability process, some factors can be calculated or measured quantitatively while others only can be assessed qualitatively by the expert rating method. So we adopt fuzzy decision and dualistic contrast to realize the reliability allocation with the help of fault tree analysis. Finally the example of the emergency diesel generator's reliability allocation is used to illustrate reliability allocation model and improve this model simple and applicable. (authors)

  17. Calibration Methods for Reliability-Based Design Codes

    DEFF Research Database (Denmark)

    Gayton, N.; Mohamed, A.; Sørensen, John Dalsgaard

    2004-01-01

    The calibration methods are applied to define the optimal code format according to some target safety levels. The calibration procedure can be seen as a specific optimization process where the control variables are the partial factors of the code. Different methods are available in the literature...

  18. Rapid and Reliable HPLC Method for the Determination of Vitamin ...

    African Journals Online (AJOL)

    Purpose: To develop and validate an accurate, sensitive and reproducible high performance liquid chromatographic (HPLC) method for the quantitation of vitamin C in pharmaceutical samples. Method: The drug and the standard were eluted from Superspher RP-18 (250 mm x 4.6 mm, 10ìm particle size) at 20 0C.

  19. Evaluation and reliability of bone histological age estimation methods

    African Journals Online (AJOL)

    Human age estimation at death plays a vital role in forensic anthropology and bioarchaeology. Researchers used morphological and histological methods to estimate human age from their skeletal remains. This paper discussed different histological methods that used human long bones and ribs to determine age ...

  20. Method matters: Understanding diagnostic reliability in DSM-IV and DSM-5.

    Science.gov (United States)

    Chmielewski, Michael; Clark, Lee Anna; Bagby, R Michael; Watson, David

    2015-08-01

    Diagnostic reliability is essential for the science and practice of psychology, in part because reliability is necessary for validity. Recently, the DSM-5 field trials documented lower diagnostic reliability than past field trials and the general research literature, resulting in substantial criticism of the DSM-5 diagnostic criteria. Rather than indicating specific problems with DSM-5, however, the field trials may have revealed long-standing diagnostic issues that have been hidden due to a reliance on audio/video recordings for estimating reliability. We estimated the reliability of DSM-IV diagnoses using both the standard audio-recording method and the test-retest method used in the DSM-5 field trials, in which different clinicians conduct separate interviews. Psychiatric patients (N = 339) were diagnosed using the SCID-I/P; 218 were diagnosed a second time by an independent interviewer. Diagnostic reliability using the audio-recording method (N = 49) was "good" to "excellent" (M κ = .80) and comparable to the DSM-IV field trials estimates. Reliability using the test-retest method (N = 218) was "poor" to "fair" (M κ = .47) and similar to DSM-5 field-trials' estimates. Despite low test-retest diagnostic reliability, self-reported symptoms were highly stable. Moreover, there was no association between change in self-report and change in diagnostic status. These results demonstrate the influence of method on estimates of diagnostic reliability. (c) 2015 APA, all rights reserved).

  1. Methods for reliability evaluation of trust and reputation systems

    Science.gov (United States)

    Janiszewski, Marek B.

    2016-09-01

    Trust and reputation systems are a systematic approach to build security on the basis of observations of node's behaviour. Exchange of node's opinions about other nodes is very useful to indicate nodes which act selfishly or maliciously. The idea behind trust and reputation systems gets significance because of the fact that conventional security measures (based on cryptography) are often not sufficient. Trust and reputation systems can be used in various types of networks such as WSN, MANET, P2P and also in e-commerce applications. Trust and reputation systems give not only benefits but also could be a thread itself. Many attacks aim at trust and reputation systems exist, but such attacks still have not gain enough attention of research teams. Moreover, joint effects of many of known attacks have been determined as a very interesting field of research. Lack of an acknowledged methodology of evaluation of trust and reputation systems is a serious problem. This paper aims at presenting various approaches of evaluation such systems. This work also contains a description of generalization of many trust and reputation systems which can be used to evaluate reliability of such systems in the context of preventing various attacks.

  2. Research on Control Method Based on Real-Time Operational Reliability Evaluation for Space Manipulator

    Directory of Open Access Journals (Sweden)

    Yifan Wang

    2014-05-01

    Full Text Available A control method based on real-time operational reliability evaluation for space manipulator is presented for improving the success rate of a manipulator during the execution of a task. In this paper, a method for quantitative analysis of operational reliability is given when manipulator is executing a specified task; then a control model which could control the quantitative operational reliability is built. First, the control process is described by using a state space equation. Second, process parameters are estimated in real time using Bayesian method. Third, the expression of the system's real-time operational reliability is deduced based on the state space equation and process parameters which are estimated using Bayesian method. Finally, a control variable regulation strategy which considers the cost of control is given based on the Theory of Statistical Process Control. It is shown via simulations that this method effectively improves the operational reliability of space manipulator control system.

  3. Reliability research to nuclear power plant operators based on several methods

    International Nuclear Information System (INIS)

    Fang Xiang; Li Fu; Zhao Bingquan

    2009-01-01

    The paper utilizes many kinds of international reliability research methods, and summarizes the review of reliability research of Chinese nuclear power plant operators in past over ten years based on the simulator platform of nuclear power plant. The paper shows the necessity and feasibility of the research to nuclear power plant operators from many angles including human cognition reliability, fuzzy mathematics model and psychological research model, etc. It will be good to the safe operation of nuclear power plant based on many kinds of research methods to the reliability research of nuclear power plant operators. (authors)

  4. Comparison of Methods for Dependency Determination between Human Failure Events within Human Reliability Analysis

    International Nuclear Information System (INIS)

    Cepin, M.

    2008-01-01

    The human reliability analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan human reliability analysis (IJS-HRA) and standardized plant analysis risk human reliability analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance

  5. Comparison of methods for dependency determination between human failure events within human reliability analysis

    International Nuclear Information System (INIS)

    Cepis, M.

    2007-01-01

    The Human Reliability Analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease of subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan - Human Reliability Analysis (IJS-HRA) and Standardized Plant Analysis Risk Human Reliability Analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance. (author)

  6. [A reliability growth assessment method and its application in the development of equipment in space cabin].

    Science.gov (United States)

    Chen, J D; Sun, H L

    1999-04-01

    Objective. To assess and predict reliability of an equipment dynamically by making full use of various test informations in the development of products. Method. A new reliability growth assessment method based on army material system analysis activity (AMSAA) model was developed. The method is composed of the AMSAA model and test data conversion technology. Result. The assessment and prediction results of a space-borne equipment conform to its expectations. Conclusion. It is suggested that this method should be further researched and popularized.

  7. Reliable method for fission source convergence of Monte Carlo criticality calculation with Wielandt's method

    International Nuclear Information System (INIS)

    Yamamoto, Toshihiro; Miyoshi, Yoshinori

    2004-01-01

    A new algorithm of Monte Carlo criticality calculations for implementing Wielandt's method, which is one of acceleration techniques for deterministic source iteration methods, is developed, and the algorithm can be successfully implemented into MCNP code. In this algorithm, part of fission neutrons emitted during random walk processes are tracked within the current cycle, and thus a fission source distribution used in the next cycle spread more widely. Applying this method intensifies a neutron interaction effect even in a loosely-coupled array where conventional Monte Carlo criticality methods have difficulties, and a converged fission source distribution can be obtained with fewer cycles. Computing time spent for one cycle, however, increases because of tracking fission neutrons within the current cycle, which eventually results in an increase of total computing time up to convergence. In addition, statistical fluctuations of a fission source distribution in a cycle are worsened by applying Wielandt's method to Monte Carlo criticality calculations. However, since a fission source convergence is attained with fewer source iterations, a reliable determination of convergence can easily be made even in a system with a slow convergence. This acceleration method is expected to contribute to prevention of incorrect Monte Carlo criticality calculations. (author)

  8. A New Method of Reliability Evaluation Based on Wavelet Information Entropy for Equipment Condition Identification

    International Nuclear Information System (INIS)

    He, Z J; Zhang, X L; Chen, X F

    2012-01-01

    Aiming at reliability evaluation of condition identification of mechanical equipment, it is necessary to analyze condition monitoring information. A new method of reliability evaluation based on wavelet information entropy extracted from vibration signals of mechanical equipment is proposed. The method is quite different from traditional reliability evaluation models that are dependent on probability statistics analysis of large number sample data. The vibration signals of mechanical equipment were analyzed by means of second generation wavelet package (SGWP). We take relative energy in each frequency band of decomposed signal that equals a percentage of the whole signal energy as probability. Normalized information entropy (IE) is obtained based on the relative energy to describe uncertainty of a system instead of probability. The reliability degree is transformed by the normalized wavelet information entropy. A successful application has been achieved to evaluate the assembled quality reliability for a kind of dismountable disk-drum aero-engine. The reliability degree indicates the assembled quality satisfactorily.

  9. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  10. Assessment of modern methods of human factor reliability analysis in PSA studies

    International Nuclear Information System (INIS)

    Holy, J.

    2001-12-01

    The report is structured as follows: Classical terms and objects (Probabilistic safety assessment as a framework for human reliability assessment; Human failure within the PSA model; Basic types of operator failure modelled in a PSA study and analyzed by HRA methods; Qualitative analysis of human reliability; Quantitative analysis of human reliability used; Process of analysis of nuclear reactor operator reliability in a PSA study); New terms and objects (Analysis of dependences; Errors of omission; Errors of commission; Error forcing context); and Overview and brief assessment of human reliability analysis (Basic characteristics of the methods; Assets and drawbacks of the use of each of HRA method; History and prospects of the use of the methods). (P.A.)

  11. Study of Fuze Structure and Reliability Design Based on the Direct Search Method

    Science.gov (United States)

    Lin, Zhang; Ning, Wang

    2017-03-01

    Redundant design is one of the important methods to improve the reliability of the system, but mutual coupling of multiple factors is often involved in the design. In my study, Direct Search Method is introduced into the optimum redundancy configuration for design optimization, in which, the reliability, cost, structural weight and other factors can be taken into account simultaneously, and the redundant allocation and reliability design of aircraft critical system are computed. The results show that this method is convenient and workable, and applicable to the redundancy configurations and optimization of various designs upon appropriate modifications. And this method has a good practical value.

  12. Application of reliability analysis methods to the comparison of two safety circuits

    International Nuclear Information System (INIS)

    Signoret, J.-P.

    1975-01-01

    Two circuits of different design, intended for assuming the ''Low Pressure Safety Injection'' function in PWR reactors are analyzed using reliability methods. The reliability analysis of these circuits allows the failure trees to be established and the failure probability derived. The dependence of these results on test use and maintenance is emphasized as well as critical paths. The great number of results obtained may allow a well-informed choice taking account of the reliability wanted for the type of circuits [fr

  13. Development and Validation of a Bioanalytical Method for Direct ...

    African Journals Online (AJOL)

    Erah

    2011-01-16

    Jan 16, 2011 ... acetonitrile in water (v/v) as mobile phase containing ammonium acetate and triethylamine (TEA), at a .... The pH of the mobile ..... and Application to a Human Pharmacokinetic ... (Thesis), Rhodes University, Grahamstown,.

  14. Sensitive Bioanalytical Methods for Mustard Gas Exposure Diagnosis

    Science.gov (United States)

    2006-11-01

    bis (2-chloreoethyl0-1- nitrosourea ), thio-TEPA (N, N’N’-triethylenethiophosphoramide), busulfan (1, 4-butanediol dimethanesulfonate), MNNG (1...N- nitrosourea ), mounting solution, sucrose, and colloidal gold solution were purchased from Sigma (St. Louis, MO). 2.2 Cell culture Frozen

  15. The Language Teaching Methods Scale: Reliability and Validity Studies

    Science.gov (United States)

    Okmen, Burcu; Kilic, Abdurrahman

    2016-01-01

    The aim of this research is to develop a scale to determine the language teaching methods used by English teachers. The research sample consisted of 300 English teachers who taught at Duzce University and in primary schools, secondary schools and high schools in the Provincial Management of National Education in the city of Duzce in 2013-2014…

  16. A method to determine validity and reliability of activity sensors

    NARCIS (Netherlands)

    Boerema, Simone Theresa; Hermens, Hermanus J.

    2013-01-01

    METHOD Four sensors were securely fastened to a mechanical oscillator (Vibration Exciter, type 4809, Brüel & Kjær) and moved at various frequencies (6.67Hz; 13.45Hz; 19.88Hz) within the range of human physical activity. For each of the three sensor axes, the sensors were simultaneously moved for

  17. Reliability and Validity of the Research Methods Skills Assessment

    Science.gov (United States)

    Smith, Tamarah; Smith, Samantha

    2018-01-01

    The Research Methods Skills Assessment (RMSA) was created to measure psychology majors' statistics knowledge and skills. The American Psychological Association's Guidelines for the Undergraduate Major in Psychology (APA, 2007, 2013) served as a framework for development. Results from a Rasch analysis with data from n = 330 undergraduates showed…

  18. Reliability and validity of non-radiographic methods of thoracic kyphosis measurement: a systematic review.

    Science.gov (United States)

    Barrett, Eva; McCreesh, Karen; Lewis, Jeremy

    2014-02-01

    A wide array of instruments are available for non-invasive thoracic kyphosis measurement. Guidelines for selecting outcome measures for use in clinical and research practice recommend that properties such as validity and reliability are considered. This systematic review reports on the reliability and validity of non-invasive methods for measuring thoracic kyphosis. A systematic search of 11 electronic databases located studies assessing reliability and/or validity of non-invasive thoracic kyphosis measurement techniques. Two independent reviewers used a critical appraisal tool to assess the quality of retrieved studies. Data was extracted by the primary reviewer. The results were synthesized qualitatively using a level of evidence approach. 27 studies satisfied the eligibility criteria and were included in the review. The reliability, validity and both reliability and validity were investigated by sixteen, two and nine studies respectively. 17/27 studies were deemed to be of high quality. In total, 15 methods of thoracic kyphosis were evaluated in retrieved studies. All investigated methods showed high (ICC ≥ .7) to very high (ICC ≥ .9) levels of reliability. The validity of the methods ranged from low to very high. The strongest levels of evidence for reliability exists in support of the Debrunner kyphometer, Spinal Mouse and Flexicurve index, and for validity supports the arcometer and Flexicurve index. Further reliability and validity studies are required to strengthen the level of evidence for the remaining methods of measurement. This should be addressed by future research. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Radioisotope method potentialities in machine reliability and durability enhancement

    International Nuclear Information System (INIS)

    Postnikov, V.I.

    1975-01-01

    The development of a surface activation method is reviewed with regard to wear of machine parts. Examples demonstrating the highly promising aspects and practical application of the method are cited. The use of high-sensitivity instruments and variation of activation depth from 10 um to 0.5 mm allows to perform the investigations at a sensitivity of 0.05 um and to estimate the linear values of machine wear. Standard diagrams are presented for measuring the wear of different machine parts by means of surface activation. Investigations performed at several Soviet technological institutes afford a set of dependences, which characterize the distribution of radioactive isotopes in depth under different conditions of activation of diverse metals and alloys and permit to study the wear of any metal

  20. TESTING METHODS FOR MECHANICALLY IMPROVED SOILS: RELIABILITY AND VALIDITY

    Directory of Open Access Journals (Sweden)

    Ana Petkovšek

    2017-10-01

    Full Text Available A possibility of in-situ mechanical improvement for reducing the liquefaction potential of silty sands was investigated by using three different techniques: Vibratory Roller Compaction, Rapid Impact Compaction (RIC and Soil Mixing. Material properties at all test sites were investigated before and after improvement with the laboratory and the in situ tests (CPT, SDMT, DPSH B, static and dynamic load plate test, geohydraulic tests. Correlation between the results obtained by different test methods gave inconclusive answers.

  1. Surface-enhanced Raman spectroscopy bioanalytical, biomolecular and medical applications

    CERN Document Server

    Procházka, Marek

    2016-01-01

    This book gives an overview of recent developments in RS and SERS for sensing and biosensing considering also limitations, possibilities and prospects of this technique. Raman scattering (RS) is a widely used vibrational technique providing highly specific molecular spectral patterns. A severe limitation for the application of this spectroscopic technique lies in the low cross section of RS. Surface-enhanced Raman scattering (SERS) spectroscopy overcomes this problem by 6-11 orders of magnitude enhancement compared with the standard RS for molecules in the close vicinity of certain rough metal surfaces. Thus, SERS combines molecular fingerprint specificity with potential single-molecule sensitivity. Due to the recent development of new SERS-active substrates, labeling and derivatization chemistry as well as new instrumentations, SERS became a very promising tool for many varied applications, including bioanalytical studies and sensing. Both intrinsic and extrinsic SERS biosensing schemes have been employed to...

  2. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  3. Reliability and validity of the AutoCAD software method in lumbar lordosis measurement.

    Science.gov (United States)

    Letafatkar, Amir; Amirsasan, Ramin; Abdolvahabi, Zahra; Hadadnezhad, Malihe

    2011-12-01

    The aim of this study was to determine the reliability and validity of the AutoCAD software method in lumbar lordosis measurement. Fifty healthy volunteers with a mean age of 23 ± 1.80 years were enrolled. A lumbar lateral radiograph was taken on all participants, and the lordosis was measured according to the Cobb method. Afterward, the lumbar lordosis degree was measured via AutoCAD software and flexible ruler methods. The current study is accomplished in 2 parts: intratester and intertester evaluations of reliability as well as the validity of the flexible ruler and software methods. Based on the intraclass correlation coefficient, AutoCAD's reliability and validity in measuring lumbar lordosis were 0.984 and 0.962, respectively. AutoCAD showed to be a reliable and valid method to measure lordosis. It is suggested that this method may replace those that are costly and involve health risks, such as radiography, in evaluating lumbar lordosis.

  4. Reliability Evaluation of Bridges Based on Nonprobabilistic Response Surface Limit Method

    OpenAIRE

    Chen, Xuyong; Chen, Qian; Bian, Xiaoya; Fan, Jianping

    2017-01-01

    Due to many uncertainties in nonprobabilistic reliability assessment of bridges, the limit state function is generally unknown. The traditional nonprobabilistic response surface method is a lengthy and oscillating iteration process and leads to difficultly solving the nonprobabilistic reliability index. This article proposes a nonprobabilistic response surface limit method based on the interval model. The intention of this method is to solve the upper and lower limits of the nonprobabilistic ...

  5. System reliability with correlated components: Accuracy of the Equivalent Planes method

    NARCIS (Netherlands)

    Roscoe, K.; Diermanse, F.; Vrouwenvelder, A.C.W.M.

    2015-01-01

    Computing system reliability when system components are correlated presents a challenge because it usually requires solving multi-fold integrals numerically, which is generally infeasible due to the computational cost. In Dutch flood defense reliability modeling, an efficient method for computing

  6. System reliability with correlated components : Accuracy of the Equivalent Planes method

    NARCIS (Netherlands)

    Roscoe, K.; Diermanse, F.; Vrouwenvelder, T.

    2015-01-01

    Computing system reliability when system components are correlated presents a challenge because it usually requires solving multi-fold integrals numerically, which is generally infeasible due to the computational cost. In Dutch flood defense reliability modeling, an efficient method for computing

  7. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  8. Improving productivity and profitability of a bioanalytical business through sales and operation planning.

    Science.gov (United States)

    Islam, Rafiqul

    2013-07-01

    Today's bioanalytical CROs face increasing global competition, highly variable demand, high fixed costs, pricing pressure, and increasing demand for quality and speed. Most bioanalytical laboratories have responded to these challenges by implementing automation and by implementing process improvement methodologies (e.g., Six Sigma). These solutions have not resulted in a significant improvement in productivity and profitability since none of them are able to predict the upturn or downturn in demand. High volatility of demand causes long lead times and high costs during peak demand and poor productivity during trough demand. Most bioanalytical laboratories lack the tools to align supply efficiently to meet changing demand. In this paper, sales and operation planning (S&OP) has been investigated as a tool to balance supply and demand. The S&OP process, when executed effectively, can be the single greatest determinant of profitability for a bioanalytical business.

  9. A Reliable Method for Rhythm Analysis during Cardiopulmonary Resuscitation

    Directory of Open Access Journals (Sweden)

    U. Ayala

    2014-01-01

    Full Text Available Interruptions in cardiopulmonary resuscitation (CPR compromise defibrillation success. However, CPR must be interrupted to analyze the rhythm because although current methods for rhythm analysis during CPR have high sensitivity for shockable rhythms, the specificity for nonshockable rhythms is still too low. This paper introduces a new approach to rhythm analysis during CPR that combines two strategies: a state-of-the-art CPR artifact suppression filter and a shock advice algorithm (SAA designed to optimally classify the filtered signal. Emphasis is on designing an algorithm with high specificity. The SAA includes a detector for low electrical activity rhythms to increase the specificity, and a shock/no-shock decision algorithm based on a support vector machine classifier using slope and frequency features. For this study, 1185 shockable and 6482 nonshockable 9-s segments corrupted by CPR artifacts were obtained from 247 patients suffering out-of-hospital cardiac arrest. The segments were split into a training and a test set. For the test set, the sensitivity and specificity for rhythm analysis during CPR were 91.0% and 96.6%, respectively. This new approach shows an important increase in specificity without compromising the sensitivity when compared to previous studies.

  10. On the performance of bioanalytical fluorescence correlation spectroscopy measurements in a multiparameter photon-counting microscope

    Energy Technology Data Exchange (ETDEWEB)

    Mazouchi, Amir; Liu Baoxu; Bahram, Abdullah [Department of Physics, Institute for Optical Sciences, University of Toronto, Toronto (Canada); Department of Chemical and Physical Sciences, University of Toronto Mississauga, 3359 Mississauga Rd. N., Mississauga, ON, L5L 1C6 (Canada); Gradinaru, Claudiu C., E-mail: claudiu.gradinaru@utoronto.ca [Department of Physics, Institute for Optical Sciences, University of Toronto, Toronto (Canada); Department of Chemical and Physical Sciences, University of Toronto Mississauga, 3359 Mississauga Rd. N., Mississauga, ON, L5L 1C6 (Canada)

    2011-02-28

    Fluorescence correlation spectroscopy (FCS) data acquisition and analysis routines were developed and implemented in a home-built, multiparameter photon-counting microscope. Laser excitation conditions were investigated for two representative fluorescent probes, Rhodamine110 and enhanced green fluorescent protein (EGFP). Reliable local concentrations and diffusion constants were obtained by fitting measured FCS curves, provided that the excitation intensity did not exceed 20% of the saturation level for each fluorophore. Accurate results were obtained from FCS measurements for sample concentrations varying from pM to {mu}M range, as well as for conditions of high background signals. These experimental constraints were found to be determined by characteristics of the detection system and by the saturation behavior of the fluorescent probes. These factors actually limit the average number of photons that can be collected from a single fluorophore passing through the detection volume. The versatility of our setup and the data analysis capabilities were tested by measuring the mobility of EGFP in the nucleus of Drosophila cells under conditions of high concentration and molecular crowding. As a bioanalytical application, we studied by FCS the binding affinity of a novel peptide-based drug to the cancer-regulating STAT3 protein and corroborated the results with fluorescence polarization analysis derived from the same photon data.

  11. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    Science.gov (United States)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  12. Reliability Analysis Of Fire System On The Industry Facility By Use Fameca Method

    International Nuclear Information System (INIS)

    Sony T, D.T.; Situmorang, Johnny; Ismu W, Puradwi; Demon H; Mulyanto, Dwijo; Kusmono, Slamet; Santa, Sigit Asmara

    2000-01-01

    FAMECA is one of the analysis method to determine system reliability on the industry facility. Analysis is done by some procedure that is identification of component function, determination of failure mode, severity level and effect of their failure. Reliability value is determined by three combinations that is severity level, component failure value and critical component. Reliability of analysis has been done for fire system on the industry by FAMECA method. Critical component which identified is pump, air release valve, check valve, manual test valve, isolation valve, control system etc

  13. Verification of practicability of quantitative reliability evaluation method (De-BDA) in nuclear power plants

    International Nuclear Information System (INIS)

    Takahashi, Kinshiro; Yukimachi, Takeo.

    1988-01-01

    A variety of methods have been applied to study of reliability analysis in which human factors are included in order to enhance the safety and availability of nuclear power plants. De-BDA (Detailed Block Diagram Analysis) is one of such mehtods developed with the objective of creating a more comprehensive and understandable tool for quantitative analysis of reliability associated with plant operations. The practicability of this method has been verified by applying it to reliability analysis of various phases of plant operation as well as evaluation of enhanced man-machine interface in the central control room. (author)

  14. A study in the reliability analysis method for nuclear power plant structures (I)

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Byung Hwan; Choi, Seong Cheol; Shin, Ho Sang; Yang, In Hwan; Kim, Yi Sung; Yu, Young; Kim, Se Hun [Seoul, Nationl Univ., Seoul (Korea, Republic of)

    1999-03-15

    Nuclear power plant structures may be exposed to aggressive environmental effects that may cause their strength and stiffness to decrease over their service life. Although the physics of these damage mechanisms are reasonably well understood and quantitative evaluation of their effects on time-dependent structural behavior is possible in some instances, such evaluations are generally very difficult and remain novel. The assessment of existing steel containment in nuclear power plants for continued service must provide quantitative evidence that they are able to withstand future extreme loads during a service period with an acceptable level of reliability. Rational methodologies to perform the reliability assessment can be developed from mechanistic models of structural deterioration, using time-dependent structural reliability analysis to take loading and strength uncertainties into account. The final goal of this study is to develop the analysis method for the reliability of containment structures. The cause and mechanism of corrosion is first clarified and the reliability assessment method has been established. By introducing the equivalent normal distribution, the procedure of reliability analysis which can determine the failure probabilities has been established. The influence of design variables to reliability and the relation between the reliability and service life will be continued second year research.

  15. Bearing Procurement Analysis Method by Total Cost of Ownership Analysis and Reliability Prediction

    Science.gov (United States)

    Trusaji, Wildan; Akbar, Muhammad; Sukoyo; Irianto, Dradjad

    2018-03-01

    In making bearing procurement analysis, price and its reliability must be considered as decision criteria, since price determines the direct cost as acquisition cost and reliability of bearing determine the indirect cost such as maintenance cost. Despite the indirect cost is hard to identify and measured, it has high contribution to overall cost that will be incurred. So, the indirect cost of reliability must be considered when making bearing procurement analysis. This paper tries to explain bearing evaluation method with the total cost of ownership analysis to consider price and maintenance cost as decision criteria. Furthermore, since there is a lack of failure data when bearing evaluation phase is conducted, reliability prediction method is used to predict bearing reliability from its dynamic load rating parameter. With this method, bearing with a higher price but has higher reliability is preferable for long-term planning. But for short-term planning the cheaper one but has lower reliability is preferable. This contextuality can give rise to conflict between stakeholders. Thus, the planning horizon needs to be agreed by all stakeholder before making a procurement decision.

  16. Assessment of Advanced Life Support competence when combining different test methods--reliability and validity

    DEFF Research Database (Denmark)

    Ringsted, C; Lippert, F; Hesselfeldt, R

    2007-01-01

    Cardiac Arrest Simulation Test (CASTest) scenarios for the assessments according to guidelines 2005. AIMS: To analyse the reliability and validity of the individual sub-tests provided by ERC and to find a combination of MCQ and CASTest that provides a reliable and valid single effect measure of ALS...... that possessed high reliability, equality of test sets, and ability to discriminate between the two groups of supposedly different ALS competence. CONCLUSIONS: ERC sub-tests of ALS competence possess sufficient reliability and validity. A combined ALS score with equal weighting of one MCQ and one CASTest can...... competence. METHODS: Two groups of participants were included in this randomised, controlled experimental study: a group of newly graduated doctors, who had not taken the ALS course (N=17) and a group of students, who had passed the ALS course 9 months before the study (N=16). Reliability in terms of inter...

  17. Reliability analysis of idealized tunnel support system using probability-based methods with case studies

    Science.gov (United States)

    Gharouni-Nik, Morteza; Naeimi, Meysam; Ahadi, Sodayf; Alimoradi, Zahra

    2014-06-01

    In order to determine the overall safety of a tunnel support lining, a reliability-based approach is presented in this paper. Support elements in jointed rock tunnels are provided to control the ground movement caused by stress redistribution during the tunnel drive. Main support elements contribute to stability of the tunnel structure are recognized owing to identify various aspects of reliability and sustainability in the system. The selection of efficient support methods for rock tunneling is a key factor in order to reduce the number of problems during construction and maintain the project cost and time within the limited budget and planned schedule. This paper introduces a smart approach by which decision-makers will be able to find the overall reliability of tunnel support system before selecting the final scheme of the lining system. Due to this research focus, engineering reliability which is a branch of statistics and probability is being appropriately applied to the field and much effort has been made to use it in tunneling while investigating the reliability of the lining support system for the tunnel structure. Therefore, reliability analysis for evaluating the tunnel support performance is the main idea used in this research. Decomposition approaches are used for producing system block diagram and determining the failure probability of the whole system. Effectiveness of the proposed reliability model of tunnel lining together with the recommended approaches is examined using several case studies and the final value of reliability obtained for different designing scenarios. Considering the idea of linear correlation between safety factors and reliability parameters, the values of isolated reliabilities determined for different structural components of tunnel support system. In order to determine individual safety factors, finite element modeling is employed for different structural subsystems and the results of numerical analyses are obtained in

  18. Reliability-based design optimization via high order response surface method

    International Nuclear Information System (INIS)

    Li, Hong Shuang

    2013-01-01

    To reduce the computational effort of reliability-based design optimization (RBDO), the response surface method (RSM) has been widely used to evaluate reliability constraints. We propose an efficient methodology for solving RBDO problems based on an improved high order response surface method (HORSM) that takes advantage of an efficient sampling method, Hermite polynomials and uncertainty contribution concept to construct a high order response surface function with cross terms for reliability analysis. The sampling method generates supporting points from Gauss-Hermite quadrature points, which can be used to approximate response surface function without cross terms, to identify the highest order of each random variable and to determine the significant variables connected with point estimate method. The cross terms between two significant random variables are added to the response surface function to improve the approximation accuracy. Integrating the nested strategy, the improved HORSM is explored in solving RBDO problems. Additionally, a sampling based reliability sensitivity analysis method is employed to reduce the computational effort further when design variables are distributional parameters of input random variables. The proposed methodology is applied on two test problems to validate its accuracy and efficiency. The proposed methodology is more efficient than first order reliability method based RBDO and Monte Carlo simulation based RBDO, and enables the use of RBDO as a practical design tool.

  19. A Novel Reliability Enhanced Handoff Method in Future Wireless Heterogeneous Networks

    Directory of Open Access Journals (Sweden)

    Wang YuPeng

    2016-01-01

    Full Text Available As the demand increases, future networks will follow the trends of network variety and service flexibility, which requires heterogeneous type of network deployment and reliable communication method. In practice, most communication failure happens due to the bad radio link quality, i.e., high-speed users suffers a lot on the problem of radio link failure, which causes the problem of communication interrupt and radio link recovery. To make the communication more reliable, especially for the high mobility users, we propose a novel communication handoff mechanism to reduce the occurrence of service interrupt. Based on computer simulation, we find that the reliability on the service is greatly improved.

  20. Reliability analysis for thermal cutting method based non-explosive separation device

    International Nuclear Information System (INIS)

    Choi, Jun Woo; Hwang, Kuk Ha; Kim, Byung Kyu

    2016-01-01

    In order to increase the reliability of a separation device for a small satellite, a new non-explosive separation device is invented. This device is activated using a thermal cutting method with a Ni-Cr wire. A reliability analysis is carried out for the proposed non-explosive separation device by applying the Fault tree analysis (FTA) method. In the FTA results for the separation device, only ten single-point failure modes are found. The reliability modeling and analysis for the device are performed considering failure of the power supply, the Ni-Cr wire burns failure and unwinds, the holder separation failure, the balls separation failure, and the pin release failure. Ultimately, the reliability of the proposed device is calculated as 0.999989 with five Ni-Cr wire coils

  1. Reliability analysis for thermal cutting method based non-explosive separation device

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jun Woo; Hwang, Kuk Ha; Kim, Byung Kyu [Korea Aerospace University, Goyang (Korea, Republic of)

    2016-12-15

    In order to increase the reliability of a separation device for a small satellite, a new non-explosive separation device is invented. This device is activated using a thermal cutting method with a Ni-Cr wire. A reliability analysis is carried out for the proposed non-explosive separation device by applying the Fault tree analysis (FTA) method. In the FTA results for the separation device, only ten single-point failure modes are found. The reliability modeling and analysis for the device are performed considering failure of the power supply, the Ni-Cr wire burns failure and unwinds, the holder separation failure, the balls separation failure, and the pin release failure. Ultimately, the reliability of the proposed device is calculated as 0.999989 with five Ni-Cr wire coils.

  2. An attempt to use FMEA method for an approximate reliability assessment of machinery

    Directory of Open Access Journals (Sweden)

    Przystupa Krzysztof

    2017-01-01

    Full Text Available The paper presents a modified FMEA (Failure Mode and Effect Analysis method to assess reliability of the components that make up a wrench type 2145: MAX Impactol TM Driver Ingersoll Rand Company. This case concerns the analysis of reliability in conditions, when full service data is not known. The aim of the study is to determine the weakest element in the design of the tool.

  3. Evaluating the reliability of multi-body mechanisms: A method considering the uncertainties of dynamic performance

    International Nuclear Information System (INIS)

    Wu, Jianing; Yan, Shaoze; Zuo, Ming J.

    2016-01-01

    Mechanism reliability is defined as the ability of a certain mechanism to maintain output accuracy under specified conditions. Mechanism reliability is generally assessed by the classical direct probability method (DPM) derived from the first order second moment (FOSM) method. The DPM relies strongly on the analytical form of the dynamic solution so it is not applicable to multi-body mechanisms that have only numerical solutions. In this paper, an indirect probability model (IPM) is proposed for mechanism reliability evaluation of multi-body mechanisms. IPM combines the dynamic equation, degradation function and Kaplan–Meier estimator to evaluate mechanism reliability comprehensively. Furthermore, to reduce the amount of computation in practical applications, the IPM is simplified into the indirect probability step model (IPSM). A case study of a crank–slider mechanism with clearance is investigated. Results show that relative errors between the theoretical and experimental results of mechanism reliability are less than 5%, demonstrating the effectiveness of the proposed method. - Highlights: • An indirect probability model (IPM) is proposed for mechanism reliability evaluation. • The dynamic equation, degradation function and Kaplan–Meier estimator are used. • Then the simplified form of indirect probability model is proposed. • The experimental results agree well with the predicted results.

  4. Method for assessing software reliability of the document management system using the RFID technology

    Directory of Open Access Journals (Sweden)

    Kiedrowicz Maciej

    2016-01-01

    Full Text Available The deliberations presented in this study refer to the method for assessing software reliability of the docu-ment management system, using the RFID technology. A method for determining the reliability structure of the dis-cussed software, understood as the index vector for assessing reliability of its components, was proposed. The model of the analyzed software is the control transfer graph, in which the probability of activating individual components during the system's operation results from the so-called operational profile, which characterizes the actual working environment. The reliability structure is established as a result of the solution of a specific mathematical software task. The knowledge of the reliability structure of the software makes it possible to properly plan the time and finan-cial expenses necessary to build the software, which would meet the reliability requirements. The application of the presented method is illustrated by the number example, corresponding to the software reality of the RFID document management system.

  5. Perspectives on bioanalytical mass spectrometry and automation in drug discovery.

    Science.gov (United States)

    Janiszewski, John S; Liston, Theodore E; Cole, Mark J

    2008-11-01

    The use of high speed synthesis technologies has resulted in a steady increase in the number of new chemical entities active in the drug discovery research stream. Large organizations can have thousands of chemical entities in various stages of testing and evaluation across numerous projects on a weekly basis. Qualitative and quantitative measurements made using LC/MS are integrated throughout this process from early stage lead generation through candidate nomination. Nearly all analytical processes and procedures in modern research organizations are automated to some degree. This includes both hardware and software automation. In this review we discuss bioanalytical mass spectrometry and automation as components of the analytical chemistry infrastructure in pharma. Analytical chemists are presented as members of distinct groups with similar skillsets that build automated systems, manage test compounds, assays and reagents, and deliver data to project teams. The ADME-screening process in drug discovery is used as a model to highlight the relationships between analytical tasks in drug discovery. Emerging software and process automation tools are described that can potentially address gaps and link analytical chemistry related tasks. The role of analytical chemists and groups in modern 'industrialized' drug discovery is also discussed.

  6. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    Science.gov (United States)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  7. A generic method for assignment of reliability scores applied to solvent accessibility predictions

    DEFF Research Database (Denmark)

    Petersen, Bent; Petersen, Thomas Nordahl; Andersen, Pernille

    2009-01-01

    : The performance of the neural networks was evaluated on a commonly used set of sequences known as the CB513 set. An overall Pearson's correlation coefficient of 0.72 was obtained, which is comparable to the performance of the currently best public available method, Real-SPINE. Both methods associate a reliability...... comparing the Pearson's correlation coefficient for the upper 20% of predictions sorted according to reliability. For this subset, values of 0.79 and 0.74 are obtained using our and the compared method, respectively. This tendency is true for any selected subset....

  8. A method to evaluate performance reliability of individual subjects in laboratory research applied to work settings.

    Science.gov (United States)

    1978-10-01

    This report presents a method that may be used to evaluate the reliability of performance of individual subjects, particularly in applied laboratory research. The method is based on analysis of variance of a tasks-by-subjects data matrix, with all sc...

  9. A generic method for assignment of reliability scores applied to solvent accessibility predictions

    Directory of Open Access Journals (Sweden)

    Nielsen Morten

    2009-07-01

    Full Text Available Abstract Background Estimation of the reliability of specific real value predictions is nontrivial and the efficacy of this is often questionable. It is important to know if you can trust a given prediction and therefore the best methods associate a prediction with a reliability score or index. For discrete qualitative predictions, the reliability is conventionally estimated as the difference between output scores of selected classes. Such an approach is not feasible for methods that predict a biological feature as a single real value rather than a classification. As a solution to this challenge, we have implemented a method that predicts the relative surface accessibility of an amino acid and simultaneously predicts the reliability for each prediction, in the form of a Z-score. Results An ensemble of artificial neural networks has been trained on a set of experimentally solved protein structures to predict the relative exposure of the amino acids. The method assigns a reliability score to each surface accessibility prediction as an inherent part of the training process. This is in contrast to the most commonly used procedures where reliabilities are obtained by post-processing the output. Conclusion The performance of the neural networks was evaluated on a commonly used set of sequences known as the CB513 set. An overall Pearson's correlation coefficient of 0.72 was obtained, which is comparable to the performance of the currently best public available method, Real-SPINE. Both methods associate a reliability score with the individual predictions. However, our implementation of reliability scores in the form of a Z-score is shown to be the more informative measure for discriminating good predictions from bad ones in the entire range from completely buried to fully exposed amino acids. This is evident when comparing the Pearson's correlation coefficient for the upper 20% of predictions sorted according to reliability. For this subset, values of 0

  10. The Evaluation Method of the Lightning Strike on Transmission Lines Aiming at Power Grid Reliability

    Science.gov (United States)

    Wen, Jianfeng; Wu, Jianwei; Huang, Liandong; Geng, Yinan; Yu, zhanqing

    2018-01-01

    Lightning protection of power system focuses on reducing the flashover rate, only distinguishing by the voltage level, without considering the functional differences between the transmission lines, and being lack of analysis the effect on the reliability of power grid. This will lead lightning protection design of general transmission lines is surplus but insufficient for key lines. In order to solve this problem, the analysis method of lightning striking on transmission lines for power grid reliability is given. Full wave process theory is used to analyze the lightning back striking; the leader propagation model is used to describe the process of shielding failure of transmission lines. The index of power grid reliability is introduced and the effect of transmission line fault on the reliability of power system is discussed in detail.

  11. Investigation of Reliabilities of Bolt Distances for Bolted Structural Steel Connections by Monte Carlo Simulation Method

    Directory of Open Access Journals (Sweden)

    Ertekin Öztekin Öztekin

    2015-12-01

    Full Text Available Design of the distance of bolts to each other and design of the distance of bolts to the edge of connection plates are made based on minimum and maximum boundary values proposed by structural codes. In this study, reliabilities of those distances were investigated. For this purpose, loading types, bolt types and plate thicknesses were taken as variable parameters. Monte Carlo Simulation (MCS method was used in the reliability computations performed for all combination of those parameters. At the end of study, all reliability index values for all those distances were presented in graphics and tables. Results obtained from this study compared with the values proposed by some structural codes and finally some evaluations were made about those comparisons. Finally, It was emphasized in the end of study that, it would be incorrect of the usage of the same bolt distances in the both traditional designs and the higher reliability level designs.

  12. An Intelligent Method for Structural Reliability Analysis Based on Response Surface

    Institute of Scientific and Technical Information of China (English)

    桂劲松; 刘红; 康海贵

    2004-01-01

    As water depth increases, the structural safety and reliability of a system become more and more important and challenging. Therefore, the structural reliability method must be applied in ocean engineering design such as offshore platform design. If the performance function is known in structural reliability analysis, the first-order second-moment method is often used. If the performance function could not be definitely expressed, the response surface method is always used because it has a very clear train of thought and simple programming. However, the traditional response surface method fits the response surface of quadratic polynomials where the problem of accuracy could not be solved, because the true limit state surface can be fitted well only in the area near the checking point. In this paper, an intelligent computing method based on the whole response surface is proposed, which can be used for the situation where the performance function could not be definitely expressed in structural reliability analysis. In this method, a response surface of the fuzzy neural network for the whole area should be constructed first, and then the structural reliability can be calculated by the genetic algorithm. In the proposed method, all the sample points for the training network come from the whole area, so the true limit state surface in the whole area can be fitted. Through calculational examples and comparative analysis, it can be known that the proposed method is much better than the traditional response surface method of quadratic polynomials, because, the amount of calculation of finite element analysis is largely reduced, the accuracy of calculation is improved,and the true limit state surface can be fitted very well in the whole area. So, the method proposed in this paper is suitable for engineering application.

  13. A Bayesian reliability evaluation method with integrated accelerated degradation testing and field information

    International Nuclear Information System (INIS)

    Wang, Lizhi; Pan, Rong; Li, Xiaoyang; Jiang, Tongmin

    2013-01-01

    Accelerated degradation testing (ADT) is a common approach in reliability prediction, especially for products with high reliability. However, oftentimes the laboratory condition of ADT is different from the field condition; thus, to predict field failure, one need to calibrate the prediction made by using ADT data. In this paper a Bayesian evaluation method is proposed to integrate the ADT data from laboratory with the failure data from field. Calibration factors are introduced to calibrate the difference between the lab and the field conditions so as to predict a product's actual field reliability more accurately. The information fusion and statistical inference procedure are carried out through a Bayesian approach and Markov chain Monte Carlo methods. The proposed method is demonstrated by two examples and the sensitivity analysis to prior distribution assumption

  14. The Monte Carlo Simulation Method for System Reliability and Risk Analysis

    CERN Document Server

    Zio, Enrico

    2013-01-01

    Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling.   Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques.   This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...

  15. Study on reliability analysis based on multilevel flow models and fault tree method

    International Nuclear Information System (INIS)

    Chen Qiang; Yang Ming

    2014-01-01

    Multilevel flow models (MFM) and fault tree method describe the system knowledge in different forms, so the two methods express an equivalent logic of the system reliability under the same boundary conditions and assumptions. Based on this and combined with the characteristics of MFM, a method mapping MFM to fault tree was put forward, thus providing a way to establish fault tree rapidly and realizing qualitative reliability analysis based on MFM. Taking the safety injection system of pressurized water reactor nuclear power plant as an example, its MFM was established and its reliability was analyzed qualitatively. The analysis result shows that the logic of mapping MFM to fault tree is correct. The MFM is easily understood, created and modified. Compared with the traditional fault tree analysis, the workload is greatly reduced and the modeling time is saved. (authors)

  16. Structural system reliability calculation using a probabilistic fault tree analysis method

    Science.gov (United States)

    Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.

    1992-01-01

    The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.

  17. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software.

    Science.gov (United States)

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2015-05-01

    Several sophisticated methods of footprint analysis currently exist. However, it is sometimes useful to apply standard measurement methods of recognized evidence with an easy and quick application. We sought to assess the reliability and validity of a new method of footprint assessment in a healthy population using Photoshop CS5 software (Adobe Systems Inc, San Jose, California). Forty-two footprints, corresponding to 21 healthy individuals (11 men with a mean ± SD age of 20.45 ± 2.16 years and 10 women with a mean ± SD age of 20.00 ± 1.70 years) were analyzed. Footprints were recorded in static bipedal standing position using optical podography and digital photography. Three trials for each participant were performed. The Hernández-Corvo, Chippaux-Smirak, and Staheli indices and the Clarke angle were calculated by manual method and by computerized method using Photoshop CS5 software. Test-retest was used to determine reliability. Validity was obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed high values (ICC, 0.98-0.99). Moreover, the validity test clearly showed no difference between techniques (ICC, 0.99-1). The reliability and validity of a method to measure, assess, and record the podometric indices using Photoshop CS5 software has been demonstrated. This provides a quick and accurate tool useful for the digital recording of morphostatic foot study parameters and their control.

  18. A human reliability based usability evaluation method for safety-critical software

    International Nuclear Information System (INIS)

    Boring, R. L.; Tran, T. Q.; Gertman, D. I.; Ragsdale, A.

    2006-01-01

    Boring and Gertman (2005) introduced a novel method that augments heuristic usability evaluation methods with that of the human reliability analysis method of SPAR-H. By assigning probabilistic modifiers to individual heuristics, it is possible to arrive at the usability error probability (UEP). Although this UEP is not a literal probability of error, it nonetheless provides a quantitative basis to heuristic evaluation. This method allows one to seamlessly prioritize and identify usability issues (i.e., a higher UEP requires more immediate fixes). However, the original version of this method required the usability evaluator to assign priority weights to the final UEP, thus allowing the priority of a usability issue to differ among usability evaluators. The purpose of this paper is to explore an alternative approach to standardize the priority weighting of the UEP in an effort to improve the method's reliability. (authors)

  19. An automated method for estimating reliability of grid systems using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Emmanuel Ramirez-Marquez, Jose

    2012-01-01

    Grid computing has become relevant due to its applications to large-scale resource sharing, wide-area information transfer, and multi-institutional collaborating. In general, in grid computing a service requests the use of a set of resources, available in a grid, to complete certain tasks. Although analysis tools and techniques for these types of systems have been studied, grid reliability analysis is generally computation-intensive to obtain due to the complexity of the system. Moreover, conventional reliability models have some common assumptions that cannot be applied to the grid systems. Therefore, new analytical methods are needed for effective and accurate assessment of grid reliability. This study presents a new method for estimating grid service reliability, which does not require prior knowledge about the grid system structure unlike the previous studies. Moreover, the proposed method does not rely on any assumptions about the link and node failure rates. This approach is based on a data-mining algorithm, the K2, to discover the grid system structure from raw historical system data, that allows to find minimum resource spanning trees (MRST) within the grid then, uses Bayesian networks (BN) to model the MRST and estimate grid service reliability.

  20. Inter- and intra- observer reliability of risk assessment of repetitive work without an explicit method.

    Science.gov (United States)

    Eliasson, Kristina; Palm, Peter; Nyman, Teresia; Forsman, Mikael

    2017-07-01

    A common way to conduct practical risk assessments is to observe a job and report the observed long term risks for musculoskeletal disorders. The aim of this study was to evaluate the inter- and intra-observer reliability of ergonomists' risk assessments without the support of an explicit risk assessment method. Twenty-one experienced ergonomists assessed the risk level (low, moderate, high risk) of eight upper body regions, as well as the global risk of 10 video recorded work tasks. Intra-observer reliability was assessed by having nine of the ergonomists repeat the procedure at least three weeks after the first assessment. The ergonomists made their risk assessment based on his/her experience and knowledge. The statistical parameters of reliability included agreement in %, kappa, linearly weighted kappa, intraclass correlation and Kendall's coefficient of concordance. The average inter-observer agreement of the global risk was 53% and the corresponding weighted kappa (K w ) was 0.32, indicating fair reliability. The intra-observer agreement was 61% and 0.41 (K w ). This study indicates that risk assessments of the upper body, without the use of an explicit observational method, have non-acceptable reliability. It is therefore recommended to use systematic risk assessment methods to a higher degree. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. A Fast Optimization Method for Reliability and Performance of Cloud Services Composition Application

    Directory of Open Access Journals (Sweden)

    Zhao Wu

    2013-01-01

    Full Text Available At present the cloud computing is one of the newest trends of distributed computation, which is propelling another important revolution of software industry. The cloud services composition is one of the key techniques in software development. The optimization for reliability and performance of cloud services composition application, which is a typical stochastic optimization problem, is confronted with severe challenges due to its randomness and long transaction, as well as the characteristics of the cloud computing resources such as openness and dynamic. The traditional reliability and performance optimization techniques, for example, Markov model and state space analysis and so forth, have some defects such as being too time consuming and easy to cause state space explosion and unsatisfied the assumptions of component execution independence. To overcome these defects, we propose a fast optimization method for reliability and performance of cloud services composition application based on universal generating function and genetic algorithm in this paper. At first, a reliability and performance model for cloud service composition application based on the multiple state system theory is presented. Then the reliability and performance definition based on universal generating function is proposed. Based on this, a fast reliability and performance optimization algorithm is presented. In the end, the illustrative examples are given.

  2. Reliability-Based Topology Optimization Using Stochastic Response Surface Method with Sparse Grid Design

    Directory of Open Access Journals (Sweden)

    Qinghai Zhao

    2015-01-01

    Full Text Available A mathematical framework is developed which integrates the reliability concept into topology optimization to solve reliability-based topology optimization (RBTO problems under uncertainty. Two typical methodologies have been presented and implemented, including the performance measure approach (PMA and the sequential optimization and reliability assessment (SORA. To enhance the computational efficiency of reliability analysis, stochastic response surface method (SRSM is applied to approximate the true limit state function with respect to the normalized random variables, combined with the reasonable design of experiments generated by sparse grid design, which was proven to be an effective and special discretization technique. The uncertainties such as material property and external loads are considered on three numerical examples: a cantilever beam, a loaded knee structure, and a heat conduction problem. Monte-Carlo simulations are also performed to verify the accuracy of the failure probabilities computed by the proposed approach. Based on the results, it is demonstrated that application of SRSM with SGD can produce an efficient reliability analysis in RBTO which enables a more reliable design than that obtained by DTO. It is also found that, under identical accuracy, SORA is superior to PMA in view of computational efficiency.

  3. A Method to Increase Drivers' Trust in Collision Warning Systems Based on Reliability Information of Sensor

    Science.gov (United States)

    Tsutsumi, Shigeyoshi; Wada, Takahiro; Akita, Tokihiko; Doi, Shun'ichi

    Driver's workload tends to be increased during driving under complicated traffic environments like a lane change. In such cases, rear collision warning is effective for reduction of cognitive workload. On the other hand, it is pointed out that false alarm or missing alarm caused by sensor errors leads to decrease of driver' s trust in the warning system and it can result in low efficiency of the system. Suppose that reliability information of the sensor is provided in real-time. In this paper, we propose a new warning method to increase driver' s trust in the system even with low sensor reliability utilizing the sensor reliability information. The effectiveness of the warning methods is shown by driving simulator experiments.

  4. An Investment Level Decision Method to Secure Long-term Reliability

    Science.gov (United States)

    Bamba, Satoshi; Yabe, Kuniaki; Seki, Tomomichi; Shibaya, Tetsuji

    The slowdown in power demand increase and facility replacement causes the aging and lower reliability in power facility. And the aging is followed by the rapid increase of repair and replacement when many facilities reach their lifetime in future. This paper describes a method to estimate the repair and replacement costs in future by applying the life-cycle cost model and renewal theory to the historical data. This paper also describes a method to decide the optimum investment plan, which replaces facilities in the order of cost-effectiveness by setting replacement priority formula, and the minimum investment level to keep the reliability. Estimation examples applied to substation facilities show that the reasonable and leveled future cash-out can keep the reliability by lowering the percentage of replacements caused by fatal failures.

  5. A Comparison of Three Methods for the Analysis of Skin Flap Viability: Reliability and Validity.

    Science.gov (United States)

    Tim, Carla Roberta; Martignago, Cintia Cristina Santi; da Silva, Viviane Ribeiro; Dos Santos, Estefany Camila Bonfim; Vieira, Fabiana Nascimento; Parizotto, Nivaldo Antonio; Liebano, Richard Eloin

    2018-05-01

    Objective: Technological advances have provided new alternatives to the analysis of skin flap viability in animal models; however, the interrater validity and reliability of these techniques have yet to be analyzed. The present study aimed to evaluate the interrater validity and reliability of three different methods: weight of paper template (WPT), paper template area (PTA), and photographic analysis. Approach: Sixteen male Wistar rats had their cranially based dorsal skin flap elevated. On the seventh postoperative day, the viable tissue area and the necrotic area of the skin flap were recorded using the paper template method and photo image. The evaluation of the percentage of viable tissue was performed using three methods, simultaneously and independently by two raters. The analysis of interrater reliability and viability was performed using the intraclass correlation coefficient and Bland Altman Plot Analysis was used to visualize the presence or absence of systematic bias in the evaluations of data validity. Results: The results showed that interrater reliability for WPT, measurement of PTA, and photographic analysis were 0.995, 0.990, and 0.982, respectively. For data validity, a correlation >0.90 was observed for all comparisons made between the three methods. In addition, Bland Altman Plot Analysis showed agreement between the comparisons of the methods and the presence of systematic bias was not observed. Innovation: Digital methods are an excellent choice for assessing skin flap viability; moreover, they make data use and storage easier. Conclusion: Independently from the method used, the interrater reliability and validity proved to be excellent for the analysis of skin flaps' viability.

  6. Proceeding of 35th domestic symposium on applications of structural reliability and risk assessment methods to nuclear power plants

    International Nuclear Information System (INIS)

    2005-06-01

    As the 35th domestic symposium of Atomic Energy Research Committee, the Japan Welding Engineering Society, the symposium was held titled as Applications of structural reliability/risk assessment methods to nuclear energy'. Six speakers gave lectures titled as 'Structural reliability and risk assessment methods', 'Risk-informed regulation of US nuclear energy and role of probabilistic risk assessment', 'Reliability and risk assessment methods in chemical plants', 'Practical structural design methods based on reliability in architectural and civil areas', 'Maintenance activities based on reliability in thermal power plants' and 'LWR maintenance strategies based on Probabilistic Fracture Mechanics'. (T. Tanaka)

  7. Extended block diagram method for a multi-state system reliability assessment

    International Nuclear Information System (INIS)

    Lisnianski, Anatoly

    2007-01-01

    The presented method extends the classical reliability block diagram method to a repairable multi-state system. It is very suitable for engineering applications since the procedure is well formalized and based on the natural decomposition of the entire multi-state system (the system is represented as a collection of its elements). Until now, the classical block diagram method did not provide the reliability assessment for the repairable multi-state system. The straightforward stochastic process methods are very difficult for engineering application in such cases due to the 'dimension damnation'-huge number of system states. The suggested method is based on the combined random processes and the universal generating function technique and drastically reduces the number of states in the multi-state model

  8. Green approaches in sample preparation of bioanalytical samples prior to chromatographic analysis.

    Science.gov (United States)

    Filippou, Olga; Bitas, Dimitrios; Samanidou, Victoria

    2017-02-01

    Sample preparation is considered to be the most challenging step of the analytical procedure, since it has an effect on the whole analytical methodology, therefore it contributes significantly to the greenness or lack of it of the entire process. The elimination of the sample treatment steps, pursuing at the same time the reduction of the amount of the sample, strong reductions in consumption of hazardous reagents and energy also maximizing safety for operators and environment, the avoidance of the use of big amount of organic solvents, form the basis for greening sample preparation and analytical methods. In the last decade, the development and utilization of greener and sustainable microextraction techniques is an alternative to classical sample preparation procedures. In this review, the main green microextraction techniques (solid phase microextraction, stir bar sorptive extraction, hollow-fiber liquid phase microextraction, dispersive liquid - liquid microextraction, etc.) will be presented, with special attention to bioanalytical applications of these environment-friendly sample preparation techniques which comply with the green analytical chemistry principles. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. [Knowledge of university students in Szeged, Hungary about reliable contraceptive methods and sexually transmitted diseases].

    Science.gov (United States)

    Devosa, Iván; Kozinszky, Zoltán; Vanya, Melinda; Szili, Károly; Fáyné Dombi, Alice; Barabás, Katalin

    2016-04-03

    Promiscuity and lack of use of reliable contraceptive methods increase the probability of sexually transmitted diseases and the risk of unwanted pregnancies, which are quite common among university students. The aim of the study was to assess the knowledge of university students about reliable contraceptive methods and sexually transmitted diseases, and to assess the effectiveness of the sexual health education in secondary schools, with specific focus on the education held by peers. An anonymous, self-administered questionnaire survey was carried out in a randomized sample of students at the University of Szeged (n = 472, 298 women and 174 men, average age 21 years) between 2009 and 2011. 62.1% of the respondents declared that reproductive health education lessons in high schools held by peers were reliable and authentic source of information, 12.3% considered as a less reliable source, and 25.6% defined the school health education as irrelevant source. Among those, who considered the health education held by peers as a reliable source, there were significantly more females (69.3% vs. 46.6%, p = 0.001), significantly fewer lived in cities (83.6% vs. 94.8%, p = 0.025), and significantly more responders knew that Candida infection can be transmitted through sexual intercourse (79.5% versus 63.9%, p = 0.02) as compared to those who did not consider health education held by peers as a reliable source. The majority of respondents obtained knowledge about sexual issues from the mass media. Young people who considered health educating programs reliable were significantly better informed about Candida disease.

  10. Synthesis, Characterization and Utility of Carbon Nanotube Based Hybrid Sensors in Bioanalytical Applications

    Science.gov (United States)

    Badhulika, Sushmee

    The detection of gaseous analytes and biological molecules is of prime importance in the fields of environmental pollution control, food and water - safety and analysis; and medical diagnostics. This necessitates the development of advanced and improved technology that is reliable, inexpensive and suitable for high volume production. The conventional sensors are often thin film based which lack sensitivity due to the phenomena of current shunting across the charge depleted region when an analyte binds with them. One dimensional (1-D) nanostructures provide a better alternative for sensing applications by eliminating the issue of current shunting due to their 1-D geometries and facilitating device miniaturization and low power operations. Carbon nanotubes (CNTs) are 1-D nanostructures that possess small size, high mechanical strength, high electrical and thermal conductivity and high specific area that have resulted in their wide spread applications in sensor technology. To overcome the issue of low sensitivity of pristine CNTs and to widen their scope, hybrid devices have been fabricated that combine the synergistic properties of CNTs along with materials like metals and conducting polymers (CPs). CPs exhibit electronic, magnetic and optical properties of metals and semiconductors while retaining the processing advantages of polymers. Their high chemical sensitivity, room temperature operation and tunable charge transport properties has made them ideal for use as transducing elements in chemical sensors. In this dissertation, various CNT based hybrid devices such as CNT-conducting polymer and graphene-CNT-metal nanoparticles based sensors have been developed and demonstrated towards bioanalytical applications such as detection of volatile organic compounds (VOCs) and saccharides. Electrochemical polymerization enabled the synthesis of CPs and metal nanoparticles in a simple, cost effective and controlled way on the surface of CNT based platforms thus resulting in

  11. Reliability of Semiautomated Computational Methods for Estimating Tibiofemoral Contact Stress in the Multicenter Osteoarthritis Study

    Directory of Open Access Journals (Sweden)

    Donald D. Anderson

    2012-01-01

    Full Text Available Recent findings suggest that contact stress is a potent predictor of subsequent symptomatic osteoarthritis development in the knee. However, much larger numbers of knees (likely on the order of hundreds, if not thousands need to be reliably analyzed to achieve the statistical power necessary to clarify this relationship. This study assessed the reliability of new semiautomated computational methods for estimating contact stress in knees from large population-based cohorts. Ten knees of subjects from the Multicenter Osteoarthritis Study were included. Bone surfaces were manually segmented from sequential 1.0 Tesla magnetic resonance imaging slices by three individuals on two nonconsecutive days. Four individuals then registered the resulting bone surfaces to corresponding bone edges on weight-bearing radiographs, using a semi-automated algorithm. Discrete element analysis methods were used to estimate contact stress distributions for each knee. Segmentation and registration reliabilities (day-to-day and interrater for peak and mean medial and lateral tibiofemoral contact stress were assessed with Shrout-Fleiss intraclass correlation coefficients (ICCs. The segmentation and registration steps of the modeling approach were found to have excellent day-to-day (ICC 0.93–0.99 and good inter-rater reliability (0.84–0.97. This approach for estimating compartment-specific tibiofemoral contact stress appears to be sufficiently reliable for use in large population-based cohorts.

  12. Risk-based methods for reliability investments in electric power distribution systems

    Energy Technology Data Exchange (ETDEWEB)

    Alvehag, Karin

    2011-07-01

    Society relies more and more on a continuous supply of electricity. However, while under investments in reliability lead to an unacceptable number of power interruptions, over investments result in too high costs for society. To give incentives for a socio economically optimal level of reliability, quality regulations have been adopted in many European countries. These quality regulations imply new financial risks for the distribution system operator (DSO) since poor reliability can reduce the allowed revenue for the DSO and compensation may have to be paid to affected customers. This thesis develops a method for evaluating the incentives for reliability investments implied by different quality regulation designs. The method can be used to investigate whether socio economically beneficial projects are also beneficial for a profit-maximizing DSO subject to a particular quality regulation design. To investigate which reinvestment projects are preferable for society and a DSO, risk-based methods are developed. With these methods, the probability of power interruptions and the consequences of these can be simulated. The consequences of interruptions for the DSO will to a large extent depend on the quality regulation. The consequences for the customers, and hence also society, will depend on factors such as the interruption duration and time of occurrence. The proposed risk-based methods consider extreme outage events in the risk assessments by incorporating the impact of severe weather, estimating the full probability distribution of the total reliability cost, and formulating a risk-averse strategy. Results from case studies performed show that quality regulation design has a significant impact on reinvestment project profitability for a DSO. In order to adequately capture the financial risk that the DSO is exposed to, detailed riskbased methods, such as the ones developed in this thesis, are needed. Furthermore, when making investment decisions, a risk

  13. Transition from Partial Factors Method to Simulation-Based Reliability Assessment in Structural Design

    Czech Academy of Sciences Publication Activity Database

    Marek, Pavel; Guštar, M.; Permaul, K.

    1999-01-01

    Roč. 14, č. 1 (1999), s. 105-118 ISSN 0266-8920 R&D Projects: GA ČR GA103/94/0562; GA ČR GV103/96/K034 Keywords : reliability * safety * failure * durability * Monte Carlo method Subject RIV: JM - Building Engineering Impact factor: 0.522, year: 1999

  14. AK-SYS: An adaptation of the AK-MCS method for system reliability

    International Nuclear Information System (INIS)

    Fauriat, W.; Gayton, N.

    2014-01-01

    A lot of research work has been proposed over the last two decades to evaluate the probability of failure of a structure involving a very time-consuming mechanical model. Surrogate model approaches based on Kriging, such as the Efficient Global Reliability Analysis (EGRA) or the Active learning and Kriging-based Monte-Carlo Simulation (AK-MCS) methods, are very efficient and each has advantages of its own. EGRA is well suited to evaluating small probabilities, as the surrogate can be used to classify any population. AK-MCS is built in relation to a given population and requires no optimization program for the active learning procedure to be performed. It is therefore easier to implement and more likely to spend computational effort on areas with a significant probability content. When assessing system reliability, analytical approaches and first-order approximation are widely used in the literature. However, in the present paper we rather focus on sampling techniques and, considering the recent adaptation of the EGRA method for systems, a strategy is presented to adapt the AK-MCS method for system reliability. The AK-SYS method, “Active learning and Kriging-based SYStem reliability method”, is presented. Its high efficiency and accuracy are illustrated via various examples

  15. Evaluation of the reliability of Levine method of wound swab for ...

    African Journals Online (AJOL)

    The aim of this paper is to evaluate the reliability of Levine swab in accurate identification of microorganisms present in a wound and identify the necessity for further studies in this regard. Methods: A semi structured questionnaire was administered and physical examination was performed on patients with chronic wounds ...

  16. Reliability analysis based on a novel density estimation method for structures with correlations

    Directory of Open Access Journals (Sweden)

    Baoyu LI

    2017-06-01

    Full Text Available Estimating the Probability Density Function (PDF of the performance function is a direct way for structural reliability analysis, and the failure probability can be easily obtained by integration in the failure domain. However, efficiently estimating the PDF is still an urgent problem to be solved. The existing fractional moment based maximum entropy has provided a very advanced method for the PDF estimation, whereas the main shortcoming is that it limits the application of the reliability analysis method only to structures with independent inputs. While in fact, structures with correlated inputs always exist in engineering, thus this paper improves the maximum entropy method, and applies the Unscented Transformation (UT technique to compute the fractional moments of the performance function for structures with correlations, which is a very efficient moment estimation method for models with any inputs. The proposed method can precisely estimate the probability distributions of performance functions for structures with correlations. Besides, the number of function evaluations of the proposed method in reliability analysis, which is determined by UT, is really small. Several examples are employed to illustrate the accuracy and advantages of the proposed method.

  17. Development of advanced methods and related software for human reliability evaluation within probabilistic safety analyses

    International Nuclear Information System (INIS)

    Kosmowski, K.T.; Mertens, J.; Degen, G.; Reer, B.

    1994-06-01

    Human Reliability Analysis (HRA) is an important part of Probabilistic Safety Analysis (PSA). The first part of this report consists of an overview of types of human behaviour and human error including the effect of significant performance shaping factors on human reliability. Particularly with regard to safety assessments for nuclear power plants a lot of HRA methods have been developed. The most important of these methods are presented and discussed in the report, together with techniques for incorporating HRA into PSA and with models of operator cognitive behaviour. Based on existing HRA methods the concept of a software system is described. For the development of this system the utilization of modern programming tools is proposed; the essential goal is the effective application of HRA methods. A possible integration of computeraided HRA within PSA is discussed. The features of Expert System Technology and examples of applications (PSA, HRA) are presented in four appendices. (orig.) [de

  18. Reliability Assessment of Active Distribution System Using Monte Carlo Simulation Method

    Directory of Open Access Journals (Sweden)

    Shaoyun Ge

    2014-01-01

    Full Text Available In this paper we have treated the reliability assessment problem of low and high DG penetration level of active distribution system using the Monte Carlo simulation method. The problem is formulated as a two-case program, the program of low penetration simulation and the program of high penetration simulation. The load shedding strategy and the simulation process were introduced in detail during each FMEA process. Results indicate that the integration of DG can improve the reliability of the system if the system was operated actively.

  19. Assessment of Electronic Circuits Reliability Using Boolean Truth Table Modeling Method

    International Nuclear Information System (INIS)

    EI-Shanshoury, A.I.

    2011-01-01

    This paper explores the use of Boolean Truth Table modeling Method (BTTM) in the analysis of qualitative data. It is widely used in certain fields especially in the fields of electrical and electronic engineering. Our work focuses on the evaluation of power supply circuit reliability using (BTTM) which involves systematic attempts to falsify and identify hypotheses on the basis of truth tables constructed from qualitative data. Reliability parameters such as the system's failure rates for the power supply case study are estimated. All possible state combinations (operating and failed states) of the major components in the circuit were listed and their effects on overall system were studied

  20. Methods for estimating the reliability of the RBMK fuel assemblies and elements

    International Nuclear Information System (INIS)

    Klemin, A.I.; Sitkarev, A.G.

    1985-01-01

    Applied non-parametric methods for calculation of point and interval estimations for the basic nomenclature of reliability factors for the RBMK fuel assemblies and elements are described. As the fuel assembly and element reliability factors, the average lifetime is considered at a preset operating time up to unloading due to fuel burnout as well as the average lifetime at the reactor transient operation and at the steady-state fuel reloading mode of reactor operation. The formulae obtained are included into the special standardized engineering documentation

  1. Screening, sensitivity, and uncertainty for the CREAM method of Human Reliability Analysis

    International Nuclear Information System (INIS)

    Bedford, Tim; Bayley, Clare; Revie, Matthew

    2013-01-01

    This paper reports a sensitivity analysis of the Cognitive Reliability and Error Analysis Method for Human Reliability Analysis. We consider three different aspects: the difference between the outputs of the Basic and Extended methods, on the same HRA scenario; the variability in outputs through the choices made for common performance conditions (CPCs); and the variability in outputs through the assignment of choices for cognitive function failures (CFFs). We discuss the problem of interpreting categories when applying the method, compare its quantitative structure to that of first generation methods and discuss also how dependence is modelled with the approach. We show that the control mode intervals used in the Basic method are too narrow to be consistent with the Extended method. This motivates a new screening method that gives improved accuracy with respect to the Basic method, in the sense that (on average) halves the uncertainty associated with the Basic method. We make some observations on the design of a screening method that are generally applicable in Risk Analysis. Finally, we propose a new method of combining CPC weights with nominal probabilities so that the calculated probabilities are always in range (i.e. between 0 and 1), while satisfying sensible properties that are consistent with the overall CREAM method

  2. A method of bias correction for maximal reliability with dichotomous measures.

    Science.gov (United States)

    Penev, Spiridon; Raykov, Tenko

    2010-02-01

    This paper is concerned with the reliability of weighted combinations of a given set of dichotomous measures. Maximal reliability for such measures has been discussed in the past, but the pertinent estimator exhibits a considerable bias and mean squared error for moderate sample sizes. We examine this bias, propose a procedure for bias correction, and develop a more accurate asymptotic confidence interval for the resulting estimator. In most empirically relevant cases, the bias correction and mean squared error correction can be performed simultaneously. We propose an approximate (asymptotic) confidence interval for the maximal reliability coefficient, discuss the implementation of this estimator, and investigate the mean squared error of the associated asymptotic approximation. We illustrate the proposed methods using a numerical example.

  3. A Simple and Reliable Method of Design for Standalone Photovoltaic Systems

    Science.gov (United States)

    Srinivasarao, Mantri; Sudha, K. Rama; Bhanu, C. V. K.

    2017-06-01

    Standalone photovoltaic (SAPV) systems are seen as a promoting method of electrifying areas of developing world that lack power grid infrastructure. Proliferations of these systems require a design procedure that is simple, reliable and exhibit good performance over its life time. The proposed methodology uses simple empirical formulae and easily available parameters to design SAPV systems, that is, array size with energy storage. After arriving at the different array size (area), performance curves are obtained for optimal design of SAPV system with high amount of reliability in terms of autonomy at a specified value of loss of load probability (LOLP). Based on the array to load ratio (ALR) and levelized energy cost (LEC) through life cycle cost (LCC) analysis, it is shown that the proposed methodology gives better performance, requires simple data and is more reliable when compared with conventional design using monthly average daily load and insolation.

  4. Sensitivity Weaknesses in Application of some Statistical Distribution in First Order Reliability Methods

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Enevoldsen, I.

    1993-01-01

    It has been observed and shown that in some examples a sensitivity analysis of the first order reliability index results in increasing reliability index, when the standard deviation for a stochastic variable is increased while the expected value is fixed. This unfortunate behaviour can occur when...... a stochastic variable is modelled by an asymmetrical density function. For lognormally, Gumbel and Weibull distributed stochastic variables it is shown for which combinations of the/3-point, the expected value and standard deviation the weakness can occur. In relation to practical application the behaviour...... is probably rather infrequent. A simple example is shown as illustration and to exemplify that for second order reliability methods and for exact calculations of the probability of failure this behaviour is much more infrequent....

  5. A fast approximation method for reliability analysis of cold-standby systems

    International Nuclear Information System (INIS)

    Wang, Chaonan; Xing, Liudong; Amari, Suprasad V.

    2012-01-01

    Analyzing reliability of large cold-standby systems has been a complicated and time-consuming task, especially for systems with components having non-exponential time-to-failure distributions. In this paper, an approximation model, which is based on the central limit theorem, is presented for the reliability analysis of binary cold-standby systems. The proposed model can estimate the reliability of large cold-standby systems with binary-state components having arbitrary time-to-failure distributions in an efficient and easy way. The accuracy and efficiency of the proposed method are illustrated using several different types of distributions for both 1-out-of-n and k-out-of-n cold-standby systems.

  6. A dynamic discretization method for reliability inference in Dynamic Bayesian Networks

    International Nuclear Information System (INIS)

    Zhu, Jiandao; Collette, Matthew

    2015-01-01

    The material and modeling parameters that drive structural reliability analysis for marine structures are subject to a significant uncertainty. This is especially true when time-dependent degradation mechanisms such as structural fatigue cracking are considered. Through inspection and monitoring, information such as crack location and size can be obtained to improve these parameters and the corresponding reliability estimates. Dynamic Bayesian Networks (DBNs) are a powerful and flexible tool to model dynamic system behavior and update reliability and uncertainty analysis with life cycle data for problems such as fatigue cracking. However, a central challenge in using DBNs is the need to discretize certain types of continuous random variables to perform network inference while still accurately tracking low-probability failure events. Most existing discretization methods focus on getting the overall shape of the distribution correct, with less emphasis on the tail region. Therefore, a novel scheme is presented specifically to estimate the likelihood of low-probability failure events. The scheme is an iterative algorithm which dynamically partitions the discretization intervals at each iteration. Through applications to two stochastic crack-growth example problems, the algorithm is shown to be robust and accurate. Comparisons are presented between the proposed approach and existing methods for the discretization problem. - Highlights: • A dynamic discretization method is developed for low-probability events in DBNs. • The method is compared to existing approaches on two crack growth problems. • The method is shown to improve on existing methods for low-probability events

  7. A Reliable Method to Measure Lip Height Using Photogrammetry in Unilateral Cleft Lip Patients.

    Science.gov (United States)

    van der Zeeuw, Frederique; Murabit, Amera; Volcano, Johnny; Torensma, Bart; Patel, Brijesh; Hay, Norman; Thorburn, Guy; Morris, Paul; Sommerlad, Brian; Gnarra, Maria; van der Horst, Chantal; Kangesu, Loshan

    2015-09-01

    There is still no reliable tool to determine the outcome of the repaired unilateral cleft lip (UCL). The aim of this study was therefore to develop an accurate, reliable tool to measure vertical lip height from photographs. The authors measured the vertical height of the cutaneous and vermilion parts of the lip in 72 anterior-posterior view photographs of 17 patients with repairs to a UCL. Points on the lip's white roll and vermillion were marked on both the cleft and the noncleft sides on each image. Two new concepts were tested. First, photographs were standardized using the horizontal (medial to lateral) eye fissure width (EFW) for calibration. Second, the authors tested the interpupillary line (IPL) and the alar base line (ABL) for their reliability as horizontal lines of reference. Measurements were taken by 2 independent researchers, at 2 different time points each. Overall 2304 data points were obtained and analyzed. Results showed that the method was very effective in measuring the height of the lip on the cleft side with the noncleft side. When using the IPL, inter- and intra-rater reliability was 0.99 to 1.0, with the ABL it varied from 0.91 to 0.99 with one exception at 0.84. The IPL was easier to define because in some subjects the overhanging nasal tip obscured the alar base and gave more consistent measurements possibly because the reconstructed alar base was sometimes indistinct. However, measurements from the IPL can only give the percentage difference between the left and right sides of the lip, whereas those from the ABL can also give exact measurements. Patient examples were given that show how the measurements correlate with clinical assessment. The authors propose this method of photogrammetry with the innovative use of the IPL as a reliable horizontal plane and use of the EFW for calibration as a useful and reliable tool to assess the outcome of UCL repair.

  8. Identification of reliable gridded reference data for statistical downscaling methods in Alberta

    Science.gov (United States)

    Eum, H. I.; Gupta, A.

    2017-12-01

    Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system

  9. Reliable methods for computer simulation error control and a posteriori estimates

    CERN Document Server

    Neittaanmäki, P

    2004-01-01

    Recent decades have seen a very rapid success in developing numerical methods based on explicit control over approximation errors. It may be said that nowadays a new direction is forming in numerical analysis, the main goal of which is to develop methods ofreliable computations. In general, a reliable numerical method must solve two basic problems: (a) generate a sequence of approximations that converges to a solution and (b) verify the accuracy of these approximations. A computer code for such a method must consist of two respective blocks: solver and checker.In this book, we are chie

  10. Reliability Study Regarding the Use of Histogram Similarity Methods for Damage Detection

    Directory of Open Access Journals (Sweden)

    Nicoleta Gillich

    2013-01-01

    Full Text Available The paper analyses the reliability of three dissimilarity estimators to compare histograms, as support for a frequency-based damage detection method, able to identify structural changes in beam-like structures. First a brief presentation of the own developed damage detection method is made, with focus on damage localization. It consists actually in comparing a histogram derived from measurement results, with a large series of histograms, namely the damage location indexes for all locations along the beam, obtained by calculus. We tested some dissimilarity estimators like the Minkowski-form Distances, the Kullback-Leibler Divergence and the Histogram Intersection and found the Minkowski Distance as the method providing best results. It was tested for numerous locations, using real measurement results and with results artificially debased by noise, proving its reliability.

  11. A rapid reliability estimation method for directed acyclic lifeline networks with statistically dependent components

    International Nuclear Information System (INIS)

    Kang, Won-Hee; Kliese, Alyce

    2014-01-01

    Lifeline networks, such as transportation, water supply, sewers, telecommunications, and electrical and gas networks, are essential elements for the economic and societal functions of urban areas, but their components are highly susceptible to natural or man-made hazards. In this context, it is essential to provide effective pre-disaster hazard mitigation strategies and prompt post-disaster risk management efforts based on rapid system reliability assessment. This paper proposes a rapid reliability estimation method for node-pair connectivity analysis of lifeline networks especially when the network components are statistically correlated. Recursive procedures are proposed to compound all network nodes until they become a single super node representing the connectivity between the origin and destination nodes. The proposed method is applied to numerical network examples and benchmark interconnected power and water networks in Memphis, Shelby County. The connectivity analysis results show the proposed method's reasonable accuracy and remarkable efficiency as compared to the Monte Carlo simulations

  12. A Sequential Kriging reliability analysis method with characteristics of adaptive sampling regions and parallelizability

    International Nuclear Information System (INIS)

    Wen, Zhixun; Pei, Haiqing; Liu, Hai; Yue, Zhufeng

    2016-01-01

    The sequential Kriging reliability analysis (SKRA) method has been developed in recent years for nonlinear implicit response functions which are expensive to evaluate. This type of method includes EGRA: the efficient reliability analysis method, and AK-MCS: the active learning reliability method combining Kriging model and Monte Carlo simulation. The purpose of this paper is to improve SKRA by adaptive sampling regions and parallelizability. The adaptive sampling regions strategy is proposed to avoid selecting samples in regions where the probability density is so low that the accuracy of these regions has negligible effects on the results. The size of the sampling regions is adapted according to the failure probability calculated by last iteration. Two parallel strategies are introduced and compared, aimed at selecting multiple sample points at a time. The improvement is verified through several troublesome examples. - Highlights: • The ISKRA method improves the efficiency of SKRA. • Adaptive sampling regions strategy reduces the number of needed samples. • The two parallel strategies reduce the number of needed iterations. • The accuracy of the optimal value impacts the number of samples significantly.

  13. Reliability of an experimental method to analyse the impact point on a golf ball during putting.

    Science.gov (United States)

    Richardson, Ashley K; Mitchell, Andrew C S; Hughes, Gerwyn

    2015-06-01

    This study aimed to examine the reliability of an experimental method identifying the location of the impact point on a golf ball during putting. Forty trials were completed using a mechanical putting robot set to reproduce a putt of 3.2 m, with four different putter-ball combinations. After locating the centre of the dimple pattern (centroid) the following variables were tested; distance of the impact point from the centroid, angle of the impact point from the centroid and distance of the impact point from the centroid derived from the X, Y coordinates. Good to excellent reliability was demonstrated in all impact variables reflected in very strong relative (ICC = 0.98-1.00) and absolute reliability (SEM% = 0.9-4.3%). The highest SEM% observed was 7% for the angle of the impact point from the centroid. In conclusion, the experimental method was shown to be reliable at locating the centroid location of a golf ball, therefore allowing for the identification of the point of impact with the putter head and is suitable for use in subsequent studies.

  14. Update of Standard Practices for New Method Validation in Forensic Toxicology.

    Science.gov (United States)

    Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T

    2017-01-01

    International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  15. Exploratory factor analysis and reliability analysis with missing data: A simple method for SPSS users

    Directory of Open Access Journals (Sweden)

    Bruce Weaver

    2014-09-01

    Full Text Available Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009 argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM covariances(or correlations as input for the analysis. SPSS users who have the Missing Values Analysis add-on module can obtain vectors ofEM means and standard deviations plus EM correlation and covariance matrices via the MVA procedure. But unfortunately, MVA has no /MATRIX subcommand, and therefore cannot write the EM correlations directly to a matrix dataset of the type needed as input to the FACTOR and RELIABILITY procedures. We describe two macros that (in conjunction with an intervening MVA command carry out the data management steps needed to create two matrix datasets, one containing EM correlations and the other EM covariances. Either of those matrix datasets can then be used asinput to the FACTOR procedure, and the EM correlations can also be used as input to RELIABILITY. We provide an example that illustrates the use of the two macros to generate the matrix datasets and how to use those datasets as input to the FACTOR and RELIABILITY procedures. We hope that this simple method for handling missing data will prove useful to both students andresearchers who are conducting EFA or reliability analysis.

  16. Reliability of fitness tests using methods and time periods common in sport and occupational management.

    Science.gov (United States)

    Burnstein, Bryan D; Steele, Russell J; Shrier, Ian

    2011-01-01

    Fitness testing is used frequently in many areas of physical activity, but the reliability of these measurements under real-world, practical conditions is unknown. To evaluate the reliability of specific fitness tests using the methods and time periods used in the context of real-world sport and occupational management. Cohort study. Eighteen different Cirque du Soleil shows. Cirque du Soleil physical performers who completed 4 consecutive tests (6-month intervals) and were free of injury or illness at each session (n = 238 of 701 physical performers). Performers completed 6 fitness tests on each assessment date: dynamic balance, Harvard step test, handgrip, vertical jump, pull-ups, and 60-second jump test. We calculated the intraclass coefficient (ICC) and limits of agreement between baseline and each time point and the ICC over all 4 time points combined. Reliability was acceptable (ICC > 0.6) over an 18-month time period for all pairwise comparisons and all time points together for the handgrip, vertical jump, and pull-up assessments. The Harvard step test and 60-second jump test had poor reliability (ICC < 0.6) between baseline and other time points. When we excluded the baseline data and calculated the ICC for 6-month, 12-month, and 18-month time points, both the Harvard step test and 60-second jump test demonstrated acceptable reliability. Dynamic balance was unreliable in all contexts. Limit-of-agreement analysis demonstrated considerable intraindividual variability for some tests and a learning effect by administrators on others. Five of the 6 tests in this battery had acceptable reliability over an 18-month time frame, but the values for certain individuals may vary considerably from time to time for some tests. Specific tests may require a learning period for administrators.

  17. Network reliability analysis of complex systems using a non-simulation-based method

    International Nuclear Information System (INIS)

    Kim, Youngsuk; Kang, Won-Hee

    2013-01-01

    Civil infrastructures such as transportation, water supply, sewers, telecommunications, and electrical and gas networks often establish highly complex networks, due to their multiple source and distribution nodes, complex topology, and functional interdependence between network components. To understand the reliability of such complex network system under catastrophic events such as earthquakes and to provide proper emergency management actions under such situation, efficient and accurate reliability analysis methods are necessary. In this paper, a non-simulation-based network reliability analysis method is developed based on the Recursive Decomposition Algorithm (RDA) for risk assessment of generic networks whose operation is defined by the connections of multiple initial and terminal node pairs. The proposed method has two separate decomposition processes for two logical functions, intersection and union, and combinations of these processes are used for the decomposition of any general system event with multiple node pairs. The proposed method is illustrated through numerical network examples with a variety of system definitions, and is applied to a benchmark gas transmission pipe network in Memphis TN to estimate the seismic performance and functional degradation of the network under a set of earthquake scenarios.

  18. Coupling finite elements and reliability methods - application to safety evaluation of pressurized water reactor vessels

    International Nuclear Information System (INIS)

    Pitner, P.; Venturini, V.

    1995-02-01

    When reliability studies are extended form deterministic calculations in mechanics, it is necessary to take into account input parameters variabilities which are linked to the different sources of uncertainty. Integrals must then be calculated to evaluate the failure risk. This can be performed either by simulation methods, or by approximations ones (FORM/SORM). Model in mechanics often require to perform calculation codes. These ones must then be coupled with the reliability calculations. Theses codes can involve large calculation times when they are invoked numerous times during simulations sequences or in complex iterative procedures. Response surface method gives an approximation of the real response from a reduced number of points for which the finite element code is run. Thus, when it is combined with FORM/SORM methods, a coupling can be carried out which gives results in a reasonable calculation time. An application of response surface method to mechanics reliability coupling for a mechanical model which calls for a finite element code is presented. It corresponds to a probabilistic fracture mechanics study of a pressurized water reactor vessel. (authors). 5 refs., 3 figs

  19. A fracture mechanics and reliability based method to assess non-destructive testings for pressure vessels

    International Nuclear Information System (INIS)

    Kitagawa, Hideo; Hisada, Toshiaki

    1979-01-01

    Quantitative evaluation has not been made on the effects of carrying out preservice and in-service nondestructive tests for securing the soundness, safety and maintainability of pressure vessels, spending large expenses and labor. Especially the problems concerning the time and interval of in-service inspections lack the reasonable, quantitative evaluation method. In this paper, the problems of pressure vessels are treated by having developed the analysis method based on reliability technology and probability theory. The growth of surface cracks in pressure vessels was estimated, using the results of previous studies. The effects of nondestructive inspection on the defects in pressure vessels were evaluated, and the influences of many factors, such as plate thickness, stress, the accuracy of inspection and so on, on the effects of inspection, and the method of evaluating the inspections at unequal intervals were investigated. The analysis of reliability taking in-service inspection into consideration, the evaluation of in-service inspection and other affecting factors through the typical examples of analysis, and the review concerning the time of inspection are described. The method of analyzing the reliability of pressure vessels, considering the growth of defects and preservice and in-service nondestructive tests, was able to be systematized so as to be practically usable. (Kako, I.)

  20. PROOF OF CONCEPT FOR A HUMAN RELIABILITY ANALYSIS METHOD FOR HEURISTIC USABILITY EVALUATION OF SOFTWARE

    International Nuclear Information System (INIS)

    Ronald L. Boring; David I. Gertman; Jeffrey C. Joe; Julie L. Marble

    2005-01-01

    An ongoing issue within human-computer interaction (HCI) is the need for simplified or ''discount'' methods. The current economic slowdown has necessitated innovative methods that are results driven and cost effective. The myriad methods of design and usability are currently being cost-justified, and new techniques are actively being explored that meet current budgets and needs. Recent efforts in human reliability analysis (HRA) are highlighted by the ten-year development of the Standardized Plant Analysis Risk HRA (SPAR-H) method. The SPAR-H method has been used primarily for determining human centered risk at nuclear power plants. The SPAR-H method, however, shares task analysis underpinnings with HCI. Despite this methodological overlap, there is currently no HRA approach deployed in heuristic usability evaluation. This paper presents an extension of the existing SPAR-H method to be used as part of heuristic usability evaluation in HCI

  1. Comparison of sample preparation methods for reliable plutonium and neptunium urinalysis using automatic extraction chromatography

    DEFF Research Database (Denmark)

    Qiao, Jixin; Xu, Yihong; Hou, Xiaolin

    2014-01-01

    This paper describes improvement and comparison of analytical methods for simultaneous determination of trace-level plutonium and neptunium in urine samples by inductively coupled plasma mass spectrometry (ICP-MS). Four sample pre-concentration techniques, including calcium phosphate, iron......), it endows urinalysis methods with better reliability and repeatability compared with co-precipitation techniques. In view of the applicability of different pre-concentration techniques proposed previously in the literature, the main challenge behind relevant method development is pointed to be the release...

  2. Reliability of different methods used for forming of working samples in the laboratory for seed testing

    Directory of Open Access Journals (Sweden)

    Opra Branislava

    2000-01-01

    Full Text Available The testing of seed quality starts from the moment a sample is formed in a warehouse during processing or packaging of the seed. The seed sampling as the process of obtaining the working sample also assumes each step undertaken during its testing in the laboratory. With the aim of appropriate forming of a seed sample in the laboratory, the usage of seed divider is prescribed for large seeded species (such as seed the size of wheat or larger (ISTA Rules, 1999. The aim of this paper was the comparison of different methods used for obtaining the working samples of maize and wheat seeds using conical, soil and centrifugal dividers. The number of seed of added admixtures confirmed the reliability of working samples formation. To each maize sample (1000 g 10 seeds of the following admixtures were added: Zea mays L. (red pericarp, Hordeum vulgäre L., Triticum aestivum L., and Glycine max (L. Merr. Two methods were used for formation of maze seed working sample. To wheat samples (1000 g 10 seeds of each of the following species were added: Avena saliva (hulled seeds, Hordeum vulgäre L., Galium tricorne Stokes, and Polygonum lapatifolmm L. For formation of wheat seed working samples four methods were used. Optimum of 9, but not less than 7 seeds of admixture were due to be determined in the maize seed working sample, while for wheat, at least one seed of admixture was expected to be found in the working sample. The obtained results confirmed that the formation of the maize seed working samples was the most reliable when centrifugal divider, the first method was used (average of admixture - 9.37. From the observed admixtures the seed of Triticum aestivum L. was the most uniformly distributed, the first method also being used (6.93. The second method gains high average values satisfying the given criterion, but it should be used with previous homogenization of the sample being tested. The forming of wheat seed working samples is the most reliable if the

  3. Development of an analysis rule of diagnosis error for standard method of human reliability analysis

    International Nuclear Information System (INIS)

    Jeong, W. D.; Kang, D. I.; Jeong, K. S.

    2003-01-01

    This paper presents the status of development of Korea standard method for Human Reliability Analysis (HRA), and proposed a standard procedure and rules for the evaluation of diagnosis error probability. The quality of KSNP HRA was evaluated using the requirement of ASME PRA standard guideline, and the design requirement for the standard HRA method was defined. Analysis procedure and rules, developed so far, to analyze diagnosis error probability was suggested as a part of the standard method. And also a study of comprehensive application was performed to evaluate the suitability of the proposed rules

  4. Review on Laryngeal Palpation Methods in Muscle Tension Dysphonia: Validity and Reliability Issues.

    Science.gov (United States)

    Khoddami, Seyyedeh Maryam; Ansari, Noureddin Nakhostin; Jalaie, Shohreh

    2015-07-01

    Laryngeal palpation is a common clinical method for the assessment of neck and laryngeal muscles in muscle tension dysphonia (MTD). To review the available laryngeal palpation methods used in patients with MTD for the assessment, diagnosis, or document of treatment outcomes. A systematic review of the literature concerning palpatory methods in MTD was conducted using the databases MEDLINE (PubMed), ScienceDirect, Scopus, Web of science, Web of knowledge and Cochrane Library between July and October 2013. Relevant studies were identified by one reviewer based on screened titles/abstracts and full texts. Manual searching was also used to track the source literature. There were five main as well as miscellaneous palpation methods that were different according to target anatomical structures, judgment or grading system, and using tasks. There were only a few scales available, and the majority of the palpatory methods were qualitative. Most of the palpatory methods evaluate the tension at both static and dynamic tasks. There was little information about the validity and reliability of the available methods. The literature on the scientific evidence of muscle tension indicators perceived by laryngeal palpation in MTD is scarce. Future studies should be conducted to investigate the validity and reliability of palpation methods. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  5. DEPEND-HRA-A method for consideration of dependency in human reliability analysis

    International Nuclear Information System (INIS)

    Cepin, Marko

    2008-01-01

    A consideration of dependencies between human actions is an important issue within the human reliability analysis. A method was developed, which integrates the features of existing methods and the experience from a full scope plant simulator. The method is used on real plant-specific human reliability analysis as a part of the probabilistic safety assessment of a nuclear power plant. The method distinguishes dependency for pre-initiator events from dependency for initiator and post-initiator events. The method identifies dependencies based on scenarios, where consecutive human actions are modeled, and based on a list of minimal cut sets, which is obtained by running the minimal cut set analysis considering high values of human error probabilities in the evaluation. A large example study, which consisted of a large number of human failure events, demonstrated the applicability of the method. Comparative analyses that were performed show that both selection of dependency method and selection of dependency levels within the method largely impact the results of probabilistic safety assessment. If the core damage frequency is not impacted much, the listings of important basic events in terms of risk increase and risk decrease factors may change considerably. More efforts are needed on the subject, which will prepare the background for more detailed guidelines, which will remove the subjectivity from the evaluations as much as it is possible

  6. An overview of the China Bioanalytical Forum: interview with Daniel Tang.

    Science.gov (United States)

    Tang, Daniel

    2017-02-01

    Daniel Tang talks to Sankeetha Nadarajah, Commissioning Editor (Bioanalysis), regarding the China Bioanalysis Forum (CBF), in which Daniel was one of the co-founders and remains as its co-chair. Daniel is currently the CEO of UP Pharma, a biologics focused bioanalytical CRO in China.

  7. pH adjustment of human blood plasma prior to bioanalytical sample preparation

    NARCIS (Netherlands)

    Hendriks, G.; Uges, D. R. A.; Franke, J. P.

    2008-01-01

    pH adjustment in bioanalytical sample preparation concerning ionisable compounds is one of the most common sample treatments. This is often done by mixing an aliquot of the sample with a proper buffer adjusted to the proposed pH. The pH of the resulting mixture however, does not necessarily have to

  8. Method of administration of PROMIS scales did not significantly impact score level, reliability, or validity

    DEFF Research Database (Denmark)

    Bjorner, Jakob B; Rose, Matthias; Gandek, Barbara

    2014-01-01

    OBJECTIVES: To test the impact of the method of administration (MOA) on score level, reliability, and validity of scales developed in the Patient Reported Outcomes Measurement Information System (PROMIS). STUDY DESIGN AND SETTING: Two nonoverlapping parallel forms each containing eight items from......, no significant mode differences were found and all confidence intervals were within the prespecified minimal important difference of 0.2 standard deviation. Parallel-forms reliabilities were very high (ICC = 0.85-0.93). Only one across-mode ICC was significantly lower than the same-mode ICC. Tests of validity...... questionnaire (PQ), personal digital assistant (PDA), or personal computer (PC) and a second form by PC, in the same administration. Method equivalence was evaluated through analyses of difference scores, intraclass correlations (ICCs), and convergent/discriminant validity. RESULTS: In difference score analyses...

  9. Establishing survey validity and reliability for American Indians through "think aloud" and test-retest methods.

    Science.gov (United States)

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L; Burgess, Katherine M; Puumala, Susan E; Wilton, Georgiana; Hanson, Jessica D

    2015-06-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a "think aloud" methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test-retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test-retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. © The Author(s) 2015.

  10. Reliability and Sensitivity Analysis for Laminated Composite Plate Using Response Surface Method

    International Nuclear Information System (INIS)

    Lee, Seokje; Kim, Ingul; Jang, Moonho; Kim, Jaeki; Moon, Jungwon

    2013-01-01

    Advanced fiber-reinforced laminated composites are widely used in various fields of engineering to reduce weight. The material property of each ply is well known; specifically, it is known that ply is less reliable than metallic materials and very sensitive to the loading direction. Therefore, it is important to consider this uncertainty in the design of laminated composites. In this study, reliability analysis is conducted using Callosum and Meatball interactions for a laminated composite plate for the case in which the tip deflection is the design requirement and the material property is a random variable. Furthermore, the efficiency and accuracy of the approximation method is identified, and a probabilistic sensitivity analysis is conducted. As a result, we can prove the applicability of the advanced design method for the stabilizer of an underwater vehicle

  11. Reliability and Sensitivity Analysis for Laminated Composite Plate Using Response Surface Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seokje; Kim, Ingul [Chungnam National Univ., Daejeon (Korea, Republic of); Jang, Moonho; Kim, Jaeki; Moon, Jungwon [LIG Nex1, Yongin (Korea, Republic of)

    2013-04-15

    Advanced fiber-reinforced laminated composites are widely used in various fields of engineering to reduce weight. The material property of each ply is well known; specifically, it is known that ply is less reliable than metallic materials and very sensitive to the loading direction. Therefore, it is important to consider this uncertainty in the design of laminated composites. In this study, reliability analysis is conducted using Callosum and Meatball interactions for a laminated composite plate for the case in which the tip deflection is the design requirement and the material property is a random variable. Furthermore, the efficiency and accuracy of the approximation method is identified, and a probabilistic sensitivity analysis is conducted. As a result, we can prove the applicability of the advanced design method for the stabilizer of an underwater vehicle.

  12. INNOVATIVE METHODS TO EVALUATE THE RELIABILITY OF INFORMATION CONSOLIDATED FINANCIAL STATEMENTS

    Directory of Open Access Journals (Sweden)

    Irina P. Kurochkina

    2014-01-01

    Full Text Available The article explores the possibility of using foreign innovative methods to assess the reliabilityof information consolidated fi nancial statements of Russian companies. Recommendations aremade under their adaptation and applicationinto commercial organizations. Banish methodindicators are implemented in one of the world’s largest vertically integrated steel and miningcompanies. Audit firms are proposed to usemethods of assessing the reliability of information in the practical application of ISA.

  13. Use of simulation methods in the evaluation of reliability and availability of complex system

    International Nuclear Information System (INIS)

    Maigret, N.; Duchemin, B.; Robert, T.; Villeneuve, J.J. de; Lanore, J.M.

    1982-04-01

    After a short review of the available standard methods in the reliability field like Boolean algebra for fault tree and the semi-regeneration theory for Markov, this paper shows how the BIAF code based on state description of a system and simulation techique can solve many problems. It also shows how the use of importance sampling and biasing techniques allows us to deal with the rare event problem

  14. A new method for evaluating the availability, reliability, and maintainability whatever may be the probability law

    International Nuclear Information System (INIS)

    Doyon, L.R.; CEA Centre d'Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette

    1975-01-01

    A simple method is presented for computer solving every system model (availability, reliability, and maintenance) with intervals between failures, and time duration for repairs distributed according to any probability law, and for any maintainance policy. A matrix equation is obtained using Markov diagrams. An example is given with the solution by the APAFS program (Algorithme Pour l'Analyse de la Fiabilite des Systemes) [fr

  15. Reliability design of a critical facility: An application of PRA methods

    International Nuclear Information System (INIS)

    Souza Vieira Neto, A.; Souza Borges, W. de

    1987-01-01

    Although a general agreement concerning the enforcement of reliability (probabilistic) design criteria for nuclear utilities is yet to be achieved. PRA methodology can still be used successfully as a project design and review tool, aimed at improving system's prospective performance or minimizing expected accident consequences. In this paper, the potential of such an application of PRA methods is examined in the special case of a critical design project currently being developed in Brazil. (orig.)

  16. Data collection on the unit control room simulator as a method of operator reliability analysis

    International Nuclear Information System (INIS)

    Holy, J.

    1998-01-01

    The report consists of the following chapters: (1) Probabilistic assessment of nuclear power plant operation safety and human factor reliability analysis; (2) Simulators and simulations as human reliability analysis tools; (3) DOE project for using the collection and analysis of data from the unit control room simulator in human factor reliability analysis at the Paks nuclear power plant; (4) General requirements for the organization of the simulator data collection project; (5) Full-scale simulator at the Nuclear Power Plants Research Institute in Trnava, Slovakia, used as a training means for operators of the Dukovany NPP; (6) Assessment of the feasibility of quantification of important human actions modelled within a PSA study by employing simulator data analysis; (7) Assessment of the feasibility of using the various exercise topics for the quantification of the PSA model; (8) Assessment of the feasibility of employing the simulator in the analysis of the individual factors affecting the operator's activity; and (9) Examples of application of statistical methods in the analysis of the human reliability factor. (P.A.)

  17. System principles, mathematical models and methods to ensure high reliability of safety systems

    Science.gov (United States)

    Zaslavskyi, V.

    2017-04-01

    Modern safety and security systems are composed of a large number of various components designed for detection, localization, tracking, collecting, and processing of information from the systems of monitoring, telemetry, control, etc. They are required to be highly reliable in a view to correctly perform data aggregation, processing and analysis for subsequent decision making support. On design and construction phases of the manufacturing of such systems a various types of components (elements, devices, and subsystems) are considered and used to ensure high reliability of signals detection, noise isolation, and erroneous commands reduction. When generating design solutions for highly reliable systems a number of restrictions and conditions such as types of components and various constrains on resources should be considered. Various types of components perform identical functions; however, they are implemented using diverse principles, approaches and have distinct technical and economic indicators such as cost or power consumption. The systematic use of different component types increases the probability of tasks performing and eliminates the common cause failure. We consider type-variety principle as an engineering principle of system analysis, mathematical models based on this principle, and algorithms for solving optimization problems of highly reliable safety and security systems design. Mathematical models are formalized in a class of two-level discrete optimization problems of large dimension. The proposed approach, mathematical models, algorithms can be used for problem solving of optimal redundancy on the basis of a variety of methods and control devices for fault and defects detection in technical systems, telecommunication networks, and energy systems.

  18. Decreasing inventory of a cement factory roller mill parts using reliability centered maintenance method

    Science.gov (United States)

    Witantyo; Rindiyah, Anita

    2018-03-01

    According to data from maintenance planning and control, it was obtained that highest inventory value is non-routine components. Maintenance components are components which procured based on maintenance activities. The problem happens because there is no synchronization between maintenance activities and the components required. Reliability Centered Maintenance method is used to overcome the problem by reevaluating maintenance activities required components. The case chosen is roller mill system because it has the highest unscheduled downtime record. Components required for each maintenance activities will be determined by its failure distribution, so the number of components needed could be predicted. Moreover, those components will be reclassified from routine component to be non-routine component, so the procurement could be carried out regularly. Based on the conducted analysis, failure happens in almost every maintenance task are classified to become scheduled on condition task, scheduled discard task, schedule restoration task and no schedule maintenance. From 87 used components for maintenance activities are evaluated and there 19 components that experience reclassification from non-routine components to routine components. Then the reliability and need of those components were calculated for one-year operation period. Based on this invention, it is suggested to change all of the components in overhaul activity to increase the reliability of roller mill system. Besides, the inventory system should follow maintenance schedule and the number of required components in maintenance activity so the value of procurement will be decreased and the reliability system will increase.

  19. Reliability of Lyapunov characteristic exponents computed by the two-particle method

    Science.gov (United States)

    Mei, Lijie; Huang, Li

    2018-03-01

    For highly complex problems, such as the post-Newtonian formulation of compact binaries, the two-particle method may be a better, or even the only, choice to compute the Lyapunov characteristic exponent (LCE). This method avoids the complex calculations of variational equations compared with the variational method. However, the two-particle method sometimes provides spurious estimates to LCEs. In this paper, we first analyze the equivalence in the definition of LCE between the variational and two-particle methods for Hamiltonian systems. Then, we develop a criterion to determine the reliability of LCEs computed by the two-particle method by considering the magnitude of the initial tangent (or separation) vector ξ0 (or δ0), renormalization time interval τ, machine precision ε, and global truncation error ɛT. The reliable Lyapunov characteristic indicators estimated by the two-particle method form a V-shaped region, which is restricted by d0, ε, and ɛT. Finally, the numerical experiments with the Hénon-Heiles system, the spinning compact binaries, and the post-Newtonian circular restricted three-body problem strongly support the theoretical results.

  20. A dynamic particle filter-support vector regression method for reliability prediction

    International Nuclear Information System (INIS)

    Wei, Zhao; Tao, Tao; ZhuoShu, Ding; Zio, Enrico

    2013-01-01

    Support vector regression (SVR) has been applied to time series prediction and some works have demonstrated the feasibility of its use to forecast system reliability. For accuracy of reliability forecasting, the selection of SVR's parameters is important. The existing research works on SVR's parameters selection divide the example dataset into training and test subsets, and tune the parameters on the training data. However, these fixed parameters can lead to poor prediction capabilities if the data of the test subset differ significantly from those of training. Differently, the novel method proposed in this paper uses particle filtering to estimate the SVR model parameters according to the whole measurement sequence up to the last observation instance. By treating the SVR training model as the observation equation of a particle filter, our method allows updating the SVR model parameters dynamically when a new observation comes. Because of the adaptability of the parameters to dynamic data pattern, the new PF–SVR method has superior prediction performance over that of standard SVR. Four application results show that PF–SVR is more robust than SVR to the decrease of the number of training data and the change of initial SVR parameter values. Also, even if there are trends in the test data different from those in the training data, the method can capture the changes, correct the SVR parameters and obtain good predictions. -- Highlights: •A dynamic PF–SVR method is proposed to predict the system reliability. •The method can adjust the SVR parameters according to the change of data. •The method is robust to the size of training data and initial parameter values. •Some cases based on both artificial and real data are studied. •PF–SVR shows superior prediction performance over standard SVR

  1. Reliability and validity of a brief method to assess nociceptive flexion reflex (NFR) threshold.

    Science.gov (United States)

    Rhudy, Jamie L; France, Christopher R

    2011-07-01

    The nociceptive flexion reflex (NFR) is a physiological tool to study spinal nociception. However, NFR assessment can take several minutes and expose participants to repeated suprathreshold stimulations. The 4 studies reported here assessed the reliability and validity of a brief method to assess NFR threshold that uses a single ascending series of stimulations (Peak 1 NFR), by comparing it to a well-validated method that uses 3 ascending/descending staircases of stimulations (Staircase NFR). Correlations between the NFR definitions were high, were on par with test-retest correlations of Staircase NFR, and were not affected by participant sex or chronic pain status. Results also indicated the test-retest reliabilities for the 2 definitions were similar. Using larger stimulus increments (4 mAs) to assess Peak 1 NFR tended to result in higher NFR threshold estimates than using the Staircase NFR definition, whereas smaller stimulus increments (2 mAs) tended to result in lower NFR threshold estimates than the Staircase NFR definition. Neither NFR definition was correlated with anxiety, pain catastrophizing, or anxiety sensitivity. In sum, a single ascending series of electrical stimulations results in a reliable and valid estimate of NFR threshold. However, caution may be warranted when comparing NFR thresholds across studies that differ in the ascending stimulus increments. This brief method to assess NFR threshold is reliable and valid; therefore, it should be useful to clinical pain researchers interested in quickly assessing inter- and intra-individual differences in spinal nociceptive processes. Copyright © 2011 American Pain Society. Published by Elsevier Inc. All rights reserved.

  2. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  3. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  4. Matrix-based system reliability method and applications to bridge networks

    International Nuclear Information System (INIS)

    Kang, W.-H.; Song Junho; Gardoni, Paolo

    2008-01-01

    Using a matrix-based system reliability (MSR) method, one can estimate the probabilities of complex system events by simple matrix calculations. Unlike existing system reliability methods whose complexity depends highly on that of the system event, the MSR method describes any general system event in a simple matrix form and therefore provides a more convenient way of handling the system event and estimating its probability. Even in the case where one has incomplete information on the component probabilities and/or the statistical dependence thereof, the matrix-based framework enables us to estimate the narrowest bounds on the system failure probability by linear programming. This paper presents the MSR method and applies it to a transportation network consisting of bridge structures. The seismic failure probabilities of bridges are estimated by use of the predictive fragility curves developed by a Bayesian methodology based on experimental data and existing deterministic models of the seismic capacity and demand. Using the MSR method, the probability of disconnection between each city/county and a critical facility is estimated. The probability mass function of the number of failed bridges is computed as well. In order to quantify the relative importance of bridges, the MSR method is used to compute the conditional probabilities of bridge failures given that there is at least one city disconnected from the critical facility. The bounds on the probability of disconnection are also obtained for cases with incomplete information

  5. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  6. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  7. ImageJ: A Free, Easy, and Reliable Method to Measure Leg Ulcers Using Digital Pictures.

    Science.gov (United States)

    Aragón-Sánchez, Javier; Quintana-Marrero, Yurena; Aragón-Hernández, Cristina; Hernández-Herero, María José

    2017-12-01

    Wound measurement to document the healing course of chronic leg ulcers has an important role in the management of these patients. Digital cameras in smartphones are readily available and easy to use, and taking pictures of wounds is becoming a routine in specialized departments. Analyzing digital pictures with appropriate software provides clinicians a quick, clean, and easy-to-use tool for measuring wound area. A set of 25 digital pictures of plain foot and leg ulcers was the basis of this study. Photographs were taken placing a ruler next to the wound in parallel with the healthy skin with the iPhone 6S (Apple Inc, Cupertino, CA), which has a camera of 12 megapixels using the flash. The digital photographs were visualized with ImageJ 1.45s freeware (National Institutes of Health, Rockville, MD; http://imagej.net/ImageJ ). Wound area measurement was carried out by 4 raters: head of the department, wound care nurse, physician, and medical student. We assessed intra- and interrater reliability using the interclass correlation coefficient. To determine intraobserver reliability, 2 of the raters repeated the measurement of the set 1 week after the first reading. The interrater model displayed an interclass correlation coefficient of 0.99 with 95% confidence interval of 0.999 to 1.000, showing excellent reliability. The intrarater model of both examiners showed excellent reliability. In conclusion, analyzing digital images of leg ulcers with ImageJ estimates wound area with excellent reliability. This method provides a free, rapid, and accurate way to measure wounds and could routinely be used to document wound healing in daily clinical practice.

  8. Reliability and Validity of 3 Methods of Assessing Orthopedic Resident Skill in Shoulder Surgery.

    Science.gov (United States)

    Bernard, Johnathan A; Dattilo, Jonathan R; Srikumaran, Uma; Zikria, Bashir A; Jain, Amit; LaPorte, Dawn M

    Traditional measures for evaluating resident surgical technical skills (e.g., case logs) assess operative volume but not level of surgical proficiency. Our goal was to compare the reliability and validity of 3 tools for measuring surgical skill among orthopedic residents when performing 3 open surgical approaches to the shoulder. A total of 23 residents at different stages of their surgical training were tested for technical skill pertaining to 3 shoulder surgical approaches using the following measures: Objective Structured Assessment of Technical Skills (OSATS) checklists, the Global Rating Scale (GRS), and a final pass/fail assessment determined by 3 upper extremity surgeons. Adverse events were recorded. The Cronbach α coefficient was used to assess reliability of the OSATS checklists and GRS scores. Interrater reliability was calculated with intraclass correlation coefficients. Correlations among OSATS checklist scores, GRS scores, and pass/fail assessment were calculated with Spearman ρ. Validity of OSATS checklists was determined using analysis of variance with postgraduate year (PGY) as a between-subjects factor. Significance was set at p shoulder approaches. Checklist scores showed superior interrater reliability compared with GRS and subjective pass/fail measurements. GRS scores were positively correlated across training years. The incidence of adverse events was significantly higher among PGY-1 and PGY-2 residents compared with more experienced residents. OSATS checklists are a valid and reliable assessment of technical skills across 3 surgical shoulder approaches. However, checklist scores do not measure quality of technique. Documenting adverse events is necessary to assess quality of technique and ultimate pass/fail status. Multiple methods of assessing surgical skill should be considered when evaluating orthopedic resident surgical performance. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights

  9. Simple, reliable, and nondestructive method for the measurement of vacuum pressure without specialized equipment.

    Science.gov (United States)

    Yuan, Jin-Peng; Ji, Zhong-Hua; Zhao, Yan-Ting; Chang, Xue-Fang; Xiao, Lian-Tuan; Jia, Suo-Tang

    2013-09-01

    We present a simple, reliable, and nondestructive method for the measurement of vacuum pressure in a magneto-optical trap. The vacuum pressure is verified to be proportional to the collision rate constant between cold atoms and the background gas with a coefficient k, which can be calculated by means of the simple ideal gas law. The rate constant for loss due to collisions with all background gases can be derived from the total collision loss rate by a series of loading curves of cold atoms under different trapping laser intensities. The presented method is also applicable for other cold atomic systems and meets the miniaturization requirement of commercial applications.

  10. Features of an advanced human reliability analysis method, AGAPE-ET

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Kyun

    2005-01-01

    This paper presents the main features of an advanced human reliability analysis (HRA) method, AGAPE-ET. It has the capabilities to deal with the diagnosis failures and the errors of commission (EOC), which have not been normally treated in the conventional HRAs. For the analysis of the potential for diagnosis failures, an analysis framework, which is called the misdiagnosis tree analysis (MDTA), and a taxonomy of the misdiagnosis causes with appropriate quantification schemes are provided. For the identification of the EOC events from the misdiagnosis, some procedural guidance is given. An example of the application of the method is also provided

  11. Features of an advanced human reliability analysis method, AGAPE-ET

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Kyun [Korea Atomic Energy Research Institute, Taejeon (Korea, Republic of)

    2005-11-15

    This paper presents the main features of an advanced human reliability analysis (HRA) method, AGAPE-ET. It has the capabilities to deal with the diagnosis failures and the errors of commission (EOC), which have not been normally treated in the conventional HRAs. For the analysis of the potential for diagnosis failures, an analysis framework, which is called the misdiagnosis tree analysis (MDTA), and a taxonomy of the misdiagnosis causes with appropriate quantification schemes are provided. For the identification of the EOC events from the misdiagnosis, some procedural guidance is given. An example of the application of the method is also provided.

  12. Reliability and applications of statistical methods based on oligonucleotide frequencies in bacterial and archaeal genomes

    DEFF Research Database (Denmark)

    Bohlin, J; Skjerve, E; Ussery, David

    2008-01-01

    with here are mainly used to examine similarities between archaeal and bacterial DNA from different genomes. These methods compare observed genomic frequencies of fixed-sized oligonucleotides with expected values, which can be determined by genomic nucleotide content, smaller oligonucleotide frequencies......, or be based on specific statistical distributions. Advantages with these statistical methods include measurements of phylogenetic relationship with relatively small pieces of DNA sampled from almost anywhere within genomes, detection of foreign/conserved DNA, and homology searches. Our aim was to explore...... the reliability and best suited applications for some popular methods, which include relative oligonucleotide frequencies (ROF), di- to hexanucleotide zero'th order Markov methods (ZOM) and 2.order Markov chain Method (MCM). Tests were performed on distant homology searches with large DNA sequences, detection...

  13. Between-day reliability of a method for non-invasive estimation of muscle composition.

    Science.gov (United States)

    Simunič, Boštjan

    2012-08-01

    Tensiomyography is a method for valid and non-invasive estimation of skeletal muscle fibre type composition. The validity of selected temporal tensiomyographic measures has been well established recently; there is, however, no evidence regarding the method's between-day reliability. Therefore it is the aim of this paper to establish the between-day repeatability of tensiomyographic measures in three skeletal muscles. For three consecutive days, 10 healthy male volunteers (mean±SD: age 24.6 ± 3.0 years; height 177.9 ± 3.9 cm; weight 72.4 ± 5.2 kg) were examined in a supine position. Four temporal measures (delay, contraction, sustain, and half-relaxation time) and maximal amplitude were extracted from the displacement-time tensiomyogram. A reliability analysis was performed with calculations of bias, random error, coefficient of variation (CV), standard error of measurement, and intra-class correlation coefficient (ICC) with a 95% confidence interval. An analysis of ICC demonstrated excellent agreement (ICC were over 0.94 in 14 out of 15 tested parameters). However, lower CV was observed in half-relaxation time, presumably because of the specifics of the parameter definition itself. These data indicate that for the three muscles tested, tensiomyographic measurements were reproducible across consecutive test days. Furthermore, we indicated the most possible origin of the lowest reliability detected in half-relaxation time. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Cervical vertebral maturation method and mandibular growth peak: a longitudinal study of diagnostic reliability.

    Science.gov (United States)

    Perinetti, Giuseppe; Primozic, Jasmina; Sharma, Bhavna; Cioffi, Iacopo; Contardo, Luca

    2018-03-28

    The capability of the cervical vertebral maturation (CVM) method in the identification of the mandibular growth peak on an individual basis remains undetermined. The diagnostic reliability of the six-stage CVM method in the identification of the mandibular growth peak was thus investigated. From the files of the Oregon and Burlington Growth Studies (data obtained between early 1950s and middle 1970s), 50 subjects (26 females, 24 males) with at least seven annual lateral cephalograms taken from 9 to 16 years were identified. Cervical vertebral maturation was assessed according to the CVM code staging system, and mandibular growth was defined as annual increments in Co-Gn distance. A diagnostic reliability analysis was carried out to establish the capability of the circumpubertal CVM stages 2, 3, and 4 in the identification of the imminent mandibular growth peak. Variable durations of each of the CVM stages 2, 3, and 4 were seen. The overall diagnostic accuracy values for the CVM stages 2, 3, and 4 were 0.70, 0.76, and 0.77, respectively. These low values appeared to be due to false positive cases. Secular trends in conjunction with the use of a discrete staging system. In most of the Burlington Growth Study sample, the lateral head film at age 15 was missing. None of the CVM stages 2, 3, and 4 reached a satisfactorily diagnostic reliability in the identification of imminent mandibular growth peak.

  15. Machine Maintenance Scheduling with Reliability Engineering Method and Maintenance Value Stream Mapping

    Science.gov (United States)

    Sembiring, N.; Nasution, A. H.

    2018-02-01

    Corrective maintenance i.e replacing or repairing the machine component after machine break down always done in a manufacturing company. It causes the production process must be stopped. Production time will decrease due to the maintenance team must replace or repair the damage machine component. This paper proposes a preventive maintenance’s schedule for a critical component of a critical machine of an crude palm oil and kernel company due to increase maintenance efficiency. The Reliability Engineering & Maintenance Value Stream Mapping is used as a method and a tool to analize the reliability of the component and reduce the wastage in any process by segregating value added and non value added activities.

  16. A review of the evolution of human reliability analysis methods at nuclear industry

    International Nuclear Information System (INIS)

    Oliveira, Lécio N. de; Santos, Isaac José A. Luquetti dos; Carvalho, Paulo V.R.

    2017-01-01

    This paper reviews the status of researches on the application of human reliability analysis methods at nuclear industry and its evolution along the years. Human reliability analysis (HRA) is one of the elements used in Probabilistic Safety Analysis (PSA) and is performed as part of PSAs to quantify the likelihood that people will fail to take action, such as errors of omission and errors of commission. Although HRA may be used at lots of areas, the focus of this paper is to review the applicability of HRA methods along the years at nuclear industry, especially in Nuclear Power Plants (NPP). An electronic search on CAPES Portal of Journals (A bibliographic database) was performed. This literature review covers original papers published since the first generation of HRA methods until the ones published on March 2017. A total of 94 papers were retrieved by the initial search and 13 were selected to be fully reviewed and for data extraction after the application of inclusion and exclusion criteria, quality and suitability evaluation according to applicability at nuclear industry. Results point out that the methods from first generation are more used in practice than methods from second generation. This occurs because it is more concentrated towards quantification, in terms of success or failure of human action what make them useful for quantitative risk assessment to PSA. Although the second generation considers context and error of commission in human error prediction, they are not wider used in practice at nuclear industry to PSA. (author)

  17. Study on Performance Shaping Factors (PSFs) Quantification Method in Human Reliability Analysis (HRA)

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Jang, Inseok Jang; Seong, Poong Hyun; Park, Jinkyun; Kim, Jong Hyun

    2015-01-01

    The purpose of HRA implementation is 1) to achieve the human factor engineering (HFE) design goal of providing operator interfaces that will minimize personnel errors and 2) to conduct an integrated activity to support probabilistic risk assessment (PRA). For these purposes, various HRA methods have been developed such as technique for human error rate prediction (THERP), simplified plant analysis risk human reliability assessment (SPAR-H), cognitive reliability and error analysis method (CREAM) and so on. In performing HRA, such conditions that influence human performances have been represented via several context factors called performance shaping factors (PSFs). PSFs are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. Most HRA methods evaluate the weightings of PSFs by expert judgment and explicit guidance for evaluating the weighting is not provided. It has been widely known that the performance of the human operator is one of the critical factors to determine the safe operation of NPPs. HRA methods have been developed to identify the possibility and mechanism of human errors. In performing HRA methods, the effect of PSFs which may increase or decrease human error should be investigated. However, the effect of PSFs were estimated by expert judgment so far. Accordingly, in order to estimate the effect of PSFs objectively, the quantitative framework to estimate PSFs by using PSF profiles is introduced in this paper

  18. A review of the evolution of human reliability analysis methods at nuclear industry

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Lécio N. de; Santos, Isaac José A. Luquetti dos; Carvalho, Paulo V.R., E-mail: lecionoliveira@gmail.com, E-mail: luquetti@ien.gov.br, E-mail: paulov@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-11-01

    This paper reviews the status of researches on the application of human reliability analysis methods at nuclear industry and its evolution along the years. Human reliability analysis (HRA) is one of the elements used in Probabilistic Safety Analysis (PSA) and is performed as part of PSAs to quantify the likelihood that people will fail to take action, such as errors of omission and errors of commission. Although HRA may be used at lots of areas, the focus of this paper is to review the applicability of HRA methods along the years at nuclear industry, especially in Nuclear Power Plants (NPP). An electronic search on CAPES Portal of Journals (A bibliographic database) was performed. This literature review covers original papers published since the first generation of HRA methods until the ones published on March 2017. A total of 94 papers were retrieved by the initial search and 13 were selected to be fully reviewed and for data extraction after the application of inclusion and exclusion criteria, quality and suitability evaluation according to applicability at nuclear industry. Results point out that the methods from first generation are more used in practice than methods from second generation. This occurs because it is more concentrated towards quantification, in terms of success or failure of human action what make them useful for quantitative risk assessment to PSA. Although the second generation considers context and error of commission in human error prediction, they are not wider used in practice at nuclear industry to PSA. (author)

  19. METHODS OF IMPROVING THE RELIABILITY OF THE CONTROL SYSTEM TRACTION POWER SUPPLY OF ELECTRIC TRANSPORT BASED ON AN EXPERT INFORMATION

    Directory of Open Access Journals (Sweden)

    O. O. Matusevych

    2009-03-01

    Full Text Available The author proposed the numerous methods of solving the multi-criterion task – increasing of reliability of control system on the basis of expert information. The information, which allows choosing thoughtfully the method of reliability increasing for a control system of electric transport, is considered.

  20. Reliability Analysis of a Composite Wind Turbine Blade Section Using the Model Correction Factor Method: Numerical Study and Validation

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov; Friis-Hansen, Peter; Berggreen, Christian

    2013-01-01

    by the composite failure criteria. Each failure mode has been considered in a separate component reliability analysis, followed by a system analysis which gives the total probability of failure of the structure. The Model Correction Factor method used in connection with FORM (First-Order Reliability Method) proved...

  1. Critical role of bioanalytical strategies in investigation of clinical PK observations, a Phase I case study

    Science.gov (United States)

    Peng, Kun; Xu, Keyang; Liu, Luna; Hendricks, Robert; Delarosa, Reginald; Erickson, Rich; Budha, Nageshwar; Leabman, Maya; Song, An; Kaur, Surinder; Fischer, Saloumeh K

    2014-01-01

    RG7652 is a human immunoglobulin 1 (IgG1) monoclonal antibody (mAb) targeting proprotein convertase subtilisin/kexin type 9 (PCSK9) and is designed for the treatment of hypercholesterolemia. A target-binding enzyme-linked immunosorbent assay (ELISA) was developed to measure RG7652 levels in human serum in a Phase I study. Although target-binding assay formats are generally used to quantify free therapeutic, the actual therapeutic species being measured are affected by assay conditions, such as sample dilution and incubation time, and levels of soluble target in the samples. Therefore, in the presence of high concentrations of circulating target, the choice of reagents and assay conditions can have a significant effect on the observed pharmacokinetic (PK) profiles. Phase I RG7652 PK analysis using the ELISA data resulted in a nonlinear dose normalized exposure. An investigation was conducted to characterize the ELISA to determine whether the assay format and reagents may have contributed to the PK observation. In addition, to confirm the ELISA results, a second orthogonal method, liquid chromatography tandem mass spectrometry (LC-MS/MS) using a signature peptide as surrogate, was developed and implemented. A subset of PK samples, randomly selected from half of the subjects in the 6 single ascending dose (SAD) cohorts in the Phase I clinical study, was analyzed with the LC-MS/MS assay, and the data were found to be comparable to the ELISA data. This paper illustrates the importance of reagent characterization, as well as the benefits of using an orthogonal approach to eliminate bioanalytical contributions when encountering unexpected observations. PMID:25484037

  2. Reliability of the input admittance of bowed-string instruments measured by the hammer method.

    Science.gov (United States)

    Zhang, Ailin; Woodhouse, Jim

    2014-12-01

    The input admittance at the bridge, measured by hammer testing, is often regarded as the most useful and convenient measurement of the vibrational behavior of a bowed string instrument. However, this method has been questioned, due especially to differences between human bowing and hammer impact. The goal of the research presented here is to investigate the reliability and accuracy of this classic hammer method. Experimental studies were carried out on cellos, with three different driving conditions and three different boundary conditions. Results suggest that there is nothing fundamentally different about the hammer method, compared to other kinds of excitation. The third series of experiments offers an opportunity to explore the difference between the input admittance measuring from one bridge corner to another and that of single strings. The classic measurement is found to give a reasonable approximation to that of all four strings. Some possible differences between the hammer method and normal bowing and implications of the acoustical results are also discussed.

  3. Reliability-based design optimization using a generalized subset simulation method and posterior approximation

    Science.gov (United States)

    Ma, Yuan-Zhuo; Li, Hong-Shuang; Yao, Wei-Xing

    2018-05-01

    The evaluation of the probabilistic constraints in reliability-based design optimization (RBDO) problems has always been significant and challenging work, which strongly affects the performance of RBDO methods. This article deals with RBDO problems using a recently developed generalized subset simulation (GSS) method and a posterior approximation approach. The posterior approximation approach is used to transform all the probabilistic constraints into ordinary constraints as in deterministic optimization. The assessment of multiple failure probabilities required by the posterior approximation approach is achieved by GSS in a single run at all supporting points, which are selected by a proper experimental design scheme combining Sobol' sequences and Bucher's design. Sequentially, the transformed deterministic design optimization problem can be solved by optimization algorithms, for example, the sequential quadratic programming method. Three optimization problems are used to demonstrate the efficiency and accuracy of the proposed method.

  4. Study on Feasibility of Applying Function Approximation Moment Method to Achieve Reliability-Based Design Optimization

    International Nuclear Information System (INIS)

    Huh, Jae Sung; Kwak, Byung Man

    2011-01-01

    Robust optimization or reliability-based design optimization are some of the methodologies that are employed to take into account the uncertainties of a system at the design stage. For applying such methodologies to solve industrial problems, accurate and efficient methods for estimating statistical moments and failure probability are required, and further, the results of sensitivity analysis, which is needed for searching direction during the optimization process, should also be accurate. The aim of this study is to employ the function approximation moment method into the sensitivity analysis formulation, which is expressed as an integral form, to verify the accuracy of the sensitivity results, and to solve a typical problem of reliability-based design optimization. These results are compared with those of other moment methods, and the feasibility of the function approximation moment method is verified. The sensitivity analysis formula with integral form is the efficient formulation for evaluating sensitivity because any additional function calculation is not needed provided the failure probability or statistical moments are calculated

  5. GNSS Single Frequency, Single Epoch Reliable Attitude Determination Method with Baseline Vector Constraint

    Directory of Open Access Journals (Sweden)

    Ang Gong

    2015-12-01

    Full Text Available For Global Navigation Satellite System (GNSS single frequency, single epoch attitude determination, this paper proposes a new reliable method with baseline vector constraint. First, prior knowledge of baseline length, heading, and pitch obtained from other navigation equipment or sensors are used to reconstruct objective function rigorously. Then, searching strategy is improved. It substitutes gradually Enlarged ellipsoidal search space for non-ellipsoidal search space to ensure correct ambiguity candidates are within it and make the searching process directly be carried out by least squares ambiguity decorrelation algorithm (LAMBDA method. For all vector candidates, some ones are further eliminated by derived approximate inequality, which accelerates the searching process. Experimental results show that compared to traditional method with only baseline length constraint, this new method can utilize a priori baseline three-dimensional knowledge to fix ambiguity reliably and achieve a high success rate. Experimental tests also verify it is not very sensitive to baseline vector error and can perform robustly when angular error is not great.

  6. Prediction method of long-term reliability in improving residual stresses by means of surface finishing

    International Nuclear Information System (INIS)

    Sera, Takehiko; Hirano, Shinro; Chigusa, Naoki; Okano, Shigetaka; Saida, Kazuyoshi; Mochizuki, Masahito; Nishimoto, Kazutoshi

    2012-01-01

    Surface finishing methods, such as Water Jet Peening (WJP), have been applied to welds in some major components of nuclear power plants as a counter measure to Primary Water Stress Corrosion Cracking (PWSCC). In addition, the methods of surface finishing (buffing treatment) is being standardized, and thus the buffing treatment has been also recognized as the well-established method of improving stress. On the other hand, the long-term stability of peening techniques has been confirmed by accelerated test. However, the effectiveness of stress improvement by surface treatment is limited to thin layers and the effect of complicated residual stress distribution in the weld metal beneath the surface is not strictly taken into account for long-term stability. This paper, therefore, describes the accelerated tests, which confirmed that the long-term stability of the layer subjected to buffing treatment was equal to that subjected to WJP. The long-term reliability of very thin stress improved layer was also confirmed through a trial evaluation by thermal elastic-plastic creep analysis, even if the effect of complicated residual stress distribution in the weld metal was excessively taken into account. Considering the above findings, an approach is proposed for constructing the prediction method of the long-term reliability of stress improvement by surface finishing. (author)

  7. Validity and reliability of a method for assessment of cervical vertebral maturation.

    Science.gov (United States)

    Zhao, Xiao-Guang; Lin, Jiuxiang; Jiang, Jiu-Hui; Wang, Qingzhu; Ng, Sut Hong

    2012-03-01

    To evaluate the validity and reliability of the cervical vertebral maturation (CVM) method with a longitudinal sample. Eighty-six cephalograms from 18 subjects (5 males and 13 females) were selected from the longitudinal database. Total mandibular length was measured on each film; an increased rate served as the gold standard in examination of the validity of the CVM method. Eleven orthodontists, after receiving intensive training in the CVM method, evaluated all films twice. Kendall's W and the weighted kappa statistic were employed. Kendall's W values were higher than 0.8 at both times, indicating strong interobserver reproducibility, but interobserver agreement was documented twice at less than 50%. A wide range of intraobserver agreement was noted (40.7%-79.1%), and substantial intraobserver reproducibility was proved by kappa values (0.53-0.86). With regard to validity, moderate agreement was reported between the gold standard and observer staging at the initial time (kappa values 0.44-0.61). However, agreement seemed to be unacceptable for clinical use, especially in cervical stage 3 (26.8%). Even though the validity and reliability of the CVM method proved statistically acceptable, we suggest that many other growth indicators should be taken into consideration in evaluating adolescent skeletal maturation.

  8. Accounting for Model Uncertainties Using Reliability Methods - Application to Carbon Dioxide Geologic Sequestration System. Final Report

    International Nuclear Information System (INIS)

    Mok, Chin Man; Doughty, Christine; Zhang, Keni; Pruess, Karsten; Kiureghian, Armen; Zhang, Miao; Kaback, Dawn

    2010-01-01

    A new computer code, CALRELTOUGH, which uses reliability methods to incorporate parameter sensitivity and uncertainty analysis into subsurface flow and transport models, was developed by Geomatrix Consultants, Inc. in collaboration with Lawrence Berkeley National Laboratory and University of California at Berkeley. The CALREL reliability code was developed at the University of California at Berkely for geotechnical applications and the TOUGH family of codes was developed at Lawrence Berkeley National Laboratory for subsurface flow and tranport applications. The integration of the two codes provides provides a new approach to deal with uncertainties in flow and transport modeling of the subsurface, such as those uncertainties associated with hydrogeology parameters, boundary conditions, and initial conditions of subsurface flow and transport using data from site characterization and monitoring for conditioning. The new code enables computation of the reliability of a system and the components that make up the system, instead of calculating the complete probability distributions of model predictions at all locations at all times. The new CALRELTOUGH code has tremendous potential to advance subsurface understanding for a variety of applications including subsurface energy storage, nuclear waste disposal, carbon sequestration, extraction of natural resources, and environmental remediation. The new code was tested on a carbon sequestration problem as part of the Phase I project. Phase iI was not awarded.

  9. A reliable method for reconstituting thymectomized, lethally irradiated guinea pigs with bone marrow cells

    International Nuclear Information System (INIS)

    Terata, N.; Tanio, Y.; Zbar, B.

    1984-01-01

    The authors developed a reliable method for reconstituting thymectomized, lethally irradiated guinea pigs. Injection of 2.5-10 x 10 7 syngeneic bone marrow cells into adult thymectomized, lethally irradiated guinea pigs produced survival of 46-100% of treated animals. Gentamycin sulfate (5 mg/kg of body weight) for 10 days was required for optimal results. Acidified drinking water (pH 2.5) appeared to be required for optimal results. Thymectomized, lethally irradiated, bone marrow reconstituted ('B') guinea pigs had impaired ability to develop delayed cutaneous hypersensitivity to mycobacterial antigens and cutaneous basophil hypersensitivity to keyhole limpet hemocyanin; proliferative responses to phytohemagglutinin were impaired. (Auth.)

  10. Radiologic identification of disaster victims: A simple and reliable method using CT of the paranasal sinuses

    International Nuclear Information System (INIS)

    Ruder, Thomas D.; Kraehenbuehl, Markus; Gotsmy, Walther F.; Mathier, Sandra; Ebert, Lars C.; Thali, Michael J.; Hatch, Gary M.

    2012-01-01

    Objective: To assess the reliability of radiologic identification using visual comparison of ante and post mortem paranasal sinus computed tomography (CT). Subjects and methods: The study was approved by the responsible justice department and university ethics committee. Four blinded readers with varying radiological experience separately compared 100 post mortem to 25 ante mortem head CTs with the goal to identify as many matching pairs as possible (out of 23 possible matches). Sensitivity, specificity, positive and negative predictive values were calculated for all readers. The chi-square test was applied to establish if there was significant difference in sensitivity between radiologists and non-radiologists. Results: For all readers, sensitivity was 83.7%, specificity was 100.0%, negative predictive value (NPV) was 95.4%, positive predictive value (PPV) was 100.0%, and accuracy was 96.3%. For radiologists, sensitivity was 97.8%, NPV was 99.4%, and accuracy was 99.5%. For non-radiologists, average sensitivity was 69.6%, negative predictive value (NPV) was 91.7%, and accuracy was 93.0%. Radiologists achieved a significantly higher sensitivity (p < 0.01) than non-radiologists. Conclusions: Visual comparison of ante mortem and post mortem CT of the head is a robust and reliable method for identifying unknown decedents, particularly in regard to positive matches. The sensitivity and NPV of the method depend on the reader's experience.

  11. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    Science.gov (United States)

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  12. Accuracy and Reliability of the Klales et al. (2012) Morphoscopic Pelvic Sexing Method.

    Science.gov (United States)

    Lesciotto, Kate M; Doershuk, Lily J

    2018-01-01

    Klales et al. (2012) devised an ordinal scoring system for the morphoscopic pelvic traits described by Phenice (1969) and used for sex estimation of skeletal remains. The aim of this study was to test the accuracy and reliability of the Klales method using a large sample from the Hamann-Todd collection (n = 279). Two observers were blinded to sex, ancestry, and age and used the Klales et al. method to estimate the sex of each individual. Sex was correctly estimated for females with over 95% accuracy; however, the male allocation accuracy was approximately 50%. Weighted Cohen's kappa and intraclass correlation coefficient analysis for evaluating intra- and interobserver error showed moderate to substantial agreement for all traits. Although each trait can be reliably scored using the Klales method, low accuracy rates and high sex bias indicate better trait descriptions and visual guides are necessary to more accurately reflect the range of morphological variation. © 2017 American Academy of Forensic Sciences.

  13. The Global Optimal Algorithm of Reliable Path Finding Problem Based on Backtracking Method

    Directory of Open Access Journals (Sweden)

    Liang Shen

    2017-01-01

    Full Text Available There is a growing interest in finding a global optimal path in transportation networks particularly when the network suffers from unexpected disturbance. This paper studies the problem of finding a global optimal path to guarantee a given probability of arriving on time in a network with uncertainty, in which the travel time is stochastic instead of deterministic. Traditional path finding methods based on least expected travel time cannot capture the network user’s risk-taking behaviors in path finding. To overcome such limitation, the reliable path finding algorithms have been proposed but the convergence of global optimum is seldom addressed in the literature. This paper integrates the K-shortest path algorithm into Backtracking method to propose a new path finding algorithm under uncertainty. The global optimum of the proposed method can be guaranteed. Numerical examples are conducted to demonstrate the correctness and efficiency of the proposed algorithm.

  14. Response and reliability analysis of nonlinear uncertain dynamical structures by the probability density evolution method

    DEFF Research Database (Denmark)

    Nielsen, Søren R. K.; Peng, Yongbo; Sichani, Mahdi Teimouri

    2016-01-01

    The paper deals with the response and reliability analysis of hysteretic or geometric nonlinear uncertain dynamical systems of arbitrary dimensionality driven by stochastic processes. The approach is based on the probability density evolution method proposed by Li and Chen (Stochastic dynamics...... of structures, 1st edn. Wiley, London, 2009; Probab Eng Mech 20(1):33–44, 2005), which circumvents the dimensional curse of traditional methods for the determination of non-stationary probability densities based on Markov process assumptions and the numerical solution of the related Fokker–Planck and Kolmogorov......–Feller equations. The main obstacle of the method is that a multi-dimensional convolution integral needs to be carried out over the sample space of a set of basic random variables, for which reason the number of these need to be relatively low. In order to handle this problem an approach is suggested, which...

  15. A Novel Evaluation Method for Building Construction Project Based on Integrated Information Entropy with Reliability Theory

    Directory of Open Access Journals (Sweden)

    Xiao-ping Bai

    2013-01-01

    Full Text Available Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  16. Calculation of noninformative prior of reliability parameter and initiating event frequency with Jeffreys method

    International Nuclear Information System (INIS)

    He Jie; Zhang Binbin

    2013-01-01

    In the probabilistic safety assessment (PSA) of nuclear power plants, there are few historical records on some initiating event frequencies or component failures in industry. In order to determine the noninformative priors of such reliability parameters and initiating event frequencies, the Jeffreys method in Bayesian statistics was employed. The mathematical mechanism of the Jeffreys prior and the simplified constrained noninformative distribution (SCNID) were elaborated in this paper. The Jeffreys noninformative formulas and the credible intervals of the Gamma-Poisson and Beta-Binomial models were introduced. As an example, the small break loss-of-coolant accident (SLOCA) was employed to show the application of the Jeffreys prior in determining an initiating event frequency. The result shows that the Jeffreys method is an effective method for noninformative prior calculation. (authors)

  17. A summary of methods of predicting reliability life of nuclear equipment with small samples

    International Nuclear Information System (INIS)

    Liao Weixian

    2000-03-01

    Some of nuclear equipment are manufactured in small batch, e.g., 1-3 sets. Their service life may be very difficult to determine experimentally in view of economy and technology. The method combining theoretical analysis with material tests to predict the life of equipment is put forward, based on that equipment consists of parts or elements which are made of different materials. The whole life of an equipment part consists of the crack forming life (i.e., the fatigue life or the damage accumulation life) and the crack extension life. Methods of predicting machine life has systematically summarized with the emphasis on those which use theoretical analysis to substitute large scale prototype experiments. Meanwhile, methods and steps of predicting reliability life have been described by taking into consideration of randomness of various variables and parameters in engineering. Finally, the latest advance and trends of machine life prediction are discussed

  18. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    Science.gov (United States)

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  19. Reliability Quantification Method for Safety Critical Software Based on a Finite Test Set

    International Nuclear Information System (INIS)

    Shin, Sung Min; Kim, Hee Eun; Kang, Hyun Gook; Lee, Seung Jun

    2014-01-01

    Software inside of digitalized system have very important role because it may cause irreversible consequence and affect the whole system as common cause failure. However, test-based reliability quantification method for some safety critical software has limitations caused by difficulties in developing input sets as a form of trajectory which is series of successive values of variables. To address these limitations, this study proposed another method which conduct the test using combination of single values of variables. To substitute the trajectory form of input using combination of variables, the possible range of each variable should be identified. For this purpose, assigned range of each variable, logical relations between variables, plant dynamics under certain situation, and characteristics of obtaining information of digital device are considered. A feasibility of the proposed method was confirmed through an application to the Reactor Protection System (RPS) software trip logic

  20. Numerical methods for reliability and safety assessment multiscale and multiphysics systems

    CERN Document Server

    Hami, Abdelkhalak

    2015-01-01

    This book offers unique insight on structural safety and reliability by combining computational methods that address multiphysics problems, involving multiple equations describing different physical phenomena, and multiscale problems, involving discrete sub-problems that together  describe important aspects of a system at multiple scales. The book examines a range of engineering domains and problems using dynamic analysis, nonlinear methods, error estimation, finite element analysis, and other computational techniques. This book also: ·       Introduces novel numerical methods ·       Illustrates new practical applications ·       Examines recent engineering applications ·       Presents up-to-date theoretical results ·       Offers perspective relevant to a wide audience, including teaching faculty/graduate students, researchers, and practicing engineers

  1. Approximation of the Monte Carlo Sampling Method for Reliability Analysis of Structures

    Directory of Open Access Journals (Sweden)

    Mahdi Shadab Far

    2016-01-01

    Full Text Available Structural load types, on the one hand, and structural capacity to withstand these loads, on the other hand, are of a probabilistic nature as they cannot be calculated and presented in a fully deterministic way. As such, the past few decades have witnessed the development of numerous probabilistic approaches towards the analysis and design of structures. Among the conventional methods used to assess structural reliability, the Monte Carlo sampling method has proved to be very convenient and efficient. However, it does suffer from certain disadvantages, the biggest one being the requirement of a very large number of samples to handle small probabilities, leading to a high computational cost. In this paper, a simple algorithm was proposed to estimate low failure probabilities using a small number of samples in conjunction with the Monte Carlo method. This revised approach was then presented in a step-by-step flowchart, for the purpose of easy programming and implementation.

  2. Using the graphs models for evaluating in-core monitoring systems reliability by the method of imiting simulaton

    International Nuclear Information System (INIS)

    Golovanov, M.N.; Zyuzin, N.N.; Levin, G.L.; Chesnokov, A.N.

    1987-01-01

    An approach for estimation of reliability factors of complex reserved systems at early stages of development using the method of imitating simulation is considered. Different types of models, their merits and lacks are given. Features of in-core monitoring systems and advosability of graph model and graph theory element application for estimating reliability of such systems are shown. The results of investigation of the reliability factors of the reactor monitoring, control and core local protection subsystem are shown

  3. A fast method for calculating reliable event supports in tree reconciliations via Pareto optimality.

    Science.gov (United States)

    To, Thu-Hien; Jacox, Edwin; Ranwez, Vincent; Scornavacca, Celine

    2015-11-14

    Given a gene and a species tree, reconciliation methods attempt to retrieve the macro-evolutionary events that best explain the discrepancies between the two tree topologies. The DTL parsimonious approach searches for a most parsimonious reconciliation between a gene tree and a (dated) species tree, considering four possible macro-evolutionary events (speciation, duplication, transfer, and loss) with specific costs. Unfortunately, many events are erroneously predicted due to errors in the input trees, inappropriate input cost values or because of the existence of several equally parsimonious scenarios. It is thus crucial to provide a measure of the reliability for predicted events. It has been recently proposed that the reliability of an event can be estimated via its frequency in the set of most parsimonious reconciliations obtained using a variety of reasonable input cost vectors. To compute such a support, a straightforward but time-consuming approach is to generate the costs slightly departing from the original ones, independently compute the set of all most parsimonious reconciliations for each vector, and combine these sets a posteriori. Another proposed approach uses Pareto-optimality to partition cost values into regions which induce reconciliations with the same number of DTL events. The support of an event is then defined as its frequency in the set of regions. However, often, the number of regions is not large enough to provide reliable supports. We present here a method to compute efficiently event supports via a polynomial-sized graph, which can represent all reconciliations for several different costs. Moreover, two methods are proposed to take into account alternative input costs: either explicitly providing an input cost range or allowing a tolerance for the over cost of a reconciliation. Our methods are faster than the region based method, substantially faster than the sampling-costs approach, and have a higher event-prediction accuracy on

  4. Procedures and methods that increase reliability and reproducibility of the transplanted kidney perfusion index

    International Nuclear Information System (INIS)

    Smokvina, A.

    1994-01-01

    At different times following surgery and during various complications, 119 studies were performed on 57 patients. In many patients studies were repeated several times. Twenty-three studies were performed in as many patients, in whom a normal function of the transplanted kidney was established by other diagnostic methods and retrospective analysis. Comparison was made of the perfusion index results obtained by the Hilson et al. method from 1978 and the ones obtained by my own modified method, which for calculating the index also takes into account: the time difference in appearance of the initial portions of the artery and kidney curves; the positioning of the region of interest over the distal part of the aorta; the bolus injection into the arteriovenous shunt of the forearm with high specific activity of small volumes of Tc-99m labelled agents; a fast 0.5 seconds study of data collection; and a standard for normalization of numerical data. The reliability of one or the other method tested by simulated time shift of the peak of arterial curves shows that the deviation percentage from the main index value in the unmodified method is 2-5 times greater than in the modified method. The normal value of the perfusion index applying the modified method is 91-171. (author)

  5. A survey on the human reliability analysis methods for the design of Korean next generation reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Lee, J. W.; Park, J. C.; Kwack, H. Y.; Lee, K. Y.; Park, J. K.; Kim, I. S.; Jung, K. W

    2000-03-01

    Enhanced features through applying recent domestic technologies may characterize the safety and efficiency of KNGR(Korea Next Generation Reactor). Human engineered interface and control room environment are expected to be beneficial to the human aspects of KNGR design. However, since the current method for human reliability analysis is not up to date after THERP/SHARP, it becomes hard to assess the potential of human errors due to both of the positive and negative effect of the design changes in KNGR. This is a state of the art report on the human reliability analysis methods that are potentially available for the application to the KNGR design. We surveyed every technical aspects of existing HRA methods, and compared them in order to obtain the requirements for the assessment of human error potentials within KNGR design. We categorized the more than 10 methods into the first and the second generation according to the suggestion of Dr. Hollnagel. THERP was revisited in detail. ATHEANA proposed by US NRC for an advanced design and CREAM proposed by Dr. Hollnagel were reviewed and compared. We conclude that the key requirements might include the enhancement in the early steps for human error identification and the quantification steps with considerations of more extended error shaping factors over PSFs(performance shaping factors). The utilization of the steps and approaches of ATHEANA and CREAM will be beneficial to the attainment of an appropriate HRA method for KNGR. However, the steps and data from THERP will be still maintained because of the continuity with previous PSA activities in KNGR design.

  6. A survey on the human reliability analysis methods for the design of Korean next generation reactor

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Lee, J. W.; Park, J. C.; Kwack, H. Y.; Lee, K. Y.; Park, J. K.; Kim, I. S.; Jung, K. W.

    2000-03-01

    Enhanced features through applying recent domestic technologies may characterize the safety and efficiency of KNGR(Korea Next Generation Reactor). Human engineered interface and control room environment are expected to be beneficial to the human aspects of KNGR design. However, since the current method for human reliability analysis is not up to date after THERP/SHARP, it becomes hard to assess the potential of human errors due to both of the positive and negative effect of the design changes in KNGR. This is a state of the art report on the human reliability analysis methods that are potentially available for the application to the KNGR design. We surveyed every technical aspects of existing HRA methods, and compared them in order to obtain the requirements for the assessment of human error potentials within KNGR design. We categorized the more than 10 methods into the first and the second generation according to the suggestion of Dr. Hollnagel. THERP was revisited in detail. ATHEANA proposed by US NRC for an advanced design and CREAM proposed by Dr. Hollnagel were reviewed and compared. We conclude that the key requirements might include the enhancement in the early steps for human error identification and the quantification steps with considerations of more extended error shaping factors over PSFs(performance shaping factors). The utilization of the steps and approaches of ATHEANA and CREAM will be beneficial to the attainment of an appropriate HRA method for KNGR. However, the steps and data from THERP will be still maintained because of the continuity with previous PSA activities in KNGR design

  7. A Reliable Method for the Evaluation of the Anaphylactoid Reaction Caused by Injectable Drugs

    Directory of Open Access Journals (Sweden)

    Fang Wang

    2016-10-01

    Full Text Available Adverse reactions of injectable drugs usually occur at first administration and are closely associated with the dosage and speed of injection. This phenomenon is correlated with the anaphylactoid reaction. However, up to now, study methods based on antigen detection have still not gained wide acceptance and single physiological indicators cannot be utilized to differentiate anaphylactoid reactions from allergic reactions and inflammatory reactions. In this study, a reliable method for the evaluation of anaphylactoid reactions caused by injectable drugs was established by using multiple physiological indicators. We used compound 48/80, ovalbumin and endotoxin as the sensitization agents to induce anaphylactoid, allergic and inflammatory reactions. Different experimental animals (guinea pig and nude rat and different modes of administration (intramuscular, intravenous and intraperitoneal injection and different times (15 min, 30 min and 60 min were evaluated to optimize the study protocol. The results showed that the optimal way to achieve sensitization involved treating guinea pigs with the different agents by intravenous injection for 30 min. Further, seven related humoral factors including 5-HT, SC5b-9, Bb, C4d, IL-6, C3a and histamine were detected by HPLC analysis and ELISA assay to determine their expression level. The results showed that five of them, including 5-HT, SC5b-9, Bb, C4d and IL-6, displayed significant differences between anaphylactoid, allergic and inflammatory reactions, which indicated that their combination could be used to distinguish these three reactions. Then different injectable drugs were used to verify this method and the results showed that the chosen indicators exhibited good correlation with the anaphylactoid reaction which indicated that the established method was both practical and reliable. Our research provides a feasible method for the diagnosis of the serious adverse reactions caused by injectable drugs which

  8. Validity and reliability of the session-RPE method for quantifying training load in karate athletes.

    Science.gov (United States)

    Tabben, M; Tourny, C; Haddad, M; Chaabane, H; Chamari, K; Coquart, J B

    2015-04-24

    To test the construct validity and reliability of the session rating of perceived exertion (sRPE) method by examining the relationship between RPE and physiological parameters (heart rate: HR and blood lactate concentration: [La --] ) and the correlations between sRPE and two HR--based methods for quantifying internal training load (Banister's method and Edwards's method) during karate training camp. Eighteen elite karate athletes: ten men (age: 24.2 ± 2.3 y, body mass: 71.2 ± 9.0 kg, body fat: 8.2 ± 1.3% and height: 178 ± 7 cm) and eight women (age: 22.6 ± 1.2 y, body mass: 59.8 ± 8.4 kg, body fat: 20.2 ± 4.4%, height: 169 ± 4 cm) were included in the study. During training camp, subjects participated in eight karate--training sessions including three training modes (4 tactical--technical, 2 technical--development, and 2 randori training), during which RPE, HR, and [La -- ] were recorded. Significant correlations were found between RPE and physiological parameters (percentage of maximal HR: r = 0.75, 95% CI = 0.64--0.86; [La --] : r = 0.62, 95% CI = 0.49--0.75; P training load ( r = 0.65--0.95; P reliability of the same intensity across training sessions (Cronbach's α = 0.81, 95% CI = 0.61--0.92). This study demonstrates that the sRPE method is valid for quantifying internal training load and intensity in karate.

  9. Novel Methods to Enhance Precision and Reliability in Muscle Synergy Identification during Walking

    Science.gov (United States)

    Kim, Yushin; Bulea, Thomas C.; Damiano, Diane L.

    2016-01-01

    Muscle synergies are hypothesized to reflect modular control of muscle groups via descending commands sent through multiple neural pathways. Recently, the number of synergies has been reported as a functionally relevant indicator of motor control complexity in individuals with neurological movement disorders. Yet the number of synergies extracted during a given activity, e.g., gait, varies within and across studies, even for unimpaired individuals. With no standardized methods for precise determination, this variability remains unexplained making comparisons across studies and cohorts difficult. Here, we utilize k-means clustering and intra-class and between-level correlation coefficients to precisely discriminate reliable from unreliable synergies. Electromyography (EMG) was recorded bilaterally from eight leg muscles during treadmill walking at self-selected speed. Muscle synergies were extracted from 20 consecutive gait cycles using non-negative matrix factorization. We demonstrate that the number of synergies is highly dependent on the threshold when using the variance accounted for by reconstructed EMG. Beyond use of threshold, our method utilized a quantitative metric to reliably identify four or five synergies underpinning walking in unimpaired adults and revealed synergies having poor reproducibility that should not be considered as true synergies. We show that robust and unreliable synergies emerge similarly, emphasizing the need for careful analysis in those with pathology. PMID:27695403

  10. Identification of a practical and reliable method for the evaluation of litter moisture in turkey production.

    Science.gov (United States)

    Vinco, L J; Giacomelli, S; Campana, L; Chiari, M; Vitale, N; Lombardi, G; Veldkamp, T; Hocking, P M

    2018-02-01

    1. An experiment was conducted to compare 5 different methods for the evaluation of litter moisture. 2. For litter collection and assessment, 55 farms were selected, one shed from each farm was inspected and 9 points were identified within each shed. 3. For each device, used for the evaluation of litter moisture, mean and standard deviation of wetness measures per collection point were assessed. 4. The reliability and overall consistency between the 5 instruments used to measure wetness were high (α = 0.72). 5. Measurement of three out of the 9 collection points were sufficient to provide a reliable assessment of litter moisture throughout the shed. 6. Based on the direct correlation between litter moisture and footpad lesions, litter moisture measurement can be used as a resource based on-farm animal welfare indicator. 7. Among the 5 methods analysed, visual scoring is the most simple and practical, and therefore the best candidate to be used on-farm for animal welfare assessment.

  11. A Newly Developed Method for Computing Reliability Measures in a Water Supply Network

    Directory of Open Access Journals (Sweden)

    Jacek Malinowski

    2016-01-01

    Full Text Available A reliability model of a water supply network has beens examined. Its main features are: a topology that can be decomposed by the so-called state factorization into a (relativelysmall number of derivative networks, each having a series-parallel structure (1, binary-state components (either operative or failed with given flow capacities (2, a multi-state character of the whole network and its sub-networks - a network state is defined as the maximal flow between a source (sources and a sink (sinks (3, all capacities (component, network, and sub-network have integer values (4. As the network operates, its state changes due to component failures, repairs, and replacements. A newly developed method of computing the inter-state transition intensities has been presented. It is based on the so-called state factorization and series-parallel aggregation. The analysis of these intensities shows that the failure-repair process of the considered system is an asymptotically homogenous Markov process. It is also demonstrated how certain reliability parameters useful for the network maintenance planning can be determined on the basis of the asymptotic intensities. For better understanding of the presented method, an illustrative example is given. (original abstract

  12. Uncertainty analysis methods for estimation of reliability of passive system of VHTR

    International Nuclear Information System (INIS)

    Han, S.J.

    2012-01-01

    An estimation of reliability of passive system for the probabilistic safety assessment (PSA) of a very high temperature reactor (VHTR) is under development in Korea. The essential approach of this estimation is to measure the uncertainty of the system performance under a specific accident condition. The uncertainty propagation approach according to the simulation of phenomenological models (computer codes) is adopted as a typical method to estimate the uncertainty for this purpose. This presentation introduced the uncertainty propagation and discussed the related issues focusing on the propagation object and its surrogates. To achieve a sufficient level of depth of uncertainty results, the applicability of the propagation should be carefully reviewed. For an example study, Latin-hypercube sampling (LHS) method as a direct propagation was tested for a specific accident sequence of VHTR. The reactor cavity cooling system (RCCS) developed by KAERI was considered for this example study. This is an air-cooled type passive system that has no active components for its operation. The accident sequence is a low pressure conduction cooling (LPCC) accident that is considered as a design basis accident for the safety design of VHTR. This sequence is due to a large failure of the pressure boundary of the reactor system such as a guillotine break of coolant pipe lines. The presentation discussed the obtained insights (benefit and weakness) to apply an estimation of reliability of passive system

  13. HUMAN RELIABILITY ANALYSIS FOR COMPUTERIZED PROCEDURES, PART TWO: APPLICABILITY OF CURRENT METHODS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-10-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no U.S. nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  14. Human reliability analysis for probabilistic safety assessments - review of methods and issues

    International Nuclear Information System (INIS)

    Srinivas, G.; Guptan, Rajee; Malhotra, P.K.; Ghadge, S.G.; Chandra, Umesh

    2011-01-01

    It is well known that the two major events in World Nuclear Power Plant Operating history, namely the Three Mile Island and Chernobyl, were Human failure events. Subsequent to these two events, several significant changes have been incorporated in Plant Design, Control Room Design and Operator Training to reduce the possibility of Human errors during plant transients. Still, human error contribution to Risk in Nuclear Power Plant operations has been a topic of continued attention for research, development and analysis. Probabilistic Safety Assessments attempt to capture all potential human errors with a scientifically computed failure probability, through Human Reliability Analysis. Several methods are followed by different countries to quantify the Human error probability. This paper reviews the various popular methods being followed, critically examines them with reference to their criticisms and brings out issues for future research. (author)

  15. Regulatory relevant and reliable methods and data for determining the environmental fate of manufactured nanomaterials

    DEFF Research Database (Denmark)

    Baun, Anders; Sayre, Phil; Steinhäuser, Klaus Günter

    2017-01-01

    The widespread use of manufactured nanomaterials (MN) increases the need for describing and predicting their environmental fate and behaviour. A number of recent reviews have addressed the scientific challenges in disclosing the governing processes for the environmental fate and behaviour of MNs,...... data. Gaps do however exist in test methods for environmental fate, such as methods to estimate heteroagglomeration and the tendency for MNs to transform in the environment.......The widespread use of manufactured nanomaterials (MN) increases the need for describing and predicting their environmental fate and behaviour. A number of recent reviews have addressed the scientific challenges in disclosing the governing processes for the environmental fate and behaviour of MNs......, however there has been less focus on the regulatory adequacy of the data available for MN. The aim of this paper is therefore to review data, testing protocols and guidance papers which describe the environmental fate and behaviour of MN with a focus on their regulatory reliability and relevance. Given...

  16. A two-step method for fast and reliable EUV mask metrology

    Science.gov (United States)

    Helfenstein, Patrick; Mochi, Iacopo; Rajendran, Rajeev; Yoshitake, Shusuke; Ekinci, Yasin

    2017-03-01

    One of the major obstacles towards the implementation of extreme ultraviolet lithography for upcoming technology nodes in semiconductor industry remains the realization of a fast and reliable detection methods patterned mask defects. We are developing a reflective EUV mask-scanning lensless imaging tool (RESCAN), installed at the Swiss Light Source synchrotron at the Paul Scherrer Institut. Our system is based on a two-step defect inspection method. In the first step, a low-resolution defect map is generated by die to die comparison of the diffraction patterns from areas with programmed defects, to those from areas that are known to be defect-free on our test sample. In a later stage, a die to database comparison will be implemented in which the measured diffraction patterns will be compared to those calculated directly from the mask layout. This Scattering Scanning Contrast Microscopy technique operates purely in the Fourier domain without the need to obtain the aerial image and, given a sufficient signal to noise ratio, defects are found in a fast and reliable way, albeit with a location accuracy limited by the spot size of the incident illumination. Having thus identified rough locations for the defects, a fine scan is carried out in the vicinity of these locations. Since our source delivers coherent illumination, we can use an iterative phase-retrieval method to reconstruct the aerial image of the scanned area with - in principle - diffraction-limited resolution without the need of an objective lens. Here, we will focus on the aerial image reconstruction technique and give a few examples to illustrate the capability of the method.

  17. On the Reliability of Source Time Functions Estimated Using Empirical Green's Function Methods

    Science.gov (United States)

    Gallegos, A. C.; Xie, J.; Suarez Salas, L.

    2017-12-01

    The Empirical Green's Function (EGF) method (Hartzell, 1978) has been widely used to extract source time functions (STFs). In this method, seismograms generated by collocated events with different magnitudes are deconvolved. Under a fundamental assumption that the STF of the small event is a delta function, the deconvolved Relative Source Time Function (RSTF) yields the large event's STF. While this assumption can be empirically justified by examination of differences in event size and frequency content of the seismograms, there can be a lack of rigorous justification of the assumption. In practice, a small event might have a finite duration when the RSTF is retrieved and interpreted as the large event STF with a bias. In this study, we rigorously analyze this bias using synthetic waveforms generated by convolving a realistic Green's function waveform with pairs of finite-duration triangular or parabolic STFs. The RSTFs are found using a time-domain based matrix deconvolution. We find when the STFs of smaller events are finite, the RSTFs are a series of narrow non-physical spikes. Interpreting these RSTFs as a series of high-frequency source radiations would be very misleading. The only reliable and unambiguous information we can retrieve from these RSTFs is the difference in durations and the moment ratio of the two STFs. We can apply a Tikhonov smoothing to obtain a single-pulse RSTF, but its duration is dependent on the choice of weighting, which may be subjective. We then test the Multi-Channel Deconvolution (MCD) method (Plourde & Bostock, 2017) which assumes that both STFs have finite durations to be solved for. A concern about the MCD method is that the number of unknown parameters is larger, which would tend to make the problem rank-deficient. Because the kernel matrix is dependent on the STFs to be solved for under a positivity constraint, we can only estimate the rank-deficiency with a semi-empirical approach. Based on the results so far, we find that the

  18. Impact of the implementation of a well-designed electronic laboratory notebook on bioanalytical laboratory function.

    Science.gov (United States)

    Zeng, Jianing; Hillman, Mark; Arnold, Mark

    2011-07-01

    This paper shares experiences of the Bristol-Myers Squibb Company during the design, validation and implementation of an electronic laboratory notebook (ELN) into the GLP/regulated bioanalytical analysis area, as well as addresses the impact on bioanalytical laboratory functions with the implementation of the electronic notebook. Some of the key points covered are: knowledge management - the project-based electronic notebook takes full advantage of the available technology that focuses on data organization and sharing so that scientific data generated by individual scientists became department knowledge; bioanalytical workflows in the ELN - the custom-built workflows that include data entry templates, validated calculation processes, integration with laboratory information management systems/laboratory instruments, and reporting capability improve the data quality and overall workflow efficiency; regulatory compliance - carefully designed notebook reviewing processes, cross referencing of distributed information, audit trail and software validation reduce compliance risks. By taking into consideration both data generation and project documentation needs, a well-designed ELN can deliver significant improvements in laboratory efficiency, work productivity, and regulatory compliance.

  19. Comprehensive reliability allocation method for CNC lathes based on cubic transformed functions of failure mode and effects analysis

    Science.gov (United States)

    Yang, Zhou; Zhu, Yunpeng; Ren, Hongrui; Zhang, Yimin

    2015-03-01

    Reliability allocation of computerized numerical controlled(CNC) lathes is very important in industry. Traditional allocation methods only focus on high-failure rate components rather than moderate failure rate components, which is not applicable in some conditions. Aiming at solving the problem of CNC lathes reliability allocating, a comprehensive reliability allocation method based on cubic transformed functions of failure modes and effects analysis(FMEA) is presented. Firstly, conventional reliability allocation methods are introduced. Then the limitations of direct combination of comprehensive allocation method with the exponential transformed FMEA method are investigated. Subsequently, a cubic transformed function is established in order to overcome these limitations. Properties of the new transformed functions are discussed by considering the failure severity and the failure occurrence. Designers can choose appropriate transform amplitudes according to their requirements. Finally, a CNC lathe and a spindle system are used as an example to verify the new allocation method. Seven criteria are considered to compare the results of the new method with traditional methods. The allocation results indicate that the new method is more flexible than traditional methods. By employing the new cubic transformed function, the method covers a wider range of problems in CNC reliability allocation without losing the advantages of traditional methods.

  20. Human reliability analysis of errors of commission: a review of methods and applications

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B

    2007-06-15

    Illustrated by specific examples relevant to contemporary probabilistic safety assessment (PSA), this report presents a review of human reliability analysis (HRA) addressing post initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions. The review addressed both methods and applications. Emerging HRA methods providing advanced features and explicit guidance suitable for PSA are: A Technique for Human Event Analysis (ATHEANA, key publications in 1998/2000), Methode d'Evaluation de la Realisation des Missions Operateur pour la Surete (MERMOS, 1998/2000), the EOC HRA method developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS, 2003), the Misdiagnosis Tree Analysis (MDTA) method (2005/2006), the Cognitive Reliability and Error Analysis Method (CREAM, 1998), and the Commission Errors Search and Assessment (CESA) method (2002/2004). As a result of a thorough investigation of various PSA/HRA applications, this paper furthermore presents an overview of EOCs (termination of safety injection, shutdown of secondary cooling, etc.) referred to in predictive studies and a qualitative review of cases of EOC quantification. The main conclusions of the review of both the methods and the EOC HRA cases are: (1) The CESA search scheme, which proceeds from possible operator actions to the affected systems to scenarios, may be preferable because this scheme provides a formalized way for identifying relatively important scenarios with EOC opportunities; (2) an EOC identification guidance like CESA, which is strongly based on the procedural guidance and important measures of systems or components affected by inappropriate actions, however should pay some attention to EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions. (3) Orientations of advanced EOC quantification comprise a) modeling of multiple contexts for a given scenario, b) accounting for

  1. Human reliability analysis of errors of commission: a review of methods and applications

    International Nuclear Information System (INIS)

    Reer, B.

    2007-06-01

    Illustrated by specific examples relevant to contemporary probabilistic safety assessment (PSA), this report presents a review of human reliability analysis (HRA) addressing post initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions. The review addressed both methods and applications. Emerging HRA methods providing advanced features and explicit guidance suitable for PSA are: A Technique for Human Event Analysis (ATHEANA, key publications in 1998/2000), Methode d'Evaluation de la Realisation des Missions Operateur pour la Surete (MERMOS, 1998/2000), the EOC HRA method developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS, 2003), the Misdiagnosis Tree Analysis (MDTA) method (2005/2006), the Cognitive Reliability and Error Analysis Method (CREAM, 1998), and the Commission Errors Search and Assessment (CESA) method (2002/2004). As a result of a thorough investigation of various PSA/HRA applications, this paper furthermore presents an overview of EOCs (termination of safety injection, shutdown of secondary cooling, etc.) referred to in predictive studies and a qualitative review of cases of EOC quantification. The main conclusions of the review of both the methods and the EOC HRA cases are: (1) The CESA search scheme, which proceeds from possible operator actions to the affected systems to scenarios, may be preferable because this scheme provides a formalized way for identifying relatively important scenarios with EOC opportunities; (2) an EOC identification guidance like CESA, which is strongly based on the procedural guidance and important measures of systems or components affected by inappropriate actions, however should pay some attention to EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions. (3) Orientations of advanced EOC quantification comprise a) modeling of multiple contexts for a given scenario, b) accounting for

  2. A Hierarchical Reliability Control Method for a Space Manipulator Based on the Strategy of Autonomous Decision-Making

    Directory of Open Access Journals (Sweden)

    Xin Gao

    2016-01-01

    Full Text Available In order to maintain and enhance the operational reliability of a robotic manipulator deployed in space, an operational reliability system control method is presented in this paper. First, a method to divide factors affecting the operational reliability is proposed, which divides the operational reliability factors into task-related factors and cost-related factors. Then the models describing the relationships between the two kinds of factors and control variables are established. Based on this, a multivariable and multiconstraint optimization model is constructed. Second, a hierarchical system control model which incorporates the operational reliability factors is constructed. The control process of the space manipulator is divided into three layers: task planning, path planning, and motion control. Operational reliability related performance parameters are measured and used as the system’s feedback. Taking the factors affecting the operational reliability into consideration, the system can autonomously decide which control layer of the system should be optimized and how to optimize it using a control level adjustment decision module. The operational reliability factors affect these three control levels in the form of control variable constraints. Simulation results demonstrate that the proposed method can achieve a greater probability of meeting the task accuracy requirements, while extending the expected lifetime of the space manipulator.

  3. Using DOProC method in reliability assessment of steel elements exposed to fatigue

    Directory of Open Access Journals (Sweden)

    Krejsa Martin

    2017-01-01

    Full Text Available Fatigue crack damage depends on a number of stress range cycles. This is a time factor in the course of reliability for the entire designed service life. Three sizes are important for the characteristics of the propagation of fatigue cracks - initial size, detectable size and acceptable size. The theoretical model of fatigue crack progression can be based on a linear fracture mechanic. Depending on location of an initial crack, the crack may propagate in structural element e.g. from the edge or from the surface. When determining the required degree of reliability, it is possible to specify the time of the first inspection of the construction which will focus on the fatigue damage. Using a conditional probability and Bayesian approach, times for subsequent inspections can be determined. For probabilistic modelling of fatigue crack progression was used the original and new probabilistic method - the Direct Optimized Probabilistic Calculation (“DOProC”, which uses a purely numerical approach without any simulation techniques or approximation approach based on optimized numerical integration.

  4. Relationship of Ambient Atmosphere and Biological Aerosol Responses from a Fielded Pyrolysis-Gas Chromatography-Ion Mobility Spectrometry Bioanalytical Detector

    National Research Council Canada - National Science Library

    Snyder, A

    2003-01-01

    .... A pyrolysis-gas chromatography-ion mobility spectrometry stand-alone bioaerosol system was interfaced to an aerosol concentrator to collect ambient background aerosols and produce bioanalytical...

  5. A Systematic Review of Statistical Methods Used to Test for Reliability of Medical Instruments Measuring Continuous Variables

    Directory of Open Access Journals (Sweden)

    Rafdzah Zaki

    2013-06-01

    Full Text Available   Objective(s: Reliability measures precision or the extent to which test results can be replicated. This is the first ever systematic review to identify statistical methods used to measure reliability of equipment measuring continuous variables. This studyalso aims to highlight the inappropriate statistical method used in the reliability analysis and its implication in the medical practice.   Materials and Methods: In 2010, five electronic databases were searched between 2007 and 2009 to look for reliability studies. A total of 5,795 titles were initially identified. Only 282 titles were potentially related, and finally 42 fitted the inclusion criteria. Results: The Intra-class Correlation Coefficient (ICC is the most popular method with 25 (60% studies having used this method followed by the comparing means (8 or 19%. Out of 25 studies using the ICC, only 7 (28% reported the confidence intervals and types of ICC used. Most studies (71% also tested the agreement of instruments. Conclusion: This study finds that the Intra-class Correlation Coefficient is the most popular method used to assess the reliability of medical instruments measuring continuous outcomes. There are also inappropriate applications and interpretations of statistical methods in some studies. It is important for medical researchers to be aware of this issue, and be able to correctly perform analysis in reliability studies.

  6. Control of the large renal vein in limited dissected space during laparoscopic nephrectomy: a simple and reliable method

    NARCIS (Netherlands)

    Kijvikai, Kittinut; Laguna, M. Pilar; de la Rosette, Jean

    2006-01-01

    We describe our technique for large renal vein control in the limited dissected space during laparoscopic nephrectomy. This technique is a simple, inexpensive and reliable method, especially for large and short renal vein ligation

  7. New Methods for Building-In and Improvement of Integrated Circuit Reliability

    NARCIS (Netherlands)

    van der Pol, J.A.; van der Pol, Jacob Antonius

    2000-01-01

    Over the past 30 years the reliability of semiconductor products has improved by a factor of 100 while at the same time the complexity of the circuits has increased by a factor 105. This 7-decade reliability improvement has been realised by implementing a sophisticated reliability assurance system

  8. Simple and Reliable Method to Estimate the Fingertip Static Coefficient of Friction in Precision Grip.

    Science.gov (United States)

    Barrea, Allan; Bulens, David Cordova; Lefevre, Philippe; Thonnard, Jean-Louis

    2016-01-01

    The static coefficient of friction (µ static ) plays an important role in dexterous object manipulation. Minimal normal force (i.e., grip force) needed to avoid dropping an object is determined by the tangential force at the fingertip-object contact and the frictional properties of the skin-object contact. Although frequently assumed to be constant for all levels of normal force (NF, the force normal to the contact), µ static actually varies nonlinearly with NF and increases at low NF levels. No method is currently available to measure the relationship between µ static and NF easily. Therefore, we propose a new method allowing the simple and reliable measurement of the fingertip µ static at different NF levels, as well as an algorithm for determining µ static from measured forces and torques. Our method is based on active, back-and-forth movements of a subject's finger on the surface of a fixed six-axis force and torque sensor. µ static is computed as the ratio of the tangential to the normal force at slip onset. A negative power law captures the relationship between µ static and NF. Our method allows the continuous estimation of µ static as a function of NF during dexterous manipulation, based on the relationship between µ static and NF measured before manipulation.

  9. Reliability Verification of DBE Environment Simulation Test Facility by using Statistics Method

    International Nuclear Information System (INIS)

    Jang, Kyung Nam; Kim, Jong Soeg; Jeong, Sun Chul; Kyung Heum

    2011-01-01

    In the nuclear power plant, all the safety-related equipment including cables under the harsh environment should perform the equipment qualification (EQ) according to the IEEE std 323. There are three types of qualification methods including type testing, operating experience and analysis. In order to environmentally qualify the safety-related equipment using type testing method, not analysis or operation experience method, the representative sample of equipment, including interfaces, should be subjected to a series of tests. Among these tests, Design Basis Events (DBE) environment simulating test is the most important test. DBE simulation test is performed in DBE simulation test chamber according to the postulated DBE conditions including specified high-energy line break (HELB), loss of coolant accident (LOCA), main steam line break (MSLB) and etc, after thermal and radiation aging. Because most DBE conditions have 100% humidity condition, in order to trace temperature and pressure of DBE condition, high temperature steam should be used. During DBE simulation test, if high temperature steam under high pressure inject to the DBE test chamber, the temperature and pressure in test chamber rapidly increase over the target temperature. Therefore, the temperature and pressure in test chamber continue fluctuating during the DBE simulation test to meet target temperature and pressure. We should ensure fairness and accuracy of test result by confirming the performance of DBE environment simulation test facility. In this paper, in order to verify reliability of DBE environment simulation test facility, statistics method is used

  10. A critical evaluation of deterministic methods in size optimisation of reliable and cost effective standalone hybrid renewable energy systems

    International Nuclear Information System (INIS)

    Maheri, Alireza

    2014-01-01

    Reliability of a hybrid renewable energy system (HRES) strongly depends on various uncertainties affecting the amount of power produced by the system. In the design of systems subject to uncertainties, both deterministic and nondeterministic design approaches can be adopted. In a deterministic design approach, the designer considers the presence of uncertainties and incorporates them indirectly into the design by applying safety factors. It is assumed that, by employing suitable safety factors and considering worst-case-scenarios, reliable systems can be designed. In fact, the multi-objective optimisation problem with two objectives of reliability and cost is reduced to a single-objective optimisation problem with the objective of cost only. In this paper the competence of deterministic design methods in size optimisation of reliable standalone wind–PV–battery, wind–PV–diesel and wind–PV–battery–diesel configurations is examined. For each configuration, first, using different values of safety factors, the optimal size of the system components which minimises the system cost is found deterministically. Then, for each case, using a Monte Carlo simulation, the effect of safety factors on the reliability and the cost are investigated. In performing reliability analysis, several reliability measures, namely, unmet load, blackout durations (total, maximum and average) and mean time between failures are considered. It is shown that the traditional methods of considering the effect of uncertainties in deterministic designs such as design for an autonomy period and employing safety factors have either little or unpredictable impact on the actual reliability of the designed wind–PV–battery configuration. In the case of wind–PV–diesel and wind–PV–battery–diesel configurations it is shown that, while using a high-enough margin of safety in sizing diesel generator leads to reliable systems, the optimum value for this margin of safety leading to a

  11. Methods for Calculating Frequency of Maintenance of Complex Information Security System Based on Dynamics of Its Reliability

    Science.gov (United States)

    Varlataya, S. K.; Evdokimov, V. E.; Urzov, A. Y.

    2017-11-01

    This article describes a process of calculating a certain complex information security system (CISS) reliability using the example of the technospheric security management model as well as ability to determine the frequency of its maintenance using the system reliability parameter which allows one to assess man-made risks and to forecast natural and man-made emergencies. The relevance of this article is explained by the fact the CISS reliability is closely related to information security (IS) risks. Since reliability (or resiliency) is a probabilistic characteristic of the system showing the possibility of its failure (and as a consequence - threats to the protected information assets emergence), it is seen as a component of the overall IS risk in the system. As it is known, there is a certain acceptable level of IS risk assigned by experts for a particular information system; in case of reliability being a risk-forming factor maintaining an acceptable risk level should be carried out by the routine analysis of the condition of CISS and its elements and their timely service. The article presents a reliability parameter calculation for the CISS with a mixed type of element connection, a formula of the dynamics of such system reliability is written. The chart of CISS reliability change is a S-shaped curve which can be divided into 3 periods: almost invariable high level of reliability, uniform reliability reduction, almost invariable low level of reliability. Setting the minimum acceptable level of reliability, the graph (or formula) can be used to determine the period of time during which the system would meet requirements. Ideally, this period should not be longer than the first period of the graph. Thus, the proposed method of calculating the CISS maintenance frequency helps to solve a voluminous and critical task of the information assets risk management.

  12. Computing interval-valued reliability measures: application of optimal control methods

    DEFF Research Database (Denmark)

    Kozin, Igor; Krymsky, Victor

    2017-01-01

    The paper describes an approach to deriving interval-valued reliability measures given partial statistical information on the occurrence of failures. We apply methods of optimal control theory, in particular, Pontryagin’s principle of maximum to solve the non-linear optimisation problem and derive...... the probabilistic interval-valued quantities of interest. It is proven that the optimisation problem can be translated into another problem statement that can be solved on the class of piecewise continuous probability density functions (pdfs). This class often consists of piecewise exponential pdfs which appear...... as soon as among the constraints there are bounds on a failure rate of a component under consideration. Finding the number of switching points of the piecewise continuous pdfs and their values becomes the focus of the approach described in the paper. Examples are provided....

  13. Uncertainty analysis of nonlinear systems employing the first-order reliability method

    International Nuclear Information System (INIS)

    Choi, Chan Kyu; Yoo, Hong Hee

    2012-01-01

    In most mechanical systems, properties of the system elements have uncertainties due to several reasons. For example, mass, stiffness coefficient of a spring, damping coefficient of a damper or friction coefficients have uncertain characteristics. The uncertain characteristics of the elements have a direct effect on the system performance uncertainty. It is very important to estimate the performance uncertainty since the performance uncertainty is directly related to manufacturing yield and consumer satisfaction. Due to this reason, the performance uncertainty should be estimated accurately and considered in the system design. In this paper, performance measures are defined for nonlinear vibration systems and the performance measure uncertainties are estimated employing the first order reliability method (FORM). It was found that the FORM could provide good results in spite of the system nonlinear characteristics. Comparing to the results obtained by Monte Carlo Simulation (MCS), the accuracy of the uncertainty analysis results obtained by the FORM is validated

  14. A novel reliable method of DNA extraction from olive oil suitable for molecular traceability.

    Science.gov (United States)

    Raieta, Katia; Muccillo, Livio; Colantuoni, Vittorio

    2015-04-01

    Extra virgin olive oil production has a worldwide economic impact. The use of this brand, however, is of great concern to Institutions and private industries because of the increasing number of fraud and adulteration attempts to the market products. Here, we present a novel, reliable and not expensive method for extracting the DNA from commercial virgin and extra virgin olive oils. The DNA is stable overtime and amenable for molecular analyses; in fact, by carrying out simple sequence repeats (SSRs) markers analysis, we characterise the genetic profile of monovarietal olive oils. By comparing the oil-derived pattern with that of the corresponding tree, we can unambiguously identify four cultivars from Samnium, a region of Southern Italy, and distinguish them from reference and more widely used varieties. Through a parentage statistical analysis, we also identify the putative pollinators, establishing an unprecedented and powerful tool for olive oil traceability. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. A reliability design method for a lithium-ion battery pack considering the thermal disequilibrium in electric vehicles

    Science.gov (United States)

    Xia, Quan; Wang, Zili; Ren, Yi; Sun, Bo; Yang, Dezhen; Feng, Qiang

    2018-05-01

    With the rapid development of lithium-ion battery technology in the electric vehicle (EV) industry, the lifetime of the battery cell increases substantially; however, the reliability of the battery pack is still inadequate. Because of the complexity of the battery pack, a reliability design method for a lithium-ion battery pack considering the thermal disequilibrium is proposed in this paper based on cell redundancy. Based on this method, a three-dimensional electric-thermal-flow-coupled model, a stochastic degradation model of cells under field dynamic conditions and a multi-state system reliability model of a battery pack are established. The relationships between the multi-physics coupling model, the degradation model and the system reliability model are first constructed to analyze the reliability of the battery pack and followed by analysis examples with different redundancy strategies. By comparing the reliability of battery packs of different redundant cell numbers and configurations, several conclusions for the redundancy strategy are obtained. More notably, the reliability does not monotonically increase with the number of redundant cells for the thermal disequilibrium effects. In this work, the reliability of a 6 × 5 parallel-series configuration is the optimal system structure. In addition, the effect of the cell arrangement and cooling conditions are investigated.

  16. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  17. Bioassay battery interlaboratory investigation of emerging contaminants in spiked water extracts - Towards the implementation of bioanalytical monitoring tools in water quality assessment and monitoring.

    Science.gov (United States)

    Di Paolo, Carolina; Ottermanns, Richard; Keiter, Steffen; Ait-Aissa, Selim; Bluhm, Kerstin; Brack, Werner; Breitholtz, Magnus; Buchinger, Sebastian; Carere, Mario; Chalon, Carole; Cousin, Xavier; Dulio, Valeria; Escher, Beate I; Hamers, Timo; Hilscherová, Klára; Jarque, Sergio; Jonas, Adam; Maillot-Marechal, Emmanuelle; Marneffe, Yves; Nguyen, Mai Thao; Pandard, Pascal; Schifferli, Andrea; Schulze, Tobias; Seidensticker, Sven; Seiler, Thomas-Benjamin; Tang, Janet; van der Oost, Ron; Vermeirssen, Etienne; Zounková, Radka; Zwart, Nick; Hollert, Henner

    2016-11-01

    equivalency factors reliably reflected the sample content. In the Ames, strong revertant induction occurred following 3-NBA spike incubation with the TA98 strain, which was of lower magnitude after metabolic transformation and when compared to TA100. Differences in experimental protocols, model organisms, and data analysis can be sources of variation, indicating that respective harmonized standard procedures should be followed when implementing bioassays in water monitoring. Together with other ongoing activities for the validation of a basic bioassay battery, the present study is an important step towards the implementation of bioanalytical monitoring tools in water quality assessment and monitoring. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    Science.gov (United States)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  19. GPUs, a new tool of acceleration in CFD: efficiency and reliability on smoothed particle hydrodynamics methods.

    Directory of Open Access Journals (Sweden)

    Alejandro C Crespo

    Full Text Available Smoothed Particle Hydrodynamics (SPH is a numerical method commonly used in Computational Fluid Dynamics (CFD to simulate complex free-surface flows. Simulations with this mesh-free particle method far exceed the capacity of a single processor. In this paper, as part of a dual-functioning code for either central processing units (CPUs or Graphics Processor Units (GPUs, a parallelisation using GPUs is presented. The GPU parallelisation technique uses the Compute Unified Device Architecture (CUDA of nVidia devices. Simulations with more than one million particles on a single GPU card exhibit speedups of up to two orders of magnitude over using a single-core CPU. It is demonstrated that the code achieves different speedups with different CUDA-enabled GPUs. The numerical behaviour of the SPH code is validated with a standard benchmark test case of dam break flow impacting on an obstacle where good agreement with the experimental results is observed. Both the achieved speed-ups and the quantitative agreement with experiments suggest that CUDA-based GPU programming can be used in SPH methods with efficiency and reliability.

  20. A human reliability assessment screening method for the NRU upgrade project

    International Nuclear Information System (INIS)

    Bremner, F.M.; Alsop, C.J.

    1997-01-01

    The National Research Universal (NRU) reactor is a 130MW, low pressure, heavy water cooled and moderated research reactor. The reactor is used for research, both in support of Canada's CANDU development program, and for a wide variety of other research applications. In addition, NRU plays an important part in the production of medical isotopes, e.g., generating 80% of worldwide supplies of Molybdenum-99. NRU is owned and operated by Atomic Energy of Canada Ltd. (AECL), and is currently undergoing upgrading as part of AECL's continuing commitment to operate their facilities in a safe manner. As part of these upgrades both deterministic and probabilistic safety assessments are being carried out. It was recognized that the assignment of Human Error Probabilities (HEPs) is an important part of the Probabilistic Safety Assessment (PSA) studies, particularly for a facility whose design predates modern ergonomic practices, and which will undergo a series of backfitted modifications whilst continuing to operate. A simple Human Reliability Assessment (HRA) screening method, looking at both pre- and post-accident errors, was used in the initial safety studies. However, following review of this method within AECL and externally by the regulator, it was judged that benefits could be gained for future error reduction by including additional features, as later described in this document. The HRA development project consisted of several stages; needs analysis, literature review, development of method (including testing and evaluation), and implementation. This paper discusses each of these stages in further detail. (author)

  1. Diagnosing developmental dyscalculia on the basis of reliable single case FMRI methods: promises and limitations.

    Directory of Open Access Journals (Sweden)

    Philipp Johannes Dinkel

    Full Text Available FMRI-studies are mostly based on a group study approach, either analyzing one group or comparing multiple groups, or on approaches that correlate brain activation with clinically relevant criteria or behavioral measures. In this study we investigate the potential of fMRI-techniques focusing on individual differences in brain activation within a test-retest reliability context. We employ a single-case analysis approach, which contrasts dyscalculic children with a control group of typically developing children. In a second step, a support-vector machine analysis and cluster analysis techniques served to investigate similarities in multivariate brain activation patterns. Children were confronted with a non-symbolic number comparison and a non-symbolic exact calculation task during fMRI acquisition. Conventional second level group comparison analysis only showed small differences around the angular gyrus bilaterally and the left parieto-occipital sulcus. Analyses based on single-case statistical procedures revealed that developmental dyscalculia is characterized by individual differences predominantly in visual processing areas. Dyscalculic children seemed to compensate for relative under-activation in the primary visual cortex through an upregulation in higher visual areas. However, overlap in deviant activation was low for the dyscalculic children, indicating that developmental dyscalculia is a disorder characterized by heterogeneous brain activation differences. Using support vector machine analysis and cluster analysis, we tried to group dyscalculic and typically developing children according to brain activation. Fronto-parietal systems seem to qualify for a distinction between the two groups. However, this was only effective when reliable brain activations of both tasks were employed simultaneously. Results suggest that deficits in number representation in the visual-parietal cortex get compensated for through finger related aspects of number

  2. Diagnosing developmental dyscalculia on the basis of reliable single case FMRI methods: promises and limitations.

    Science.gov (United States)

    Dinkel, Philipp Johannes; Willmes, Klaus; Krinzinger, Helga; Konrad, Kerstin; Koten, Jan Willem

    2013-01-01

    FMRI-studies are mostly based on a group study approach, either analyzing one group or comparing multiple groups, or on approaches that correlate brain activation with clinically relevant criteria or behavioral measures. In this study we investigate the potential of fMRI-techniques focusing on individual differences in brain activation within a test-retest reliability context. We employ a single-case analysis approach, which contrasts dyscalculic children with a control group of typically developing children. In a second step, a support-vector machine analysis and cluster analysis techniques served to investigate similarities in multivariate brain activation patterns. Children were confronted with a non-symbolic number comparison and a non-symbolic exact calculation task during fMRI acquisition. Conventional second level group comparison analysis only showed small differences around the angular gyrus bilaterally and the left parieto-occipital sulcus. Analyses based on single-case statistical procedures revealed that developmental dyscalculia is characterized by individual differences predominantly in visual processing areas. Dyscalculic children seemed to compensate for relative under-activation in the primary visual cortex through an upregulation in higher visual areas. However, overlap in deviant activation was low for the dyscalculic children, indicating that developmental dyscalculia is a disorder characterized by heterogeneous brain activation differences. Using support vector machine analysis and cluster analysis, we tried to group dyscalculic and typically developing children according to brain activation. Fronto-parietal systems seem to qualify for a distinction between the two groups. However, this was only effective when reliable brain activations of both tasks were employed simultaneously. Results suggest that deficits in number representation in the visual-parietal cortex get compensated for through finger related aspects of number representation in

  3. A Method for Improving Reliability of Radiation Detection using Deep Learning Framework

    International Nuclear Information System (INIS)

    Chang, Hojong; Kim, Tae-Ho; Han, Byunghun; Kim, Hyunduk; Kim, Ki-duk

    2017-01-01

    Radiation detection is essential technology for overall field of radiation and nuclear engineering. Previously, technology for radiation detection composes of preparation of the table of the input spectrum to output spectrum in advance, which requires simulation of numerous predicted output spectrum with simulation using parameters modeling the spectrum. In this paper, we propose new technique to improve the performance of radiation detector. The software in the radiation detector has been stagnant for a while with possible intrinsic error of simulation. In the proposed method, to predict the input source using output spectrum measured by radiation detector is performed using deep neural network. With highly complex model, we expect that the complex pattern between data and the label can be captured well. Furthermore, the radiation detector should be calibrated regularly and beforehand. We propose a method to calibrate radiation detector using GAN. We hope that the power of deep learning may also reach to radiation detectors and make huge improvement on the field. Using improved radiation detector, the reliability of detection would be confident, and there are many tasks remaining to solve using deep learning in nuclear engineering society.

  4. Development of a Method for Quantifying the Reliability of Nuclear Safety-Related Software

    International Nuclear Information System (INIS)

    Yi Zhang; Golay, Michael W.

    2003-01-01

    The work of our project is intended to help introducing digital technologies into nuclear power into nuclear power plant safety related software applications. In our project we utilize a combination of modern software engineering methods: design process discipline and feedback, formal methods, automated computer aided software engineering tools, automatic code generation, and extensive feasible structure flow path testing to improve software quality. The tactics include ensuring that the software structure is kept simple, permitting routine testing during design development, permitting extensive finished product testing in the input data space of most likely service and using test-based Bayesian updating to estimate the probability that a random software input will encounter an error upon execution. From the results obtained the software reliability can be both improved and its value estimated. Hopefully our success in the project's work can aid the transition of the nuclear enterprise into the modern information world. In our work, we have been using the proprietary sample software, the digital Signal Validation Algorithm (SVA), provided by Westinghouse. Also our work is being done with their collaboration. The SVA software is used for selecting the plant instrumentation signal set which is to be used as the input the digital Plant Protection System (PPS). This is the system that automatically decides whether to trip the reactor. In our work, we are using -001 computer assisted software engineering (CASE) tool of Hamilton Technologies Inc. This tool is capable of stating the syntactic structure of a program reflecting its state requirements, logical functions and data structure

  5. Modified Inverse First Order Reliability Method (I-FORM) for Predicting Extreme Sea States.

    Energy Technology Data Exchange (ETDEWEB)

    Eckert-Gallup, Aubrey Celia; Sallaberry, Cedric Jean-Marie; Dallman, Ann Renee; Neary, Vincent Sinclair

    2014-09-01

    Environmental contours describing extreme sea states are generated as the input for numerical or physical model simulation s as a part of the stand ard current practice for designing marine structure s to survive extreme sea states. Such environmental contours are characterized by combinations of significant wave height ( ) and energy period ( ) values calculated for a given recurrence interval using a set of data based on hindcast simulations or buoy observations over a sufficient period of record. The use of the inverse first - order reliability method (IFORM) i s standard design practice for generating environmental contours. In this paper, the traditional appli cation of the IFORM to generating environmental contours representing extreme sea states is described in detail and its merits and drawbacks are assessed. The application of additional methods for analyzing sea state data including the use of principal component analysis (PCA) to create an uncorrelated representation of the data under consideration is proposed. A reexamination of the components of the IFORM application to the problem at hand including the use of new distribution fitting techniques are shown to contribute to the development of more accurate a nd reasonable representations of extreme sea states for use in survivability analysis for marine struc tures. Keywords: In verse FORM, Principal Component Analysis , Environmental Contours, Extreme Sea State Characteri zation, Wave Energy Converters

  6. Reliability Analysis of Corroded Reinforced Concrete Beams Using Enhanced HL-RF Method

    Directory of Open Access Journals (Sweden)

    Arash Mohammadi Farsani

    2015-12-01

    Full Text Available Steel corrosion of bars in concrete structures is a complex process which leads to the reduction of the cross-section bars and decreasing the resistance of the concrete and steel materials. In this study, reliability analysis of a reinforced concrete beam with corrosion defects under the distributed load was investigated using the enhanced Hasofer-Lind and Rackwitz-Fiessler (EHL-RF method based on relaxed approach. Robustness of the EHL-RF algorithm was compared with the HL-RF using a complicated example. It was seen that the EHL-RF algorithm is more robust than the HL-RF method. Finally, the effects of corrosion time were investigated using the EHL-RF algorithm for a reinforced concrete beam based on flexural strength in the pitting and general corrosion. The model uncertainties were considered in the resistance and load terms of flexural strength limit state function. The results illustrated that increasing the corrosion time-period leads to increase in the failure probability of the corroded concrete beam.

  7. A reliable method for intracranial electrode implantation and chronic electrical stimulation in the mouse brain.

    Science.gov (United States)

    Jeffrey, Melanie; Lang, Min; Gane, Jonathan; Wu, Chiping; Burnham, W McIntyre; Zhang, Liang

    2013-08-06

    Electrical stimulation of brain structures has been widely used in rodent models for kindling or modeling deep brain stimulation used clinically. This requires surgical implantation of intracranial electrodes and subsequent chronic stimulation in individual animals for several weeks. Anchoring screws and dental acrylic have long been used to secure implanted intracranial electrodes in rats. However, such an approach is limited when carried out in mouse models as the thin mouse skull may not be strong enough to accommodate the anchoring screws. We describe here a screw-free, glue-based method for implanting bipolar stimulating electrodes in the mouse brain and validate this method in a mouse model of hippocampal electrical kindling. Male C57 black mice (initial ages of 6-8 months) were used in the present experiments. Bipolar electrodes were implanted bilaterally in the hippocampal CA3 area for electrical stimulation and electroencephalographic recordings. The electrodes were secured onto the skull via glue and dental acrylic but without anchoring screws. A daily stimulation protocol was used to induce electrographic discharges and motor seizures. The locations of implanted electrodes were verified by hippocampal electrographic activities and later histological assessments. Using the glue-based implantation method, we implanted bilateral bipolar electrodes in 25 mice. Electrographic discharges and motor seizures were successfully induced via hippocampal electrical kindling. Importantly, no animal encountered infection in the implanted area or a loss of implanted electrodes after 4-6 months of repetitive stimulation/recording. We suggest that the glue-based, screw-free method is reliable for chronic brain stimulation and high-quality electroencephalographic recordings in mice. The technical aspects described this study may help future studies in mouse models.

  8. Bridging Human Reliability Analysis and Psychology, Part 1: The Psychological Literature Review for the IDHEAS Method

    Energy Technology Data Exchange (ETDEWEB)

    April M. Whaley; Stacey M. L. Hendrickson; Ronald L. Boring; Jeffrey C. Joe; Katya L. Le Blanc; Jing Xing

    2012-06-01

    In response to Staff Requirements Memorandum (SRM) SRM-M061020, the U.S. Nuclear Regulatory Commission (NRC) is sponsoring work to update the technical basis underlying human reliability analysis (HRA) in an effort to improve the robustness of HRA. The ultimate goal of this work is to develop a hybrid of existing methods addressing limitations of current HRA models and in particular issues related to intra- and inter-method variabilities and results. This hybrid method is now known as the Integrated Decision-tree Human Event Analysis System (IDHEAS). Existing HRA methods have looked at elements of the psychological literature, but there has not previously been a systematic attempt to translate the complete span of cognition from perception to action into mechanisms that can inform HRA. Therefore, a first step of this effort was to perform a literature search of psychology, cognition, behavioral science, teamwork, and operating performance to incorporate current understanding of human performance in operating environments, thus affording an improved technical foundation for HRA. However, this literature review went one step further by mining the literature findings to establish causal relationships and explicit links between the different types of human failures, performance drivers and associated performance measures ultimately used for quantification. This is the first of two papers that detail the literature review (paper 1) and its product (paper 2). This paper describes the literature review and the high-level architecture used to organize the literature review, and the second paper (Whaley, Hendrickson, Boring, & Xing, these proceedings) describes the resultant cognitive framework.

  9. Reliability of a new method for measuring coronal trunk imbalance, the axis-line-angle technique.

    Science.gov (United States)

    Zhang, Rui-Fang; Liu, Kun; Wang, Xue; Liu, Qian; He, Jia-Wei; Wang, Xiang-Yang; Yan, Zhi-Han

    2015-12-01

    Accurate determination of the extent of trunk imbalance in the coronal plane plays a key role in an evaluation of patients with trunk imbalance, such as patients with adolescent idiopathic scoliosis. An established, widely used practice in evaluating trunk imbalance is to drop a plumb line from the C7 vertebra to a key reference axis, the central sacral vertical line (CSVL) in full-spine standing anterioposterior radiographs, and measuring the distance between them, the C7-CSVL. However, measuring the CSVL is subject to intraobserver differences, is error-prone, and is of poor reliability. Therefore, the development of a different way to measure trunk imbalance is needed. This study aimed to describe a new method to measure coronal trunk imbalance, the axis-line-angle technique (ALAT), which measures the angle at the intersection between the C7 plumb line and an axis line drawn from the vertebral centroid of the C7 to the middle of the superior border of the symphysis pubis, and to compare the reliability of the ALAT with that of the C7-CSVL. A prospective study at a university hospital was used. The patient sample consisted of sixty-nine consecutively enrolled men and women patients, aged 10-18 years, who had trunk imbalance defined as C7-CSVL longer than 20 mm on computed full-spine standing anterioposterior radiographs. Data were analyzed to determine the correlation between C7-CSVL and ALAT measurements and to determine intraobserver and interobserver reliabilities. Using a picture archiving and communication system, three radiologists independently evaluated trunk imbalance on the 69 computed radiographs by measuring the C7-CSVL and by measuring the angle determined by the ALAT. Data were analyzed to determine the correlations between the two measures of trunk imbalance, and to determine intraobserver and interobserver reliabilities of each of them. Overall results from the measurements by the C7-CSVL and the ALAT were significantly moderately correlated

  10. A reliable and economical method for gaining mouse embryonic fibroblasts capable of preparing feeder layers.

    Science.gov (United States)

    Jiang, Guangming; Wan, Xiaoju; Wang, Ming; Zhou, Jianhua; Pan, Jian; Wang, Baolong

    2016-08-01

    Mouse embryonic fibroblasts (MEFs) are widely used to prepare feeder layers for culturing embryonic stem cells (ESCs) or induced pluripotent stem cells (iPSCs) in vitro. Transportation lesions and exorbitant prices make the commercially obtained MEFs unsuitable for long term research. The aim of present study is to establish a method, which enables researchers to gain MEFs from mice and establish feeder layers by themselves in ordinary laboratories. MEFs were isolated from ICR mouse embryos at 12.5-17.5 day post-coitum (DPC) and cultured in vitro. At P2-P7, the cells were inactivated with mitomycin C or by X-ray irradiation. Then they were used to prepare feeder layers. The key factors of the whole protocol were analyzed to determine the optimal conditions for the method. The results revealed MEFs isolated at 12.5-13.5 DPC, and cultured to P3 were the best choice for feeder preparation, those P2 and P4-P5 MEFs were also suitable for the purpose. The P3-P5 MEFs treated with 10 μg/ml of mitomycin C for 3 h, or irradiated with X-ray at 1.5 Gy/min for 25 Gy were the most suitable feeder cells. Treating MEFs with 10 μg/ml of mitomycin C for 2.5 h, 15 μg/ml for 2.0 h, or irradiating the cells with 20 Gy of X-ray at 2.0 Gy/min could all serve as alternative methods for P3-P4 cells. Our study provides a reliable and economical way to obtain large amount of qualified MEFs for long term research of ESCs or iPSCs.

  11. Noninvasive Hemoglobin Monitoring: A Rapid, Reliable, and Cost-Effective Method Following Total Joint Replacement.

    Science.gov (United States)

    Martin, J Ryan; Camp, Christopher L; Stitz, Amber; Young, Ernest Y; Abdel, Matthew P; Taunton, Michael J; Trousdale, Robert T

    2016-03-02

    Noninvasive hemoglobin (nHgb) monitoring was initially introduced in the intensive care setting as a means of rapidly assessing Hgb values without performing a blood draw. We conducted a prospective analysis to compare reliability, cost, and patient preference between nHgb monitoring and invasive Hgb (iHgb) monitoring performed via a traditional blood draw. We enrolled 100 consecutive patients undergoing primary or revision total hip or total knee arthroplasty. On postoperative day 1, nHgb and iHgb values were obtained within thirty minutes of one another. iHgb and nHgb values, cost, patient satisfaction, and the duration of time required to obtain each reading were recorded. The concordance correlation coefficient (CCC) was utilized to evaluate the agreement of the two Hgb measurement methods. Paired t tests and Wilcoxon signed-rank tests were utilized to compare mean Hgb values, time, and pain for all readings. The mean Hgb values did not differ significantly between the two measurement methods: the mean iHgb value (and standard deviation) was 11.3 ± 1.4 g/dL (range, 8.2 to 14.3 g/dL), and the mean nHgb value was 11.5 ± 1.8 g/dL (range, 7.0 to 16.0 g/dL) (p = 0.11). The CCC between the two Hgb methods was 0.69. One hundred percent of the patients with an nHgb value of ≥ 10.5 g/dL had an iHgb value of >8.0 g/dL. The mean time to obtain an Hgb value was 0.9 minute for the nHgb method and 51.1 minutes for the iHgb method (p measurement, resulting in a savings of $26 per Hgb assessment when the noninvasive method is used. Noninvasive Hgb monitoring was found to be more efficient, less expensive, and preferred by patients compared with iHgb monitoring. Providers could consider screening total joint arthroplasty patients with nHgb monitoring and only order iHgb measurement if the nHgb value is protocol had been applied to the first blood draw in our 100 patients, approximately $2000 would have been saved. Extrapolated to the U.S. total joint arthroplasty practice

  12. Packaging of silicon sensors for microfluidic bio-analytical applications

    International Nuclear Information System (INIS)

    Wimberger-Friedl, Reinhold; Prins, Menno; Megens, Mischa; Dittmer, Wendy; Witz, Christiane de; Nellissen, Ton; Weekamp, Wim; Delft, Jan van; Ansems, Will; Iersel, Ben van

    2009-01-01

    A new industrial concept is presented for packaging biosensor chips in disposable microfluidic cartridges to enable medical diagnostic applications. The inorganic electronic substrates, such as silicon or glass, are integrated in a polymer package which provides the electrical and fluidic interconnections to the world and provides mechanical strength and protection for out-of-lab use. The demonstrated prototype consists of a molded interconnection device (MID), a silicon-based giant magneto-resistive (GMR) biosensor chip, a flex and a polymer fluidic part with integrated tubing. The various processes are compatible with mass manufacturing and run at a high yield. The devices show a reliable electrical interconnection between the sensor chip and readout electronics during extended wet operation. Sandwich immunoassays were carried out in the cartridges with surface functionalized sensor chips. Biological response curves were determined for different concentrations of parathyroid hormone (PTH) on the packaged biosensor, which demonstrates the functionality and biocompatibility of the devices. The new packaging concept provides a platform for easy further integration of electrical and fluidic functions, as for instance required for integrated molecular diagnostic devices in cost-effective mass manufacturing

  13. Activity-Based Detection and Bioanalytical Confirmation of a Fatal Carfentanil Intoxication

    Directory of Open Access Journals (Sweden)

    Annelies Cannaert

    2018-05-01

    Full Text Available Carfentanil, one of the most potent opioids known, has recently been reported as a contaminant in street heroin in the United States and Europe, and is associated with an increased number of life-threatening emergency department admissions and deaths. Here, we report on the application of a novel in vitro opioid activity reporter assay and a sensitive bioanalytical assay in the context of a fatal carfentanil intoxication, revealing the highest carfentanil concentrations reported until now. A 21-year-old male was found dead at home with a note stating that he had taken carfentanil with suicidal intentions. A foil bag and plastic bag labeled “C.50” were found at the scene. These bags were similar to a sample obtained by the Belgian Early Warning System on Drugs from a German darknet shop and to those found in the context of a fatality in Norway. Blood, urine and vitreous, obtained during autopsy, were screened with a newly developed in vitro opioid activity reporter assay able to detect compounds based on their μ-opioid receptor activity rather than their chemical structure. All extracts showed strong opioid activity. Results were confirmed by a bioanalytical assay, which revealed extremely high concentrations for carfentanil and norcarfentanil. It should be noted that carfentanil concentrations are typically in pg/mL, but here they were 92 ng/mL in blood, 2.8 ng/mL in urine, and 23 ng/mL in vitreous. The blood and vitreous contained 0.532 and 0.300 ng/mL norcarfentanil, respectively. No norcarfentanil was detected in urine. This is the first report where a novel activity-based opioid screening assay was successfully deployed in a forensic case. Confirmation and quantification using a validated bioanalytical procedure revealed the, to our knowledge, highest carfentanil concentrations reported in humans so far.

  14. A consistent modelling methodology for secondary settling tanks: a reliable numerical method.

    Science.gov (United States)

    Bürger, Raimund; Diehl, Stefan; Farås, Sebastian; Nopens, Ingmar; Torfs, Elena

    2013-01-01

    The consistent modelling methodology for secondary settling tanks (SSTs) leads to a partial differential equation (PDE) of nonlinear convection-diffusion type as a one-dimensional model for the solids concentration as a function of depth and time. This PDE includes a flux that depends discontinuously on spatial position modelling hindered settling and bulk flows, a singular source term describing the feed mechanism, a degenerating term accounting for sediment compressibility, and a dispersion term for turbulence. In addition, the solution itself is discontinuous. A consistent, reliable and robust numerical method that properly handles these difficulties is presented. Many constitutive relations for hindered settling, compression and dispersion can be used within the model, allowing the user to switch on and off effects of interest depending on the modelling goal as well as investigate the suitability of certain constitutive expressions. Simulations show the effect of the dispersion term on effluent suspended solids and total sludge mass in the SST. The focus is on correct implementation whereas calibration and validation are not pursued.

  15. A reliable morphological method to assess the age of male Anopheles gambiae

    Directory of Open Access Journals (Sweden)

    Killeen Gerry F

    2006-07-01

    Full Text Available Abstract Background Release of genetically-modified (GM or sterile male mosquitoes for malaria control is hampered by inability to assess the age and mating history of free-living male Anopheles. Methods Age and mating-related changes in the reproductive system of male Anopheles gambiae were quantified and used to fit predictive statistical models. These models, based on numbers of spermatocysts, relative size of sperm reservoir and presence/absence of a clear area around the accessory gland, were evaluated using an independent sample of mosquitoes whose status was blinded during the experiment. Results The number of spermatocysts in male testes decreased with age, and the relative size of their sperm reservoir increased. The presence of a clear area around accessory glands was also linked to age and mating status. A quantitative model was able to categorize males from the blind trial into age groups of young (≤ 4 days and old (> 4 days with an overall efficiency of 89%. Using the parameters of this model, a simple table was compiled that can be used to predict male age. In contrast, mating history could not be reliably assessed as virgins could not be distinguished from mated males. Conclusion Simple assessment of a few morphological traits which are easily collected in the field allows accurate age-grading of male An. gambiae. This simple, yet robust, model enables evaluation of demographic patterns and mortality in wild and released males in populations targeted by GM or sterile male-based control programmes.

  16. Development of a reliability-analysis method for category I structures

    International Nuclear Information System (INIS)

    Shinozuka, M.; Kako, T.; Hwang, H.; Reich, M.

    1983-01-01

    The present paper develops a reliability analysis method for category I nuclear structures, particularly for reinforced concrete containment structures subjected to various load combinations. The loads considered here include dead loads, accidental internal pressure and earthquake ground acceleration. For mathematical tractability, an earthquake occurrence is assumed to be governed by the Poisson arrival law, while its acceleration history is idealized as a Gaussian vector process of finite duration. A vector process consists of three component processes, each with zero mean. The second order statistics of this process are specified by a three-by-three spectral density matrix with a multiplying factor representing the overall intensity of the ground acceleration. With respect to accidental internal pressure, the following assumptions are made: (a) it occurs in accordance with the Poisson law; (b) its intensity and duration are random; and (c) its temporal rise and fall behaviors are such that a quasi-static structural analysis applies. A dead load is considered to be a deterministic constant

  17. Method and apparatus for a nuclear reactor for increasing reliability to scram control elements

    International Nuclear Information System (INIS)

    Bevilacqua, F.

    1976-01-01

    A description is given of a method and apparatus for increasing the reliability of linear drive devices of a nuclear reactor to scram the control elements held in a raised position thereby. Each of the plurality of linear drive devices includes a first type of holding means associated with the drive means of the linear drive device and a second type of holding means distinct and operatively dissimilar from the first type. The system of linear drive devices having both types of holding means are operated in such a manner that the control elements of a portion of the linear drive devices are only held in a raised position by the first holding means and the control elements of the remaining portion of linear drive devices are held in a raised position by only the second type of holding means. Since the two types of holding means are distinct from one another and are operatively dissimilar, the probability of failure of both systems to scram as a result of common mode failure will be minimized. Means may be provided to positively detect disengagement of the first type of holding means and engagement of the second type of holding means for those linear drive devices being operative to hold the control elements in a raised position with the second type of holding means

  18. Small metal soft tissue foreign body extraction by using 3D CT guidance: A reliable method

    International Nuclear Information System (INIS)

    Tao, Kai; Xu, Sen; Liu, Xiao-yan; Liang, Jiu-long; Qiu, Tao; Tan, Jia-nan; Che, Jian-hua; Wang, Zi-hua

    2012-01-01

    Objective: To introduce a useful and accurate technique for the locating and removal of small metal foreign bodies in the soft tissues. Methods: Eight patients presented with suspected small metal foreign bodies retained in the soft tissues of various body districts. Under local anesthesia, 3–6 pieces of 5 ml syringe needles or 1 ml syringe needles were induced through three different planes around the entry point of the foreign bodies. Using these finders, the small metal FBs were confirmed under 3D CT guidance. Based on the CT findings, the soft tissues were dissected along the path of the closest needle and the FBs were easily found and removed according to the relation with the closest needle finder. Results: Eight metal foreign bodies (3 slices, 3 nails, 1 fish hook, 1 needlepoint) were successfully removed under 3D CT guidance in all patients. The procedures took between 35 min and 50 min and the operation times took between 15 min and 25 min. No complications arose after the treatment. Conclusion: 3D CT-guided technique is a good alternative for the removal of small metal foreign body retained in the soft tissues as it is relatively accurate, reliable, quick, carries a low risk of complications and can be a first-choice procedure for the extraction of small metal foreign body.

  19. Reliability evaluation of I-123 ADAM SPECT imaging using SPM software and AAL ROI methods

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Bang-Hung [Department of Biomedical Imaging and Radiological Sciences, National Yang-Ming University, Taipei, Taiwan (China); Department of Nuclear Medicine, Taipei Veterans General Hospital, Taiwan (China); Tsai, Sung-Yi [Department of Biomedical Imaging and Radiological Sciences, National Yang-Ming University, Taipei, Taiwan (China); Department of Imaging Medical, St.Martin De Porres Hospital, Chia-Yi, Taiwan (China); Wang, Shyh-Jen [Department of Biomedical Imaging and Radiological Sciences, National Yang-Ming University, Taipei, Taiwan (China); Department of Nuclear Medicine, Taipei Veterans General Hospital, Taiwan (China); Su, Tung-Ping; Chou, Yuan-Hwa [Department of Psychiatry, Taipei Veterans General Hospital, Taipei, Taiwan (China); Chen, Chia-Chieh [Institute of Nuclear Energy Research, Longtan, Taiwan (China); Chen, Jyh-Cheng, E-mail: jcchen@ym.edu.tw [Department of Biomedical Imaging and Radiological Sciences, National Yang-Ming University, Taipei, Taiwan (China)

    2011-08-21

    The level of serotonin was regulated by serotonin transporter (SERT), which is a decisive protein in regulation of serotonin neurotransmission system. Many psychiatric disorders and therapies were also related to concentration of cerebral serotonin. I-123 ADAM was the novel radiopharmaceutical to image SERT in brain. The aim of this study was to measure reliability of SERT densities of healthy volunteers by automated anatomical labeling (AAL) method. Furthermore, we also used statistic parametric mapping (SPM) on a voxel by voxel analysis to find difference of cortex between test and retest of I-123 ADAM single photon emission computed tomography (SPECT) images. Twenty-one healthy volunteers were scanned twice with SPECT at 4 h after intravenous administration of 185 MBq of {sup 123}I-ADAM. The image matrix size was 128x128 and pixel size was 3.9 mm. All images were obtained through filtered back-projection (FBP) reconstruction algorithm. Region of interest (ROI) definition was performed based on the AAL brain template in PMOD version 2.95 software package. ROI demarcations were placed on midbrain, pons, striatum, and cerebellum. All images were spatially normalized to the SPECT MNI (Montreal Neurological Institute) templates supplied with SPM2. And each image was transformed into standard stereotactic space, which was matched to the Talairach and Tournoux atlas. Then differences across scans were statistically estimated on a voxel by voxel analysis using paired t-test (population main effect: 2 cond's, 1 scan/cond.), which was applied to compare concentration of SERT between the test and retest cerebral scans. The average of specific uptake ratio (SUR: target/cerebellum-1) of {sup 123}I-ADAM binding to SERT in midbrain was 1.78{+-}0.27, pons was 1.21{+-}0.53, and striatum was 0.79{+-}0.13. The cronbach's {alpha} of intra-class correlation coefficient (ICC) was 0.92. Besides, there was also no significant statistical finding in cerebral area using SPM2

  20. Surface Modification of Photoresist SU-8 for Low Autofluorescence and Bioanalytical Applications

    DEFF Research Database (Denmark)

    Cao, Cuong; Birtwell, Sam W.; Høgberg, Jonas

    2011-01-01

    This paper reports a surface modification of epoxy-based negative photoresist SU-8 for reducing its autofluorescence while enhancing its biofunctionality. By covalently depositing a thin layer of 20 nm Au nanoparticles (AuNPs) onto the SU-8 surface, we found that the AuNPs-coated SU-8 surface...... is much less fluorescent than the untreated SU-8. Moreover, DNA probes can easily be immobilized on the Au surface and are thermally stable over a wide range of temperature. These improvements will benefit bioanalytical applications such as DNA hybridization and solid-phase PCR (SP-PCR)....

  1. Reliable Viscosity Calculation from Equilibrium Molecular Dynamics Simulations: A Time Decomposition Method.

    Science.gov (United States)

    Zhang, Yong; Otani, Akihito; Maginn, Edward J

    2015-08-11

    Equilibrium molecular dynamics is often used in conjunction with a Green-Kubo integral of the pressure tensor autocorrelation function to compute the shear viscosity of fluids. This approach is computationally expensive and is subject to a large amount of variability because the plateau region of the Green-Kubo integral is difficult to identify unambiguously. Here, we propose a time decomposition approach for computing the shear viscosity using the Green-Kubo formalism. Instead of one long trajectory, multiple independent trajectories are run and the Green-Kubo relation is applied to each trajectory. The averaged running integral as a function of time is fit to a double-exponential function with a weighting function derived from the standard deviation of the running integrals. Such a weighting function minimizes the uncertainty of the estimated shear viscosity and provides an objective means of estimating the viscosity. While the formal Green-Kubo integral requires an integration to infinite time, we suggest an integration cutoff time tcut, which can be determined by the relative values of the running integral and the corresponding standard deviation. This approach for computing the shear viscosity can be easily automated and used in computational screening studies where human judgment and intervention in the data analysis are impractical. The method has been applied to the calculation of the shear viscosity of a relatively low-viscosity liquid, ethanol, and relatively high-viscosity ionic liquid, 1-n-butyl-3-methylimidazolium bis(trifluoromethane-sulfonyl)imide ([BMIM][Tf2N]), over a range of temperatures. These test cases show that the method is robust and yields reproducible and reliable shear viscosity values.

  2. Development of a Method for Quantifying the Reliability of Nuclear Safety-Related Software

    Energy Technology Data Exchange (ETDEWEB)

    Yi Zhang; Michael W. Golay

    2003-10-01

    The work of our project is intended to help introducing digital technologies into nuclear power into nuclear power plant safety related software applications. In our project we utilize a combination of modern software engineering methods: design process discipline and feedback, formal methods, automated computer aided software engineering tools, automatic code generation, and extensive feasible structure flow path testing to improve software quality. The tactics include ensuring that the software structure is kept simple, permitting routine testing during design development, permitting extensive finished product testing in the input data space of most likely service and using test-based Bayesian updating to estimate the probability that a random software input will encounter an error upon execution. From the results obtained the software reliability can be both improved and its value estimated. Hopefully our success in the project's work can aid the transition of the nuclear enterprise into the modern information world. In our work, we have been using the proprietary sample software, the digital Signal Validation Algorithm (SVA), provided by Westinghouse. Also our work is being done with their collaboration. The SVA software is used for selecting the plant instrumentation signal set which is to be used as the input the digital Plant Protection System (PPS). This is the system that automatically decides whether to trip the reactor. In our work, we are using -001 computer assisted software engineering (CASE) tool of Hamilton Technologies Inc. This tool is capable of stating the syntactic structure of a program reflecting its state requirements, logical functions and data structure.

  3. Condition-based fault tree analysis (CBFTA): A new method for improved fault tree analysis (FTA), reliability and safety calculations

    International Nuclear Information System (INIS)

    Shalev, Dan M.; Tiran, Joseph

    2007-01-01

    Condition-based maintenance methods have changed systems reliability in general and individual systems in particular. Yet, this change does not affect system reliability analysis. System fault tree analysis (FTA) is performed during the design phase. It uses components failure rates derived from available sources as handbooks, etc. Condition-based fault tree analysis (CBFTA) starts with the known FTA. Condition monitoring (CM) methods applied to systems (e.g. vibration analysis, oil analysis, electric current analysis, bearing CM, electric motor CM, and so forth) are used to determine updated failure rate values of sensitive components. The CBFTA method accepts updated failure rates and applies them to the FTA. The CBFTA recalculates periodically the top event (TE) failure rate (λ TE ) thus determining the probability of system failure and the probability of successful system operation-i.e. the system's reliability. FTA is a tool for enhancing system reliability during the design stages. But, it has disadvantages, mainly it does not relate to a specific system undergoing maintenance. CBFTA is tool for updating reliability values of a specific system and for calculating the residual life according to the system's monitored conditions. Using CBFTA, the original FTA is ameliorated to a practical tool for use during the system's field life phase, not just during system design phase. This paper describes the CBFTA method and its advantages are demonstrated by an example

  4. Reliability determination of aluminium electrolytic capacitors by the mean of various methods application to the protection system of the LHC

    CERN Document Server

    Perisse, F; Rojat, G

    2004-01-01

    The lifetime of power electronic components is often calculated from reliability reports, but this method can be discussed. We compare in this article the results of various reliability reports to an accelerated ageing test of component and introduced the load-strength concept. Large aluminium electrolytic capacitors are taken here in example in the context of the protection system of LHC (Large Hadron Collider) in CERN where the level of reliability is essential. We notice important differences of MTBF (Mean Time Between Failure) according to the reliability report used. Accelerating ageing tests carried out prove that a Weibull law is more adapted to determinate failure rates of components. The load-strength concept associated with accelerated ageing tests can be a solution to determine the lifetime of power electronic components.

  5. How to use an optimization-based method capable of balancing safety, reliability, and weight in an aircraft design process

    Energy Technology Data Exchange (ETDEWEB)

    Johansson, Cristina [Mendeley, Broderna Ugglasgatan, Linkoping (Sweden); Derelov, Micael; Olvander, Johan [Linkoping University, IEI, Dept. of Machine Design, Linkoping (Sweden)

    2017-03-15

    In order to help decision-makers in the early design phase to improve and make more cost-efficient system safety and reliability baselines of aircraft design concepts, a method (Multi-objective Optimization for Safety and Reliability Trade-off) that is able to handle trade-offs such as system safety, system reliability, and other characteristics, for instance weight and cost, is used. Multi-objective Optimization for Safety and Reliability Trade-off has been developed and implemented at SAAB Aeronautics. The aim of this paper is to demonstrate how the implemented method might work to aid the selection of optimal design alternatives. The method is a three-step method: step 1 involves the modelling of each considered target, step 2 is optimization, and step 3 is the visualization and selection of results (results processing). The analysis is performed within Architecture Design and Preliminary Design steps, according to the company's Product Development Process. The lessons learned regarding the use of the implemented trade-off method in the three cases are presented. The results are a handful of solutions, a basis to aid in the selection of a design alternative. While the implementation of the trade-off method is performed for companies, there is nothing to prevent adapting this method, with minimal modifications, for use in other industrial applications.

  6. How to use an optimization-based method capable of balancing safety, reliability, and weight in an aircraft design process

    International Nuclear Information System (INIS)

    Johansson, Cristina; Derelov, Micael; Olvander, Johan

    2017-01-01

    In order to help decision-makers in the early design phase to improve and make more cost-efficient system safety and reliability baselines of aircraft design concepts, a method (Multi-objective Optimization for Safety and Reliability Trade-off) that is able to handle trade-offs such as system safety, system reliability, and other characteristics, for instance weight and cost, is used. Multi-objective Optimization for Safety and Reliability Trade-off has been developed and implemented at SAAB Aeronautics. The aim of this paper is to demonstrate how the implemented method might work to aid the selection of optimal design alternatives. The method is a three-step method: step 1 involves the modelling of each considered target, step 2 is optimization, and step 3 is the visualization and selection of results (results processing). The analysis is performed within Architecture Design and Preliminary Design steps, according to the company's Product Development Process. The lessons learned regarding the use of the implemented trade-off method in the three cases are presented. The results are a handful of solutions, a basis to aid in the selection of a design alternative. While the implementation of the trade-off method is performed for companies, there is nothing to prevent adapting this method, with minimal modifications, for use in other industrial applications

  7. A high efficiency, high quality and low cost internal regulated bioanalytical laboratory to support drug development needs.

    Science.gov (United States)

    Song, Yan; Dhodda, Raj; Zhang, Jun; Sydor, Jens

    2014-05-01

    In the recent past, we have seen an increase in the outsourcing of bioanalysis in pharmaceutical companies in support of their drug development pipeline. This trend is largely driven by the effort to reduce internal cost, especially in support of late-stage pipeline assets where established bioanalytical assays are used to analyze a large volume of samples. This article will highlight our perspective of how bioanalytical laboratories within pharmaceutical companies can be developed into the best partner in the advancement of drug development pipelines with high-quality support at competitive cost.

  8. Reliability Assessment Method of Reactor Protection System Software by Using V and Vbased Bayesian Nets

    International Nuclear Information System (INIS)

    Eom, H. S.; Park, G. Y.; Kang, H. G.; Son, H. S.

    2010-07-01

    Developed a methodology which can be practically used in quantitative reliability assessment of a safety c ritical software for a protection system of nuclear power plants. The base of the proposed methodology is V and V being used in the nuclear industry, which means that it is not affected with specific software development environments or parameters that are necessary for the reliability calculation. Modular and formal sub-BNs in the proposed methodology is useful tool to constitute the whole BN model for reliability assessment of a target software. The proposed V and V based BN model estimates the defects in the software according to the performance of V and V results and then calculate reliability of the software. A case study was carried out to validate the proposed methodology. The target software is the RPS SW which was developed by KNICS project

  9. A Method for The Assessing of Reliability Characteristics Relevant to an Assumed Position-Fixing Accuracy in Navigational Positioning Systems

    Directory of Open Access Journals (Sweden)

    Specht Cezary

    2016-09-01

    Full Text Available This paper presents a method which makes it possible to determine reliability characteristics of navigational positioning systems, relevant to an assumed value of permissible error in position fixing. The method allows to calculate: availability , reliability as well as operation continuity of position fixing system for an assumed, determined on the basis of formal requirements - both worldwide and national, position-fixing accuracy. The proposed mathematical model allows to satisfy, by any navigational positioning system, not only requirements as to position-fixing accuracy of a given navigational application (for air , sea or land traffic but also the remaining characteristics associated with technical serviceability of a system.

  10. Reliability and validity in measurement of true humeral retroversion by a three-dimensional cylinder fitting method.

    Science.gov (United States)

    Saka, Masayuki; Yamauchi, Hiroki; Hoshi, Kenji; Yoshioka, Toru; Hamada, Hidetoshi; Gamada, Kazuyoshi

    2015-05-01

    Humeral retroversion is defined as the orientation of the humeral head relative to the distal humerus. Because none of the previous methods used to measure humeral retroversion strictly follow this definition, values obtained by these techniques vary and may be biased by morphologic variations of the humerus. The purpose of this study was 2-fold: to validate a method to define the axis of the distal humerus with a virtual cylinder and to establish the reliability of 3-dimensional (3D) measurement of humeral retroversion by this cylinder fitting method. Humeral retroversion in 14 baseball players (28 humeri) was measured by the 3D cylinder fitting method. The root mean square error was calculated to compare values obtained by a single tester and by 2 different testers using the embedded coordinate system. To establish the reliability, intraclass correlation coefficient (ICC) and precision (standard error of measurement [SEM]) were calculated. The root mean square errors for the humeral coordinate system were reliability and precision of the 3D measurement of retroversion yielded an intratester ICC of 0.99 (SEM, 1.0°) and intertester ICC of 0.96 (SEM, 2.8°). The error in measurements obtained by a distal humerus cylinder fitting method was small enough not to affect retroversion measurement. The 3D measurement of retroversion by this method provides excellent intratester and intertester reliability. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  11. Feasibility to implement the radioisotopic method of nasal mucociliary transport measurement getting reliable results

    International Nuclear Information System (INIS)

    Troncoso, M.; Opazo, C.; Quilodran, C.; Lizama, V.

    2002-01-01

    Aim: Our goal was to implement the radioisotopic method to measure the nasal mucociliary velocity of transport (NMVT) in a feasible way in order to make it easily available as well as to validate the accuracy of the results. Such a method is needed when primary ciliary dyskinesia (PCD) is suspected, a disorder characterized for low NMVT, non-specific chronic respiratory symptoms that needs to be confirmed by electronic microscopic cilia biopsy. Methods: We performed one hundred studies from February 2000 until February 2002. Patients aged 2 months to 39 years, mean 9 years. All of them were referred from the Respiratory Disease Department. Ninety had upper or lower respiratory symptoms, ten were healthy controls. The procedure, done be the Nuclear Medicine Technologist, consists to put a 20 μl drop of 99mTc-MAA (0,1 mCi, 4 MBq) behind the head of the inferior turbinate in one nostril using a frontal light, a nasal speculum and a teflon catheter attached to a tuberculin syringe. The drop movement was acquired in a gamma camera-computer system and the velocity was expressed in mm/min. As there is need for the patient not to move during the procedure, sedation has to be used in non-cooperative children. Abnormal NMVT values cases were referred for nasal biopsy. Patients were classified in three groups. Normal controls (NC), PCD confirmed by biopsy (PCDB) and cases with respiratory symptoms without biopsy (RSNB). In all patients with NMVT less than 2.4 mm/min PCD was confirmed by biopsy. There was a clear-cut separation between normal and abnormal values and interestingly even the highest NMVT in PCDB cases was lower than the lowest NMVT in NC. The procedure is not as easy as is generally described in the literature because the operator has to get some skill as well as for the need of sedation in some cases. Conclusion: The procedure gives reliable, reproducible and objective results. It is safe, not expensive and quick in cooperative patients. Although, sometimes

  12. Efficient reliability analysis of structures with the rotational quasi-symmetric point- and the maximum entropy methods

    Science.gov (United States)

    Xu, Jun; Dang, Chao; Kong, Fan

    2017-10-01

    This paper presents a new method for efficient structural reliability analysis. In this method, a rotational quasi-symmetric point method (RQ-SPM) is proposed for evaluating the fractional moments of the performance function. Then, the derivation of the performance function's probability density function (PDF) is carried out based on the maximum entropy method in which constraints are specified in terms of fractional moments. In this regard, the probability of failure can be obtained by a simple integral over the performance function's PDF. Six examples, including a finite element-based reliability analysis and a dynamic system with strong nonlinearity, are used to illustrate the efficacy of the proposed method. All the computed results are compared with those by Monte Carlo simulation (MCS). It is found that the proposed method can provide very accurate results with low computational effort.

  13. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software in Young People with Down Syndrome.

    Science.gov (United States)

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Rey-Abella, Ferran; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2016-05-01

    People with Down syndrome present skeletal abnormalities in their feet that can be analyzed by commonly used gold standard indices (the Hernández-Corvo index, the Chippaux-Smirak index, the Staheli arch index, and the Clarke angle) based on footprint measurements. The use of Photoshop CS5 software (Adobe Systems Software Ireland Ltd, Dublin, Ireland) to measure footprints has been validated in the general population. The present study aimed to assess the reliability and validity of this footprint assessment technique in the population with Down syndrome. Using optical podography and photography, 44 footprints from 22 patients with Down syndrome (11 men [mean ± SD age, 23.82 ± 3.12 years] and 11 women [mean ± SD age, 24.82 ± 6.81 years]) were recorded in a static bipedal standing position. A blinded observer performed the measurements using a validated manual method three times during the 4-month study, with 2 months between measurements. Test-retest was used to check the reliability of the Photoshop CS5 software measurements. Validity and reliability were obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed very good values for the Photoshop CS5 method (ICC, 0.982-0.995). Validity testing also found no differences between the techniques (ICC, 0.988-0.999). The Photoshop CS5 software method is reliable and valid for the study of footprints in young people with Down syndrome.

  14. A new method to evaluate the sealing reliability of the flanged connections for Molten Salt Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Li, Qiming, E-mail: liqiming@sinap.ac.cn [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China); Key Laboratory of Nuclear Radiation and Nuclear Energy Technology, Chinese Academy of Sciences, Shanghai 201800 (China); Tian, Jian; Zhou, Chong [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China); Key Laboratory of Nuclear Radiation and Nuclear Energy Technology, Chinese Academy of Sciences, Shanghai 201800 (China); Wang, Naxiu, E-mail: wangnaxiu@sinap.ac.cn [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China); Key Laboratory of Nuclear Radiation and Nuclear Energy Technology, Chinese Academy of Sciences, Shanghai 201800 (China)

    2015-06-15

    Highlights: • We novelly valuate the sealing reliability of the flanged connections for MSRs. • We focus on the passive decrease of the leak impetus in flanged connections. • The modified flanged connections are acquired a sealing ability of self-adjustment. • Effects of redesigned flange configurations on molten salt leakage are discussed. - Abstract: The Thorium based Molten Salt Reactor (TMSR) project is a future Generation IV nuclear reactor system proposed by the Chinese Academy of Sciences with the strategic goal of meeting the growing energy needs in the Chinese economic development and social progress. It is based on liquid salts served as both fuel and primary coolant and consequently great challenges are brought into the sealing of the flanged connections. In this study, an improved prototype flange assembly is performed on the strength of the Freeze-Flange initially developed by Oak Ridge National Laboratory (ORNL). The calculation results of the finite element model established to analyze the temperature profile of the Freeze-Flange agree well with the experimental data, which indicates that the numerical simulation method is credible. For further consideration, the ideal-gas thermodynamic model, together with the mathematical approximation, is novelly borrowed to theoretically evaluate the sealing performance of the modified Freeze-Flange and the traditional double gaskets bolted flange joint. This study focuses on the passive decrease of the leak driving force due to multiple gaskets introduced in flanged connections for MSR. The effects of the redesigned flange configuration on molten salt leakage resistance are discussed in detail.

  15. A new method to evaluate the sealing reliability of the flanged connections for Molten Salt Reactors

    International Nuclear Information System (INIS)

    Li, Qiming; Tian, Jian; Zhou, Chong; Wang, Naxiu

    2015-01-01

    Highlights: • We novelly valuate the sealing reliability of the flanged connections for MSRs. • We focus on the passive decrease of the leak impetus in flanged connections. • The modified flanged connections are acquired a sealing ability of self-adjustment. • Effects of redesigned flange configurations on molten salt leakage are discussed. - Abstract: The Thorium based Molten Salt Reactor (TMSR) project is a future Generation IV nuclear reactor system proposed by the Chinese Academy of Sciences with the strategic goal of meeting the growing energy needs in the Chinese economic development and social progress. It is based on liquid salts served as both fuel and primary coolant and consequently great challenges are brought into the sealing of the flanged connections. In this study, an improved prototype flange assembly is performed on the strength of the Freeze-Flange initially developed by Oak Ridge National Laboratory (ORNL). The calculation results of the finite element model established to analyze the temperature profile of the Freeze-Flange agree well with the experimental data, which indicates that the numerical simulation method is credible. For further consideration, the ideal-gas thermodynamic model, together with the mathematical approximation, is novelly borrowed to theoretically evaluate the sealing performance of the modified Freeze-Flange and the traditional double gaskets bolted flange joint. This study focuses on the passive decrease of the leak driving force due to multiple gaskets introduced in flanged connections for MSR. The effects of the redesigned flange configuration on molten salt leakage resistance are discussed in detail

  16. RELIABILITY AND ACCURACY ASSESSMENT OF INVASIVE AND NON- INVASIVE SEISMIC METHODS FOR SITE CHARACTERIZATION: FEEDBACK FROM THE INTERPACIFIC PROJECT

    OpenAIRE

    Garofalo , F.; Foti , S.; Hollender , F.; Bard , P.-Y.; Cornou , C.; Cox , B.R.; Dechamp , A.; Ohrnberger , M.; Sicilia , D.; Vergniault , C.

    2017-01-01

    International audience; The InterPacific project (Intercomparison of methods for site parameter and velocity profile characterization) aims to assess the reliability of seismic site characterization methods (borehole and surface wave methods) used for estimating shear wave velocity (VS) profiles and other related parameters (e.g., VS30). Three sites, representative of different geological conditions relevant for the evaluation of seismic site response effects, have been selected: (1) a hard r...

  17. Basic Concepts in Classical Test Theory: Tests Aren't Reliable, the Nature of Alpha, and Reliability Generalization as a Meta-analytic Method.

    Science.gov (United States)

    Helms, LuAnn Sherbeck

    This paper discusses the fact that reliability is about scores and not tests and how reliability limits effect sizes. The paper also explores the classical reliability coefficients of stability, equivalence, and internal consistency. Stability is concerned with how stable test scores will be over time, while equivalence addresses the relationship…

  18. pd Scattering Using a Rigorous Coulomb Treatment: Reliability of the Renormalization Method for Screened-Coulomb Potentials

    International Nuclear Information System (INIS)

    Hiratsuka, Y.; Oryu, S.; Gojuki, S.

    2011-01-01

    Reliability of the screened Coulomb renormalization method, which was proposed in an elegant way by Alt-Sandhas-Zankel-Ziegelmann (ASZZ), is discussed on the basis of 'two-potential theory' for the three-body AGS equations with the Coulomb potential. In order to obtain ASZZ's formula, we define the on-shell Moller function, and calculate it by using the Haeringen criterion, i. e. 'the half-shell Coulomb amplitude is zero'. By these two steps, we can finally obtain the ASZZ formula for a small Coulomb phase shift. Furthermore, the reliability of the Haeringen criterion is thoroughly checked by a numerically rigorous calculation for the Coulomb LS-type equation. We find that the Haeringen criterion can be satisfied only in the higher energy region. We conclude that the ASZZ method can be verified in the case that the on-shell approximation to the Moller function is reasonable, and the Haeringen criterion is reliable. (author)

  19. Quantitative developments in the cognitive reliability and error analysis method (CREAM) for the assessment of human performance

    International Nuclear Information System (INIS)

    Marseguerra, Marzio; Zio, Enrico; Librizzi, Massimo

    2006-01-01

    The current 'second generation' approaches in human reliability analysis focus their attention on the contextual conditions under which a given action is performed rather than on the notion of inherent human error probabilities, as was done in the earlier 'first generation' techniques. Among the 'second generation' methods, this paper considers the Cognitive Reliability and Error Analysis Method (CREAM) and proposes some developments with respect to a systematic procedure for computing probabilities of action failure. The starting point for the quantification is a previously introduced fuzzy version of the CREAM paradigm which is here further extended to include uncertainty on the qualification of the conditions under which the action is performed and to account for the fact that the effects of the common performance conditions (CPCs) on performance reliability may not all be equal. By the proposed approach, the probability of action failure is estimated by rating the performance conditions in terms of their effect on the action

  20. Implications of differences in bioanalytical regulations between Canada, USA and South America.

    Science.gov (United States)

    Arnold, Mark E

    2011-02-01

    To complete globally, pharmaceutical companies desire to use bioanalytical data and reports as a single version for all filings; not revising for specific countries or regions. Historically, this meant following the US FDA and International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use guidance/guidelines; finding them sufficient to achieve global acceptance. However, a growing challenge of the past decade has been additional country-specific and regional regulations that have been released. The differences between the bioanalytical regulations among countries have been recognized as a challenge to the pharmaceutical industry and its CRO partners. Harmonization of the regulations at a global level has been the subject of a number of recent articles and editorials, and the topic has been vigorously discussed at several conferences over the past year. Since all have been in agreement about the need to harmonize regulations, this article will not focus on harmonization but rather it will provide a comparison of the USA/Canadian regulations versus those of South America, in particular Brazil, noting the additional work needed to achieve compliance with country-specific regulations. All countries discussed have specific guidance or regulations on clinical bioequivalence studies, and due to the higher standards for these studies, the regulations for bioequivalence studies will be used as the basis for comparison in the article.

  1. Application of fuzzy-MOORA method: Ranking of components for reliability estimation of component-based software systems

    Directory of Open Access Journals (Sweden)

    Zeeshan Ali Siddiqui

    2016-01-01

    Full Text Available Component-based software system (CBSS development technique is an emerging discipline that promises to take software development into a new era. As hardware systems are presently being constructed from kits of parts, software systems may also be assembled from components. It is more reliable to reuse software than to create. It is the glue code and individual components reliability that contribute to the reliability of the overall system. Every component contributes to overall system reliability according to the number of times it is being used, some components are of critical usage, known as usage frequency of component. The usage frequency decides the weight of each component. According to their weights, each component contributes to the overall reliability of the system. Therefore, ranking of components may be obtained by analyzing their reliability impacts on overall application. In this paper, we propose the application of fuzzy multi-objective optimization on the basis of ratio analysis, Fuzzy-MOORA. The method helps us find the best suitable alternative, software component, from a set of available feasible alternatives named software components. It is an accurate and easy to understand tool for solving multi-criteria decision making problems that have imprecise and vague evaluation data. By the use of ratio analysis, the proposed method determines the most suitable alternative among all possible alternatives, and dimensionless measurement will realize the job of ranking of components for estimating CBSS reliability in a non-subjective way. Finally, three case studies are shown to illustrate the use of the proposed technique.

  2. Selection and reporting of statistical methods to assess reliability of a diagnostic test: Conformity to recommended methods in a peer-reviewed journal

    International Nuclear Information System (INIS)

    Park, Ji Eun; Sung, Yu Sub; Han, Kyung Hwa

    2017-01-01

    To evaluate the frequency and adequacy of statistical analyses in a general radiology journal when reporting a reliability analysis for a diagnostic test. Sixty-three studies of diagnostic test accuracy (DTA) and 36 studies reporting reliability analyses published in the Korean Journal of Radiology between 2012 and 2016 were analyzed. Studies were judged using the methodological guidelines of the Radiological Society of North America-Quantitative Imaging Biomarkers Alliance (RSNA-QIBA), and COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) initiative. DTA studies were evaluated by nine editorial board members of the journal. Reliability studies were evaluated by study reviewers experienced with reliability analysis. Thirty-one (49.2%) of the 63 DTA studies did not include a reliability analysis when deemed necessary. Among the 36 reliability studies, proper statistical methods were used in all (5/5) studies dealing with dichotomous/nominal data, 46.7% (7/15) of studies dealing with ordinal data, and 95.2% (20/21) of studies dealing with continuous data. Statistical methods were described in sufficient detail regarding weighted kappa in 28.6% (2/7) of studies and regarding the model and assumptions of intraclass correlation coefficient in 35.3% (6/17) and 29.4% (5/17) of studies, respectively. Reliability parameters were used as if they were agreement parameters in 23.1% (3/13) of studies. Reproducibility and repeatability were used incorrectly in 20% (3/15) of studies. Greater attention to the importance of reporting reliability, thorough description of the related statistical methods, efforts not to neglect agreement parameters, and better use of relevant terminology is necessary

  3. Selection and reporting of statistical methods to assess reliability of a diagnostic test: Conformity to recommended methods in a peer-reviewed journal

    Energy Technology Data Exchange (ETDEWEB)

    Park, Ji Eun; Sung, Yu Sub [Dept. of Radiology and Research Institute of Radiology, University of Ulsan College of Medicine, Asan Medical Center, Seoul (Korea, Republic of); Han, Kyung Hwa [Dept. of Radiology, Research Institute of Radiological Science, Yonsei University College of Medicine, Seoul (Korea, Republic of); and others

    2017-11-15

    To evaluate the frequency and adequacy of statistical analyses in a general radiology journal when reporting a reliability analysis for a diagnostic test. Sixty-three studies of diagnostic test accuracy (DTA) and 36 studies reporting reliability analyses published in the Korean Journal of Radiology between 2012 and 2016 were analyzed. Studies were judged using the methodological guidelines of the Radiological Society of North America-Quantitative Imaging Biomarkers Alliance (RSNA-QIBA), and COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) initiative. DTA studies were evaluated by nine editorial board members of the journal. Reliability studies were evaluated by study reviewers experienced with reliability analysis. Thirty-one (49.2%) of the 63 DTA studies did not include a reliability analysis when deemed necessary. Among the 36 reliability studies, proper statistical methods were used in all (5/5) studies dealing with dichotomous/nominal data, 46.7% (7/15) of studies dealing with ordinal data, and 95.2% (20/21) of studies dealing with continuous data. Statistical methods were described in sufficient detail regarding weighted kappa in 28.6% (2/7) of studies and regarding the model and assumptions of intraclass correlation coefficient in 35.3% (6/17) and 29.4% (5/17) of studies, respectively. Reliability parameters were used as if they were agreement parameters in 23.1% (3/13) of studies. Reproducibility and repeatability were used incorrectly in 20% (3/15) of studies. Greater attention to the importance of reporting reliability, thorough description of the related statistical methods, efforts not to neglect agreement parameters, and better use of relevant terminology is necessary.

  4. Development of reliability-based load and resistance factor design methods for piping

    International Nuclear Information System (INIS)

    Ayyub, Bilal M.; Hill, Ralph S. III; Balkey, Kenneth R.

    2003-01-01

    Current American Society of Mechanical Engineers (ASME) nuclear codes and standards rely primarily on deterministic and mechanistic approaches to design. The American Institute of Steel Construction and the American Concrete Institute, among other organizations, have incorporated probabilistic methodologies into their design codes. ASME nuclear codes and standards could benefit from developing a probabilistic, reliability-based, design methodology. This paper provides a plan to develop the technical basis for reliability-based, load and resistance factor design of ASME Section III, Class 2/3 piping for primary loading, i.e., pressure, deadweight and seismic. The plan provides a proof of concept in that LRFD can be used in the design of piping, and could achieve consistent reliability levels. Also, the results from future projects in this area could form the basis for code cases, and additional research for piping secondary loads. (author)

  5. THE SIMULATION DIAGNOSTIC METHODS AND REGENERATION WAYS OF REINFORCED - CONCRETE CONSTRUCTIONS OF BRIDGES IN PROVIDING THEIR OPERATING RELIABILITY AND LONGEVITY

    Directory of Open Access Journals (Sweden)

    B. V. Savchinskiy

    2010-03-01

    Full Text Available On the basis of analysis of existing diagnostic methods and regeneration ways of reinforced-concrete constructions of bridges the recommendations on introduction of new modern technologies of renewal of reinforced-concrete constructions of bridges in providing their operating reliability and longevity are offered.

  6. THE SIMULATION DIAGNOSTIC METHODS AND REGENERATION WAYS OF REINFORCED - CONCRETE CONSTRUCTIONS OF BRIDGES IN PROVIDING THEIR OPERATING RELIABILITY AND LONGEVITY

    OpenAIRE

    B. V. Savchinskiy

    2010-01-01

    On the basis of analysis of existing diagnostic methods and regeneration ways of reinforced-concrete constructions of bridges the recommendations on introduction of new modern technologies of renewal of reinforced-concrete constructions of bridges in providing their operating reliability and longevity are offered.

  7. Complex method to calculate objective assessments of information systems protection to improve expert assessments reliability

    Science.gov (United States)

    Abdenov, A. Zh; Trushin, V. A.; Abdenova, G. A.

    2018-01-01

    The paper considers the questions of filling the relevant SIEM nodes based on calculations of objective assessments in order to improve the reliability of subjective expert assessments. The proposed methodology is necessary for the most accurate security risk assessment of information systems. This technique is also intended for the purpose of establishing real-time operational information protection in the enterprise information systems. Risk calculations are based on objective estimates of the adverse events implementation probabilities, predictions of the damage magnitude from information security violations. Calculations of objective assessments are necessary to increase the reliability of the proposed expert assessments.

  8. Reliability of lower limb alignment measures using an established landmark-based method with a customized computer software program

    Science.gov (United States)

    Sled, Elizabeth A.; Sheehy, Lisa M.; Felson, David T.; Costigan, Patrick A.; Lam, Miu; Cooke, T. Derek V.

    2010-01-01

    The objective of the study was to evaluate the reliability of frontal plane lower limb alignment measures using a landmark-based method by (1) comparing inter- and intra-reader reliability between measurements of alignment obtained manually with those using a computer program, and (2) determining inter- and intra-reader reliability of computer-assisted alignment measures from full-limb radiographs. An established method for measuring alignment was used, involving selection of 10 femoral and tibial bone landmarks. 1) To compare manual and computer methods, we used digital images and matching paper copies of five alignment patterns simulating healthy and malaligned limbs drawn using AutoCAD. Seven readers were trained in each system. Paper copies were measured manually and repeat measurements were performed daily for 3 days, followed by a similar routine with the digital images using the computer. 2) To examine the reliability of computer-assisted measures from full-limb radiographs, 100 images (200 limbs) were selected as a random sample from 1,500 full-limb digital radiographs which were part of the Multicenter Osteoarthritis (MOST) Study. Three trained readers used the software program to measure alignment twice from the batch of 100 images, with two or more weeks between batch handling. Manual and computer measures of alignment showed excellent agreement (intraclass correlations [ICCs] 0.977 – 0.999 for computer analysis; 0.820 – 0.995 for manual measures). The computer program applied to full-limb radiographs produced alignment measurements with high inter- and intra-reader reliability (ICCs 0.839 – 0.998). In conclusion, alignment measures using a bone landmark-based approach and a computer program were highly reliable between multiple readers. PMID:19882339

  9. Bioanalytical Applications of Real-Time ATP Imaging Via Bioluminescence

    Energy Technology Data Exchange (ETDEWEB)

    Gruenhagen, Jason Alan [Iowa State Univ., Ames, IA (United States)

    2003-01-01

    The research discussed within involves the development of novel applications of real-time imaging of adenosine 5'-triphosphate (ATP). ATP was detected via bioluminescence and the firefly luciferase-catalyzed reaction of ATP and luciferin. The use of a microscope and an imaging detector allowed for spatially resolved quantitation of ATP release. Employing this method, applications in both biological and chemical systems were developed. First, the mechanism by which the compound 48/80 induces release of ATP from human umbilical vein endothelial cells (HUVECs) was investigated. Numerous enzyme activators and inhibitors were utilized to probe the second messenger systems involved in release. Compound 48/80 activated a G{sub q}-type protein to initiate ATP release from HUVECs. Ca2+ imaging along with ATP imaging revealed that activation of phospholipase C and induction of intracellular Ca2+ signaling were necessary for release of ATP. Furthermore, activation of protein kinase C inhibited the activity of phospholipase C and thus decreased the magnitude of ATP release. This novel release mechanism was compared to the existing theories of extracellular release of ATP. Bioluminescence imaging was also employed to examine the role of ATP in the field of neuroscience. The central nervous system (CNS) was dissected from the freshwater snail Lymnaea stagnalis. Electrophysiological experiments demonstrated that the neurons of the Lymnaea were not damaged by any of the components of the imaging solution. ATP was continuously released by the ganglia of the CNS for over eight hours and varied from ganglion to ganglion and within individual ganglia. Addition of the neurotransmitters K+ and serotonin increased release of ATP in certain regions of the Lymnaea CNS. Finally, the ATP imaging technique was investigated for the study of drug release systems. MCM-41-type mesoporous nanospheres were loaded with ATP and end-capped with mercaptoethanol

  10. Investigation of reliability of EC method for inspection of VVER steam generator tubes

    International Nuclear Information System (INIS)

    Corak, Z.

    2004-01-01

    Complete and accurate non-destructive examinations (NDE) data provides the basis for performing mitigating actions and corrective repairs. It is important that detection and characterization of flaws are done properly at an early stage. EPRI Document PWR Steam Generator Examination Guidelines recommends an approach that is intended to provide the following: Ensure accurate assessment of steam generator tube integrity; Extend the reliable, cost effective, operating life of the steam generators, and Maximize the availability of the unit. Steam Generator Eddy Current Data Analysis Performance Demonstration represents the culmination of the intense two-year industry effort in the development of a performance demonstration program for eddy current testing (ECT) of steam generator tubing. It is referred to as the Industry Database (IDB) and provides a capability for individual organizations to implement SG ECT performance demonstration programs in accordance with the requirements specified in Appendices G and H of the ISI Guidelines. The Appendix G of EPRI Document PWR Steam Generator Examination Guidelines specifies personnel training and qualification requirements for NDE personnel who analyze NDE data for PWR steam generator tubing. Its purpose is to insure a continuing uniform knowledge base and skill level for data analysis. The European methodology document is intended to provide a general framework for development of qualifications for the inspection of specific components to ensure they are developed in a consistent way throughout Europe while still allowing qualification to be tailored in detail to meet different nation requirements. In the European methodology document one will not find a detailed description of how the inspection of a specific component should be qualified. A recommended practice is a document produced by ENIQ to support the production of detailed qualification procedures by individual countries. VVER SG tubes are inspected by EC method but a

  11. A method and application study on holistic decision tree for human reliability analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Sun Feng; Zhong Shan; Wu Zhiyu

    2008-01-01

    The paper introduces a human reliability analysis method mainly used in Nuclear Power Plant Safety Assessment and the Holistic Decision Tree (HDT) method and how to apply it. The focus is primarily on providing the basic framework and some background of HDT method and steps to perform it. Influence factors and quality descriptors are formed by the interview with operators in Qinshan Nuclear Power Plant and HDT analysis performed for SGTR and SLOCA based on this information. The HDT model can use a graphic tree structure to indicate that error rate is a function of influence factors. HDT method is capable of dealing with the uncertainty in HRA, and it is reliable and practical. (authors)

  12. Interobserver reliability when using the Van Herick method to measure anterior chamber depth

    Directory of Open Access Journals (Sweden)

    Ahmed Javed

    2017-01-01

    Conclusion: The Van Herick score has a good interobserver reliability for Grades 1 and 4; however, Grades 2 and 3 require further tests such as gonioscopy or ocular coherence tomography. Temporal and nasal scores demonstrated good agreement; therefore, if the nasal score cannot be measured due to nasal bridge size, the temporal can be used as an approximation.

  13. A reliable method for ageing of whiting (Merlangius merlangus) for use in stock assessment and management

    DEFF Research Database (Denmark)

    Ross, Stine Dalmann; Hüssy, Karin

    2013-01-01

    Accurate age estimation is important for stock assessment and management. The importance of reliable ageing is emphasized by the impending analytical assessment of whiting (Merlangius merlangus) in the Baltic Sea. Whiting is a top predator in the western Baltic Sea, where it is fished commercially...

  14. Reliability analysis of idealized tunnel support system using probability-based methods with case studies

    NARCIS (Netherlands)

    Gharouni-Nik, M.; Naeimi, M.; Ahadi, S.; Alimoradi, Z.

    2014-01-01

    In order to determine the overall safety of a tunnel support lining, a reliability-based approach is presented in this paper. Support elements in jointed rock tunnels are provided to control the ground movement caused by stress redistribution during the tunnel drive. Main support elements contribute

  15. Computational intelligence methods for the efficient reliability analysis of complex flood defence structures

    NARCIS (Netherlands)

    Kingston, Greer B.; Rajabali Nejad, Mohammadreza; Gouldby, Ben P.; van Gelder, Pieter H.A.J.M.

    2011-01-01

    With the continual rise of sea levels and deterioration of flood defence structures over time, it is no longer appropriate to define a design level of flood protection, but rather, it is necessary to estimate the reliability of flood defences under varying and uncertain conditions. For complex

  16. Reliability analysis and risk-based methods for planning of operation & maintenance of offshore wind turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2017-01-01

    for extreme and fatigue limit states are presented. Operation & Maintenance planning often follows corrective and preventive strategies based on information from condition monitoring and structural health monitoring systems. A reliability- and risk-based approach is presented where a life-cycle approach...

  17. Reliability-based design methods to determine the extreme response distribution of offshore wind turbines

    NARCIS (Netherlands)

    Cheng, P.W.; Bussel, van G.J.W.; Kuik, van G.A.M.; Vugts, J.H.

    2003-01-01

    In this article a reliability-based approach to determine the extreme response distribution of offshore wind turbines is presented. Based on hindcast data, the statistical description of the offshore environment is formulated. The contour lines of different return periods can be determined.

  18. Submission of scientifically sound and ethical manuscripts to peer-reviewed journals - a reviewer's personal perspective on bioanalytical publications.

    Science.gov (United States)

    Weng, Naidong

    2012-11-01

    In the pharmaceutical industry, bioanalysis is very dynamic and is probably one of the few fields of research covering the entire drug discovery, development and post-marketing process. Important decisions on drug safety can partially rely on bioanalytical data, which therefore can be subject to regulatory scrutiny. Bioanalytical scientists have historically contributed significant numbers of scientific manuscripts in many peer-reviewed analytical journals. All of these journals provide some high-level instructions, but they also leave sufficient flexibility for reviewers to perform independent critique and offer recommendations for each submitted manuscript. Reviewers play a pivotal role in the process of bioanalytical publication to ensure the publication of high-quality manuscripts in a timely fashion. Their efforts usually lead to improved manuscripts. However, it has to be a joint effort among authors, reviewers and editors to promote scientifically sound and ethically fair bioanalytical publications. Most of the submitted manuscripts were well written with only minor or moderate revisions required for further improvement. Nevertheless, there were small numbers of submitted manuscripts that did not meet the requirements for publications because of scientific or ethical deficiencies, which are discussed in this Letter to the Editor. Copyright © 2012 John Wiley & Sons, Ltd.

  19. Reliability analysis for radiographic measures of lumbar lordosis in adult scoliosis: a case–control study comparing 6 methods

    Science.gov (United States)

    Hong, Jae Young; Modi, Hitesh N.; Hur, Chang Yong; Song, Hae Ryong; Park, Jong Hoon

    2010-01-01

    Several methods are used to measure lumbar lordosis. In adult scoliosis patients, the measurement is difficult due to degenerative changes in the vertebral endplate as well as the coronal and sagittal deformity. We did the observational study with three examiners to determine the reliability of six methods for measuring the global lumbar lordosis in adult scoliosis patients. Ninety lateral lumbar radiographs were collected for the study. The radiographs were divided into normal (Cobb lordosis measurement decreased with increasing severity of scoliosis. In Cobb L1–S1, centroid and posterior tangent L1–S1 methods, the ICCs were relatively lower in the high-grade scoliosis group (≥0.60). And, the mean absolute difference (MAD) in these methods was high in the high-grade scoliosis group (≤7.17°). However, in the Cobb L1–L5 and posterior tangent L1–L5 method, the ICCs were ≥0.86 in all groups. And, in the TRALL method, the ICCs were ≥0.76 in all groups. In addition, in the Cobb L1–L5 and posterior tangent L1–L5 method, the MAD was ≤3.63°. And, in the TRALL method, the MAD was ≤3.84° in all groups. We concluded that the Cobb L1–L5 and the posterior tangent L1–L5 methods are reliable methods for measuring the global lumbar lordosis in adult scoliosis. And the TRALL method is more reliable method than other methods which include the L5–S1 joint in lordosis measurement. PMID:20437183

  20. Larvas output and influence of human factor in reliability of meat inspection by the method of artificial digestion

    OpenAIRE

    Đorđević Vesna; Savić Marko; Vasilev Saša; Đorđević Milovan

    2013-01-01

    On the basis of the performed analyses of the factors that contributed the infected meat reach food chain, we have found out that the infection occurred after consuming the meat inspected by the method of collective samples artificial digestion by using a magnetic stirrer (MM). In this work there are presented assay results which show how modifications of the method, on the level of final sedimentation, influence the reliability of Trichinella larvas detect...

  1. Reliability and Validity of a New Method for Isometric Back Extensor Strength Evaluation Using A Hand-Held Dynamometer.

    Science.gov (United States)

    Park, Hee-Won; Baek, Sora; Kim, Hong Young; Park, Jung-Gyoo; Kang, Eun Kyoung

    2017-10-01

    To investigate the reliability and validity of a new method for isometric back extensor strength measurement using a portable dynamometer. A chair equipped with a small portable dynamometer was designed (Power Track II Commander Muscle Tester). A total of 15 men (mean age, 34.8±7.5 years) and 15 women (mean age, 33.1±5.5 years) with no current back problems or previous history of back surgery were recruited. Subjects were asked to push the back of the chair while seated, and their isometric back extensor strength was measured by the portable dynamometer. Test-retest reliability was assessed with intraclass correlation coefficient (ICC). For the validity assessment, isometric back extensor strength of all subjects was measured by a widely used physical performance evaluation instrument, BTE PrimusRS system. The limit of agreement (LoA) from the Bland-Altman plot was evaluated between two methods. The test-retest reliability was excellent (ICC=0.82; 95% confidence interval, 0.65-0.91). The Bland-Altman plots demonstrated acceptable agreement between the two methods: the lower 95% LoA was -63.1 N and the upper 95% LoA was 61.1 N. This study shows that isometric back extensor strength measurement using a portable dynamometer has good reliability and validity.

  2. Reliability of a method for establishing the capacity of individuals with an intellectual disability to respond to Likert scales.

    Science.gov (United States)

    Cuskelly, Monica; Moni, Karen; Lloyd, Jan; Jobling, Anne

    2013-12-01

    The study reported here was an examination of the reliability of a method for determining acquiescent responding and the capacity to respond to items using a Likert scale response format by adults with an intellectual disability. Reliability of the outcomes of these procedures was investigated using a test-retest design. Associations with receptive vocabulary were examined. The majority of the participants did not demonstrate acquiescent responding. Individuals' responses to the Likert-type discrimination tasks were consistent, although this varied somewhat depending upon the abstractness of the task. There was some association between receptive language age equivalence scores and respondent performance. It is recommended that the pretest protocol (a) be modified to improve its reliability, and (b) this modified version be used with study participants who have an intellectual disability to ascertain the appropriate level of choice to be used for items that use a Likert response format.

  3. Reliability of CRBR primary piping: critique of stress-strength overlap method for cold-leg inlet downcomer

    International Nuclear Information System (INIS)

    Bari, R.A.; Buslik, A.J.; Papazoglou, I.A.

    1976-04-01

    A critique is presented of the strength-stress overlap method for the reliability of the CRBR primary heat transport system piping. The report addresses, in particular, the reliability assessment of WARD-D-0127 (Piping Integrity Status Report), which is part of the CRBR PSAR docket. It was found that the reliability assessment is extremely sensitive to the assumed shape for the probability density function for the strength (regarded as a random variable) of the cold-leg inlet downcomer section of the primary piping. Based on the rigorous Chebyschev inequality, it is shown that the piping failure probability is less than 10 -2 . On the other hand, it is shown that the failure probability can be much larger than approximately 10 -13 , the typical value put forth in WARD-D-0127

  4. A combined Importance Sampling and Kriging reliability method for small failure probabilities with time-demanding numerical models

    International Nuclear Information System (INIS)

    Echard, B.; Gayton, N.; Lemaire, M.; Relun, N.

    2013-01-01

    Applying reliability methods to a complex structure is often delicate for two main reasons. First, such a structure is fortunately designed with codified rules leading to a large safety margin which means that failure is a small probability event. Such a probability level is difficult to assess efficiently. Second, the structure mechanical behaviour is modelled numerically in an attempt to reproduce the real response and numerical model tends to be more and more time-demanding as its complexity is increased to improve accuracy and to consider particular mechanical behaviour. As a consequence, performing a large number of model computations cannot be considered in order to assess the failure probability. To overcome these issues, this paper proposes an original and easily implementable method called AK-IS for active learning and Kriging-based Importance Sampling. This new method is based on the AK-MCS algorithm previously published by Echard et al. [AK-MCS: an active learning reliability method combining Kriging and Monte Carlo simulation. Structural Safety 2011;33(2):145–54]. It associates the Kriging metamodel and its advantageous stochastic property with the Importance Sampling method to assess small failure probabilities. It enables the correction or validation of the FORM approximation with only a very few mechanical model computations. The efficiency of the method is, first, proved on two academic applications. It is then conducted for assessing the reliability of a challenging aerospace case study submitted to fatigue.

  5. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  6. N- versus O-alkylation: utilizing NMR methods to establish reliable primary structure determinations for drug discovery.

    Science.gov (United States)

    LaPlante, Steven R; Bilodeau, François; Aubry, Norman; Gillard, James R; O'Meara, Jeff; Coulombe, René

    2013-08-15

    A classic synthetic issue that remains unresolved is the reaction that involves the control of N- versus O-alkylation of ambident anions. This common chemical transformation is important for medicinal chemists, who require predictable and reliable protocols for the rapid synthesis of inhibitors. The uncertainty of whether the product(s) are N- and/or O-alkylated is common and can be costly if undetermined. Herein, we report an NMR-based strategy that focuses on distinguishing inhibitors and intermediates that are N- or O-alkylated. The NMR strategy involves three independent and complementary methods. However, any combination of two of the methods can be reliable if the third were compromised due to resonance overlap or other issues. The timely nature of these methods (HSQC/HMQC, HMBC. ROESY, and (13)C shift predictions) allows for contemporaneous determination of regioselective alkylation as needed during the optimization of synthetic routes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Efficient Estimation of Extreme Non-linear Roll Motions using the First-order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    2007-01-01

    In on-board decision support systems efficient procedures are needed for real-time estimation of the maximum ship responses to be expected within the next few hours, given on-line information on the sea state and user defined ranges of possible headings and speeds. For linear responses standard...... frequency domain methods can be applied. To non-linear responses like the roll motion, standard methods like direct time domain simulations are not feasible due to the required computational time. However, the statistical distribution of non-linear ship responses can be estimated very accurately using...... the first-order reliability method (FORM), well-known from structural reliability problems. To illustrate the proposed procedure, the roll motion is modelled by a simplified non-linear procedure taking into account non-linear hydrodynamic damping, time-varying restoring and wave excitation moments...

  8. Research of radioecological processes by methods of the theory of reliability

    International Nuclear Information System (INIS)

    Kutlakhmedov, Yu.A.; Salivon, A.G.; Pchelovskaya, S.A.; Rodina, V.V.; Bevza, A.G.; Matveeva, I.V.

    2012-01-01

    Theory and the models of radiocapacity ecosystems using the theory and models of reliability have allowed adequately to describe the laws of migration and radionuclides distribution for different types ecosystems of reservoirs and land. The theory and the models of radiocapacity allow strictly to define critical elements of ecosystem where it is necessary to expect temporary or final depoting of radionuclides.The approach on the basis of application biogenic tracers allows within the framework of the theory both models of radiocapacity and reliability simultaneously to estimate the processes of radionuclides migration, to define the dozes of loading on biota ecosystems, and to establish fundamental parameters of radionuclides redistribution speeds and others pollutants in different types of ecosystems.

  9. Case study on the use of PSA methods: Human reliability analysis

    International Nuclear Information System (INIS)

    1991-04-01

    The overall objective of treating human reliability in a probabilistic safety analysis is to ensure that the key human interactions of typical crews are accurately and systematically incorporated into the study in a traceable manner. An additional objective is to make the human reliability analysis (HRA) as realistic as possible, taking into account the emergency procedures, the man-machine interface, the focus of training process, and the knowledge and experience of the crews. Section 3 of the paper describes an overview of this analytical process which leads to three more detailed example problems described in Section 4. Section 5 discusses a peer review process. References are presented that are useful in performing HRAs. In addition appendices are provided for definitions, selected data and a generic list of performance shaping factors. 35 refs, figs and tabs

  10. Analyses of reliability characteristics of emergency diesel generator population using empirical Bayes methods

    International Nuclear Information System (INIS)

    Vesely, W.E.; Uryas'ev, S.P.; Samanta, P.K.

    1993-01-01

    Emergency Diesel Generators (EDGs) provide backup power to nuclear power plants in case of failure of AC buses. The reliability of EDGs is important to assure response to loss-of-offsite power accident scenarios, a dominant contributor to the plant risk. The reliable performance of EDGs has been of concern both for regulators and plant operators. In this paper the authors present an approach and results from the analysis of failure data from a large population of EDGs. They used empirical Bayes approach to obtain both the population distribution and the individual failure probabilities from EDGs failure to start and load-run data over 4 years for 194 EDGs at 63 plant units

  11. Investigation on the reliability of expansion joint for piping with probabilistic method

    International Nuclear Information System (INIS)

    Ishii, Y.; Kambe, M.

    1980-01-01

    The reduction of the plant size is necessitated as one of the major targets in LMFBR design. Usually, piping work system is extensively used to absorb thermal expansion between two components anywhere. Besides above, expansion joint for piping seems to be attractive lately for the same object. This paper describes the significance of expansion joint with multiple boundaries, breakdown probability of expansion joint assembly and partly the bellows by introducing several hypothetical conditions in connection with piping. Also, an importance of in-service inspection (ISI) for expansion joint was discussed using a comparative table and probabilities on reliability from partly broken to full penetration. In conclusion, the expansion joint with ISI should be manufactured with excellent reliability in order to cope with piping work system; several conditions of the practical application for piping systems are suggested. (author)

  12. Investigation on the reliability of expansion joint for piping with probabilistic method

    Energy Technology Data Exchange (ETDEWEB)

    Ishii, Y; Kambe, M

    1980-02-01

    The reduction of the plant size is necessitated as one of the major targets in LMFBR design. Usually, piping work system is extensively used to absorb thermal expansion between two components anywhere. Besides above, expansion joint for piping seems to be attractive lately for the same object. This paper describes the significance of expansion joint with multiple boundaries, breakdown probability of expansion joint assembly and partly the bellows by introducing several hypothetical conditions in connection with piping. Also, an importance of in-service inspection (ISI) for expansion joint was discussed using a comparative table and probabilities on reliability from partly broken to full penetration. In conclusion, the expansion joint with ISI should be manufactured with excellent reliability in order to cope with piping work system; several conditions of the practical application for piping systems are suggested. (author)

  13. Investigation on the reliability of expansion joint for piping with probabilistic method

    International Nuclear Information System (INIS)

    Ishii, Yoichiro; Kambe, Mitsuru.

    1979-11-01

    The reduction of the plant size if necessitated as one of the major target in LMFBR design. Usually, piping work system is extensively used to absorb thermal expansion between two components anywhere. Besides above, expansion joint for piping seems to be attractive lately for the same object. This paper describes about the significance of expansion joint with multiple boundaries, breakdown probability of expansion joint assembly and partly the bellows by introducing several hypothetical conditions in connection with piping. Also, an importance of inservice inspection (ISI) for expansion joint was discussed using by comparative table and probabilities on reliability from partly broken to full penetration. In the conclusion, the expansion joint with ISI should be manufactured with excellent reliability in order to cope with piping work system, and several conditions of the practical application for piping systems are suggested. (author)

  14. Increasing reliability of Gauss-Kronrod quadrature by Eratosthenes' sieve method

    Science.gov (United States)

    Adam, Gh.; Adam, S.

    2001-04-01

    The reliability of the local error estimates returned by the Gauss-Kronrod quadrature rules can be raised up to the theoretical 100% rate of success, under error estimate sharpening, provided a number of natural validating conditions are required. The self-validating scheme of the local error estimates, which is easy to implement and adds little supplementary computing effort, strengthens considerably the correctness of the decisions within the automatic adaptive quadrature.

  15. Method for assessing the reliability of molecular diagnostics based on multiplexed SERS-coded nanoparticles.

    Directory of Open Access Journals (Sweden)

    Steven Y Leigh

    Full Text Available Surface-enhanced Raman scattering (SERS nanoparticles have been engineered to generate unique fingerprint spectra and are potentially useful as bright contrast agents for molecular diagnostics. One promising strategy for biomedical diagnostics and imaging is to functionalize various particle types ("flavors", each emitting a unique spectral signature, to target a large multiplexed panel of molecular biomarkers. While SERS particles emit narrow spectral features that allow them to be easily separable under ideal conditions, the presence of competing noise sources and background signals such as detector noise, laser background, and autofluorescence confounds the reliability of demultiplexing algorithms. Results obtained during time-constrained in vivo imaging experiments may not be reproducible or accurate. Therefore, our goal is to provide experimentalists with a metric that may be monitored to enforce a desired bound on accuracy within a user-defined confidence level. We have defined a spectral reliability index (SRI, based on the output of a direct classical least-squares (DCLS demultiplexing routine, which provides a measure of the reliability of the computed nanoparticle concentrations and ratios. We present simulations and experiments to demonstrate the feasibility of this strategy, which can potentially be utilized for a range of instruments and biomedical applications involving multiplexed SERS nanoparticles.

  16. Conceptual transitions in methods of skull-photo superimposition that impact the reliability of identification: a review.

    Science.gov (United States)

    Jayaprakash, Paul T

    2015-01-01

    Establishing identification during skull-photo superimposition relies on correlating the salient morphological features of an unidentified skull with those of a face-image of a suspected dead individual using image overlay processes. Technical progression in the process of overlay has included the incorporation of video cameras, image-mixing devices and software that enables real-time vision-mixing. Conceptual transitions occur in the superimposition methods that involve 'life-size' images, that achieve orientation of the skull to the posture of the face in the photograph and that assess the extent of match. A recent report on the reliability of identification using the superimposition method adopted the currently prevalent methods and suggested an increased rate of failures when skulls were compared with related and unrelated face images. The reported reduction in the reliability of the superimposition method prompted a review of the transition in the concepts that are involved in skull-photo superimposition. The prevalent popular methods for visualizing the superimposed images at less than 'life-size', overlaying skull-face images by relying on the cranial and facial landmarks in the frontal plane when orienting the skull for matching and evaluating the match on a morphological basis by relying on mix-mode alone are the major departures in the methodology that may have reduced the identification reliability. The need to reassess the reliability of the method that incorporates the concepts which have been considered appropriate by the practitioners is stressed. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Self-Tuning Method for Increased Obstacle Detection Reliability Based on Internet of Things LiDAR Sensor Models.

    Science.gov (United States)

    Castaño, Fernando; Beruvides, Gerardo; Villalonga, Alberto; Haber, Rodolfo E

    2018-05-10

    On-chip LiDAR sensors for vehicle collision avoidance are a rapidly expanding area of research and development. The assessment of reliable obstacle detection using data collected by LiDAR sensors has become a key issue that the scientific community is actively exploring. The design of a self-tuning methodology and its implementation are presented in this paper, to maximize the reliability of LiDAR sensors network for obstacle detection in the 'Internet of Things' (IoT) mobility scenarios. The Webots Automobile 3D simulation tool for emulating sensor interaction in complex driving environments is selected in order to achieve that objective. Furthermore, a model-based framework is defined that employs a point-cloud clustering technique, and an error-based prediction model library that is composed of a multilayer perceptron neural network, and k-nearest neighbors and linear regression models. Finally, a reinforcement learning technique, specifically a Q-learning method, is implemented to determine the number of LiDAR sensors that are required to increase sensor reliability for obstacle localization tasks. In addition, a IoT driving assistance user scenario, connecting a five LiDAR sensor network is designed and implemented to validate the accuracy of the computational intelligence-based framework. The results demonstrated that the self-tuning method is an appropriate strategy to increase the reliability of the sensor network while minimizing detection thresholds.

  18. Assessing the reliability of the borderline regression method as a standard setting procedure for objective structured clinical examination

    Directory of Open Access Journals (Sweden)

    Sara Mortaz Hejri

    2013-01-01

    Full Text Available Background: One of the methods used for standard setting is the borderline regression method (BRM. This study aims to assess the reliability of BRM when the pass-fail standard in an objective structured clinical examination (OSCE was calculated by averaging the BRM standards obtained for each station separately. Materials and Methods: In nine stations of the OSCE with direct observation the examiners gave each student a checklist score and a global score. Using a linear regression model for each station, we calculated the checklist score cut-off on the regression equation for the global scale cut-off set at 2. The OSCE pass-fail standard was defined as the average of all station′s standard. To determine the reliability, the root mean square error (RMSE was calculated. The R2 coefficient and the inter-grade discrimination were calculated to assess the quality of OSCE. Results: The mean total test score was 60.78. The OSCE pass-fail standard and its RMSE were 47.37 and 0.55, respectively. The R2 coefficients ranged from 0.44 to 0.79. The inter-grade discrimination score varied greatly among stations. Conclusion: The RMSE of the standard was very small indicating that BRM is a reliable method of setting standard for OSCE, which has the advantage of providing data for quality assurance.

  19. Reliable and Accurate Release of Micro-Sized Objects with a Gripper that Uses the Capillary-Force Method

    Directory of Open Access Journals (Sweden)

    Suzana Uran

    2017-06-01

    Full Text Available There have been recent developments in grippers that are based on capillary force and condensed water droplets. These are used for manipulating micro-sized objects. Recently, one-finger grippers have been produced that are able to reliably grip using the capillary force. To release objects, either the van der Waals, gravitational or inertial-forces method is used. This article presents methods for reliably gripping and releasing micro-objects using the capillary force. The moisture from the surrounding air is condensed into a thin layer of water on the contact surfaces of the objects. From the thin layer of water, a water meniscus between the micro-sized object, the gripper and the releasing surface is created. Consequently, the water meniscus between the object and the releasing surface produces a high enough capillary force to release the micro-sized object from the tip of the one-finger gripper. In this case, either polystyrene, glass beads with diameters between 5–60 µm, or irregularly shaped dust particles of similar sizes were used. 3D structures made up of micro-sized objects could be constructed using this method. This method is reliable for releasing during assembly and also for gripping, when the objects are removed from the top of the 3D structure—the so-called “disassembling gripping” process. The accuracy of the release was lower than 0.5 µm.

  20. The BioSentinel Bioanalytical Microsystem: Characterizing DNA Radiation Damage in Living Organisms Beyond Earth Orbit

    Science.gov (United States)

    Ricco, A. J.; Hanel, R.; Bhattacharya, S.; Boone, T.; Tan, M.; Mousavi, A.; Rademacher, A.; Schooley, A.; Klamm, B.; Benton, J.; hide

    2016-01-01

    We will present details and initial lab test results from an integrated bioanalytical microsystem designed to conduct the first biology experiments beyond low Earth orbit (LEO) since Apollo 17 (1972). The 14-kg, 12x24x37-cm BioSentinel spacecraft (Figure 1) assays radiation-responsive yeast in its science payload by measuring DNA double-strand breaks (DSBs) repaired via homologous recombination, a mechanism common to all eukaryotes including humans. S. cerevisiae (brewer's yeast) in 288 microwells are provided with nutrient and optically assayed for growth and metabolism via 3-color absorptimetry monthly during the 18-month mission. BioSentinel is one of several secondary payloads to be deployed by NASA's Exploration Mission 1 (EM-1) launch vehicle into approximately 0.95 AU heliocentric orbit in July 2018; it will communicate with Earth from up to 100 million km.

  1. Transgene traceability in transgenic mice: a bioanalytical approach for potential gene-doping analysis.

    Science.gov (United States)

    Bogani, Patrizia; Spiriti, Maria Michela; Lazzarano, Stefano; Arcangeli, Annarosa; Buiatti, Marcello; Minunni, Maria

    2011-11-01

    The World Anti-Doping Agency fears the use of gene doping to enhance athletic performances. Thus, a bioanalytical approach based on end point PCR for detecting markers' of transgenesis traceability was developed. A few sequences from two different vectors using an animal model were selected and traced in different tissues and at different times. In particular, enhanced green fluorescent protein gene and a construct-specific new marker were targeted in the analysis. To make the developed detection approach open to future routine doping analysis, matrices such as urine and tears as well blood were also tested. This study will have impact in evaluating the vector transgenes traceability for the detection of a gene doping event by non-invasive sampling.

  2. Multifunctional Fluorescent-Magnetic Polymeric Colloidal Particles: Preparations and Bioanalytical Applications.

    Science.gov (United States)

    Kaewsaneha, Chariya; Tangboriboonrat, Pramuan; Polpanich, Duangporn; Elaissari, Abdelhamid

    2015-10-28

    Fluorescent-magnetic particles (FMPs) play important roles in modern materials, especially as nanoscale devices in the biomedical field. The interesting features of FMPs are attributed to their dual detection ability, i.e., fluorescent and magnetic modes. Functionalization of FMPs can be performed using several types of polymers, allowing their use in various applications. The synergistic potentials for unique multifunctional, multilevel targeting nanoscale devices as well as combination therapies make them particularly attractive for biomedical applications. However, the synthesis of FMPs is challenging and must be further developed. In this review article, we summarized the most recent representative works on polymer-based FMP systems that have been applied particularly in the bioanalytical field.

  3. Going paperless: implementing an electronic laboratory notebook in a bioanalytical laboratory.

    Science.gov (United States)

    Beato, Brian; Pisek, April; White, Jessica; Grever, Timothy; Engel, Brian; Pugh, Michael; Schneider, Michael; Carel, Barbara; Branstrator, Laurel; Shoup, Ronald

    2011-07-01

    AIT Bioscience, a bioanalytical CRO, implemented a highly configurable, Oracle-based electronic laboratory notebook (ELN) from IDBS called E-WorkBook Suite (EWBS). This ELN provides a high degree of connectivity with other databases, including Watson LIMS. Significant planning and training, along with considerable design effort and template validation for dozens of laboratory workflows were required prior to EWBS being viable for either R&D or regulated work. Once implemented, EWBS greatly reduced the need for traditional quality review upon experiment completion. Numerous real-time error checks occur automatically when conducting EWBS experiments, preventing the majority of laboratory errors by pointing them out while there is still time to correct any issues. Auditing and reviewing EWBS data are very efficient, because all data are forever securely (and even remotely) accessible, provided a reviewer has appropriate credentials. Use of EWBS significantly increases both data quality and laboratory efficiency.

  4. A Method for the Preparation of Chicken Liver P?t? that Reliably Destroys Campylobacters

    OpenAIRE

    Hutchison, Mike; Harrison, Dawn; Richardson, Ian; Tch?rzewska, Monika

    2015-01-01

    This study devised a protocol for the manufacture of commercial quantities of chicken liver pâté that reliably destroyed campylobacters. A literature search identified 40 pâté manufacture recipes. Recipes stages with a potential to be antimicrobial were assembled to form a new protocol that included washing with organic acid, freeze-thaw and flambé in alcohol. Naturally-contaminated, high-risk livers were obtained from clearance flocks at slaughter and the effect of each stage of the protoco...

  5. Reliability analysis of protection systems in NPP applying fault-tree analysis method

    International Nuclear Information System (INIS)

    Bokor, J.; Gaspar, P.; Hetthessy, J.; Szabo, G.

    1998-01-01

    This paper demonstrates the applicability and limits of dependability analysis in nuclear power plants (NPPS) based on the reactor protection refurbishment project (RRP) in NPP Paks. This paper illustrates case studies from the reliability analysis for NPP Paks. It also investigates the solutions for the connection between the data acquisition and subsystem control units (TSs) and the voter units (VTs), it analyzes the influence of the voting in the VT computer level, it studies the effects of the testing procedures to the dependability parameters. (author)

  6. European-American workshop: Determination of reliability and validation methods on NDE. Proceedings

    International Nuclear Information System (INIS)

    1997-01-01

    The invited papers focused on the following issues: 1. The different technical and scientific approaches to the problem of how to guarantees or demonstrate the reliability of NDE: a. Application of established prescriptive standards, b. Probabilities of Detection (PDO) and False Alarm (PFA) from blind trials, c. POD and PFA from signal statistics, d. Modeling, e. ''Technical Justification''; 2. The dissimilar validation/qualification concepts used in different industries in Europe and North America: a. Nuclear Power Generation, b. Aerospace Industry, c. Offcshore Industry and d. Service Companies

  7. Experimental Research of Reliability of Plant Stress State Detection by Laser-Induced Fluorescence Method

    Directory of Open Access Journals (Sweden)

    Yury Fedotov

    2016-01-01

    Full Text Available Experimental laboratory investigations of the laser-induced fluorescence spectra of watercress and lawn grass were conducted. The fluorescence spectra were excited by YAG:Nd laser emitting at 532 nm. It was established that the influence of stress caused by mechanical damage, overwatering, and soil pollution is manifested in changes of the spectra shapes. The mean values and confidence intervals for the ratio of two fluorescence maxima near 685 and 740 nm were estimated. It is presented that the fluorescence ratio could be considered a reliable characteristic of plant stress state.

  8. The effect of DLC-coating deposition method on the reliability and mechanical properties of abutment's screws.

    Science.gov (United States)

    Bordin, Dimorvan; Coelho, Paulo G; Bergamo, Edmara T P; Bonfante, Estevam A; Witek, Lukasz; Del Bel Cury, Altair A

    2018-04-10

    To characterize the mechanical properties of different coating methods of DLC (diamond-like carbon) onto dental implant abutment screws, and their effect on the probability of survival (reliability). Seventy-five abutment screws were allocated into three groups according to the coating method: control (no coating); UMS - DLC applied through unbalanced magnetron sputtering; RFPA-DLC applied through radio frequency plasma-activated (n=25/group). Twelve screws (n=4) were used to determine the hardness and Young's modulus (YM). A 3D finite element model composed of titanium substrate, DLC-layer and a counterpart were constructed. The deformation (μm) and shear stress (MPa) were calculated. The remaining screws of each group were torqued into external hexagon abutments and subjected to step-stress accelerated life-testing (SSALT) (n=21/group). The probability Weibull curves and reliability (probability survival) were calculated considering the mission of 100, 150 and 200N at 50,000 and 100,000 cycles. DLC-coated experimental groups evidenced higher hardness than control (p1 indicating that fatigue contributed to failure. High reliability was depicted at a mission of 100N. At 200N a significant decrease in reliability was detected for all groups (ranging from 39% to 66%). No significant difference was observed among groups regardless of mission. Screw fracture was the chief failure mode. DLC-coating have been used to improve titanium's mechanical properties and increase the reliability of dental implant-supported restorations. Copyright © 2018 The Academy of Dental Materials. Published by Elsevier Inc. All rights reserved.

  9. Rapid and Reliable HPLC Method for the Simultaneous Determination of Dihydroxyacetone, Methylglyoxal and 5-Hydroxymethylfurfural in Leptospermum Honeys.

    Directory of Open Access Journals (Sweden)

    Matthew Pappalardo

    Full Text Available A reliable determination of dihydroxyacetone, methylglyoxal and 5-hydroxymethylfurfural is essential to establishing the commercial value and antimicrobial potential of honeys derived from the Leptospermum species endemic to Australia and New Zealand. We report a robust method for quantitation of all three compounds in a single HPLC run. Honey samples (n = 6 that are derivatized with o-(2,3,4,5,6-Pentafluorobenzyl hydroxylamine were quantitated against a stable anisole internal standard. Linear regression analysis was performed using calibration standards for each compound (n = 6 and results indicated a high degree of accuracy (R2 = 0.999 for this method. The reliability of some commercial methylglyoxal solutions were found to be questionable. Effective quantitation of methylglyoxal content in honey is critical for researchers and industry, and the use of some commercial standards may bias data. Two accurate methylglyoxal standards are proposed, including a commercial standard and a derivative that can be prepared within the laboratory.

  10. Note: An online testing method for lifetime projection of high power light-emitting diode under accelerated reliability test.

    Science.gov (United States)

    Chen, Qi; Chen, Quan; Luo, Xiaobing

    2014-09-01

    In recent years, due to the fast development of high power light-emitting diode (LED), its lifetime prediction and assessment have become a crucial issue. Although the in situ measurement has been widely used for reliability testing in laser diode community, it has not been applied commonly in LED community. In this paper, an online testing method for LED life projection under accelerated reliability test was proposed and the prototype was built. The optical parametric data were collected. The systematic error and the measuring uncertainty were calculated to be within 0.2% and within 2%, respectively. With this online testing method, experimental data can be acquired continuously and sufficient amount of data can be gathered. Thus, the projection fitting accuracy can be improved (r(2) = 0.954) and testing duration can be shortened.

  11. Application of SAW method for multiple-criteria comparative analysis of the reliability of heat supply organizations

    Science.gov (United States)

    Akhmetova, I. G.; Chichirova, N. D.

    2016-12-01

    Heat supply is the most energy-consuming sector of the economy. Approximately 30% of all used primary fuel-and-energy resources is spent on municipal heat-supply needs. One of the key indicators of activity of heat-supply organizations is the reliability of an energy facility. The reliability index of a heat supply organization is of interest to potential investors for assessing risks when investing in projects. The reliability indices established by the federal legislation are actually reduced to a single numerical factor, which depends on the number of heat-supply outages in connection with disturbances in operation of heat networks and the volume of their resource recovery in the calculation year. This factor is rather subjective and may change in a wide range during several years. A technique is proposed for evaluating the reliability of heat-supply organizations with the use of the simple additive weighting (SAW) method. The technique for integrated-index determination satisfies the following conditions: the reliability level of the evaluated heat-supply system is represented maximum fully and objectively; the information used for the reliability-index evaluation is easily available (is located on the Internet in accordance with demands of data-disclosure standards). For reliability estimation of heat-supply organizations, the following indicators were selected: the wear of equipment of thermal energy sources, the wear of heat networks, the number of outages of supply of thermal energy (heat carrier due to technological disturbances on heat networks per 1 km of heat networks), the number of outages of supply of thermal energy (heat carrier due to technologic disturbances on thermal energy sources per 1 Gcal/h of installed power), the share of expenditures in the cost of thermal energy aimed at recovery of the resource (renewal of fixed assets), coefficient of renewal of fixed assets, and a coefficient of fixed asset retirement. A versatile program is developed

  12. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-01-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ''A Technique for Human Error Analysis'' (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst

  13. Advanced RESTART method for the estimation of the probability of failure of highly reliable hybrid dynamic systems

    International Nuclear Information System (INIS)

    Turati, Pietro; Pedroni, Nicola; Zio, Enrico

    2016-01-01

    The efficient estimation of system reliability characteristics is of paramount importance for many engineering applications. Real world system reliability modeling calls for the capability of treating systems that are: i) dynamic, ii) complex, iii) hybrid and iv) highly reliable. Advanced Monte Carlo (MC) methods offer a way to solve these types of problems, which are feasible according to the potentially high computational costs. In this paper, the REpetitive Simulation Trials After Reaching Thresholds (RESTART) method is employed, extending it to hybrid systems for the first time (to the authors’ knowledge). The estimation accuracy and precision of RESTART highly depend on the choice of the Importance Function (IF) indicating how close the system is to failure: in this respect, proper IFs are here originally proposed to improve the performance of RESTART for the analysis of hybrid systems. The resulting overall simulation approach is applied to estimate the probability of failure of the control system of a liquid hold-up tank and of a pump-valve subsystem subject to degradation induced by fatigue. The results are compared to those obtained by standard MC simulation and by RESTART with classical IFs available in the literature. The comparison shows the improvement in the performance obtained by our approach. - Highlights: • We consider the issue of estimating small failure probabilities in dynamic systems. • We employ the RESTART method to estimate the failure probabilities. • New Importance Functions (IFs) are introduced to increase the method performance. • We adopt two dynamic, hybrid, highly reliable systems as case studies. • A comparison with literature IFs proves the effectiveness of the new IFs.

  14. Rapid methods for dioxin and dioxin-like PCBs in food and feedingstuffs. State of the art

    Energy Technology Data Exchange (ETDEWEB)

    Behnisch, P.A. [eurofins-GfA, Muenster (Germany); Hoogenboom, R. [RIKILT-Institute of Food Safety, Wageningen (Netherlands)

    2004-09-15

    The increasing number of local dioxin crises since 2002 becoming more and more apparent due to stricter controls of feed and food in the European Union and the globally increasing number of countries applying similar guidelines make it necessary to establish reliable, time and cost-effective screening methods for the dioxin intake through nutritional pathways. Five years after the last overview presentation about all kinds of different bio-analytical detection methods (BDMs) and the establishment of quality guidelines for screening methods, time has come to include as well the improvements in the chemical methods to speed up the analysis. This review gives an overview about the state-of-the-art improvements and gives a future outlook for both methods, chemical and bio-analytical approach for rapid analyses of dioxins and dioxin-like compounds. Now several new ways of improvement are currently in the pipeline of research and testing, such as PCR, proteomic biomarkers and in case of the clean-up ASE11, PowerPrep and different detection methods as well as different other ways of indicators for dioxins (e.g. correlations to fatty acids).

  15. Reliability Analysis of Operation for Cableways by FTA (Fault Tree Analysis Method

    Directory of Open Access Journals (Sweden)

    Sergej Težak

    2010-05-01

    Full Text Available This paper examines the reliability of the operation of cableway systems in Slovenia, which has major impact on the quality of service in the mountain tourism, mainly in wintertime. Different types of cableway installations in Slovenia were captured in a sample and fault tree analysis (FTA was made on the basis of the obtained data. The paper presents the results of the analysis. With these results it is possible to determine the probability of faults of different types of cableways, which types of faults have the greatest impact on the termination of operation, which components of cableways fail most, what is the impact of age of cableways on the occurrence of the faults. Finally, an attempt was made to find if occurrence of faults on individual cableway installation has also impact on traffic on this cableway due to reduced quality of service. KEYWORDS: cableways, aerial ropeways, chairlifts, ski-tows, quality, faults, fault tree analysis, reliability, service quality, winter tourism, mountain tourist centre

  16. A fast and reliable method for simultaneous waveform, amplitude and latency estimation of single-trial EEG/MEG data.

    Directory of Open Access Journals (Sweden)

    Wouter D Weeda

    Full Text Available The amplitude and latency of single-trial EEG/MEG signals may provide valuable information concerning human brain functioning. In this article we propose a new method to reliably estimate single-trial amplitude and latency of EEG/MEG signals. The advantages of the method are fourfold. First, no a-priori specified template function is required. Second, the method allows for multiple signals that may vary independently in amplitude and/or latency. Third, the method is less sensitive to noise as it models data with a parsimonious set of basis functions. Finally, the method is very fast since it is based on an iterative linear least squares algorithm. A simulation study shows that the method yields reliable estimates under different levels of latency variation and signal-to-noise ratioÕs. Furthermore, it shows that the existence of multiple signals can be correctly determined. An application to empirical data from a choice reaction time study indicates that the method describes these data accurately.

  17. Reliable clarity automatic-evaluation method for optical remote sensing images

    Science.gov (United States)

    Qin, Bangyong; Shang, Ren; Li, Shengyang; Hei, Baoqin; Liu, Zhiwen

    2015-10-01

    Image clarity, which reflects the sharpness degree at the edge of objects in images, is an important quality evaluate index for optical remote sensing images. Scholars at home and abroad have done a lot of work on estimation of image clarity. At present, common clarity-estimation methods for digital images mainly include frequency-domain function methods, statistical parametric methods, gradient function methods and edge acutance methods. Frequency-domain function method is an accurate clarity-measure approach. However, its calculation process is complicate and cannot be carried out automatically. Statistical parametric methods and gradient function methods are both sensitive to clarity of images, while their results are easy to be affected by the complex degree of images. Edge acutance method is an effective approach for clarity estimate, while it needs picking out the edges manually. Due to the limits in accuracy, consistent or automation, these existing methods are not applicable to quality evaluation of optical remote sensing images. In this article, a new clarity-evaluation method, which is based on the principle of edge acutance algorithm, is proposed. In the new method, edge detection algorithm and gradient search algorithm are adopted to automatically search the object edges in images. Moreover, The calculation algorithm for edge sharpness has been improved. The new method has been tested with several groups of optical remote sensing images. Compared with the existing automatic evaluation methods, the new method perform better both in accuracy and consistency. Thus, the new method is an effective clarity evaluation method for optical remote sensing images.

  18. Application of Distribution-free Methods of Study for Identifying the Degree of Reliability of Ukrainian Banks

    Directory of Open Access Journals (Sweden)

    Burkina Natalia V.

    2014-03-01

    Full Text Available Bank ratings are integral elements of information infrastructure that ensure sound development of the banking business. One of the key issues that the clients of banking structures are worried about is the issue of identification of the degree of reliability and trust to the bank. As of now there are no common generally accepted methods of bank rating and the issue of bank reliability is rather problematic. The article considers a modern DEA method of economic and mathematical analysis which is a popular instrument of assessment of quality of services of different subjects and which became very popular in foreign econometric studies. The article demonstrates application of the data encapsulation method (data envelopment analysis, DEA for obtaining new methods of development of bank ratings and marks out incoming and outgoing indicators for building a DEA model as applied to the Ukrainian banking system. The authors also discuss some methodical problems that might appear when applying component indicators for ranging the subjects and offer methods of their elimination.

  19. Optimal design methods for a digital human-computer interface based on human reliability in a nuclear power plant

    International Nuclear Information System (INIS)

    Jiang, Jianjun; Zhang, Li; Xie, Tian; Wu, Daqing; Li, Min; Wang, Yiqun; Peng, Yuyuan; Peng, Jie; Zhang, Mengjia; Li, Peiyao; Ma, Congmin; Wu, Xing

    2017-01-01

    Highlights: • A complete optimization process is established for digital human-computer interfaces of Npps. • A quick convergence search method is proposed. • The authors propose an affinity error probability mapping function to test human reliability. - Abstract: This is the second in a series of papers describing the optimal design method for a digital human-computer interface of nuclear power plant (Npp) from three different points based on human reliability. The purpose of this series is to explore different optimization methods from varying perspectives. This present paper mainly discusses the optimal design method for quantity of components of the same factor. In monitoring process, quantity of components has brought heavy burden to operators, thus, human errors are easily triggered. To solve the problem, the authors propose an optimization process, a quick convergence search method and an affinity error probability mapping function. Two balanceable parameter values of the affinity error probability function are obtained by experiments. The experimental results show that the affinity error probability mapping function about human-computer interface has very good sensitivity and stability, and that quick convergence search method for fuzzy segments divided by component quantity has better performance than general algorithm.

  20. Testing the reliability and efficiency of the pilot Mixed Methods Appraisal Tool (MMAT) for systematic mixed studies review.

    Science.gov (United States)

    Pace, Romina; Pluye, Pierre; Bartlett, Gillian; Macaulay, Ann C; Salsberg, Jon; Jagosh, Justin; Seller, Robbyn

    2012-01-01

    Systematic literature reviews identify, select, appraise, and synthesize relevant literature on a particular topic. Typically, these reviews examine primary studies based on similar methods, e.g., experimental trials. In contrast, interest in a new form of review, known as mixed studies review (MSR), which includes qualitative, quantitative, and mixed methods studies, is growing. In MSRs, reviewers appraise studies that use different methods allowing them to obtain in-depth answers to complex research questions. However, appraising the quality of studies with different methods remains challenging. To facilitate systematic MSRs, a pilot Mixed Methods Appraisal Tool (MMAT) has been developed at McGill University (a checklist and a tutorial), which can be used to concurrently appraise the methodological quality of qualitative, quantitative, and mixed methods studies. The purpose of the present study is to test the reliability and efficiency of a pilot version of the MMAT. The Center for Participatory Research at McGill conducted a systematic MSR on the benefits of Participatory Research (PR). Thirty-two PR evaluation studies were appraised by two independent reviewers using the pilot MMAT. Among these, 11 (34%) involved nurses as researchers or research partners. Appraisal time was measured to assess efficiency. Inter-rater reliability was assessed by calculating a kappa statistic based on dichotomized responses for each criterion. An appraisal score was determined for each study, which allowed the calculation of an overall intra-class correlation. On average, it took 14 min to appraise a study (excluding the initial reading of articles). Agreement between reviewers was moderate to perfect with regards to MMAT criteria, and substantial with respect to the overall quality score of appraised studies. The MMAT is unique, thus the reliability of the pilot MMAT is promising, and encourages further development. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Power Cycling Test Method for Reliability Assessment of Power Device Modules in Respect to Temperature Stress

    DEFF Research Database (Denmark)

    Choi, Ui-Min; Blaabjerg, Frede; Jørgensen, Søren

    2018-01-01

    Power cycling test is one of the important tasks to investigate the reliability performance of power device modules in respect to temperature stress. From this, it is able to predict the lifetime of a component in power converters. In this paper, representative power cycling test circuits......, measurement circuits of wear-out failure indicators as well as measurement strategies for different power cycling test circuits are discussed in order to provide the current state of knowledge of this topic by organizing and evaluating current literature. In the first section of this paper, the structure...... of a conventional power device module and its related wear-out failure mechanisms with degradation indicators are discussed. Then, representative power cycling test circuits are introduced. Furthermore, on-state collector-emitter voltage (VCE ON) and forward voltage (VF) measurement circuits for wear-out condition...

  2. Reliability and Discriminative Ability of a New Method for Soccer Kicking Evaluation

    Science.gov (United States)

    Radman, Ivan; Wessner, Barbara; Bachl, Norbert; Ruzic, Lana; Hackl, Markus; Baca, Arnold; Markovic, Goran

    2016-01-01

    The study aimed to evaluate the test–retest reliability of a newly developed 356 Soccer Shooting Test (356-SST), and the discriminative ability of this test with respect to the soccer players' proficiency level and leg dominance. Sixty-six male soccer players, divided into three groups based on their proficiency level (amateur, n = 24; novice semi-professional, n = 18; and experienced semi-professional players, n = 24), performed 10 kicks following a two-step run up. Forty-eight of them repeated the test on a separate day. The following shooting variables were derived: ball velocity (BV; measured via radar gun), shooting accuracy (SA; average distance from the ball-entry point to the goal centre), and shooting quality (SQ; shooting accuracy divided by the time elapsed from hitting the ball to the point of entry). No systematic bias was evident in the selected shooting variables (SA: 1.98±0.65 vs. 2.00±0.63 m; BV: 24.6±2.3 vs. 24.5±1.9 m s-1; SQ: 2.92±1.0 vs. 2.93±1.0 m s-1; all p>0.05). The intra-class correlation coefficients were high (ICC = 0.70–0.88), and the coefficients of variation were low (CV = 5.3–5.4%). Finally, all three 356-SST variables identify, with adequate sensitivity, differences in soccer shooting ability with respect to the players' proficiency and leg dominance. The results suggest that the 356-SST is a reliable and sensitive test of specific shooting ability in men’s soccer. Future studies should test the validity of these findings in a fatigued state, as well as in other populations. PMID:26812247

  3. An enquiry into the method of paired comparison: reliability, scaling, and Thurstone's Law of Comparative Judgment

    Science.gov (United States)

    Thomas C. Brown; George L. Peterson

    2009-01-01

    The method of paired comparisons is used to measure individuals' preference orderings of items presented to them as discrete binary choices. This paper reviews the theory and application of the paired comparison method, describes a new computer program available for eliciting the choices, and presents an analysis of methods for scaling paired choice data to...

  4. Are LOD and LOQ Reliable Parameters for Sensitivity Evaluation of Spectroscopic Methods?

    Science.gov (United States)

    Ershadi, Saba; Shayanfar, Ali

    2018-03-22

    The limit of detection (LOD) and the limit of quantification (LOQ) are common parameters to assess the sensitivity of analytical methods. In this study, the LOD and LOQ of previously reported terbium sensitized analysis methods were calculated by different methods, and the results were compared with sensitivity parameters [lower limit of quantification (LLOQ)] of U.S. Food and Drug Administration guidelines. The details of the calibration curve and standard deviation of blank samples of three different terbium-sensitized luminescence methods for the quantification of mycophenolic acid, enrofloxacin, and silibinin were used for the calculation of LOD and LOQ. A comparison of LOD and LOQ values calculated by various methods and LLOQ shows a considerable difference. The significant difference of the calculated LOD and LOQ with various methods and LLOQ should be considered in the sensitivity evaluation of spectroscopic methods.

  5. Method for a reliable activation calculation of core components; Methode zur zuverlaessigen Berechnung von Aktivierungen in Kernbauteilen

    Energy Technology Data Exchange (ETDEWEB)

    Mispagel, T.; Phlippen, P.W.; Rose, J. [Wissenschaftlich-Technische Ingenieurberatung GmbH (WTI), Juelich (Germany)

    2013-07-01

    During nuclear power plant operation components and materials are exposed to the neutron flux from the reactor core and radionuclides are produced. After removal of the fuel elements the radioactivity of these radionuclides in the reactor pressure vessel and the core internals provide more than 99% of the activity of the power plant. For the transport, the interim storage and the final disposal of these radioactive components the radioactive inventories have to be decoded with respect to radiation and nuclides. The declaration of the nuclide and activity inventories requires a reliable calculation of neutron induced activation of reactor components. These activation calculations describe the pile-up of nuclides due to irradiation and due to the decay of nuclides. For an optimum usage of the activity capacities of the repository Konrad it is necessary to have a qualified calculation procedure that keeps the conservatism as low as possible.

  6. A new method for improving the reliability of fracture toughness surveillance of nuclear pressure vessel by neutron irradiated embrittlement

    International Nuclear Information System (INIS)

    Zhang Xinping; Shi Yaowu

    1992-01-01

    In order to obtain more information from neutron irradiated sample specimens and raise the reliability of fracture toughness surveillance test, it has more important significance to repeatedly exploit the broken Charpy-size specimen which had been tested in surveillance test. In this work, on the renewing design and utilization for Charpy-size specimens, 9 data of fracture toughness can be gained from one pre-cracked side-grooved Charpy-size specimen while at the preset usually only 1 to 3 data of fracture toughness can be obtained from one Chharpy-size specimen. Thus, it is found that the new method would obviously improve the reliability of fracture toughness surveillance test and evaluation. Some factors which affect the reasonable design of pre-cracked deep side-groove Charpy-size compound specimen have been discussed

  7. A study of digital hardware architectures for nuclear reactors protection systems applications - reliability and safety analysis methods

    International Nuclear Information System (INIS)

    Benko, Pedro Luiz

    1997-01-01

    A study of digital hardware architectures, including experience in many countries, topologies and solutions to interface circuits for protection systems of nuclear reactors is presented. Methods for developing digital systems architectures based on fault tolerant and safety requirements is proposed. Directives for assessing such conditions are suggested. Techniques and the most common tools employed in reliability, safety evaluation and modeling of hardware architectures is also presented. Markov chain modeling is used to evaluate the reliability of redundant architectures. In order to estimate software quality, several mechanisms to be used in design, specification, and validation and verification (V and V) procedures are suggested. A digital protection system architecture has been analyzed as a case study. (author)

  8. A simple method of measuring tibial tubercle to trochlear groove distance on MRI: description of a novel and reliable technique.

    Science.gov (United States)

    Camp, Christopher L; Heidenreich, Mark J; Dahm, Diane L; Bond, Jeffrey R; Collins, Mark S; Krych, Aaron J

    2016-03-01

    Tibial tubercle-trochlear groove (TT-TG) distance is a variable that helps guide surgical decision-making in patients with patellar instability. The purpose of this study was to compare the accuracy and reliability of an MRI TT-TG measuring technique using a simple external alignment method to a previously validated gold standard technique that requires advanced software read by radiologists. TT-TG was calculated by MRI on 59 knees with a clinical diagnosis of patellar instability in a blinded and randomized fashion by two musculoskeletal radiologists using advanced software and by two orthopaedists using the study technique which utilizes measurements taken on a simple electronic imaging platform. Interrater reliability between the two radiologists and the two orthopaedists and intermethods reliability between the two techniques were calculated using interclass correlation coefficients (ICC) and concordance correlation coefficients (CCC). ICC and CCC values greater than 0.75 were considered to represent excellent agreement. The mean TT-TG distance was 14.7 mm (Standard Deviation (SD) 4.87 mm) and 15.4 mm (SD 5.41) as measured by the radiologists and orthopaedists, respectively. Excellent interobserver agreement was noted between the radiologists (ICC 0.941; CCC 0.941), the orthopaedists (ICC 0.978; CCC 0.976), and the two techniques (ICC 0.941; CCC 0.933). The simple TT-TG distance measurement technique analysed in this study resulted in excellent agreement and reliability as compared to the gold standard technique. This method can predictably be performed by orthopaedic surgeons without advanced radiologic software. II.

  9. The quadrant method measuring four points is as a reliable and accurate as the quadrant method in the evaluation after anatomical double-bundle ACL reconstruction.

    Science.gov (United States)

    Mochizuki, Yuta; Kaneko, Takao; Kawahara, Keisuke; Toyoda, Shinya; Kono, Norihiko; Hada, Masaru; Ikegami, Hiroyasu; Musha, Yoshiro

    2017-11-20

    The quadrant method was described by Bernard et al. and it has been widely used for postoperative evaluation of anterior cruciate ligament (ACL) reconstruction. The purpose of this research is to further develop the quadrant method measuring four points, which we named four-point quadrant method, and to compare with the quadrant method. Three-dimensional computed tomography (3D-CT) analyses were performed in 25 patients who underwent double-bundle ACL reconstruction using the outside-in technique. The four points in this study's quadrant method were defined as point1-highest, point2-deepest, point3-lowest, and point4-shallowest, in femoral tunnel position. Value of depth and height in each point was measured. Antero-medial (AM) tunnel is (depth1, height2) and postero-lateral (PL) tunnel is (depth3, height4) in this four-point quadrant method. The 3D-CT images were evaluated independently by 2 orthopaedic surgeons. A second measurement was performed by both observers after a 4-week interval. Intra- and inter-observer reliability was calculated by means of intra-class correlation coefficient (ICC). Also, the accuracy of the method was evaluated against the quadrant method. Intra-observer reliability was almost perfect for both AM and PL tunnel (ICC > 0.81). Inter-observer reliability of AM tunnel was substantial (ICC > 0.61) and that of PL tunnel was almost perfect (ICC > 0.81). The AM tunnel position was 0.13% deep, 0.58% high and PL tunnel position was 0.01% shallow, 0.13% low compared to quadrant method. The four-point quadrant method was found to have high intra- and inter-observer reliability and accuracy. This method can evaluate the tunnel position regardless of the shape and morphology of the bone tunnel aperture for use of comparison and can provide measurement that can be compared with various reconstruction methods. The four-point quadrant method of this study is considered to have clinical relevance in that it is a detailed and accurate tool for

  10. The treatment of commission errors in first generation human reliability analysis methods

    Energy Technology Data Exchange (ETDEWEB)

    Alvarengga, Marco Antonio Bayout; Fonseca, Renato Alves da, E-mail: bayout@cnen.gov.b, E-mail: rfonseca@cnen.gov.b [Comissao Nacional de Energia Nuclear (CNEN) Rio de Janeiro, RJ (Brazil); Melo, Paulo Fernando Frutuoso e, E-mail: frutuoso@nuclear.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Engenharia Nuclear

    2011-07-01

    Human errors in human reliability analysis can be classified generically as errors of omission and commission errors. Omission errors are related to the omission of any human action that should have been performed, but does not occur. Errors of commission are those related to human actions that should not be performed, but which in fact are performed. Both involve specific types of cognitive error mechanisms, however, errors of commission are more difficult to model because they are characterized by non-anticipated actions that are performed instead of others that are omitted (omission errors) or are entered into an operational task without being part of the normal sequence of this task. The identification of actions that are not supposed to occur depends on the operational context that will influence or become easy certain unsafe actions of the operator depending on the operational performance of its parameters and variables. The survey of operational contexts and associated unsafe actions is a characteristic of second-generation models, unlike the first generation models. This paper discusses how first generation models can treat errors of commission in the steps of detection, diagnosis, decision-making and implementation, in the human information processing, particularly with the use of THERP tables of errors quantification. (author)

  11. A Method for the Preparation of Chicken Liver Pâté that Reliably Destroys Campylobacters

    Science.gov (United States)

    Hutchison, Mike; Harrison, Dawn; Richardson, Ian; Tchórzewska, Monika

    2015-01-01

    This study devised a protocol for the manufacture of commercial quantities of chicken liver pâté that reliably destroyed campylobacters. A literature search identified 40 pâté manufacture recipes. Recipes stages with a potential to be antimicrobial were assembled to form a new protocol that included washing with organic acid, freeze-thaw and flambé in alcohol. Naturally-contaminated, high-risk livers were obtained from clearance flocks at slaughter and the effect of each stage of the protocol on Campylobacter populations was determined. Organic acid washing changed the color of the liver surfaces. However, there were no significant differences between liver surface color changes when a range of concentrations of lactic acid and ethanoic acid washes were compared by reflective spectrophotometry. A 5% (w/v) acid wash reduced numbers of indigenous campylobacters by around 1.5 log10 CFU/g for both acids. The use of a Bain Marie was found to more reproducibly apply heat compared with pan-frying. Antimicrobial recipe stages reduced the numbers of campylobacters, but not significantly if thermal processing was ineffective. Cooking to 63°C was confirmed to be a critical control point for campylobacters cooked in a Bain Marie. Organoleptic and sensory assessment of pâté determined an overall preference for pâté made from frozen livers. PMID:25927478

  12. Is epicardial adipose tissue, assessed by echocardiography, a reliable method for visceral adipose tissue prediction?

    Science.gov (United States)

    Silaghi, Alina Cristina; Poantă, Laura; Valea, Ana; Pais, Raluca; Silaghi, Horatiu

    2011-03-01

    Epicardial adipose tissue is an ectopic fat storage at the heart surface in direct contact with the coronary arteries. It is considered a metabolically active tissue, being a local source of pro-inflammatory factors that contribute to the pathogenesis of coronary artery disease. The AIM of our study was to establish correlations between echocardiographic assessment of epicardial adipose tissue and anthropometric and ultrasound measurements of the central and peripheral fat depots. The study was conducted on 22 patients with or without coronaropathy. Epicardial adipose tissue was measured using Aloka Prosound α 10 machine with a 3.5-7.5 MHz variable-frequency transducer and subcutaneous and visceral fat with Esaote Megas GPX machine and 3.5-7.5 MHz variable frequency transducer. Epicardial adipose tissue measured by echocardiography is correlated with waist circumference (p < 0.05), visceral adipose tissue thickness measured by ultrasonography (US) and is not correlated with body mass index (p = 0.315), hip and thigh circumference or subcutaneous fat thickness measured by US. Our study confirms that US assessment of epicardial fat correlates with anthropometric and US measurements of the central fat, representing an indirect but reliable marker of the visceral fat.

  13. A Method for the Preparation of Chicken Liver Pâté that Reliably Destroys Campylobacters.

    Science.gov (United States)

    Hutchison, Mike; Harrison, Dawn; Richardson, Ian; Tchórzewska, Monika

    2015-04-28

    This study devised a protocol for the manufacture of commercial quantities of chicken liver pâté that reliably destroyed campylobacters. A literature search identified 40 pâté manufacture recipes. Recipes stages with a potential to be antimicrobial were assembled to form a new protocol that included washing with organic acid, freeze-thaw and flambé in alcohol. Naturally-contaminated, high-risk livers were obtained from clearance flocks at slaughter and the effect of each stage of the protocol on Campylobacter populations was determined. Organic acid washing changed the color of the liver surfaces. However, there were no significant differences between liver surface color changes when a range of concentrations of lactic acid and ethanoic acid washes were compared by reflective spectrophotometry. A 5% (w/v) acid wash reduced numbers of indigenous campylobacters by around 1.5 log₁₀ CFU/g for both acids. The use of a Bain Marie was found to more reproducibly apply heat compared with pan-frying. Antimicrobial recipe stages reduced the numbers of campylobacters, but not significantly if thermal processing was ineffective. Cooking to 63°C was confirmed to be a critical control point for campylobacters cooked in a Bain Marie. Organoleptic and sensory assessment of pâté determined an overall preference for pâté made from frozen livers.

  14. A Method for the Preparation of Chicken Liver Pâté that Reliably Destroys Campylobacters

    Directory of Open Access Journals (Sweden)

    Mike Hutchison

    2015-04-01

    Full Text Available This study devised a protocol for the manufacture of commercial quantities of chicken liver pâté that reliably destroyed campylobacters. A literature search identified 40 pâté manufacture recipes. Recipes stages with a potential to be antimicrobial were assembled to form a new protocol that included washing with organic acid, freeze-thaw and flambé in alcohol. Naturally-contaminated, high-risk livers were obtained from clearance flocks at slaughter and the effect of each stage of the protocol on Campylobacter populations was determined. Organic acid washing changed the color of the liver surfaces. However, there were no significant differences between liver surface color changes when a range of concentrations of lactic acid and ethanoic acid washes were compared by reflective spectrophotometry. A 5% (w/v acid wash reduced numbers of indigenous campylobacters by around 1.5 log10 CFU/g for both acids. The use of a Bain Marie was found to more reproducibly apply heat compared with pan-frying. Antimicrobial recipe stages reduced the numbers of campylobacters, but not significantly if thermal processing was ineffective. Cooking to 63°C was confirmed to be a critical control point for campylobacters cooked in a Bain Marie. Organoleptic and sensory assessment of pâté determined an overall preference for pâté made from frozen livers.

  15. Dating of zircon from high-grade rocks: Which is the most reliable method?

    Directory of Open Access Journals (Sweden)

    Alfred Kröner

    2014-07-01

    Full Text Available Magmatic zircon in high-grade metamorphic rocks is often characterized by complex textures as revealed by cathodoluminenscence (CL that result from multiple episodes of recrystallization, overgrowth, Pb-loss and modifications through fluid-induced disturbances of the crystal structure and the original U-Th-Pb isotopic systematics. Many of these features can be recognized in 2-dimensional CL images, and isotopic analysis of such domains using a high resolution ion-microprobe with only shallow penetration of the zircon surface may be able to reconstruct much of the magmatic and complex post-magmatic history of such grains. In particular it is generally possible to find original magmatic domains yielding concordant ages. In contrast, destructive techniques such as LA-ICP-MS consume a large volume, leave a deep crater in the target grain, and often sample heterogeneous domains that are not visible and thus often yield discordant results which are difficult to interpret. We provide examples of complex magmatic zircon from a southern Indian granulite terrane where SHRIMP II and LA-ICP-MS analyses are compared. The SHRIMP data are shown to be more precise and reliable, and we caution against the use of LA-ICP-MS in deciphering the chronology of complex zircons from high-grade terranes.

  16. Rapid and reliable high-throughput methods of DNA extraction for use in barcoding and molecular systematics of mushrooms.

    Science.gov (United States)

    Dentinger, Bryn T M; Margaritescu, Simona; Moncalvo, Jean-Marc

    2010-07-01

    We present two methods for DNA extraction from fresh and dried mushrooms that are adaptable to high-throughput sequencing initiatives, such as DNA barcoding. Our results show that these protocols yield ∼85% sequencing success from recently collected materials. Tests with both recent (100 years) specimens reveal that older collections have low success rates and may be an inefficient resource for populating a barcode database. However, our method of extracting DNA from herbarium samples using small amount of tissue is reliable and could be used for important historical specimens. The application of these protocols greatly reduces time, and therefore cost, of generating DNA sequences from mushrooms and other fungi vs. traditional extraction methods. The efficiency of these methods illustrates that standardization and streamlining of sample processing should be shifted from the laboratory to the field. © 2009 Blackwell Publishing Ltd.

  17. Reliability of different mark-recapture methods for population size estimation tested against reference population sizes constructed from field data.

    Directory of Open Access Journals (Sweden)

    Annegret Grimm

    Full Text Available Reliable estimates of population size are fundamental in many ecological studies and biodiversity conservation. Selecting appropriate methods to estimate abundance is often very difficult, especially if data are scarce. Most studies concerning the reliability of different estimators used simulation data based on assumptions about capture variability that do not necessarily reflect conditions in natural populations. Here, we used data from an intensively studied closed population of the arboreal gecko Gehyra variegata to construct reference population sizes for assessing twelve different population size estimators in terms of bias, precision, accuracy, and their 95%-confidence intervals. Two of the reference populations reflect natural biological entities, whereas the other reference populations reflect artificial subsets of the population. Since individual heterogeneity was assumed, we tested modifications of the Lincoln-Petersen estimator, a set of models in programs MARK and CARE-2, and a truncated geometric distribution. Ranking of methods was similar across criteria. Models accounting for individual heterogeneity performed best in all assessment criteria. For populations from heterogeneous habitats without obvious covariates explaining individual heterogeneity, we recommend using the moment estimator or the interpolated jackknife estimator (both implemented in CAPTURE/MARK. If data for capture frequencies are substantial, we recommend the sample coverage or the estimating equation (both models implemented in CARE-2. Depending on the distribution of catchabilities, our proposed multiple Lincoln-Petersen and a truncated geometric distribution obtained comparably good results. The former usually resulted in a minimum population size and the latter can be recommended when there is a long tail of low capture probabilities. Models with covariates and mixture models performed poorly. Our approach identified suitable methods and extended options to

  18. Reliable B cell epitope predictions: impacts of method development and improved benchmarking

    DEFF Research Database (Denmark)

    Kringelum, Jens Vindahl; Lundegaard, Claus; Lund, Ole

    2012-01-01

    biomedical applications such as; rational vaccine design, development of disease diagnostics and immunotherapeutics. However, experimental mapping of epitopes is resource intensive making in silico methods an appealing complementary approach. To date, the reported performance of methods for in silico mapping...... evaluation data set improved from 0.712 to 0.727. Our results thus demonstrate that given proper benchmark definitions, B-cell epitope prediction methods achieve highly significant predictive performances suggesting these tools to be a powerful asset in rational epitope discovery. The updated version...

  19. Reliability and reproducibility of several methods of arthroscopic assessment of femoral tunnel position during anterior cruciate ligament reconstruction.

    Science.gov (United States)

    Ilahi, Omer A; Mansfield, David J; Urrea, Luis H; Qadeer, Ali A

    2014-10-01

    To assess interobserver and intraobserver agreement of estimating anterior cruciate ligament (ACL) femoral tunnel positioning arthroscopically using circular and linear (noncircular) estimation methods and to determine whether overlay template visual aids improve agreement. Standardized intraoperative pictures of femoral tunnel pilot holes (taken with a 30° arthroscope through an anterolateral portal at 90° of knee flexion with horizontal being parallel to the tibial surface) in 27 patients undergoing single-bundle ACL reconstruction were presented to 3 fellowship-trained arthroscopists on 2 separate occasions. On both viewings, each surgeon estimated the femoral tunnel pilot hole location to the nearest half-hour mark using a whole clock face and half clock face, to the nearest 15° using a whole compass and half compass, in the top or bottom half of a linear quadrant, and in the top or bottom half of a linear trisector. Evaluations were performed first without and then with an overlay template of each estimation method. The average difference among reviewers was quite similar for all 4 circular methods with the use of visual aids. Without overlay template visual aids, pair-wise κ statistic values for interobserver agreement ranged from -0.14 to 0.56 for the whole clock face and from 0.16 to 0.42 for the half clock face. With overlay visual guides, interobserver agreement ranged from 0.29 to 0.63 for the whole clock face and from 0.17 to 0.66 for the half clock face. The quadrant method's interobserver agreement ranged from 0.22 to 0.60, and that of the trisection method ranged from 0.17 to 0.57. Neither linear estimation method's reliability uniformly improved with the use of overlay templates. Intraobserver agreement without overlay templates ranged from 0.17 to 0.49 for the whole clock face, 0.11 to 0.47 for the half clock face, 0.01 to 0.66 for the quadrant method, and 0.20 to 0.57 for the trisection method. Use of overlay templates did not uniformly

  20. Method for critical software event execution reliability in high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Kidd, M.E. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This report contains viewgraphs on a method called SEER, which provides a high level of confidence that critical software driven event execution sequences faithfully exceute in the face of transient computer architecture failures in both normal and abnormal operating environments.

  1. Quantitative assessment of probability of failing safely for the safety instrumented system using reliability block diagram method

    International Nuclear Information System (INIS)

    Jin, Jianghong; Pang, Lei; Zhao, Shoutang; Hu, Bin

    2015-01-01

    Highlights: • Models of PFS for SIS were established by using the reliability block diagram. • The more accurate calculation of PFS for SIS can be acquired by using SL. • Degraded operation of complex SIS does not affect the availability of SIS. • The safe undetected failure is the largest contribution to the PFS of SIS. - Abstract: The spurious trip of safety instrumented system (SIS) brings great economic losses to production. How to ensure the safety instrumented system is reliable and available has been put on the schedule. But the existing models on spurious trip rate (STR) or probability of failing safely (PFS) are too simplified and not accurate, in-depth studies of availability to obtain more accurate PFS for SIS are required. Based on the analysis of factors that influence the PFS for the SIS, using reliability block diagram method (RBD), the quantitative study of PFS for the SIS is carried out, and gives some application examples. The results show that, the common cause failure will increase the PFS; degraded operation does not affect the availability of the SIS; if the equipment was tested and repaired one by one, the unavailability of the SIS can be ignored; the corresponding occurrence time of independent safe undetected failure should be the system lifecycle (SL) rather than the proof test interval and the independent safe undetected failure is the largest contribution to the PFS for the SIS

  2. Methods to achieve high interrater reliability in data collection from primary care medical records.

    Science.gov (United States)

    Liddy, Clare; Wiens, Miriam; Hogg, William

    2011-01-01

    We assessed interrater reliability (IRR) of chart abstractors within a randomized trial of cardiovascular care in primary care. We report our findings, and outline issues and provide recommendations related to determining sample size, frequency of verification, and minimum thresholds for 2 measures of IRR: the κ statistic and percent agreement. We designed a data quality monitoring procedure having 4 parts: use of standardized protocols and forms, extensive training, continuous monitoring of IRR, and a quality improvement feedback mechanism. Four abstractors checked a 5% sample of charts at 3 time points for a predefined set of indicators of the quality of care. We set our quality threshold for IRR at a κ of 0.75, a percent agreement of 95%, or both. Abstractors reabstracted a sample of charts in 16 of 27 primary care practices, checking a total of 132 charts with 38 indicators per chart. The overall κ across all items was 0.91 (95% confidence interval, 0.90-0.92) and the overall percent agreement was 94.3%, signifying excellent agreement between abstractors. We gave feedback to the abstractors to highlight items that had a κ of less than 0.70 or a percent agreement less than 95%. No practice had to have its charts abstracted again because of poor quality. A 5% sampling of charts for quality control using IRR analysis yielded κ and agreement levels that met or exceeded our quality thresholds. Using 3 time points during the chart audit phase allows for early quality control as well as ongoing quality monitoring. Our results can be used as a guide and benchmark for other medical chart review studies in primary care.

  3. Isometric hand grip strength measured by the Nintendo Wii Balance Board - a reliable new method.

    Science.gov (United States)

    Blomkvist, A W; Andersen, S; de Bruin, E D; Jorgensen, M G

    2016-02-03

    Low hand grip strength is a strong predictor for both long-term and short-term disability and mortality. The Nintendo Wii Balance Board (WBB) is an inexpensive, portable, wide-spread instrument with the potential for multiple purposes in assessing clinically relevant measures including muscle strength. The purpose of the study was to explore intrarater reliability and concurrent validity of the WBB by comparing it to the Jamar hand dynamometer. Intra-rater test-retest cohort design with randomized validity testing on the first session. Using custom WBB software, thirty old adults (69.0 ± 4.2 years of age) were studied for reproducibility and concurrent validity compared to the Jamar hand dynamometer. Reproducibility was tested for dominant and non-dominant hands during the same time-of-day, one week apart. Intraclass correlation coefficient (ICC) and standard error of measurement (SEM) and limits of agreement (LOA) were calculated to describe relative and absolute reproducibility respectively. To describe concurrent validity, Pearson's product-moment correlation and ICC was calculated. Reproducibility was high with ICC values of >0.948 across all measures. Both SEM and LOA were low (0.2-0.5 kg and 2.7-4.2 kg, respectively) in both the dominant and non-dominant hand. For validity, Pearson correlations were high (0.80-0.88) and ICC values were fair to good (0.763-0.803). Reproducibility for WBB was high for relative measures and acceptable for absolute measures. In addition, concurrent validity between the Jamar hand dynamometer and the WBB was acceptable. Thus, the WBB may be a valid instrument to assess hand grip strength in older adults.

  4. A fast and reliable readout method for quantitative analysis of surface-enhanced Raman scattering nanoprobes on chip surface

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Hyejin; Jeong, Sinyoung; Ko, Eunbyeol; Jeong, Dae Hong, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Department of Chemistry Education, Seoul National University, Seoul 151-742 (Korea, Republic of); Kang, Homan [Interdisciplinary Program in Nano-Science and Technology, Seoul National University, Seoul 151-742 (Korea, Republic of); Lee, Yoon-Sik, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Interdisciplinary Program in Nano-Science and Technology, Seoul National University, Seoul 151-742 (Korea, Republic of); School of Chemical and Biological Engineering, Seoul National University, Seoul 151-742 (Korea, Republic of); Lee, Ho-Young, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Department of Nuclear Medicine, Seoul National University Bundang Hospital, Seongnam 463-707 (Korea, Republic of)

    2015-05-15

    Surface-enhanced Raman scattering techniques have been widely used for bioanalysis due to its high sensitivity and multiplex capacity. However, the point-scanning method using a micro-Raman system, which is the most common method in the literature, has a disadvantage of extremely long measurement time for on-chip immunoassay adopting a large chip area of approximately 1-mm scale and confocal beam point of ca. 1-μm size. Alternative methods such as sampled spot scan with high confocality and large-area scan method with enlarged field of view and low confocality have been utilized in order to minimize the measurement time practically. In this study, we analyzed the two methods in respect of signal-to-noise ratio and sampling-led signal fluctuations to obtain insights into a fast and reliable readout strategy. On this basis, we proposed a methodology for fast and reliable quantitative measurement of the whole chip area. The proposed method adopted a raster scan covering a full area of 100 μm × 100 μm region as a proof-of-concept experiment while accumulating signals in the CCD detector for single spectrum per frame. One single scan with 10 s over 100 μm × 100 μm area yielded much higher sensitivity compared to sampled spot scanning measurements and no signal fluctuations attributed to sampled spot scan. This readout method is able to serve as one of key technologies that will bring quantitative multiplexed detection and analysis into practice.

  5. Human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1992-01-01

    This book resulted from the activity of Task Force 4.2 - 'Human Reliability'. This group was established on February 27th, 1986, at the plenary meeting of the Technical Reliability Committee of VDI, within the framework of the joint committee of VDI on industrial systems technology - GIS. It is composed of representatives of industry, representatives of research institutes, of technical control boards and universities, whose job it is to study how man fits into the technical side of the world of work and to optimize this interaction. In a total of 17 sessions, information from the part of ergonomy dealing with human reliability in using technical systems at work was exchanged, and different methods for its evaluation were examined and analyzed. The outcome of this work was systematized and compiled in this book. (orig.) [de

  6. Reliability residual-life prediction method for thermal aging based on performance degradation

    International Nuclear Information System (INIS)

    Ren Shuhong; Xue Fei; Yu Weiwei; Ti Wenxin; Liu Xiaotian

    2013-01-01

    The paper makes the study of the nuclear power plant main pipeline. The residual-life of the main pipeline that failed due to thermal aging has been studied by the use of performance degradation theory and Bayesian updating methods. Firstly, the thermal aging impact property degradation process of the main pipeline austenitic stainless steel has been analyzed by the accelerated thermal aging test data. Then, the thermal aging residual-life prediction model based on the impact property degradation data is built by Bayesian updating methods. Finally, these models are applied in practical situations. It is shown that the proposed methods are feasible and the prediction accuracy meets the needs of the project. Also, it provides a foundation for the scientific management of aging management of the main pipeline. (authors)

  7. High-throughput screening assay used in pharmacognosy: Selection, optimization and validation of methods of enzymatic inhibition by UV-visible spectrophotometry

    Directory of Open Access Journals (Sweden)

    Graciela Granados-Guzmán

    2014-02-01

    Full Text Available In research laboratories of both organic synthesis and extraction of natural products, every day a lot of products that can potentially introduce some biological activity are obtained. Therefore it is necessary to have in vitro assays, which provide reliable information for further evaluation in in vivo systems. From this point of view, in recent years has intensified the use of high-throughput screening assays. Such trials should be optimized and validated for accurate and precise results, i.e. reliable. The present review addresses the steps needed to develop and validate bioanalytical methods, emphasizing UV-Visible spectrophotometry as detection system. Particularly focuses on the selection of the method, the optimization to determine the best experimental conditions, validation, implementation of optimized and validated method to real samples, and finally maintenance and possible transfer it to a new laboratory.

  8. Reactive power control methods for improved reliability of wind power inverters under wind speed variations

    DEFF Research Database (Denmark)

    Ma, Ke; Liserre, Marco; Blaabjerg, Frede

    2012-01-01

    method to relieve the thermal cycling of power switching devices under severe wind speed variations, by circulating reactive power among the parallel power converters in a WTS or among the WTS's in a wind park. The amount of reactive power is adjusted to limit the junction temperature fluctuation...

  9. Active gate driving method for reliability improvement of IGBTs via junction temperature swing reduction

    DEFF Research Database (Denmark)

    Luo, Haoze; Iannuzzo, Francesco; Ma, Ke

    2016-01-01

    be changed according to the amplitude of AC current. Accordingly, a closed-loop thermal control method including the functions of root-mean-square calculation and phase analysis is proposed. Hence ΔTj can be reduced by means of changing losses-related gate resistors on the basis of output fundamental...

  10. A reliable morphological method to assess the age of male Anopheles gambiae

    NARCIS (Netherlands)

    Huho, B.J.; Ng'habi, K.R.; Killeen, G.F.; Nkwengulila, G.; Knols, B.G.J.; Ferguson, H.M.

    2006-01-01

    Background - Release of genetically-modified (GM) or sterile male mosquitoes for malaria control is hampered by inability to assess the age and mating history of free-living male Anopheles. Methods - Age and mating-related changes in the reproductive system of male Anopheles gambiae were quantified

  11. A method for the automated, reliable retrieval of publication-citation records.

    Directory of Open Access Journals (Sweden)

    Derek Ruths

    Full Text Available BACKGROUND: Publication records and citation indices often are used to evaluate academic performance. For this reason, obtaining or computing them accurately is important. This can be difficult, largely due to a lack of complete knowledge of an individual's publication list and/or lack of time available to manually obtain or construct the publication-citation record. While online publication search engines have somewhat addressed these problems, using raw search results can yield inaccurate estimates of publication-citation records and citation indices. METHODOLOGY: In this paper, we present a new, automated method that produces estimates of an individual's publication-citation record from an individual's name and a set of domain-specific vocabulary that may occur in the individual's publication titles. Because this vocabulary can be harvested directly from a research web page or online (partial publication list, our method delivers an easy way to obtain estimates of a publication-citation record and the relevant citation indices. Our method works by applying a series of stringent name and content filters to the raw publication search results returned by an online publication search engine. In this paper, our method is run using Google Scholar, but the underlying filters can be easily applied to any existing publication search engine. When compared against a manually constructed data set of individuals and their publication-citation records, our method provides significant improvements over raw search results. The estimated publication-citation records returned by our method have an average sensitivity of 98% and specificity of 72% (in contrast to raw search result specificity of less than 10%. When citation indices are computed using these records, the estimated indices are within of the true value 10%, compared to raw search results which have overestimates of, on average, 75%. CONCLUSIONS: These results confirm that our method provides

  12. Optimal design method for a digital human–computer interface based on human reliability in a nuclear power plant. Part 3: Optimization method for interface task layout

    International Nuclear Information System (INIS)

    Jiang, Jianjun; Wang, Yiqun; Zhang, Li; Xie, Tian; Li, Min; Peng, Yuyuan; Wu, Daqing; Li, Peiyao; Ma, Congmin; Shen, Mengxu; Wu, Xing; Weng, Mengyun; Wang, Shiwei; Xie, Cen

    2016-01-01

    Highlights: • The authors present an optimization algorithm for interface task layout. • The performing process of the proposed algorithm was depicted. • The performance evaluation method adopted neural network method. • The optimization layouts of an event interface tasks were obtained by experiments. - Abstract: This is the last in a series of papers describing the optimal design for a digital human–computer interface of a nuclear power plant (NPP) from three different points based on human reliability. The purpose of this series is to propose different optimization methods from varying perspectives to decrease human factor events that arise from the defects of a human–computer interface. The present paper mainly solves the optimization method as to how to effectively layout interface tasks into different screens. The purpose of this paper is to decrease human errors by reducing the distance that an operator moves among different screens in each operation. In order to resolve the problem, the authors propose an optimization process of interface task layout for digital human–computer interface of a NPP. As to how to automatically layout each interface task into one of screens in each operation, the paper presents a shortest moving path optimization algorithm with dynamic flag based on human reliability. To test the algorithm performance, the evaluation method uses neural network based on human reliability. The less the human error probabilities are, the better the interface task layouts among different screens are. Thus, by analyzing the performance of each interface task layout, the optimization result is obtained. Finally, the optimization layouts of spurious safety injection event interface tasks of the NPP are obtained by an experiment, the proposed methods has a good accuracy and stabilization.

  13. Validity and reliability of the Thai version of the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU

    Directory of Open Access Journals (Sweden)

    Pipanmekaporn T

    2014-05-01

    Full Text Available Tanyong Pipanmekaporn,1 Nahathai Wongpakaran,2 Sirirat Mueankwan,3 Piyawat Dendumrongkul,2 Kaweesak Chittawatanarat,3 Nantiya Khongpheng,3 Nongnut Duangsoy31Department of Anesthesiology, Faculty of Medicine, Chiang Mai University, Chiang Mai, Thailand; 2Department of Psychiatry, Faculty of Medicine, Chiang Mai University, Chiang Mai, Thailand; 3Division of Surgical Critical Care and Trauma, Department of Surgery, Chiang Mai University Hospital, Chiang Mai, ThailandPurpose: The purpose of this study was to determine the validity and reliability of the Thai version of the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU, when compared to the diagnoses made by delirium experts.Patients and methods: This was a cross-sectional study conducted in both surgical intensive care and subintensive care units in Thailand between February–June 2011. Seventy patients aged 60 years or older who had been admitted to the units were enrolled into the study within the first 48 hours of admission. Each patient was randomly assessed as to whether they had delirium by a nurse using the Thai version of the CAM-ICU algorithm (Thai CAM-ICU or by a delirium expert using the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision.Results: The prevalence of delirium was found to be 18.6% (n=13 by the delirium experts. The sensitivity of the Thai CAM-ICU’s algorithms was found to be 92.3% (95% confidence interval [CI] =64.0%-99.8%, while the specificity was 94.7% (95% CI =85.4%-98.9%. The instrument displayed good interrater reliability (Cohen’s κ=0.81; 95% CI =0.64-0.99. The time taken to complete the Thai CAM-ICU was 1 minute (interquatile range, 1-2 minutes.Conclusion: The Thai CAM-ICU demonstrated good validity, reliability, and ease of use when diagnosing delirium in a surgical intensive care unit setting. The use of this diagnostic tool should be encouraged for daily, routine use, so as to promote the early detection

  14. Wind turbine performance: Methods and criteria for reliability of measured power curves

    Energy Technology Data Exchange (ETDEWEB)

    Griffin, D.A. [Advanced Wind Turbines Inc., Seattle, WA (United States)

    1996-12-31

    In order to evaluate the performance of prototype turbines, and to quantify incremental changes in performance through field testing, Advanced Wind Turbines (AWT) has been developing methods and requirements for power curve measurement. In this paper, field test data is used to illustrate several issues and trends which have resulted from this work. Averaging and binning processes, data hours per wind-speed bin, wind turbulence levels, and anemometry methods are all shown to have significant impacts on the resulting power curves. Criteria are given by which the AWT power curves show a high degree of repeatability, and these criteria are compared and contrasted with current published standards for power curve measurement. 6 refs., 5 figs., 5 tabs.

  15. DNA Barcoding as a Reliable Method for the Authentication of Commercial Seafood Products

    Directory of Open Access Journals (Sweden)

    Silvia Nicolè

    2012-01-01

    Full Text Available Animal DNA barcoding allows researchers to identify different species by analyzing a short nucleotide sequence, typically the mitochondrial gene cox1. In this paper, we use DNA barcoding to genetically identify seafood samples that were purchased from various locations throughout Italy. We adopted a multi-locus approach to analyze the cob, 16S-rDNA and cox1 genes, and compared our sequences to reference sequences in the BOLD and GenBank online databases. Our method is a rapid and robust technique that can be used to genetically identify crustaceans, mollusks and fishes. This approach could be applied in the future for conservation, particularly for monitoring illegal trade of protected and endangered species. Additionally, this method could be used for authentication in order to detect mislabeling of commercially processed seafood.

  16. Safety and reliability of pressure components with special emphasis on advanced methods of NDT. Vol. 2

    International Nuclear Information System (INIS)

    1986-01-01

    The 12 papers discuss topics of strength and safety in the field of materials technology and engineering. Conclusions for NPP component safety and materials are drawn. Measurements and studies relate to fracture mechanics methods (oscillation, burst, material strength, characteristics). The dynamic analysis of the behaviour of large test specimens, the influence of load velocity on crack resistance curve and the development of forged parts from austenitic steel for fast breeder reactors are presented. (DG) [de

  17. Safety and reliability of pressure components with special emphasis on advanced methods of NDT. Vol. 1

    International Nuclear Information System (INIS)

    1986-01-01

    24 papers discuss various methods for nondestructive testing of materials, e.g. eddy current measurement, EMAG analyser, tomography, ultrasound, holographic interferometry, and optical sound field camera. Special consideration is given to mathematical programmes and tests allowing to determine fracture-mechanical parameters and to assess cracks in various components, system parts and individual specimens both in pressurized systems and NPP systems. Studies focus on weld seams and adjacent areas. (DG) [de

  18. Reliability of Doppler and stethoscope methods of determining systolic blood pressures: considerations for calculating an ankle-brachial index.

    Science.gov (United States)

    Chesbro, Steven B; Asongwed, Elmira T; Brown, Jamesha; John, Emmanuel B

    2011-01-01

    The purposes of this study were to: (1) identify the interrater and intrarater reliability of systolic blood pressures using a stethoscope and Doppler to determine an ankle-brachial index (ABI), and (2) to determine the correlation between the 2 methods. Peripheral arterial disease (PAD) affects approximately 8 to 12 million people in the United States, and nearly half of those with this disease are asymptomatic. Early detection and prompt treatment of PAD will improve health outcomes. It is important that clinicians perform tests that determine the presence of PAD. Two individual raters trained in ABI procedure measured the systolic blood pressures of 20 individuals' upper and lower extremities. Standard ABI measurement protocols were observed. Raters individually recorded the systolic blood pressures of each extremity using a stethoscope and a Doppler, for a total of 640 independent measures. Interrater reliability of Doppler measurements to determine SBP at the ankle was very strong (intraclass correlation coefficient [ICC], 0.93-0.99) compared to moderate to strong reliability using a stethoscope (ICC, 0.64-0.87). Agreement between the 2 devices to determine SBP was moderate to very weak (ICC, 0.13-0.61). Comparisons of the use of Doppler and stethoscope to determine ABI showed weak to very weak intrarater correlation (ICC, 0.17-0.35). Linear regression analysis of the 2 methods to determine ABI showed positive but weak to very weak correlations (r2 = .013, P = .184). A Doppler ultrasound is recommended over a stethoscope for accuracy in systolic pressure readings for ABI measurements.

  19. Reliability and limitation of various diagnostic methods including nuclear medicine in myocardial disease

    International Nuclear Information System (INIS)

    Tokuyasu, Yoshiki; Kusakabe, Kiyoko; Yamazaki, Toshio

    1981-01-01

    Electrocardiography (ECG), echocardiography, nuclear method, cardiac catheterization, left ventriculography and endomyocardial biopsy (biopsy) were performed in 40 cases of cardiomyopathy (CM), 9 of endocardial fibroelastosis and 19 of specific heart muscle disease, and the usefulness and limitation of each method was comparatively estimated. In CM, various methods including biopsy were performed. The 40 patients were classified into 3 groups, i.e., hypertrophic (17), dilated (20) and non-hypertrophic.non-dilated (3) on the basis of left ventricular ejection fraction and hypertrophy of the ventricular wall. The hypertrophic group was divided into 4 subgroups: 9 septal, 4 apical, 2 posterior and 2 anterior. The nuclear study is useful in assessing the site of the abnormal ventricular thickening, perfusion defect and ventricular function. Echocardiography is most useful in detecting asymmetric septal hypertrophy. The biopsy gives the sole diagnostic clue, especially in non-hypertrophic.non-dilated cardiomyopathy. ECG is useful in all cases but correlation with the site of disproportional hypertrophy was not obtained. (J.P.N.)

  20. Robust and reliable banknote authentification and print flaw detection with opto-acoustical sensor fusion methods

    Science.gov (United States)

    Lohweg, Volker; Schaede, Johannes; Türke, Thomas

    2006-02-01

    The authenticity checking and inspection of bank notes is a high labour intensive process where traditionally every note on every sheet is inspected manually. However with the advent of more and more sophisticated security features, both visible and invisible, and the requirement of cost reduction in the printing process, it is clear that automation is required. As more and more print techniques and new security features will be established, total quality security, authenticity and bank note printing must be assured. Therefore, this factor necessitates amplification of a sensorial concept in general. We propose a concept for both authenticity checking and inspection methods for pattern recognition and classification for securities and banknotes, which is based on the concept of sensor fusion and fuzzy interpretation of data measures. In the approach different methods of authenticity analysis and print flaw detection are combined, which can be used for vending or sorting machines, as well as for printing machines. Usually only the existence or appearance of colours and their textures are checked by cameras. Our method combines the visible camera images with IR-spectral sensitive sensors, acoustical and other measurements like temperature and pressure of printing machines.