WorldWideScience

Sample records for reliable analytical procedures

  1. Application of analytical procedure on system reliability, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Mitomo, Nobuo; Miyazaki, Keiko; Matsukura, Hiroshi; Kobayashi, Michiyuki

    2000-01-01

    In the Ship Research Institute, research and development of GO-FLOW procedure with various advanced functions as a system reliability analysis method occupying main part of the probabilistic safety assessment (PSA) were promoted. In this study, as an important evaluation technique on executing PSA with lower than level 3, by intending fundamental upgrading of the GO-FLOW procedure, a safety assessment system using the GO-FLOW as well as an analytical function coupling of dynamic behavior analytical function and physical behavior of the system with stochastic phenomenon change were developed. In 1998 fiscal year, preparation and verification of various functions such as dependence addition between the headings, rearrangement in order of time, positioning of same heading to plural positions, calculation of forming frequency with elapsing time were carried out. And, on a simulation analysis function of accident sequence, confirmation on analysis covering all of main accident sequence in the reactor for improved marine reactor, MRX was carried out. In addition, a function near automatically producible on input data for analysis was also prepared. As a result, the conventional analysis not always easy understanding on analytical results except an expert of PSA was solved, and understanding of the accident phenomenon, verification of validity on analysis, feedback to analysis, and feedback to design could be easily carried out. (G.K.)

  2. Analytical procedures. Pt. 1

    International Nuclear Information System (INIS)

    Weber, G.

    1985-01-01

    In analytical procedures (Boole procedures) there is certain to be a close relationship between the safety assessment and reliability assessment of technical facilities. The paper gives an overview of the organization of models, fault trees, the probabilistic evaluation of systems, evaluation with minimum steps or minimum paths regarding statistically dependent components and of systems liable to suffer different kinds of outages. (orig.) [de

  3. The use of analytical procedures in the internal audit of the restaurant business expenses

    Directory of Open Access Journals (Sweden)

    T.Yu. Kopotienko

    2015-06-01

    Full Text Available The important task of carrying out the internal audit of expenses is to get the sufficient and reliable audit evidence. This can be achieved by using the analytical procedures in the audit process. The identification of the analytical procedures with the financial analysis of the business activities prevents from the efficient usage of them in the internal audit of the restaurant business expenses. The knowledge of internal auditors about the instructional techniques of analytical procedures and their tasks, depending on the verification steps are insufficient. The purpose of the article is the developing the methods of the internal audit of the restaurant business expenses based on an integrated application of analytical procedures. The nature and purpose of analytical procedures have been investigated in the article. It have been identified the factors influencing on auditor’s decision about the choice of analytical procedures complex. It was recommended to identify among them the purpose of analytical procedures, the type and structure of the enterprise, the source of the available information, the existence of financial and non-financial information, reliability and comparability of the available information. It have been identified the tasks of analytical procedures, depending on the verification steps. It was offered the analytical procedures complex as a part of internal audit of the restaurant business expenses. This complex contains a list of the analytical procedures, instructional techniques of analysis that are used in the appropriate procedure and the brief overview of the content of procedure.

  4. Analytical procedures. Pt. 4

    International Nuclear Information System (INIS)

    Rackwitz, R.

    1985-01-01

    The semi-analytical procedures are summarized under the heading 'first or second-order reliability method'. The asymptotic aggravation of the theory was repeatedly hinted at. In supporting structures the probability of outage of components always is also a function of the condition of all other components. It depends moreover on the stress affecting mostly all components. This fact causes a marked reduction of the effect of redundant component arrangements in the system. It moreover requires very special formulations. Although theoretically interesting and practically important developments will leave their mark on the further progress of the theory, the statements obtained by those approaches will continue to depend on how closely the chosen physical relationships and stoachstic models can come to the scatter quantities. Sensitivity studies show that these are partly aspects of substantially higher importance with a view to decision criteria than the refinement of the (probabilistic) method. Questions of relevance and reliability of data and their adequate treatment in reliability analyses seem to rank higher in order of sequence than exaggerated demands on methodics. (orig./HP) [de

  5. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Summative Mass Analysis of Algal Biomass - Integration of Analytical Procedures: Laboratory Analytical Procedure (LAP)

    Energy Technology Data Exchange (ETDEWEB)

    Laurens, Lieve M. L.

    2016-01-13

    This procedure guides the integration of laboratory analytical procedures to measure algal biomass constituents in an unambiguous manner and ultimately achieve mass balance closure for algal biomass samples. Many of these methods build on years of research in algal biomass analysis.

  7. Practical approach to a procedure for judging the results of analytical verification measurements

    International Nuclear Information System (INIS)

    Beyrich, W.; Spannagel, G.

    1979-01-01

    For practical safeguards a particularly transparent procedure is described to judge analytical differences between declared and verified values based on experimental data relevant to the actual status of the measurement technique concerned. Essentially it consists of two parts: Derivation of distribution curves for the occurrence of interlaboratory differences from the results of analytical intercomparison programmes; and judging of observed differences using criteria established on the basis of these probability curves. By courtesy of the Euratom Safeguards Directorate, Luxembourg, the applicability of this judging procedure has been checked in practical data verification for safeguarding; the experience gained was encouraging and implementation of the method is intended. Its reliability might be improved further by evaluation of additional experimental data. (author)

  8. Pre-analytical and analytical aspects affecting clinical reliability of plasma glucose results.

    Science.gov (United States)

    Pasqualetti, Sara; Braga, Federica; Panteghini, Mauro

    2017-07-01

    The measurement of plasma glucose (PG) plays a central role in recognizing disturbances in carbohydrate metabolism, with established decision limits that are globally accepted. This requires that PG results are reliable and unequivocally valid no matter where they are obtained. To control the pre-analytical variability of PG and prevent in vitro glycolysis, the use of citrate as rapidly effective glycolysis inhibitor has been proposed. However, the commercial availability of several tubes with studies showing different performance has created confusion among users. Moreover, and more importantly, studies have shown that tubes promptly inhibiting glycolysis give PG results that are significantly higher than tubes containing sodium fluoride only, used in the majority of studies generating the current PG cut-points, with a different clinical classification of subjects. From the analytical point of view, to be equivalent among different measuring systems, PG results should be traceable to a recognized higher-order reference via the implementation of an unbroken metrological hierarchy. In doing this, it is important that manufacturers of measuring systems consider the uncertainty accumulated through the different steps of the selected traceability chain. In particular, PG results should fulfil analytical performance specifications defined to fit the intended clinical application. Since PG has tight homeostatic control, its biological variability may be used to define these limits. Alternatively, given the central diagnostic role of the analyte, an outcome model showing the impact of analytical performance of test on clinical classifications of subjects can be used. Using these specifications, performance assessment studies employing commutable control materials with values assigned by reference procedure have shown that the quality of PG measurements is often far from desirable and that problems are exacerbated using point-of-care devices. Copyright © 2017 The Canadian

  9. Reliability of application of inspection procedures

    Energy Technology Data Exchange (ETDEWEB)

    Murgatroyd, R A

    1988-12-31

    This document deals with the reliability of application of inspection procedures. A method to ensure that the inspection of defects thanks to fracture mechanics is reliable is described. The Systematic Human Error Reduction and Prediction Analysis (SHERPA) methodology is applied to every task performed by the inspector to estimate the possibility of error. It appears that it is essential that inspection procedures should be sufficiently rigorous to avoid substantial errors, and that the selection procedures and the training period for inspectors should be optimised. (TEC). 3 refs.

  10. Reliability of application of inspection procedures

    International Nuclear Information System (INIS)

    Murgatroyd, R.A.

    1988-01-01

    This document deals with the reliability of application of inspection procedures. A method to ensure that the inspection of defects thanks to fracture mechanics is reliable is described. The Systematic Human Error Reduction and Prediction Analysis (SHERPA) methodology is applied to every task performed by the inspector to estimate the possibility of error. It appears that it is essential that inspection procedures should be sufficiently rigorous to avoid substantial errors, and that the selection procedures and the training period for inspectors should be optimised. (TEC)

  11. An off-line two-dimensional analytical procedure for determination of polcyclic aromatic hydrocarbons in smoke aerosol

    NARCIS (Netherlands)

    Claessens, H.A.; Lammerts van Bueren, L.G.D.

    1987-01-01

    Smoke aerosol from stoves consists of a wide variety of chemical substances of which a number have toxic properties. To study the impact of aerosol emissions on health and environment reliable analytical procedures must be available for these samples. An off-line two-dimensional HPLC method is

  12. Analytical procedures for determining the impacts of reliability mitigation strategies.

    Science.gov (United States)

    2013-01-01

    Reliability of transport, especially the ability to reach a destination within a certain amount of time, is a regular concern of travelers and shippers. The definition of reliability used in this research is how travel time varies over time. The vari...

  13. Accident Sequence Evaluation Program: Human reliability analysis procedure

    International Nuclear Information System (INIS)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs

  14. An Analytical Cost Estimation Procedure

    National Research Council Canada - National Science Library

    Jayachandran, Toke

    1999-01-01

    Analytical procedures that can be used to do a sensitivity analysis of a cost estimate, and to perform tradeoffs to identify input values that can reduce the total cost of a project, are described in the report...

  15. Accident Sequence Evaluation Program: Human reliability analysis procedure

    Energy Technology Data Exchange (ETDEWEB)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  16. Application of system reliability analytical method, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Mitomo, Nobuo; Miyazaki, Keiko; Matsukura, Hiroshi; Kobayashi, Michiyuki

    1999-01-01

    The Ship Research Institute proceed a developmental study on GO-FLOW method with various advancing functionalities for the system reliability analysis method occupying main parts of PSA (Probabilistic Safety Assessment). Here was attempted to intend to upgrade functionality of the GO-FLOW method, to develop an analytical function integrated with dynamic behavior analytical function, physical behavior and probable subject transfer, and to prepare a main accident sequence picking-out function. In 1997 fiscal year, in dynamic event-tree analytical system, an analytical function was developed by adding dependency between headings. In simulation analytical function of the accident sequence, main accident sequence of MRX for improved ship propulsion reactor became possible to be covered perfectly. And, input data for analysis was prepared with a function capable easily to set by an analysis operator. (G.K.)

  17. Human Reliability Analysis For Computerized Procedures

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Gertman, David I.; Le Blanc, Katya

    2011-01-01

    This paper provides a characterization of human reliability analysis (HRA) issues for computerized procedures in nuclear power plant control rooms. It is beyond the scope of this paper to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper provides a review of HRA as applied to traditional paper-based procedures, followed by a discussion of what specific factors should additionally be considered in HRAs for computerized procedures. Performance shaping factors and failure modes unique to computerized procedures are highlighted. Since there is no definitive guide to HRA for paper-based procedures, this paper also serves to clarify the existing guidance on paper-based procedures before delving into the unique aspects of computerized procedures.

  18. Argon analytical procedures for potassium-argon dating

    International Nuclear Information System (INIS)

    Gabites, J.E.; Adams, C.J.

    1981-01-01

    A manual for the argon analytical methods involved in potassium-argon geochronology, including: i) operating procedures for the ultra-high vacuum argon extraction/purification equipment for the analysis of nanolitre quantities of radiogenic argon in rocks, minerals and gases; ii) operating procedures for the AEI-MS10 gas source mass spectrometer

  19. Procedure for Application of Software Reliability Growth Models to NPP PSA

    International Nuclear Information System (INIS)

    Son, Han Seong; Kang, Hyun Gook; Chang, Seung Cheol

    2009-01-01

    As the use of software increases at nuclear power plants (NPPs), the necessity for including software reliability and/or safety into the NPP Probabilistic Safety Assessment (PSA) rises. This work proposes an application procedure of software reliability growth models (RGMs), which are most widely used to quantify software reliability, to NPP PSA. Through the proposed procedure, it can be determined if a software reliability growth model can be applied to the NPP PSA before its real application. The procedure proposed in this work is expected to be very helpful for incorporating software into NPP PSA

  20. Interim Reliability Evaluation Program procedures guide

    International Nuclear Information System (INIS)

    Carlson, D.D.; Gallup, D.R.; Kolaczkowski, A.M.; Kolb, G.J.; Stack, D.W.; Lofgren, E.; Horton, W.H.; Lobner, P.R.

    1983-01-01

    This document presents procedures for conducting analyses of a scope similar to those performed in Phase II of the Interim Reliability Evaluation Program (IREP). It documents the current state of the art in performing the plant systems analysis portion of a probabilistic risk assessment. Insights gained into managing such an analysis are discussed. Step-by-step procedures and methodological guidance constitute the major portion of the document. While not to be viewed as a cookbook, the procedures set forth the principal steps in performing an IREP analysis. Guidance for resolving the problems encountered in previous analyses is offered. Numerous examples and representative products from previous analyses clarify the discussion

  1. An analytical framework for reliability growth of one-shot systems

    International Nuclear Information System (INIS)

    Hall, J. Brian; Mosleh, Ali

    2008-01-01

    In this paper, we introduce a new reliability growth methodology for one-shot systems that is applicable to the case where all corrective actions are implemented at the end of the current test phase. The methodology consists of four model equations for assessing: expected reliability, the expected number of failure modes observed in testing, the expected probability of discovering new failure modes, and the expected portion of system unreliability associated with repeat failure modes. These model equations provide an analytical framework for which reliability practitioners can estimate reliability improvement, address goodness-of-fit concerns, quantify programmatic risk, and assess reliability maturity of one-shot systems. A numerical example is given to illustrate the value and utility of the presented approach. This methodology is useful to program managers and reliability practitioners interested in applying the techniques above in their reliability growth program

  2. Genesis of theory and analysis of practice of applying the analytical procedures in auditing

    OpenAIRE

    Сурніна, К. С.

    2012-01-01

    Determination of concept "Analytical procedures" in an audit by different researchers is investigated in the article, ownvision of necessity of wideuse of analytical procedures in audit is defined. Classification of analytical procedures is presentedtaking into account the specifity of auditing process on the whole

  3. Role of the IAEA's ALMERA network in harmonization of analytical procedures applicable worldwide for radiological emergencies

    International Nuclear Information System (INIS)

    Pitois, A.; Osvath, I.; Tarjan, S.; Groening, M.; Osborn, D.; )

    2016-01-01

    The International Atomic Energy Agency (IAEA) coordinates and provides analytical support to the worldwide network of Analytical Laboratories for the Measurement of Environmental Radioactivity (ALMERA), consisting at the end of 2015 of 154 laboratories in 85 countries. This network, established by the IAEA in 1995, has for aim to provide timely and reliable measurement results of environmental radioactivity in routine monitoring and emergency situations. The IAEA supports the ALMERA laboratories in their routine and emergency response environmental monitoring activities by organizing proficiency tests and inter-laboratory comparison exercises, developing validated analytical procedures for environmental radioactivity measurement, and organizing training courses and workshops. The network also acts as a forum for sharing knowledge and expertise. The aim of this paper is to describe the current status of ALMERA analytical method development activities for radiological emergencies and the plans for further development in the field

  4. Procedures for treating common cause failures in safety and reliability studies: Analytical background and techniques

    International Nuclear Information System (INIS)

    Mosleh, A.; Fleming, K.N.; Parry, G.W.; Paula, H.M.; Worledge, D.H.; Rasmuson, D.M.

    1989-01-01

    Volume I of this report presents a framework for the inclusion of the impact of common cause failures in risk and reliability evaluations. Common cause failures are defined as that subset of dependent failures for which causes are not explicitly included in the logic model as basic events. The emphasis here is on providing procedures for a practical, systematic approach that can be used to perform and clearly document the analysis. The framework and the methods discussed for performing the different stages of the analysis integrate insights obtained from engineering assessments of the system and the historical evidence from multiple failure events into a systematic, reproducible, and defensible analysis. This document, Volume 2, contains a series of appendices that provide additional background and methodological detail on several important topics discussed in Volume I

  5. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases.

    Science.gov (United States)

    Lluveras-Tenorio, Anna; Mazurek, Joy; Restivo, Annalaura; Colombini, Maria Perla; Bonaduce, Ilaria

    2012-10-10

    Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in

  6. A Novel Analytic Technique for the Service Station Reliability in a Discrete-Time Repairable Queue

    Directory of Open Access Journals (Sweden)

    Renbin Liu

    2013-01-01

    Full Text Available This paper presents a decomposition technique for the service station reliability in a discrete-time repairable GeomX/G/1 queueing system, in which the server takes exhaustive service and multiple adaptive delayed vacation discipline. Using such a novel analytic technique, some important reliability indices and reliability relation equations of the service station are derived. Furthermore, the structures of the service station indices are also found. Finally, special cases and numerical examples validate the derived results and show that our analytic technique is applicable to reliability analysis of some complex discrete-time repairable bulk arrival queueing systems.

  7. Procedure for hazards analysis of plutonium gloveboxes used in analytical chemistry operations

    International Nuclear Information System (INIS)

    Delvin, W.L.

    1977-06-01

    A procedure is presented to identify and assess hazards associated with gloveboxes used for analytical chemistry operations involving plutonium. This procedure is based upon analytic tree methodology and it has been adapted from the US Energy Research and Development Administration's safety program, the Management Oversight and Risk Tree

  8. Bio-Oil Analysis Laboratory Procedures | Bioenergy | NREL

    Science.gov (United States)

    Bio-Oil Analysis Laboratory Procedures Bio-Oil Analysis Laboratory Procedures NREL develops laboratory analytical procedures (LAPs) for the analysis of raw and upgraded pyrolysis bio-oils. These standard procedures have been validated and allow for reliable bio-oil analysis. Procedures Determination

  9. Assessment of passive drag in swimming by numerical simulation and analytical procedure.

    Science.gov (United States)

    Barbosa, Tiago M; Ramos, Rui; Silva, António J; Marinho, Daniel A

    2018-03-01

    The aim was to compare the passive drag-gliding underwater by a numerical simulation and an analytical procedure. An Olympic swimmer was scanned by computer tomography and modelled gliding at a 0.75-m depth in the streamlined position. Steady-state computer fluid dynamics (CFD) analyses were performed on Fluent. A set of analytical procedures was selected concurrently. Friction drag (D f ), pressure drag (D pr ), total passive drag force (D f +pr ) and drag coefficient (C D ) were computed between 1.3 and 2.5 m · s -1 by both techniques. D f +pr ranged from 45.44 to 144.06 N with CFD, from 46.03 to 167.06 N with the analytical procedure (differences: from 1.28% to 13.77%). C D ranged between 0.698 and 0.622 by CFD, 0.657 and 0.644 by analytical procedures (differences: 0.40-6.30%). Linear regression models showed a very high association for D f +pr plotted in absolute values (R 2  = 0.98) and after log-log transformation (R 2  = 0.99). The C D also obtained a very high adjustment for both absolute (R 2  = 0.97) and log-log plots (R 2  = 0.97). The bias for the D f +pr was 8.37 N and 0.076 N after logarithmic transformation. D f represented between 15.97% and 18.82% of the D f +pr by the CFD, 14.66% and 16.21% by the analytical procedures. Therefore, despite the bias, analytical procedures offer a feasible way of gathering insight on one's hydrodynamics characteristics.

  10. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    Science.gov (United States)

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Adjoint sensitivity analysis procedure of Markov chains with applications on reliability of IFMIF accelerator-system facilities

    Energy Technology Data Exchange (ETDEWEB)

    Balan, I.

    2005-05-01

    This work presents the implementation of the Adjoint Sensitivity Analysis Procedure (ASAP) for the Continuous Time, Discrete Space Markov chains (CTMC), as an alternative to the other computational expensive methods. In order to develop this procedure as an end product in reliability studies, the reliability of the physical systems is analyzed using a coupled Fault-Tree - Markov chain technique, i.e. the abstraction of the physical system is performed using as the high level interface the Fault-Tree and afterwards this one is automatically converted into a Markov chain. The resulting differential equations based on the Markov chain model are solved in order to evaluate the system reliability. Further sensitivity analyses using ASAP applied to CTMC equations are performed to study the influence of uncertainties in input data to the reliability measures and to get the confidence in the final reliability results. The methods to generate the Markov chain and the ASAP for the Markov chain equations have been implemented into the new computer code system QUEFT/MARKOMAGS/MCADJSEN for reliability and sensitivity analysis of physical systems. The validation of this code system has been carried out by using simple problems for which analytical solutions can be obtained. Typical sensitivity results show that the numerical solution using ASAP is robust, stable and accurate. The method and the code system developed during this work can be used further as an efficient and flexible tool to evaluate the sensitivities of reliability measures for any physical system analyzed using the Markov chain. Reliability and sensitivity analyses using these methods have been performed during this work for the IFMIF Accelerator System Facilities. The reliability studies using Markov chain have been concentrated around the availability of the main subsystems of this complex physical system for a typical mission time. The sensitivity studies for two typical responses using ASAP have been

  12. Analytical approach for confirming the achievement of LMFBR reliability goals

    International Nuclear Information System (INIS)

    Ingram, G.E.; Elerath, J.G.; Wood, A.P.

    1981-01-01

    The approach, recommended by GE-ARSD, for confirming the achievement of LMFBR reliability goals relies upon a comprehensive understanding of the physical and operational characteristics of the system and the environments to which the system will be subjected during its operational life. This kind of understanding is required for an approach based on system hardware testing or analyses, as recommended in this report. However, for a system as complex and expensive as the LMFBR, an approach which relies primarily on system hardware testing would be prohibitive both in cost and time to obtain the required system reliability test information. By using an analytical approach, results of tests (reliability and functional) at a low level within the specific system of interest, as well as results from other similar systems can be used to form the data base for confirming the achievement of the system reliability goals. This data, along with information relating to the design characteristics and operating environments of the specific system, will be used in the assessment of the system's reliability

  13. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    International Nuclear Information System (INIS)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective

  14. Evidential Analytic Hierarchy Process Dependence Assessment Methodology in Human Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Luyuan Chen

    2017-02-01

    Full Text Available In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster–Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  15. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong [School of Computer and Information Science, Southwest University, Chongqing (China); Mahadevan, Sankaran [School of Engineering, Vanderbilt University, Nashville (United States)

    2017-02-15

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  16. Theoretical, analytical, and statistical interpretation of environmental data

    International Nuclear Information System (INIS)

    Lombard, S.M.

    1974-01-01

    The reliability of data from radiochemical analyses of environmental samples cannot be determined from nuclear counting statistics alone. The rigorous application of the principles of propagation of errors, an understanding of the physics and chemistry of the species of interest in the environment, and the application of information from research on the analytical procedure are all necessary for a valid estimation of the errors associated with analytical results. The specific case of the determination of plutonium in soil is considered in terms of analytical problems and data reliability. (U.S.)

  17. Reliability Based Optimization of Structural Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    1987-01-01

    The optimization problem to design structural systems such that the reliability is satisfactory during the whole lifetime of the structure is considered in this paper. Some of the quantities modelling the loads and the strength of the structure are modelled as random variables. The reliability...... is estimated using first. order reliability methods ( FORM ). The design problem is formulated as the optimization problem to minimize a given cost function such that the reliability of the single elements satisfies given requirements or such that the systems reliability satisfies a given requirement....... For these optimization problems it is described how a sensitivity analysis can be performed. Next, new optimization procedures to solve the optimization problems are presented. Two of these procedures solve the system reliability based optimization problem sequentially using quasi-analytical derivatives. Finally...

  18. EML procedures manual

    International Nuclear Information System (INIS)

    Volchok, H.L.; de Planque, G.

    1982-01-01

    This manual contains the procedures that are used currently by the Environmental Measurements Laboratory of the US Department of Energy. In addition a number of analytical methods from other laboratories have been included. These were tested for reliability at the Battelle, Pacific Northwest Laboratory under contract with the Division of Biomedical and Environmental Research of the AEC. These methods are clearly distinguished. The manual is prepared in loose leaf form to facilitate revision of the procedures and inclusion of additional procedures or data sheets. Anyone receiving the manual through EML should receive this additional material automatically. The contents are as follows: (1) general; (2) sampling; (3) field measurements; (4) general analytical chemistry; (5) chemical procedures; (6) data section; (7) specifications

  19. Control Chart on Semi Analytical Weighting

    Science.gov (United States)

    Miranda, G. S.; Oliveira, C. C.; Silva, T. B. S. C.; Stellato, T. B.; Monteiro, L. R.; Marques, J. R.; Faustino, M. G.; Soares, S. M. V.; Ulrich, J. C.; Pires, M. A. F.; Cotrim, M. E. B.

    2018-03-01

    Semi-analytical balance verification intends to assess the balance performance using graphs that illustrate measurement dispersion, trough time, and to demonstrate measurements were performed in a reliable manner. This study presents internal quality control of a semi-analytical balance (GEHAKA BG400) using control charts. From 2013 to 2016, 2 weight standards were monitored before any balance operation. This work intended to evaluate if any significant difference or bias were presented on weighting procedure over time, to check the generated data reliability. This work also exemplifies how control intervals are established.

  20. Procedure for conducting a human-reliability analysis for nuclear power plants. Final report

    International Nuclear Information System (INIS)

    Bell, B.J.; Swain, A.D.

    1983-05-01

    This document describes in detail a procedure to be followed in conducting a human reliability analysis as part of a probabilistic risk assessment when such an analysis is performed according to the methods described in NUREG/CR-1278, Handbook for Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications. An overview of the procedure describing the major elements of a human reliability analysis is presented along with a detailed description of each element and an example of an actual analysis. An appendix consists of some sample human reliability analysis problems for further study

  1. Dispersant testing : a study on analytical test procedures

    International Nuclear Information System (INIS)

    Fingas, M.F.; Fieldhouse, B.; Wang, Z.; Environment Canada, Ottawa, ON

    2004-01-01

    Crude oil is a complex mixture of hydrocarbons, ranging from small, volatile compounds to very large, non-volatile compounds. Analysis of the dispersed oil is crucial. This paper described Environment Canada's ongoing studies on various traits of dispersants. In particular, it describes small studies related to dispersant effectiveness and methods to improve analytical procedures. The study also re-evaluated the analytical procedure for the Swirling Flask Test, which is now part of the ASTM standard procedure. There are new and improved methods for analyzing oil-in-water using gas chromatography (GC). The methods could be further enhanced by integrating the entire chromatogram rather than just peaks. This would result in a decrease in maximum variation from 5 per cent to about 2 per cent. For oil-dispersant studies, the surfactant-dispersed oil hydrocarbons consist of two parts: GC-resolved hydrocarbons and GC-unresolved hydrocarbons. This study also tested a second feature of the Swirling Flask Test in which the side spout was tested and compared with a new vessel with a septum port instead of a side spout. This decreased the variability as well as the energy and mixing in the vessel. Rather than being a variation of the Swirling Flask Test, it was suggested that a spoutless vessel might be considered as a completely separate test. 7 refs., 2 tabs., 4 figs

  2. Risk and reliability allocation to risk control

    International Nuclear Information System (INIS)

    Vojnovic, D.; Kozuh, M.

    1992-01-01

    The risk allocation procedure is used as an analytical model to support the optimal decision making for reliability/availability improvement planning. Both levels of decision criteria, the plant risk measures and plant performance indices, are used in risk allocation procedure. Decision support system uses the multi objective decision making concept. (author) [sl

  3. Big data analytics for the Future Circular Collider reliability and availability studies

    Science.gov (United States)

    Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter

    2017-10-01

    Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.

  4. Effects of Analytical and Holistic Scoring Patterns on Scorer Reliability in Biology Essay Tests

    Science.gov (United States)

    Ebuoh, Casmir N.

    2018-01-01

    Literature revealed that the patterns/methods of scoring essay tests had been criticized for not being reliable and this unreliability is more likely to be more in internal examinations than in the external examinations. The purpose of this study is to find out the effects of analytical and holistic scoring patterns on scorer reliability in…

  5. Tritium fractionation in biological systems and in analytical procedures

    International Nuclear Information System (INIS)

    Kim, M.A.; Baumgaertner, F.

    1991-01-01

    The organically bound tritium (OBT) is evaluated in biological systems by measuring the tritium distribution ratio (R-value), i.e. tritium concentrations in organic substance to tissue water. The determination of the R-value is found to involve always isotope fractionation in applied analytical procedures and hence the evaluation of the true OBT-value in a given biological system appears more complicated than hitherto known in the literature. The present work concentrates on the tritium isotope fraction in the tissue water separation and on the resulting effects on the R-value. The analytical procedures examined are vacuum freeze drying under equilibrium and non-equilibrium conditions and azeotropic distillation. The vaporization isotope effects are determined separately in the phase transition of solid or liquid to gas in pure water systems as well as in real biological systems, e.g. maize plant. The results are systematically analysed and the influence of isotope effects on the R-value is rigorously quantified. (orig.)

  6. Validity, reliability and support for implementation of independence-scaled procedural assessment in laparoscopic surgery.

    Science.gov (United States)

    Kramp, Kelvin H; van Det, Marc J; Veeger, Nic J G M; Pierie, Jean-Pierre E N

    2016-06-01

    There is no widely used method to evaluate procedure-specific laparoscopic skills. The first aim of this study was to develop a procedure-based assessment method. The second aim was to compare its validity, reliability and feasibility with currently available global rating scales (GRSs). An independence-scaled procedural assessment was created by linking the procedural key steps of the laparoscopic cholecystectomy to an independence scale. Subtitled and blinded videos of a novice, an intermediate and an almost competent trainee, were evaluated with GRSs (OSATS and GOALS) and the independence-scaled procedural assessment by seven surgeons, three senior trainees and six scrub nurses. Participants received a short introduction to the GRSs and independence-scaled procedural assessment before assessment. The validity was estimated with the Friedman and Wilcoxon test and the reliability with the intra-class correlation coefficient (ICC). A questionnaire was used to evaluate user opinion. Independence-scaled procedural assessment and GRS scores improved significantly with surgical experience (OSATS p = 0.001, GOALS p < 0.001, independence-scaled procedural assessment p < 0.001). The ICCs of the OSATS, GOALS and independence-scaled procedural assessment were 0.78, 0.74 and 0.84, respectively, among surgeons. The ICCs increased when the ratings of scrub nurses were added to those of the surgeons. The independence-scaled procedural assessment was not considered more of an administrative burden than the GRSs (p = 0.692). A procedural assessment created by combining procedural key steps to an independence scale is a valid, reliable and acceptable assessment instrument in surgery. In contrast to the GRSs, the reliability of the independence-scaled procedural assessment exceeded the threshold of 0.8, indicating that it can also be used for summative assessment. It furthermore seems that scrub nurses can assess the operative competence of surgical trainees.

  7. Comparative evaluation of analytical procedures for the recovery of Enterobacteriaceae in coastal marine waters; Valutazione comparativa di procedure analitiche per il rilevamento di Enterobacteriaceae in acque marine costiere

    Energy Technology Data Exchange (ETDEWEB)

    Bonadonna, Lucia; Chiaretti, Gianluca; Coccia, Anna Maria; Semproni, Maurizio [Istituto Superiore di Sanita`, Rome (Italy). Lab. di Igiene Ambientale

    1997-03-01

    The use of quick and reliable procedures is fundamental for water quality evaluation control. In order to improve the analytical methods for microbiological examination of bathing waters, a comparison of different substrates for the recovery of Enterobacteriaceae from coastal marine waters was carried out. The medium indicated in the Italian technical normative has shown a good selectivity when the red colonies with a green metallic surface sheen were counted, as stated in the Standard Methods. On the other hand, the chronogenic substrate used in this study resulted easy to read, selective and specific for both Escherichia coli and total coliforms.

  8. Nonspecialist Raters Can Provide Reliable Assessments of Procedural Skills

    DEFF Research Database (Denmark)

    Mahmood, Oria; Dagnæs, Julia; Bube, Sarah

    2018-01-01

    was significant (p Pearson's correlation of 0.77 for the nonspecialists and 0.75 for the specialists. The test-retest reliability showed the biggest difference between the 2 groups, 0.59 and 0.38 for the nonspecialist raters and the specialist raters, respectively (p ... was chosen as it is a simple procedural skill that is crucial to master in a resident urology program. RESULTS: The internal consistency of assessments was high, Cronbach's α = 0.93 and 0.95 for nonspecialist and specialist raters, respectively (p correlations). The interrater reliability...

  9. Analytic of elements for the determination of soil->plant transfer factors

    International Nuclear Information System (INIS)

    Liese, T.

    1985-02-01

    This article describes a part of the conventional analytical work, which was done to determine soil to plant transfer factors. The analytical methods, the experiments to find out the best way of sample digestion and the resulting analytical procedures are described. Analytical methods are graphite furnace atomic absorption spectrometry (GFAAS) and inductively coupled plasma atomic emission spectrometry (ICP-AES). In case of ICP-AES the necessity of right background correction and correction of the spectral interferences is shown. The reliability of the analytical procedure is demonstrated by measuring different kinds of standard reference materials and by comparison of AAS and AES. (orig./HP) [de

  10. Validation of an advanced analytical procedure applied to the measurement of environmental radioactivity.

    Science.gov (United States)

    Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van

    2018-04-01

    In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Median of patient results as a tool for assessment of analytical stability.

    Science.gov (United States)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Procedures for controlling the risks of reliability, safety, and availability of technical systems

    International Nuclear Information System (INIS)

    1987-01-01

    The reference book covers four sections. Apart from the fundamental aspects of the reliability problem, of risk and safety and the relevant criteria with regard to reliability, the material presented explains reliability in terms of maintenance, logistics and availability, and presents procedures for reliability assessment and determination of factors influencing the reliability, together with suggestions for systems technical integration. The reliability assessment consists of diagnostic and prognostic analyses. The section on factors influencing reliability discusses aspects of organisational structures, programme planning and control, and critical activities. (DG) [de

  13. Tritium isotope fractionation in biological systems and in analytical procedures

    International Nuclear Information System (INIS)

    Kim, M.A.; Baumgaertner, Franz

    1989-01-01

    The organically bound tritium (OBT) is evaluated in biological systems by determining the tritium distribution ratio (R-value), i.e. tritium concentrations in organic substance to cell water. The determination of the R-value always involves isotope fractionation is applied analytical procedures and hence the evaluation of the true OBT -value in a given biological system appears more complicated than hitherto known in the literature. The present work concentrates on the tritium isotope fractionation in the cell water separation and on the resulting effects on the R-value. The analytical procedures examined are vacuum freeze drying under equilibrium and non-equilibrium conditions and azeotropic distillation. The vaporization isotope effects are determined separately in the phase transition of solid or liquid to gas in pure tritium water systems as well as in real biological systems, e.g. corn plant. The results are systematically analyzed and the influence of isotope effects on the R-value is rigorously quantified

  14. Reliability of procedures used for scaling loudness

    DEFF Research Database (Denmark)

    Jesteadt, Walt; Joshi, Suyash Narendra

    2013-01-01

    In this study, 16 normally-hearing listeners judged the loudness of 1000-Hz sinusoids using magnitude estimation (ME), magnitude production (MP), and categorical loudness scaling (CLS). Listeners in each of four groups completed the loudness scaling tasks in a different sequence on the first visit...... (ME, MP, CLS; MP, ME, CLS; CLS, ME, MP; CLS, MP, ME), and the order was reversed on the second visit. This design made it possible to compare the reliability of estimates of the slope of the loudness function across procedures in the same listeners. The ME data were well fitted by an inflected...... results were the most reproducible, they do not provide direct information about the slope of the loudness function because the numbers assigned to CLS categories are arbitrary. This problem can be corrected by using data from the other procedures to assign numbers that are proportional to loudness...

  15. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  16. Consistency of FMEA used in the validation of analytical procedures

    DEFF Research Database (Denmark)

    Oldenhof, M.T.; van Leeuwen, J.F.; Nauta, Maarten

    2011-01-01

    is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating......In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection...

  17. Procedures for treating common cause failures in safety and reliability studies: Volume 2, Analytic background and techniques: Final report

    International Nuclear Information System (INIS)

    Mosleh, A.; Fleming, K.N.; Parry, G.W.; Paula, H.M.; Worledge, D.H.; Rasmuson, D.M.

    1988-12-01

    This report presents a framework for the inclusion of the impact of common cause failures in risk and reliability evaluations. Common cause failures are defined as that subset of dependent failures for which causes are not explicitly included in the logic model as basic events. The emphasis here is on providing procedures for a practical, systematic approach that can be used to perform and clearly document the analysis. The framework and the methods discussed for performing the different stages of the analysis integrate insights obtained from engineering assessments of the system and the historical evidence from multiple failure events into a systematic, reproducible, and defensible analysis. This document, Volume 2, contains a series of appendices that provide additional background and methodological detail on several important topics discussed in Volume 1

  18. Evaluation of structural reliability using simulation methods

    Directory of Open Access Journals (Sweden)

    Baballëku Markel

    2015-01-01

    Full Text Available Eurocode describes the 'index of reliability' as a measure of structural reliability, related to the 'probability of failure'. This paper is focused on the assessment of this index for a reinforced concrete bridge pier. It is rare to explicitly use reliability concepts for design of structures, but the problems of structural engineering are better known through them. Some of the main methods for the estimation of the probability of failure are the exact analytical integration, numerical integration, approximate analytical methods and simulation methods. Monte Carlo Simulation is used in this paper, because it offers a very good tool for the estimation of probability in multivariate functions. Complicated probability and statistics problems are solved through computer aided simulations of a large number of tests. The procedures of structural reliability assessment for the bridge pier and the comparison with the partial factor method of the Eurocodes have been demonstrated in this paper.

  19. A Simplified Procedure for Reliability Estimation of Underground Concrete Barriers against Normal Missile Impact

    Directory of Open Access Journals (Sweden)

    N. A. Siddiqui

    2011-06-01

    Full Text Available Underground concrete barriers are frequently used to protect strategic structures like Nuclear power plants (NPP, deep under the soil against any possible high velocity missile impact. For a given range and type of missile (or projectile it is of paramount importance to examine the reliability of underground concrete barriers under expected uncertainties involved in the missile, concrete, and soil parameters. In this paper, a simple procedure for the reliability assessment of underground concrete barriers against normal missile impact has been presented using the First Order Reliability Method (FORM. The presented procedure is illustrated by applying it to a concrete barrier that lies at a certain depth in the soil. Some parametric studies are also conducted to obtain the design values which make the barrier as reliable as desired.

  20. Optimally Fortifying Logic Reliability through Criticality Ranking

    Directory of Open Access Journals (Sweden)

    Yu Bai

    2015-02-01

    Full Text Available With CMOS technology aggressively scaling towards the 22-nm node, modern FPGA devices face tremendous aging-induced reliability challenges due to bias temperature instability (BTI and hot carrier injection (HCI. This paper presents a novel anti-aging technique at the logic level that is both scalable and applicable for VLSI digital circuits implemented with FPGA devices. The key idea is to prolong the lifetime of FPGA-mapped designs by strategically elevating the VDD values of some LUTs based on their modular criticality values. Although the idea of scaling VDD in order to improve either energy efficiency or circuit reliability has been explored extensively, our study distinguishes itself by approaching this challenge through an analytical procedure, therefore being able to maximize the overall reliability of the target FPGA design by rigorously modeling the BTI-induced device reliability and optimally solving the VDD assignment problem. Specifically, we first develop a systematic framework to analytically model the reliability of an FPGA LUT (look-up table, which consists of both RAM memory bits and associated switching circuit. We also, for the first time, establish the relationship between signal transition density and a LUT’s reliability in an analytical way. This key observation further motivates us to define the modular criticality as the product of signal transition density and the logic observability of each LUT. Finally, we analytically prove, for the first time, that the optimal way to improve the overall reliability of a whole FPGA device is to fortify individual LUTs according to their modular criticality. To the best of our knowledge, this work is the first to draw such a conclusion.

  1. Systems reliability analyses and risk analyses for the licencing procedure under atomic law

    International Nuclear Information System (INIS)

    Berning, A.; Spindler, H.

    1983-01-01

    For the licencing procedure under atomic law in accordance with Article 7 AtG, the nuclear power plant as a whole needs to be assessed, plus the reliability of systems and plant components that are essential to safety are to be determined with probabilistic methods. This requirement is the consequence of safety criteria for nuclear power plants issued by the Home Department (BMI). Systems reliability studies and risk analyses used in licencing procedures under atomic law are identified. The stress is on licencing decisions, mainly for PWR-type reactors. Reactor Safety Commission (RSK) guidelines, examples of reasoning in legal proceedings and arguments put forth by objectors are also dealt with. Correlations are shown between reliability analyses made by experts and licencing decisions by means of examples. (orig./HP) [de

  2. Analytical procedures for determining Pb and Sr isotopic compositions in water samples by ID-TIMS

    Directory of Open Access Journals (Sweden)

    Veridiana Martins

    2008-01-01

    Full Text Available Few articles deal with lead and strontium isotopic analysis of water samples. The aim of this study was to define the chemical procedures for Pb and Sr isotopic analyses of groundwater samples from an urban sedimentary aquifer. Thirty lead and fourteen strontium isotopic analyses were performed to test different analytical procedures. Pb and Sr isotopic ratios as well as Sr concentration did not vary using different chemical procedures. However, the Pb concentrations were very dependent on the different procedures. Therefore, the choice of the best analytical procedure was based on the Pb results, which indicated a higher reproducibility from samples that had been filtered and acidified before the evaporation, had their residues totally dissolved, and were purified by ion chromatography using the Biorad® column. Our results showed no changes in Pb ratios with the storage time.

  3. NASA reliability preferred practices for design and test

    Science.gov (United States)

    1991-01-01

    Given here is a manual that was produced to communicate within the aerospace community design practices that have contributed to NASA mission success. The information represents the best technical advice that NASA has to offer on reliability design and test practices. Topics covered include reliability practices, including design criteria, test procedures, and analytical techniques that have been applied to previous space flight programs; and reliability guidelines, including techniques currently applied to space flight projects, where sufficient information exists to certify that the technique will contribute to mission success.

  4. Further HTGR core support structure reliability studies. Interim report No. 1

    International Nuclear Information System (INIS)

    Platus, D.L.

    1976-01-01

    Results of a continuing effort to investigate high temperature gas cooled reactor (HTGR) core support structure reliability are described. Graphite material and core support structure component physical, mechanical and strength properties required for the reliability analysis are identified. Also described are experimental and associated analytical techniques for determining the required properties, a procedure for determining number of tests required, properties that might be monitored by special surveillance of the core support structure to improve reliability predictions, and recommendations for further studies. Emphasis in the study is directed towards developing a basic understanding of graphite failure and strength degradation mechanisms; and validating analytical methods for predicting strength and strength degradation from basic material properties

  5. Consistency of FMEA used in the validation of analytical procedures.

    Science.gov (United States)

    Oldenhof, M T; van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Vredenbregt, M J; Weda, M; Barends, D M

    2011-02-20

    In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection-Mass Spectrometry (HPLC-DAD-MS) analytical procedure used in the quality control of medicines. Each team was free to define their own ranking scales for the probability of severity (S), occurrence (O), and detection (D) of failure modes. We calculated Risk Priority Numbers (RPNs) and we identified the failure modes above the 90th percentile of RPN values as failure modes needing urgent corrective action; failure modes falling between the 75th and 90th percentile of RPN values were identified as failure modes needing necessary corrective action, respectively. Team 1 and Team 2 identified five and six failure modes needing urgent corrective action respectively, with two being commonly identified. Of the failure modes needing necessary corrective actions, about a third were commonly identified by both teams. These results show inconsistency in the outcome of the FMEA. To improve consistency, we recommend that FMEA is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating that this inconsistency is not always a drawback. Copyright © 2010 Elsevier B.V. All rights reserved.

  6. The Usefulness of Analytical Procedures - An Empirical Approach in the Auditing Sector in Portugal

    Directory of Open Access Journals (Sweden)

    Carlos Pinho

    2014-08-01

    Full Text Available The conceptual conflict between the efficiency and efficacy on financial auditing arises from the fact that resources are scarce, both in terms of the time available to carry out the audit and the quality and timeliness of the information available to the external auditor. Audits tend to be more efficient, the lower the combination of inherent risk and control risk is assessed to be, allowing the auditor to carry out less extensive and less timely auditing tests, meaning that in some cases analytical audit procedures are a good tool to support the opinions formed by the auditor. This research, by means of an empirical study of financial auditing in Portugal, aims to evaluate the extent to which analytical procedures are used during a financial audit engagement in Portugal, throughout the different phases involved in auditing. The conclusions point to the fact that, in general terms and regardless of the size of the audit company and the way in which professionals work, Portuguese auditors use analytical procedures more frequently during the planning phase rather than during the phase of evidence gathering and the phase of opinion formation.

  7. Analytical quality assurance procedures developed for the IAEA's Reference Asian Man Project (Phase 2)

    International Nuclear Information System (INIS)

    Kawamura, H.; Parr, R.M.; Dang, H.S.; Tian, W.; Barnes, R.M.; Iyengar, G.V.

    2000-01-01

    Analytical quality assurance procedures adopted for use in the IAEA Co-ordinated Research Project on Ingestion and Organ Content of Trace Elements of Importance in Radiological Protection are designed to ensure comparability of the analytical results for Cs, I, Sr, Th, U and other elements in human tissues and diets collected and analysed in nine participating countries. The main analytical techniques are NAA and ICP-MS. For sample preparation, all participants are using identical food blenders which have been centrally supplied after testing for contamination. For quality control of the analyses, six NIST SRMs covering a range of matrices with certified and reference values for the elements of interest have been distributed. A new Japanese reference diet material has also been developed. These quality assurance procedures are summarized here and new data are presented for Cs, I, Sr, Th and U in the NIST SRMs. (author)

  8. Precision profiles and analytic reliability of radioimmunologic methods

    International Nuclear Information System (INIS)

    Yaneva, Z.; Popova, Yu.

    1991-01-01

    The aim of the present study is to investigate and compare some methods for creation of 'precision profiles' (PP) and to clarify their possibilities for determining the analytical reliability of RIA. Only methods without complicated mathematical calculations has been used. The reproducibility in serums with a concentration of the determinable hormone in the whole range of the calibration curve has been studied. The radioimmunoassay has been performed with TSH-RIA set (ex East Germany), and comparative evaluations - with commercial sets of HOECHST (Germany) and AMERSHAM (GB). Three methods for obtaining the relationship concentration (IU/l) -reproducibility (C.V.,%) are used and a comparison is made of their corresponding profiles: preliminary rough profile, Rodbard-PP and Ekins-PP. It is concluded that the creation of a precision profile is obligatory and the method of its construction does not influence the relationship's course. PP allows to determine concentration range giving stable results which improves the efficiency of the analitical work. 16 refs., 4 figs

  9. Analytical procedures for the determination of strontium radionuclides in environmental materials

    International Nuclear Information System (INIS)

    Harvey, B.R.; Ibbett, R.D.; Lovett, M.B.; Williams, K.J.

    1989-01-01

    As part of its statutory role in the authorisation, monitoring and research relating to radioactive wastes discharged into the aquatic environment, the Aquatic Environment Protection Division of the Directorate of Fisheries Research (DFR), Lowestoft routinely carries out analyses for a substantial number of radionuclides in a wide range of environmental materials. The Ministry of a Agriculture, Fisheries and Food has for many years required information about the concentrations of strontium radionuclides in waters, sediments and biological materials. There are not absolute standard methods for such radiochemical analysis; indeed none are required because methodology is continually developing. A very considerable amount of expertise has been developed in the analysis of radiostrontium at the Laboratory since the late 1950s, when detailed analysis first commenced, and the procedures described in this report have been developed and tested over a long period of time with a view to achieving the highest analytical quality. Full details of the practical, analytical and computational procedures, as currently used, are given in the Appendix. (author)

  10. Analytical Methodology for the Determination of Radium Isotopes in Environmental Samples

    International Nuclear Information System (INIS)

    2010-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is an extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004, the environment programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. Measurements of radium isotopes are important for radiological and environmental protection, geochemical and geochronological investigations, hydrology, etc. The suite of isotopes creates and stimulates continuing interest in the development of new methods for determination of radium in various media. In this publication, the four most routinely used analytical methods for radium determination in biological and environmental samples, i.e. alpha spectrometry, gamma spectrometry, liquid scintillation spectrometry and mass spectrometry, are reviewed

  11. Applying Behavior Analytic Procedures to Effectively Teach Literacy Skills in the Classroom

    Science.gov (United States)

    Joseph, Laurice M.; Alber-Morgan, Sheila; Neef, Nancy

    2016-01-01

    The purpose of this article is to discuss the application of behavior analytic procedures for advancing and evaluating methods for teaching literacy skills in the classroom. Particularly, applied behavior analysis has contributed substantially to examining the relationship between teacher behavior and student literacy performance. Teacher…

  12. CONSIDERATIONS FOR THE TREATMENT OF COMPUTERIZED PROCEDURES IN HUMAN RELIABILITY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-07-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  13. Pre-analytical issues in the haemostasis laboratory: guidance for the clinical laboratories.

    Science.gov (United States)

    Magnette, A; Chatelain, M; Chatelain, B; Ten Cate, H; Mullier, F

    2016-01-01

    Ensuring quality has become a daily requirement in laboratories. In haemostasis, even more than in other disciplines of biology, quality is determined by a pre-analytical step that encompasses all procedures, starting with the formulation of the medical question, and includes patient preparation, sample collection, handling, transportation, processing, and storage until time of analysis. This step, based on a variety of manual activities, is the most vulnerable part of the total testing process and is a major component of the reliability and validity of results in haemostasis and constitutes the most important source of erroneous or un-interpretable results. Pre-analytical errors may occur throughout the testing process and arise from unsuitable, inappropriate or wrongly handled procedures. Problems may arise during the collection of blood specimens such as misidentification of the sample, use of inadequate devices or needles, incorrect order of draw, prolonged tourniquet placing, unsuccessful attempts to locate the vein, incorrect use of additive tubes, collection of unsuitable samples for quality or quantity, inappropriate mixing of a sample, etc. Some factors can alter the result of a sample constituent after collection during transportation, preparation and storage. Laboratory errors can often have serious adverse consequences. Lack of standardized procedures for sample collection accounts for most of the errors encountered within the total testing process. They can also have clinical consequences as well as a significant impact on patient care, especially those related to specialized tests as these are often considered as "diagnostic". Controlling pre-analytical variables is critical since this has a direct influence on the quality of results and on their clinical reliability. The accurate standardization of the pre-analytical phase is of pivotal importance for achieving reliable results of coagulation tests and should reduce the side effects of the influence

  14. A procedure for the rapid determination of Pu isotopes and Am-241 in soil and sediment samples by alpha spectrometry

    International Nuclear Information System (INIS)

    2009-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is a extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004 the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. In this report, a rapid procedure for the determination of Pu and Am radionuclides in soil and sediment samples is described that can be used in emergency situations. The method provides accurate and reliable results for the activity concentrations of elevated levels of 239,240 Pu, 238 Pu and 241 Am in soil and sediment samples over the course of 24 hours. The procedure has been validated in accordance with ISO guidelines

  15. Metrological reliability of the calibration procedure in terms of air kerma using the ionization chamber NE2575

    International Nuclear Information System (INIS)

    Guimaraes, Margarete Cristina; Silva, Teogenes Augusto da; Rosado, Paulo H.G.

    2016-01-01

    Metrology laboratories are expected to provide X radiation beams that were established by international standardization organizations to perform calibration and testing of dosimeters. Reliable and traceable standard dosimeters should be used in the calibration procedure. The aim of this work was to study the reliability of the NE 2575 ionization chamber used as standard dosimeter for the air kerma calibration procedure adopted in the CDTN Calibration Laboratory. (author)

  16. Improving Wind Turbine Drivetrain Reliability Using a Combined Experimental, Computational, and Analytical Approach

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; van Dam, J.; Bergua, R.; Jove, J.; Campbell, J.

    2015-03-01

    Nontorque loads induced by the wind turbine rotor overhang weight and aerodynamic forces can greatly affect drivetrain loads and responses. If not addressed properly, these loads can result in a decrease in gearbox component life. This work uses analytical modeling, computational modeling, and experimental data to evaluate a unique drivetrain design that minimizes the effects of nontorque loads on gearbox reliability: the Pure Torque(R) drivetrain developed by Alstom. The drivetrain has a hub-support configuration that transmits nontorque loads directly into the tower rather than through the gearbox as in other design approaches. An analytical model of Alstom's Pure Torque drivetrain provides insight into the relationships among turbine component weights, aerodynamic forces, and the resulting drivetrain loads. Main shaft bending loads are orders of magnitude lower than the rated torque and are hardly affected by wind conditions and turbine operations.

  17. HUMAN RELIABILITY ANALYSIS FOR COMPUTERIZED PROCEDURES, PART TWO: APPLICABILITY OF CURRENT METHODS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-10-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no U.S. nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  18. Design-related influencing factors of the computerized procedure system for inclusion into human reliability analysis of the advanced control room

    International Nuclear Information System (INIS)

    Kim, Jaewhan; Lee, Seung Jun; Jang, Seung Cheol; Ahn, Kwang-Il; Shin, Yeong Cheol

    2013-01-01

    This paper presents major design factors of the computerized procedure system (CPS) by task characteristics/requirements, with individual relative weight evaluated by the analytic hierarchy process (AHP) technique, for inclusion into human reliability analysis (HRA) of the advanced control rooms. Task characteristics/requirements of an individual procedural step are classified into four categories according to the dynamic characteristics of an emergency situation: (1) a single-static step, (2) a single-dynamic and single-checking step, (3) a single-dynamic and continuous-monitoring step, and (4) a multiple-dynamic and continuous-monitoring step. According to the importance ranking evaluation by the AHP technique, ‘clearness of the instruction for taking action’, ‘clearness of the instruction and its structure for rule interpretation’, and ‘adequate provision of requisite information’ were rated as of being higher importance for all the task classifications. Importance of ‘adequacy of the monitoring function’ and ‘adequacy of representation of the dynamic link or relationship between procedural steps’ is dependent upon task characteristics. The result of the present study gives a valuable insight on which design factors of the CPS should be incorporated, with relative importance or weight between design factors, into HRA of the advanced control rooms. (author)

  19. Development of an Analytical Procedure for the Determination of Multiclass Compounds for Forensic Veterinary Toxicology.

    Science.gov (United States)

    Sell, Bartosz; Sniegocki, Tomasz; Zmudzki, Jan; Posyniak, Andrzej

    2018-04-01

    Reported here is a new analytical multiclass method based on QuEChERS technique, which has proven to be effective in diagnosing fatal poisoning cases in animals. This method has been developed for the determination of analytes in liver samples comprising rodenticides, carbamate and organophosphorus pesticides, coccidiostats and mycotoxins. The procedure entails addition of acetonitrile and sodium acetate to 2 g of homogenized liver sample. The mixture was shaken intensively and centrifuged for phase separation, which was followed by an organic phase transfer into a tube containing sorbents (PSA and C18) and magnesium sulfate, then it was centrifuged, the supernatant was filtered and analyzed by liquid chromatography tandem mass spectrometry. A validation of the procedure was performed. Repeatability variation coefficients forensic toxicology cases.

  20. Human reliability analysis of Lingao Nuclear Power Station

    International Nuclear Information System (INIS)

    Zhang Li; Huang Shudong; Yang Hong; He Aiwu; Huang Xiangrui; Zheng Tao; Su Shengbing; Xi Haiying

    2001-01-01

    The necessity of human reliability analysis (HRA) of Lingao Nuclear Power Station are analyzed, and the method and operation procedures of HRA is briefed. One of the human factors events (HFE) is analyzed in detail and some questions of HRA are discussed. The authors present the analytical results of 61 HFEs, and make a brief introduction of HRA contribution to Lingao Nuclear Power Station

  1. The Analytical Pragmatic Structure of Procedural Due Process: A Framework for Inquiry in Administrative Decision Making.

    Science.gov (United States)

    Fisher, James E.; Sealey, Ronald W.

    The study describes the analytical pragmatic structure of concepts and applies this structure to the legal concept of procedural due process. This structure consists of form, purpose, content, and function. The study conclusions indicate that the structure of the concept of procedural due process, or any legal concept, is not the same as the…

  2. Establishing reliable cognitive change in children with epilepsy: The procedures and results for a sample with epilepsy

    NARCIS (Netherlands)

    van Iterson, L.; Augustijn, P.B.; de Jong, P.F.; van der Leij, A.

    2013-01-01

    The goal of this study was to investigate reliable cognitive change in epilepsy by developing computational procedures to determine reliable change index scores (RCIs) for the Dutch Wechsler Intelligence Scales for Children. First, RCIs were calculated based on stability coefficients from a

  3. Adjoint sensitivity analysis of dynamic reliability models based on Markov chains - I: Theory

    International Nuclear Information System (INIS)

    Cacuci, D. G.; Cacuci, D. G.; Ionescu-Bujor, M.

    2008-01-01

    The development of the adjoint sensitivity analysis procedure (ASAP) for generic dynamic reliability models based on Markov chains is presented, together with applications of this procedure to the analysis of several systems of increasing complexity. The general theory is presented in Part I of this work and is accompanied by a paradigm application to the dynamic reliability analysis of a simple binary component, namely a pump functioning on an 'up/down' cycle until it fails irreparably. This paradigm example admits a closed form analytical solution, which permits a clear illustration of the main characteristics of the ASAP for Markov chains. In particular, it is shown that the ASAP for Markov chains presents outstanding computational advantages over other procedures currently in use for sensitivity and uncertainty analysis of the dynamic reliability of large-scale systems. This conclusion is further underscored by the large-scale applications presented in Part II. (authors)

  4. Adjoint sensitivity analysis of dynamic reliability models based on Markov chains - I: Theory

    Energy Technology Data Exchange (ETDEWEB)

    Cacuci, D. G. [Commiss Energy Atom, Direct Energy Nucl, Saclay, (France); Cacuci, D. G. [Univ Karlsruhe, Inst Nucl Technol and Reactor Safety, D-76021 Karlsruhe, (Germany); Ionescu-Bujor, M. [Forschungszentrum Karlsruhe, Fus Program, D-76021 Karlsruhe, (Germany)

    2008-07-01

    The development of the adjoint sensitivity analysis procedure (ASAP) for generic dynamic reliability models based on Markov chains is presented, together with applications of this procedure to the analysis of several systems of increasing complexity. The general theory is presented in Part I of this work and is accompanied by a paradigm application to the dynamic reliability analysis of a simple binary component, namely a pump functioning on an 'up/down' cycle until it fails irreparably. This paradigm example admits a closed form analytical solution, which permits a clear illustration of the main characteristics of the ASAP for Markov chains. In particular, it is shown that the ASAP for Markov chains presents outstanding computational advantages over other procedures currently in use for sensitivity and uncertainty analysis of the dynamic reliability of large-scale systems. This conclusion is further underscored by the large-scale applications presented in Part II. (authors)

  5. The validity and reliability of value-added and target-setting procedures with special reference to Key Stage 3

    OpenAIRE

    Moody, Ian Robin

    2003-01-01

    The validity of value-added systems of measurement is crucially dependent upon there being a demonstrably unambiguous relationship between the so-called baseline, or intake measures, and any subsequent measure of performance at a later stage. The reliability of such procedures is dependent on the relationships between these two measures being relatively stable over time. A number of questions arise with regard to both the validity and reliability of value-added procedures at any level in educ...

  6. Establishing Reliable Cognitive Change in Children with Epilepsy: The Procedures and Results for a Sample with Epilepsy

    Science.gov (United States)

    van Iterson, Loretta; Augustijn, Paul B.; de Jong, Peter F.; van der Leij, Aryan

    2013-01-01

    The goal of this study was to investigate reliable cognitive change in epilepsy by developing computational procedures to determine reliable change index scores (RCIs) for the Dutch Wechsler Intelligence Scales for Children. First, RCIs were calculated based on stability coefficients from a reference sample. Then, these RCIs were applied to a…

  7. Analytical Chemistry Laboratory (ACL) procedure compendium

    International Nuclear Information System (INIS)

    1992-06-01

    Covered are: analytical laboratory operations (ALO) sample receipt and control, ALO data report/package preparation review and control, single shell tank (PST) project sample tracking system, sample receiving, analytical balances, duties and responsibilities of sample custodian, sample refrigerator temperature monitoring, security, assignment of staff responsibilities, sample storage, data reporting, and general requirements for glassware

  8. The effect on reliability and sensitivity to level of training of combining analytic and holistic rating scales for assessing communication skills in an internal medicine resident OSCE.

    Science.gov (United States)

    Daniels, Vijay John; Harley, Dwight

    2017-07-01

    Although previous research has compared checklists to rating scales for assessing communication, the purpose of this study was to compare the effect on reliability and sensitivity to level of training of an analytic, a holistic, and a combined analytic-holistic rating scale in assessing communication skills. The University of Alberta Internal Medicine Residency runs OSCEs for postgraduate year (PGY) 1 and 2 residents and another for PGY-4 residents. Communication stations were scored with an analytic scale (empathy, non-verbal skills, verbal skills, and coherence subscales) and a holistic scale. Authors analyzed reliability of individual and combined scales using generalizability theory and evaluated each scale's sensitivity to level of training. For analytic, holistic, and combined scales, 12, 12, and 11 stations respectively yielded a Phi of 0.8 for the PGY-1,2 cohort, and 16, 16, and 14 stations yielded a Phi of 0.8 for the PGY-4 cohort. PGY-4 residents scored higher on the combined scale, the analytic rating scale, and the non-verbal and coherence subscales. A combined analytic-holistic rating scale increased score reliability and was sensitive to level of training. Given increased validity evidence, OSCE developers should consider combining analytic and holistic scales when assessing communication skills. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. An analytical procedure for computing smooth transitions between two specified cross sections with applications to blended wing body configuration

    Science.gov (United States)

    Barger, R. L.

    1982-01-01

    An analytical procedure is described for designing smooth transition surfaces for blended wing-body configurations. Starting from two specified cross section shapes, the procedure generates a gradual transition from one cross section shape to the other as an analytic blend of the two shapes. The method utilizes a conformal mapping, with subsequent translation and scaling, to transform the specified and shapes to curves that can be combined more smoothly. A sample calculation is applied to a blended wing-body missile type configuration with a top mounted inlet.

  10. A competency based selection procedure for Dutch postgraduate GP training: a pilot study on validity and reliability.

    Science.gov (United States)

    Vermeulen, Margit I; Tromp, Fred; Zuithoff, Nicolaas P A; Pieters, Ron H M; Damoiseaux, Roger A M J; Kuyvenhoven, Marijke M

    2014-12-01

    Abstract Background: Historically, semi-structured interviews (SSI) have been the core of the Dutch selection for postgraduate general practice (GP) training. This paper describes a pilot study on a newly designed competency-based selection procedure that assesses whether candidates have the competencies that are required to complete GP training. The objective was to explore reliability and validity aspects of the instruments developed. The new selection procedure comprising the National GP Knowledge Test (LHK), a situational judgement tests (SJT), a patterned behaviour descriptive interview (PBDI) and a simulated encounter (SIM) was piloted alongside the current procedure. Forty-seven candidates volunteered in both procedures. Admission decision was based on the results of the current procedure. Study participants did hardly differ from the other candidates. The mean scores of the candidates on the LHK and SJT were 21.9 % (SD 8.7) and 83.8% (SD 3.1), respectively. The mean self-reported competency scores (PBDI) were higher than the observed competencies (SIM): 3.7(SD 0.5) and 2.9(SD 0.6), respectively. Content-related competencies showed low correlations with one another when measured with different instruments, whereas more diverse competencies measured by a single instrument showed strong to moderate correlations. Moreover, a moderate correlation between LHK and SJT was found. The internal consistencies (intraclass correlation, ICC) of LHK and SJT were poor while the ICC of PBDI and SIM showed acceptable levels of reliability. Findings on content validity and reliability of these new instruments are promising to realize a competency based procedure. Further development of the instruments and research on predictive validity should be pursued.

  11. Integrating analytical procedures into the continuous audit environment

    Directory of Open Access Journals (Sweden)

    Eija Koskivaara

    2006-12-01

    Full Text Available The objective of this article is to show how to embed analytical procedures (AP into the continuous audit environment. The audit environment is discussed in terms of audit phases, where the role of APs is to obtain evidence for auditors. The article addresses different characteristics of AP techniques. Furthermore, the article compares four different AP techniques to form expectations for the monthly sales values. Two of these techniques are simple quantitative ones, such as the previous year’s value and the mean of the previous years’ values. The advanced quantitative techniques are regression analysis and an artificial neural network (ANN-based model. In a comparison of the prediction results, the regression analysis and ANN model turn out to be equally good. The development of these kinds of tools is crucial to the continuous audit environment, especially when most data transmission between companies and their stakeholders are moved into the electronic form.

  12. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    Science.gov (United States)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  13. OSS reliability measurement and assessment

    CERN Document Server

    Yamada, Shigeru

    2016-01-01

    This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.

  14. General analytical procedure for determination of acidity parameters of weak acids and bases.

    Science.gov (United States)

    Pilarski, Bogusław; Kaliszan, Roman; Wyrzykowski, Dariusz; Młodzianowski, Janusz; Balińska, Agata

    2015-01-01

    The paper presents a new convenient, inexpensive, and reagent-saving general methodology for the determination of pK a values for components of the mixture of diverse chemical classes weak organic acids and bases in water solution, without the need to separate individual analytes. The data obtained from simple pH-metric microtitrations are numerically processed into reliable pK a values for each component of the mixture. Excellent agreement has been obtained between the determined pK a values and the reference literature data for compounds studied.

  15. Reliability and validity of procedure-based assessments in otolaryngology training.

    Science.gov (United States)

    Awad, Zaid; Hayden, Lindsay; Robson, Andrew K; Muthuswamy, Keerthini; Tolley, Neil S

    2015-06-01

    To investigate the reliability and construct validity of procedure-based assessment (PBA) in assessing performance and progress in otolaryngology training. Retrospective database analysis using a national electronic database. We analyzed PBAs of otolaryngology trainees in North London from core trainees (CTs) to specialty trainees (STs). The tool contains six multi-item domains: consent, planning, preparation, exposure/closure, technique, and postoperative care, rated as "satisfactory" or "development required," in addition to an overall performance rating (pS) of 1 to 4. Individual domain score, overall calculated score (cS), and number of "development-required" items were calculated for each PBA. Receiver operating characteristic analysis helped determine sensitivity and specificity. There were 3,152 otolaryngology PBAs from 46 otolaryngology trainees analyzed. PBA reliability was high (Cronbach's α 0.899), and sensitivity approached 99%. cS correlated positively with pS and level in training (rs : +0.681 and +0.324, respectively). ST had higher cS and pS than CT (93% ± 0.6 and 3.2 ± 0.03 vs. 71% ± 3.1 and 2.3 ± 0.08, respectively; P reliable and valid for assessing otolaryngology trainees' performance and progress at all levels. It is highly sensitive in identifying competent trainees. The tool is used in a formative and feedback capacity. The technical domain is the best predictor and should be given close attention. NA. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  16. Development of quantitative analytical procedures on two-phase flow in tight-lattice fuel bundles for reduced-moderation light-water reactors

    International Nuclear Information System (INIS)

    Ohnuki, A.; Kureta, M.; Takae, K.; Tamai, H.; Akimoto, H.; Yoshida, H.

    2004-01-01

    The research project to investigate thermal-hydraulic performance in tight-lattice rod bundles for Reduced-Moderation Water Reactor (RMWR) started at Japan Atomic Energy Research Institute (JAERI) in 2002. The RMWR is a light water reactor for which a higher conversion ratio more than one can be expected. In order to attain this higher conversion ratio, triangular tight-lattice fuel bundles whose gap spacing between each fuel rod is around 1 mm are required. As for the thermal design of the RMWR core, conventional analytical methods are no good because the conventional composition equations can not predict the RMWR core with high accuracy. Then, development of new quantitative analytical procedures was carried out. Those analytical procedures are constructed by model experiments and advanced two-phase flow analysis codes. This paper describes the results of the model experiments and analytical results with the developed analysis codes. (authors)

  17. Reliability assessment of a manual-based procedure towards learning curve modeling and fmea analysis

    Directory of Open Access Journals (Sweden)

    Gustavo Rech

    2013-03-01

    Full Text Available Separation procedures in drug Distribution Centers (DC are manual-based activities prone to failures such as shipping exchanged, expired or broken drugs to the customer. Two interventions seem as promising in improving the reliability in the separation procedure: (i selection and allocation of appropriate operators to the procedure, and (ii analysis of potential failure modes incurred by selected operators. This article integrates Learning Curves (LC and FMEA (Failure Mode and Effect Analysis aimed at reducing the occurrence of failures in the manual separation of a drug DC. LCs parameters enable generating an index to identify the recommended operators to perform the procedures. The FMEA is then applied to the separation procedure carried out by the selected operators in order to identify failure modes. It also deployed the traditional FMEA severity index into two sub-indexes related to financial issues and damage to company´s image in order to characterize failures severity. When applied to a drug DC, the proposed method significantly reduced the frequency and severity of failures in the separation procedure.

  18. Analytical procedures for water-soluble vitamins in foods and dietary supplements: a review.

    Science.gov (United States)

    Blake, Christopher J

    2007-09-01

    Water-soluble vitamins include the B-group vitamins and vitamin C. In order to correctly monitor water-soluble vitamin content in fortified foods for compliance monitoring as well as to establish accurate data banks, an accurate and precise analytical method is a prerequisite. For many years microbiological assays have been used for analysis of B vitamins. However they are no longer considered to be the gold standard in vitamins analysis as many studies have shown up their deficiencies. This review describes the current status of analytical methods, including microbiological assays and spectrophotometric, biosensor and chromatographic techniques. In particular it describes the current status of the official methods and highlights some new developments in chromatographic procedures and detection methods. An overview is made of multivitamin extractions and analyses for foods and supplements.

  19. Differences in metabolite profiles caused by pre-analytical blood processing procedures.

    Science.gov (United States)

    Nishiumi, Shin; Suzuki, Makoto; Kobayashi, Takashi; Yoshida, Masaru

    2018-05-01

    Recently, the use of metabolomic analysis of human serum and plasma for biomarker discovery and disease diagnosis in clinical studies has been increasing. The feasibility of using a metabolite biomarker for disease diagnosis is strongly dependent on the metabolite's stability during pre-analytical blood processing procedures, such as serum or plasma sampling and sample storage prior to centrifugation. However, the influence of blood processing procedures on the stability of metabolites has not been fully characterized. In the present study, we compared the levels of metabolites in matched human serum and plasma samples using gas chromatography coupled with mass spectrometry and liquid chromatography coupled with mass spectrometry. In addition, we evaluated the changes in plasma metabolite levels induced by storage at room temperature or at a cold temperature prior to centrifugation. As a result, it was found that 76 metabolites exhibited significant differences between their serum and plasma levels. Furthermore, the pre-centrifugation storage conditions significantly affected the plasma levels of 45 metabolites. These results highlight the importance of blood processing procedures during metabolome analysis, which should be considered during biomarker discovery and the subsequent use of biomarkers for disease diagnosis. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  20. SHARP1: A revised systematic human action reliability procedure

    International Nuclear Information System (INIS)

    Wakefield, D.J.; Parry, G.W.; Hannaman, G.W.; Spurgin, A.J.

    1990-12-01

    Individual plant examinations (IPE) are being performed by utilities to evaluate plant-specific vulnerabilities to severe accidents. A major tool in performing an IPE is a probabilistic risk assessment (PRA). The importance of human interactions in determining the plant response in past PRAs is well documented. The modeling and quantification of the probabilities of human interactions have been the subjects of considerable research by the Electric Power Research Institute (EPRI). A revised framework, SHARP1, for incorporating human interactions into PRA is summarized in this report. SHARP1 emphasizes that the process stages are iterative and directed at specific goals rather than being performed sequentially in a stepwise procedure. This expanded summary provides the reader with a flavor of the full report content. Excerpts from the full report are presented, following the same outline as the full report. In the full report, the interface of the human reliability analysis with the plant logic model development in a PRA is given special attention. In addition to describing a methodology framework, the report also discusses the types of human interactions to be evaluated, and how to formulate a project team to perform the human reliability analysis. A concise description and comparative evaluation of the selected existing methods of quantification of human error are also presented. Four case studies are also provided to illustrate the SHARP1 process

  1. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  2. Reliability of the spent fuel identification for flask loading procedure used by COGEMA for fuel transport to La Hague

    International Nuclear Information System (INIS)

    Eid, M.; Zachar, M.; Pretesacque, P.

    1991-01-01

    The Spent Fuel Identification for Flask Loading (SFIFL) procedure designed by COGEMA is analysed and its reliability calculated. The reliability of the procedure is defined as the probability of transporting only approved fuel elements for a given number of shipments. The procedure describes a non-coherent system. A non-coherent system is the one in which two successive failures could result in a success, from the system mission point of view. A technique that describes the system with the help of its maximal cuts (states) is used for calculations. A maximal cut contains more than one failure which can split into two cuts (sub-states). Cuts splitting will enable us to analyse, in a systematic way, non-coherent systems with independent basic components. (author)

  3. Reliability of the spent fuel identification for flask loading procedure used by COGEMA for fuel transport to La Hague

    International Nuclear Information System (INIS)

    Eid, M.; Zachar, M.; Pretesacque, P.

    1990-01-01

    The Spent Fuel Identification for Flask Loading, SFIFL, procedure designed by COGEMA is analysed and its reliability is calculated. The reliability of the procedure is defined as the probability of transporting only approved fuel elements for a given number of shipments. The procedure describes a non-coherent system. A non-coherent system is the one in which two successive failures could result in a success, from the system mission point of view. A technique that describes the system with the help of its maximal cuts (states), is used for calculations. A maximal cut contains more than one failure can split into two cuts, (sub-states). Cuts splitting will enable us to analyse, in a systematic way, non-coherent systems with independent basic components. (author)

  4. Definition, development, and demonstration of analytical procedures for the structured assessment approach. Final report

    International Nuclear Information System (INIS)

    1979-01-01

    Analytical procedures were refined for the Structural Assessment Approach for assessing the Material Control and Accounting systems at facilities that contain special nuclear material. Requirements were established for an efficient, feasible algorithm to be used in evaluating system performance measures that involve the probability of detection. Algorithm requirements to calculate the probability of detection for a given type of adversary and the target set are described

  5. Accounting for Proof Test Data in a Reliability Based Design Optimization Framework

    Science.gov (United States)

    Ventor, Gerharad; Scotti, Stephen J.

    2012-01-01

    This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.

  6. An analytical inductor design procedure for three-phase PWM converters in power factor correction applications

    DEFF Research Database (Denmark)

    Kouchaki, Alireza; Niroumand, Farideh Javidi; Haase, Frerk

    2015-01-01

    This paper presents an analytical method for designing the inductor of three-phase power factor correction converters (PFCs). The complex behavior of the inductor current complicates the inductor design procedure as well as the core loss and copper loss calculations. Therefore, this paper analyze...... to calculate the core loss in the PFC application. To investigate the impact of the dc link voltage level, two inductors for different dc voltage levels are designed and the results are compared.......This paper presents an analytical method for designing the inductor of three-phase power factor correction converters (PFCs). The complex behavior of the inductor current complicates the inductor design procedure as well as the core loss and copper loss calculations. Therefore, this paper analyzes...... circuit is used to provide the inductor current harmonic spectrum. Therefore, using the harmonic spectrum, the low and high frequency copper losses are calculated. The high frequency minor B-H loops in one switching cycle are also analyzed. Then, the loss map provided by the measurement setup is used...

  7. A Hybrid Approach for Reliability Analysis Based on Analytic Hierarchy Process and Bayesian Network

    International Nuclear Information System (INIS)

    Zubair, Muhammad

    2014-01-01

    By using analytic hierarchy process (AHP) and Bayesian Network (BN) the present research signifies the technical and non-technical issues of nuclear accidents. The study exposed that the technical faults was one major reason of these accidents. Keep an eye on other point of view it becomes clearer that human behavior like dishonesty, insufficient training, and selfishness are also play a key role to cause these accidents. In this study, a hybrid approach for reliability analysis based on AHP and BN to increase nuclear power plant (NPP) safety has been developed. By using AHP, best alternative to improve safety, design, operation, and to allocate budget for all technical and non-technical factors related with nuclear safety has been investigated. We use a special structure of BN based on the method AHP. The graphs of the BN and the probabilities associated with nodes are designed to translate the knowledge of experts on the selection of best alternative. The results show that the improvement in regulatory authorities will decrease failure probabilities and increase safety and reliability in industrial area.

  8. Optimisation (sampling strategies and analytical procedures) for site specific environment monitoring at the areas of uranium production legacy sites in Ukraine - 59045

    International Nuclear Information System (INIS)

    Voitsekhovych, Oleg V.; Lavrova, Tatiana V.; Kostezh, Alexander B.

    2012-01-01

    There are many sites in the world, where Environment are still under influence of the contamination related to the Uranium production carried out in past. Author's experience shows that lack of site characterization data, incomplete or unreliable environment monitoring studies can significantly limit quality of Safety Assessment procedures and Priority actions analyses needed for Remediation Planning. During recent decades the analytical laboratories of the many enterprises, currently being responsible for establishing the site specific environment monitoring program have been significantly improved their technical sampling and analytical capacities. However, lack of experience in the optimal site specific sampling strategy planning and also not enough experience in application of the required analytical techniques, such as modern alpha-beta radiometers, gamma and alpha spectrometry and liquid-scintillation analytical methods application for determination of U-Th series radionuclides in the environment, does not allow to these laboratories to develop and conduct efficiently the monitoring programs as a basis for further Safety Assessment in decision making procedures. This paper gives some conclusions, which were gained from the experience establishing monitoring programs in Ukraine and also propose some practical steps on optimization in sampling strategy planning and analytical procedures to be applied for the area required Safety assessment and justification for its potential remediation and safe management. (authors)

  9. Analytical method comparisons for the accurate determination of PCBs in sediments

    Energy Technology Data Exchange (ETDEWEB)

    Numata, M.; Yarita, T.; Aoyagi, Y.; Yamazaki, M.; Takatsu, A. [National Metrology Institute of Japan, Tsukuba (Japan)

    2004-09-15

    National Metrology Institute of Japan in National Institute of Advanced Industrial Science and Technology (NMIJ/AIST) has been developing several matrix reference materials, for example, sediments, water and biological tissues, for the determinations of heavy metals and organometallic compounds. The matrix compositions of those certified reference materials (CRMs) are similar to compositions of actual samples, and those are useful for validating analytical procedures. ''Primary methods of measurements'' are essential to obtain accurate and SI-traceable certified values in the reference materials, because the methods have the highest quality of measurement. However, inappropriate analytical operations, such as incomplete extraction of analytes or crosscontamination during analytical procedures, will cause error of analytical results, even if one of the primary methods, isotope-dilution, is utilized. To avoid possible procedural bias for the certification of reference materials, we employ more than two analytical methods which have been optimized beforehand. Because the accurate determination of trace POPs in the environment is important to evaluate their risk, reliable CRMs are required by environmental chemists. Therefore, we have also been preparing matrix CRMs for the determination of POPs. To establish accurate analytical procedures for the certification of POPs, extraction is one of the critical steps as described above. In general, conventional extraction techniques for the determination of POPs, such as Soxhlet extraction (SOX) and saponification (SAP), have been characterized well, and introduced as official methods for environmental analysis. On the other hand, emerging techniques, such as microwave-assisted extraction (MAE), pressurized fluid extraction (PFE) and supercritical fluid extraction (SFE), give higher recovery yields of analytes with relatively short extraction time and small amount of solvent, by reasons of the high

  10. A procedure for the determination of Po-210 in water samples by alpha spectrometry

    International Nuclear Information System (INIS)

    2009-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is a extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004 the Environment programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. In the case of 210 Po, this started with the collection and review of about 130 papers from the scientific literature. Based on this review, two candidate methods for the chemical separation of 210 Po from water samples were selected for testing, refinement and validation in accordance with ISO guidelines. A comprehensive methodology for calculation of results including quantification of measurement uncertainty was also developed. This report presents the final procedure which was developed based on that work

  11. Validity, reliability, feasibility, acceptability and educational impact of direct observation of procedural skills (DOPS).

    Science.gov (United States)

    Naeem, Naghma

    2013-01-01

    Direct observation of procedural skills (DOPS) is a new workplace assessment tool. The aim of this narrative review of literature is to summarize the available evidence about the validity, reliability, feasibility, acceptability and educational impact of DOPS. A PubMed database and Google search of the literature on DOPS published from January 2000 to January 2012 was conducted which yielded 30 articles. Thirteen articles were selected for full text reading and review. In the reviewed literature, DOPS was found to be a useful tool for assessment of procedural skills, but further research is required to prove its utility as a workplace based assessment instrument.

  12. An integrated approach to human reliability analysis -- decision analytic dynamic reliability model

    International Nuclear Information System (INIS)

    Holmberg, J.; Hukki, K.; Norros, L.; Pulkkinen, U.; Pyy, P.

    1999-01-01

    The reliability of human operators in process control is sensitive to the context. In many contemporary human reliability analysis (HRA) methods, this is not sufficiently taken into account. The aim of this article is that integration between probabilistic and psychological approaches in human reliability should be attempted. This is achieved first, by adopting such methods that adequately reflect the essential features of the process control activity, and secondly, by carrying out an interactive HRA process. Description of the activity context, probabilistic modeling, and psychological analysis form an iterative interdisciplinary sequence of analysis in which the results of one sub-task maybe input to another. The analysis of the context is carried out first with the help of a common set of conceptual tools. The resulting descriptions of the context promote the probabilistic modeling, through which new results regarding the probabilistic dynamics can be achieved. These can be incorporated in the context descriptions used as reference in the psychological analysis of actual performance. The results also provide new knowledge of the constraints of activity, by providing information of the premises of the operator's actions. Finally, the stochastic marked point process model gives a tool, by which psychological methodology may be interpreted and utilized for reliability analysis

  13. Procedures for treating common cause failures in safety and reliability studies: Procedural framework and examples

    International Nuclear Information System (INIS)

    Mosleh, A.; Fleming, K.N.; Parry, G.W.; Paula, H.M.; Worledge, D.H.; Rasmuson, D.M.

    1988-01-01

    This report presents a framework for the inclusion of the impact of common cause failures in risk and reliability evaluations. Common cause failures are defined as that cutset of dependent failures for which causes are not explicitly included in the logic model as basic events. The emphasis here is on providing procedures for a practical, systematic approach that can be used to perform and clearly document the analysis. The framework comprises four major stages: (1) System Logic Model Development; (2) Identification of Common Cause Component Groups; (3) Common Cause Modeling and Data Analysis; and (4) System Quantification and Interpretation of Results. The framework and the methods discussed for performing the different stages of the analysis integrate insights obtained from engineering assessments of the system and the historical evidence from multiple failure events into a systematic, reproducible, and defensible analysis. 22 figs., 34 tabs

  14. Measurement of HDO Products Using GC-TCD: Towards Obtaining Reliable Analytical Data

    Directory of Open Access Journals (Sweden)

    Zuas Oman

    2018-03-01

    Full Text Available This paper reported the method development and validation of a gas chromatography with thermal conductivity detector (GC-TCD method for the measurement of the gaseous products of hydrodeoxygenation (HDO. The method validation parameters include selectivity, precision (repeatability and reproducibility, accuracy, linearity, limit of detection (LoD, limit of quantitation (LoQ, and robustness. The results showed that the developed method was able to separate the target components (H2, CO2, CH4 and CO from their mixtures without any special sample treatment. The validated method was selective, precise, accurate, and robust. Application of the developed and validated GC-TCD method to the measurement of by-product components of HDO of bio-oil revealed a good performance with relative standard deviation (RSD less than 1.0% for all target components, implying that the process of method development and validation provides a trustworthy way of obtaining reliable analytical data.

  15. Methodological procedures and analytical instruments to evaluate an indicators integrated archive for urban management

    International Nuclear Information System (INIS)

    Del Ciello, R.; Napoleoni, S.

    1998-01-01

    This guide provides the results of a research developed at ENEA (National Agency for new Technology, Energy and the Environment) Casaccia center (Rome, Italy) aimed to define methodological procedures and analytical instruments needed to carry out an indicators integrated archive for urban management. The guide also defines the scheme of a negotiation process aimed to reach and exchange data and information among governmental and local administrations, non-governmental organizations and scientific bodies [it

  16. An exact method for solving logical loops in reliability analysis

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2009-01-01

    This paper presents an exact method for solving logical loops in reliability analysis. The systems that include logical loops are usually described by simultaneous Boolean equations. First, present a basic rule of solving simultaneous Boolean equations. Next, show the analysis procedures for three-component system with external supports. Third, more detailed discussions are given for the establishment of logical loop relation. Finally, take up two typical structures which include more than one logical loop. Their analysis results and corresponding GO-FLOW charts are given. The proposed analytical method is applicable to loop structures that can be described by simultaneous Boolean equations, and it is very useful in evaluating the reliability of complex engineering systems.

  17. Analytical modeling of nuclear power station operator reliability

    International Nuclear Information System (INIS)

    Sabri, Z.A.; Husseiny, A.A.

    1979-01-01

    The operator-plant interface is a critical component of power stations which requires the formulation of mathematical models to be applied in plant reliability analysis. The human model introduced here is based on cybernetic interactions and allows for use of available data from psychological experiments, hot and cold training and normal operation. The operator model is identified and integrated in the control and protection systems. The availability and reliability are given for different segments of the operator task and for specific periods of the operator life: namely, training, operation and vigilance or near retirement periods. The results can be easily and directly incorporated in system reliability analysis. (author)

  18. CARVEDILOL POPULATION PHARMACOKINETIC ANALYSIS – APPLIED VALIDATION PROCEDURE

    Directory of Open Access Journals (Sweden)

    Aleksandra Catić-Đorđević

    2013-09-01

    Full Text Available Carvedilol is a nonselective beta blocker/alpha-1 blocker, which is used for treatment of essential hypertension, chronic stable angina, unstable angina and ischemic left ventricular dysfunction. The aim of this study was to describe carvedilol population pharmacokinetic (PK analysis as well as the validation of analytical procedure, which is an important step regarding this approach. In contemporary clinical practice, population PK analysis is often more important than standard PK approach in setting a mathematical model that describes the PK parameters. Also, it includes the variables that have particular importance in the drugs pharmacokinetics such as sex, body mass, dosage, pharmaceutical form, pathophysiological state, disease associated with the organism or the presence of a specific polymorphism in the isoenzyme important for biotransformation of the drug. One of the most frequently used approach in population PK analysis is the Nonlinear Modeling of Mixed Effects - NONMEM modeling. Analytical methods used in the data collection period is of great importance for the implementation of a population PK analysis of carvedilol in order to obtain reliable data that can be useful in clinical practice. High performance liquid chromatography (HPLC analysis of carvedilol is used to confirm the identity of a drug and provide quantitative results and also to monitor the efficacy of the therapy. Analytical procedures used in other studies could not be fully implemented in our research as it was necessary to perform certain modification and validation of the method with the aim of using the obtained results for the purpose of a population pharmacokinetic analysis. Validation process is a logical terminal phase of analytical procedure development that provides applicability of the procedure itself. The goal of validation is to ensure consistency of the method and accuracy of results or to confirm the selection of analytical method for a given sample

  19. Procedure prediction from symbolic Electronic Health Records via time intervals analytics.

    Science.gov (United States)

    Moskovitch, Robert; Polubriaginof, Fernanda; Weiss, Aviram; Ryan, Patrick; Tatonetti, Nicholas

    2017-11-01

    Prediction of medical events, such as clinical procedures, is essential for preventing disease, understanding disease mechanism, and increasing patient quality of care. Although longitudinal clinical data from Electronic Health Records provides opportunities to develop predictive models, the use of these data faces significant challenges. Primarily, while the data are longitudinal and represent thousands of conceptual events having duration, they are also sparse, complicating the application of traditional analysis approaches. Furthermore, the framework presented here takes advantage of the events duration and gaps. International standards for electronic healthcare data represent data elements, such as procedures, conditions, and drug exposures, using eras, or time intervals. Such eras contain both an event and a duration and enable the application of time intervals mining - a relatively new subfield of data mining. In this study, we present Maitreya, a framework for time intervals analytics in longitudinal clinical data. Maitreya discovers frequent time intervals related patterns (TIRPs), which we use as prognostic markers for modelling clinical events. We introduce three novel TIRP metrics that are normalized versions of the horizontal-support, that represents the number of TIRP instances per patient. We evaluate Maitreya on 28 frequent and clinically important procedures, using the three novel TIRP representation metrics in comparison to no temporal representation and previous TIRPs metrics. We also evaluate the epsilon value that makes Allen's relations more flexible with several settings of 30, 60, 90 and 180days in comparison to the default zero. For twenty-two of these procedures, the use of temporal patterns as predictors was superior to non-temporal features, and the use of the vertically normalized horizontal support metric to represent TIRPs as features was most effective. The use of the epsilon value with thirty days was slightly better than the zero

  20. Simple and reliable procedure for the evaluation of short-term dynamic processes in power systems

    Energy Technology Data Exchange (ETDEWEB)

    Popovic, D P

    1986-10-01

    An efficient approach is presented to the solution of the short-term dynamics model in power systems. It consists of an adequate algebraic treatment of the original system of nonlinear differential equations, using linearization, decomposition and Cauchy's formula. The simple difference equations obtained in this way are incorporated into a model of the electrical network, which is of a low order compared to the ones usually used. Newton's method is applied to the model formed in this way, which leads to a simple and reliable iterative procedure. The characteristics of the procedure developed are demonstrated on examples of transient stability analysis of real power systems. 12 refs.

  1. Optimization of instrumental neutron activation analysis method by means of 2k experimental design technique aiming the validation of analytical procedures

    International Nuclear Information System (INIS)

    Petroni, Robson; Moreira, Edson G.

    2013-01-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) methods were carried out for the determination of the elements arsenic, chromium, cobalt, iron, rubidium, scandium, selenium and zinc in biological materials. The aim is to validate the analytical methods for future accreditation at the National Institute of Metrology, Quality and Technology (INMETRO). The 2 k experimental design was applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. Samples of Mussel Tissue Certified Reference Material and multi-element standards were analyzed considering the following variables: sample decay time, counting time and sample distance to detector. The standard multi-element concentration (comparator standard), mass of the sample and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN - CNEN/SP). Optimized conditions were estimated based on the results of z-score tests, main effect and interaction effects. The results obtained with the different experimental configurations were evaluated for accuracy (precision and trueness) for each measurement. (author)

  2. Analytical procedure for the titrimetric determination of uranium in concentrates

    International Nuclear Information System (INIS)

    Florence, T.M.; Pakalns, P.

    1989-01-01

    In 1964 Davis and gray published a titrimetric method for uranium which does not require column reductors, electronic instruments or inert atmospheres, and is sufficiently selective to enable uranium to be determined without prior separation. The method involves reduction of uranium (VI) to (IV) by ferrous sulphate in concentrated phosphoric acid medium. The excess ion (II) is then selectively oxidised by nitric acid using molybdenum catalyst. After addition of sulphuric acid and dilution with water, the uranium (IV) is titrated with standard potassium dichromate, using barium diphenylamine sulphonate indicator. This method has been found to be simple, precise and reliable, and applicable to a wide range of uranium-containing materials. The method given here for determining uranium in concentrates is essentially that of Davies and Gray. Its applications, apparatus, reagents, procedures and accuracy and precision are discussed. 10 refs

  3. Measuring myokines with cardiovascular functions: pre-analytical variables affecting the analytical output.

    Science.gov (United States)

    Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe

    2017-08-01

    In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.

  4. Reliability of stellar inclination estimated from asteroseismology: analytical criteria, mock simulations and Kepler data analysis

    Science.gov (United States)

    Kamiaka, Shoya; Benomar, Othman; Suto, Yasushi

    2018-05-01

    Advances in asteroseismology of solar-like stars, now provide a unique method to estimate the stellar inclination i⋆. This enables to evaluate the spin-orbit angle of transiting planetary systems, in a complementary fashion to the Rossiter-McLaughlineffect, a well-established method to estimate the projected spin-orbit angle λ. Although the asteroseismic method has been broadly applied to the Kepler data, its reliability has yet to be assessed intensively. In this work, we evaluate the accuracy of i⋆ from asteroseismology of solar-like stars using 3000 simulated power spectra. We find that the low signal-to-noise ratio of the power spectra induces a systematic under-estimate (over-estimate) bias for stars with high (low) inclinations. We derive analytical criteria for the reliable asteroseismic estimate, which indicates that reliable measurements are possible in the range of 20° ≲ i⋆ ≲ 80° only for stars with high signal-to-noise ratio. We also analyse and measure the stellar inclination of 94 Kepler main-sequence solar-like stars, among which 33 are planetary hosts. According to our reliability criteria, a third of them (9 with planets, 22 without) have accurate stellar inclination. Comparison of our asteroseismic estimate of vsin i⋆ against spectroscopic measurements indicates that the latter suffers from a large uncertainty possibly due to the modeling of macro-turbulence, especially for stars with projected rotation speed vsin i⋆ ≲ 5km/s. This reinforces earlier claims, and the stellar inclination estimated from the combination of measurements from spectroscopy and photometric variation for slowly rotating stars needs to be interpreted with caution.

  5. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  6. Evaluation of the Most Reliable Procedure of Determining Jump Height During the Loaded Countermovement Jump Exercise: Take-Off Velocity vs. Flight Time.

    Science.gov (United States)

    Pérez-Castilla, Alejandro; García-Ramos, Amador

    2018-07-01

    Pérez-Castilla, A and García-Ramos, A. Evaluation of the most reliable procedure of determining jump height during the loaded countermovement jump exercise: Take-off velocity vs. flight time. J Strength Cond Res 32(7): 2025-2030, 2018-This study aimed to compare the reliability of jump height between the 2 standard procedures of analyzing force-time data (take-off velocity [TOV] and flight time [FT]) during the loaded countermovement (CMJ) exercise performed with a free-weight barbell and in a Smith machine. The jump height of 17 men (age: 22.2 ± 2.2 years, body mass: 75.2 ± 7.1 kg, and height: 177.0 ± 6.0 cm) was tested in 4 sessions (twice for each CMJ type) against external loads of 17, 30, 45, 60, and 75 kg. Jump height reliability was comparable between the TOV (coefficient of variation [CV]: 6.42 ± 2.41%) and FT (CV: 6.53 ± 2.17%) during the free-weight CMJ, but it was higher for the FT when the CMJ was performed in a Smith machine (CV: 11.34 ± 3.73% for TOV and 5.95 ± 1.12% for FT). Bland-Altman plots revealed trivial differences (≤0.27 cm) and no heteroscedasticity of the errors (R ≤ 0.09) for the jump height obtained by the TOV and FT procedures, whereas the random error between both procedures was higher for the CMJ performed in the Smith machine (2.02 cm) compared with the free-weight barbell (1.26 cm). Based on these results, we recommend the FT procedure to determine jump height during the loaded CMJ performed in a Smith machine, whereas the TOV and FT procedures provide similar reliability during the free-weight CMJ.

  7. Integration of human reliability analysis into the probabilistic risk assessment process: Phase 1

    International Nuclear Information System (INIS)

    Bell, B.J.; Vickroy, S.C.

    1984-10-01

    A research program was initiated to develop a testable set of analytical procedures for integrating human reliability analysis (HRA) into the probabilistic risk assessment (PRA) process to more adequately assess the overall impact of human performance on risk. In this three-phase program, stand-alone HRA/PRA analytic procedures will be developed and field evaluated to provide improved methods, techniques, and models for applying quantitative and qualitative human error data which systematically integrate HRA principles, techniques, and analyses throughout the entire PRA process. Phase 1 of the program involved analysis of state-of-the-art PRAs to define the structures and processes currently in use in the industry. Phase 2 research will involve developing a new or revised PRA methodology which will enable more efficient regulation of the industry using quantitative or qualitative results of the PRA. Finally, Phase 3 will be to field test those procedures to assure that the results generated by the new methodologies will be usable and acceptable to the NRC. This paper briefly describes the first phase of the program and outlines the second

  8. Standard operating procedures for pre-analytical handling of blood and urine for metabolomic studies and biobanks

    Energy Technology Data Exchange (ETDEWEB)

    Bernini, Patrizia; Bertini, Ivano, E-mail: bertini@cerm.unifi.it; Luchinat, Claudio [University of Florence, Magnetic Resonance Center (CERM) (Italy); Nincheri, Paola; Staderini, Samuele [FiorGen Foundation (Italy); Turano, Paola [University of Florence, Magnetic Resonance Center (CERM) (Italy)

    2011-04-15

    {sup 1}H NMR metabolic profiling of urine, serum and plasma has been used to monitor the impact of the pre-analytical steps on the sample quality and stability in order to propose standard operating procedures (SOPs) for deposition in biobanks. We analyzed the quality of serum and plasma samples as a function of the elapsed time (t = 0-4 h) between blood collection and processing and of the time from processing to freezing (up to 24 h). The stability of the urine metabolic profile over time (up to 24 h) at various storage temperatures was monitored as a function of the different pre-analytical treatments like pre-storage centrifugation, filtration, and addition of the bacteriostatic preservative sodium azide. Appreciable changes in the profiles, reflecting changes in the concentration of a number of metabolites, were detected and discussed in terms of chemical and enzymatic reactions for both blood and urine samples. Appropriate procedures for blood derivatives collection and urine preservation/storage that allow maintaining as much as possible the original metabolic profile of the fresh samples emerge, and are proposed as SOPs for biobanking.

  9. Standard operating procedures for pre-analytical handling of blood and urine for metabolomic studies and biobanks.

    Science.gov (United States)

    Bernini, Patrizia; Bertini, Ivano; Luchinat, Claudio; Nincheri, Paola; Staderini, Samuele; Turano, Paola

    2011-04-01

    (1)H NMR metabolic profiling of urine, serum and plasma has been used to monitor the impact of the pre-analytical steps on the sample quality and stability in order to propose standard operating procedures (SOPs) for deposition in biobanks. We analyzed the quality of serum and plasma samples as a function of the elapsed time (t = 0-4 h) between blood collection and processing and of the time from processing to freezing (up to 24 h). The stability of the urine metabolic profile over time (up to 24 h) at various storage temperatures was monitored as a function of the different pre-analytical treatments like pre-storage centrifugation, filtration, and addition of the bacteriostatic preservative sodium azide. Appreciable changes in the profiles, reflecting changes in the concentration of a number of metabolites, were detected and discussed in terms of chemical and enzymatic reactions for both blood and urine samples. Appropriate procedures for blood derivatives collection and urine preservation/storage that allow maintaining as much as possible the original metabolic profile of the fresh samples emerge, and are proposed as SOPs for biobanking.

  10. Standard operating procedures for pre-analytical handling of blood and urine for metabolomic studies and biobanks

    International Nuclear Information System (INIS)

    Bernini, Patrizia; Bertini, Ivano; Luchinat, Claudio; Nincheri, Paola; Staderini, Samuele; Turano, Paola

    2011-01-01

    1 H NMR metabolic profiling of urine, serum and plasma has been used to monitor the impact of the pre-analytical steps on the sample quality and stability in order to propose standard operating procedures (SOPs) for deposition in biobanks. We analyzed the quality of serum and plasma samples as a function of the elapsed time (t = 0−4 h) between blood collection and processing and of the time from processing to freezing (up to 24 h). The stability of the urine metabolic profile over time (up to 24 h) at various storage temperatures was monitored as a function of the different pre-analytical treatments like pre-storage centrifugation, filtration, and addition of the bacteriostatic preservative sodium azide. Appreciable changes in the profiles, reflecting changes in the concentration of a number of metabolites, were detected and discussed in terms of chemical and enzymatic reactions for both blood and urine samples. Appropriate procedures for blood derivatives collection and urine preservation/storage that allow maintaining as much as possible the original metabolic profile of the fresh samples emerge, and are proposed as SOPs for biobanking.

  11. Analytical Procedures for Testability.

    Science.gov (United States)

    1983-01-01

    Beat Internal Classifications", AD: A018516. "A System of Computer Aided Diagnosis with Blood Serum Chemistry Tests and Bayesian Statistics", AD: 786284...6 LIST OF TALS .. 1. Truth Table ......................................... 49 2. Covering Problem .............................. 93 3. Primary and...quential classification procedure in a coronary care ward is evaluated. In the toxicology field "A System of Computer Aided Diagnosis with Blood Serum

  12. Radioimmunoassay. A revolution in the analytic procedure

    Energy Technology Data Exchange (ETDEWEB)

    Strecker, H; Eckert, H G [Farbwerke Hoechst A.G., Frankfurt am Main (Germany, F.R.). Radiochemisches Lab.

    1978-04-01

    Radioimmunoassay is an analytic method which combines the sensitivity of radioactive measurement and the specificity of the antigen-antibody reaction Substances down to a concentration of some picograms per ml serum (or biological material) can be measured in the presence of a millionfold excess of otherwise interfering substances. The method is easy to carry out (test tube chemistry). The main field of application at the moment is in endocrinology; further possibilities of application are in pharmaceutical research, environmental protection, forensic medicine, and for general analytic purposes. Radioactive sources are used only in vitro in the nanocurie range, i.e. radiation exposure is negligible.

  13. Simultaneous grouping and ranking with combination of SOM and TOPSIS for selection of preferable analytical procedure for furan determination in food.

    Science.gov (United States)

    Jędrkiewicz, Renata; Tsakovski, Stefan; Lavenu, Aurore; Namieśnik, Jacek; Tobiszewski, Marek

    2018-02-01

    Novel methodology for grouping and ranking with application of self-organizing maps and multicriteria decision analysis is presented. The dataset consists of 22 objects that are analytical procedures applied to furan determination in food samples. They are described by 10 variables, referred to their analytical performance, environmental and economic aspects. Multivariate statistics analysis allows to limit the amount of input data for ranking analysis. Assessment results show that the most beneficial procedures are based on microextraction techniques with GC-MS final determination. It is presented how the information obtained from both tools complement each other. The applicability of combination of grouping and ranking is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Optimization of instrumental neutron activation analysis method by means of 2{sup k} experimental design technique aiming the validation of analytical procedures

    Energy Technology Data Exchange (ETDEWEB)

    Petroni, Robson; Moreira, Edson G., E-mail: rpetroni@ipen.br, E-mail: emoreira@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) methods were carried out for the determination of the elements arsenic, chromium, cobalt, iron, rubidium, scandium, selenium and zinc in biological materials. The aim is to validate the analytical methods for future accreditation at the National Institute of Metrology, Quality and Technology (INMETRO). The 2{sup k} experimental design was applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. Samples of Mussel Tissue Certified Reference Material and multi-element standards were analyzed considering the following variables: sample decay time, counting time and sample distance to detector. The standard multi-element concentration (comparator standard), mass of the sample and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN - CNEN/SP). Optimized conditions were estimated based on the results of z-score tests, main effect and interaction effects. The results obtained with the different experimental configurations were evaluated for accuracy (precision and trueness) for each measurement. (author)

  15. In situ impulse test: an experimental and analytical evaluation of data interpretation procedures

    International Nuclear Information System (INIS)

    1975-08-01

    Special experimental field testing and analytical studies were undertaken at Fort Lawton in Seattle, Washington, to study ''close-in'' wave propagation and evaluate data interpretation procedures for a new in situ impulse test. This test was developed to determine the shear wave velocity and dynamic modulus of soils underlying potential nuclear power plant sites. The test is different from conventional geophysical testing in that the velocity variation with strain is determined for each test. In general, strains between 10 -1 and 10 -3 percent are achieved. The experimental field work consisted of performing special tests in a large test sand fill to obtain detailed ''close-in'' data. Six recording transducers were placed at various points on the energy source, while approximately 37 different transducers were installed within the soil fill, all within 7 feet of the energy source. Velocity measurements were then taken simultaneously under controlled test conditions to study shear wave propagation phenomenology and help evaluate data interpretation procedures. Typical test data are presented along with detailed descriptions of the results

  16. A review of simple multiple criteria decision making analytic procedures which are implementable on spreadsheet packages

    Directory of Open Access Journals (Sweden)

    T.J. Stewart

    2003-12-01

    Full Text Available A number of modern multi-criteria decision making aids for the discrete choice problem, are reviewed, with particular emphasis on those which can be implemented on standard commercial spreadsheet packages. Three broad classes of procedures are discussed, namely the analytic hierarchy process, reference point methods, and outranking methods. The broad principles are summarised in a consistent framework, and on a spreadsheet. LOTUS spreadsheets implementing these are available from the author.

  17. An analytical procedure to evaluate electronic integrals for molecular quantum mechanical calculations

    International Nuclear Information System (INIS)

    Mundim, Kleber C.

    2004-01-01

    Full text: We propose an alternative methodology for the calculation of electronic integrals, through an analytical function based on the generalized Gaussian function (q Gaussian), where a single q Gaussian replaces the usual linear combination of Gaussian functions for different basis set. Moreover, the integrals become analytical functions of the interatomic distances. Therefore, when estimating certain quantities such as molecular energy, g Gaussian avoid new calculations of the integrals: they are simply another value of the corresponding function. The procedure proposed here is particularly advantageous, when compared with the usual one, because it reduces drastically the number of two-electronic integrals used in the construction of the Fock matrix, enabling the use of the quantum mechanics in the description of macro-molecular systems. This advantage increases when the size of the molecular systems become larger and more complex. While in the usual approach CPU time increases with n4, in the one proposed here the CPU time scales linearly with n. This catastrophic dependence of the rank the Hamiltonian or Fock matrix with n4 two-electron integrals is a severe bottleneck for petaFLOPS computing time. Its is important to emphasize that this methodology is equally applicable to systems of any sizes, including biomolecules, solid materials and solutions, within the HF, post-HF and DFT theories. (author)

  18. Integration of human reliability analysis into the probabilistic risk assessment process: phase 1

    International Nuclear Information System (INIS)

    Bell, B.J.; Vickroy, S.C.

    1985-01-01

    The US Nuclear Regulatory Commission and Pacific Northwest Laboratory initiated a research program in 1984 to develop a testable set of analytical procedures for integrating human reliability analysis (HRA) into the probabilistic risk assessment (PRA) process to more adequately assess the overall impact of human performance on risk. In this three phase program, stand-alone HRA/PRA analytic procedures will be developed and field evaluated to provide improved methods, techniques, and models for applying quantitative and qualitative human error data which systematically integrate HRA principles, techniques, and analyses throughout the entire PRA process. Phase 1 of the program involved analysis of state-of-the-art PRAs to define the structures and processes currently in use in the industry. Phase 2 research will involve developing a new or revised PRA methodology which will enable more efficient regulation of the industry using quantitative or qualitative results of the PRA. Finally, Phase 3 will be to field test those procedures to assure that the results generated by the new methodologies will be usable and acceptable to the NRC. This paper briefly describes the first phase of the program and outlines the second

  19. Analytical quality, performance indices and laboratory service

    DEFF Research Database (Denmark)

    Hilden, Jørgen; Magid, Erik

    1999-01-01

    analytical error, bias, cost effectiveness, decision-making, laboratory techniques and procedures, mass screening, models, statistical, quality control......analytical error, bias, cost effectiveness, decision-making, laboratory techniques and procedures, mass screening, models, statistical, quality control...

  20. Case study on the use of PSA methods: Human reliability analysis

    International Nuclear Information System (INIS)

    1991-04-01

    The overall objective of treating human reliability in a probabilistic safety analysis is to ensure that the key human interactions of typical crews are accurately and systematically incorporated into the study in a traceable manner. An additional objective is to make the human reliability analysis (HRA) as realistic as possible, taking into account the emergency procedures, the man-machine interface, the focus of training process, and the knowledge and experience of the crews. Section 3 of the paper describes an overview of this analytical process which leads to three more detailed example problems described in Section 4. Section 5 discusses a peer review process. References are presented that are useful in performing HRAs. In addition appendices are provided for definitions, selected data and a generic list of performance shaping factors. 35 refs, figs and tabs

  1. The DYLAM approach to systems safety and reliability assessment

    International Nuclear Information System (INIS)

    Amendola, A.

    1988-01-01

    A survey of the principal features and applications of DYLAM (Dynamic Logical Analytical Methodology) is presented, whose basic principles can be summarized as follows: after a particular modelling of the component states, computerized heuristical procedures generate stochastic configurations of the system, whereas the resulting physical processes are simultaneously simulated to give account of the possible interactions between physics and states and, on the other hand, to search for system dangerous configurations and related probabilities. The association of probabilistic techniques for describing the states with physical equations for describing the process results in a very powerful tool for safety and reliability assessment of systems potentially subjected to dangerous incidental transients. A comprehensive picture of DYLAM capability for manifold applications can be obtained by the review of the study cases analyzed (LMFBR core accident, systems reliability assessment, accident simulation, man-machine interaction analysis, chemical reactors safety, etc.)

  2. Solution standards for quality control of nuclear-material analytical measurements

    International Nuclear Information System (INIS)

    Clark, J.P.

    1981-01-01

    Analytical chemistry measurement control depends upon reliable solution standards. At the Savannah River Plant Control Laboratory over a thousand analytical measurements are made daily for process control, product specification, accountability, and nuclear safety. Large quantities of solution standards are required for a measurement quality control program covering the many different analytical chemistry methods. Savannah River Plant produced uranium, plutonium, neptunium, and americium metals or oxides are dissolved to prepare stock solutions for working or Quality Control Standards (QCS). Because extensive analytical effort is required to characterize or confirm these solutions, they are prepared in large quantities. These stock solutions are diluted and blended with different chemicals and/or each other to synthesize QCS that match the matrices of different process streams. The target uncertainty of a standard's reference value is 10% of the limit of error of the methods used for routine measurements. Standard Reference Materials from NBS are used according to special procedures to calibrate the methods used in measuring the uranium and plutonium standards so traceability can be established. Special precautions are required to minimize the effects of temperature, radiolysis, and evaporation. Standard reference values are periodically corrected to eliminate systematic errors caused by evaporation or decay products. Measurement control is achieved by requiring analysts to analyze a blind QCS each shift a measurement system is used on plant samples. Computer evaluation determines whether or not a measurement is within the +- 3 sigma control limits. Monthly evaluations of the QCS measurements are made to determine current bias correction factors for accountability measurements and detect significant changes in the bias and precision statistics. The evaluations are also used to plan activities for improving the reliability of the analytical chemistry measurements

  3. Analytical quality control in environmental analysis - Recent results and future trends of the IAEA's analytical quality control programme

    Energy Technology Data Exchange (ETDEWEB)

    Suschny, O; Heinonen, J

    1973-12-01

    The significance of analytical results depends critically on the degree of their reliability, an assessment of this reliability is indispensable if the results are to have any meaning at all. Environmental radionuclide analysis is a relatively new analytical field in which new methods are continuously being developed and into which many new laboratories have entered during the last ten to fifteen years. The scarcity of routine methods and the lack of experience of the new laboratories have made the need for the assessment of the reliability of results particularly urgent in this field. The IAEA, since 1962, has provided assistance to its member states by making available to their laboratories analytical quality control services in the form of standard samples, reference materials and the organization of analytical intercomparisons. The scope of this programme has increased over the years and now includes, in addition to environmental radionuclides, non-radioactive environmental contaminants which may be analysed by nuclear methods, materials for forensic neutron activation analysis, bioassay materials and nuclear fuel. The results obtained in recent intercomparisons demonstrate the continued need for these services. (author)

  4. Design-reliability assurance program application to ACP600

    International Nuclear Information System (INIS)

    Zhichao, Huang; Bo, Zhao

    2012-01-01

    ACP600 is a newly nuclear power plant technology made by CNNC in China and it is based on the Generation III NPPs design experience and general safety goals. The ACP600 Design Reliability Assurance Program (D-RAP) is implemented as an integral part of the ACP600 design process. A RAP is a formal management system which assures the collection of important characteristic information about plant performance throughout each phase of its life and directs the use of this information in the implementation of analytical and management process which are specifically designed to meet two specific objects: confirm the plant goals and cost effective improvements. In general, typical reliability assurance program have 4 broad functional elements: 1) Goals and performance criteria; 2) Management system and implementing procedures; 3) Analytical tools and investigative methods; and 4) Information management. In this paper we will use the D-RAP technical and Risk-Informed requirements, and establish the RAM and PSA model to optimize the ACP600 design. Compared with previous design process, the D-RAP is more competent for the higher design targets and requirements, enjoying more creativity through an easier implementation of technical breakthroughs. By using D-RAP, the plants goals, system goals, performance criteria and safety criteria can be easier to realize, and the design can be optimized and more rational

  5. General Procedure for the Easy Calculation of pH in an Introductory Course of General or Analytical Chemistry

    Science.gov (United States)

    Cepriá, Gemma; Salvatella, Luis

    2014-01-01

    All pH calculations for simple acid-base systems used in introductory courses on general or analytical chemistry can be carried out by using a general procedure requiring the use of predominance diagrams. In particular, the pH is calculated as the sum of an independent term equaling the average pK[subscript a] values of the acids involved in the…

  6. Reactor Section standard analytical methods. Part 1

    Energy Technology Data Exchange (ETDEWEB)

    Sowden, D.

    1954-07-01

    the Standard Analytical Methods manual was prepared for the purpose of consolidating and standardizing all current analytical methods and procedures used in the Reactor Section for routine chemical analyses. All procedures are established in accordance with accepted practice and the general analytical methods specified by the Engineering Department. These procedures are specifically adapted to the requirements of the water treatment process and related operations. The methods included in this manual are organized alphabetically within the following five sections which correspond to the various phases of the analytical control program in which these analyses are to be used: water analyses, essential material analyses, cotton plug analyses boiler water analyses, and miscellaneous control analyses.

  7. Analytical Chemistry Laboratory (ACL) procedure compendium. Volume 1, Administrative

    Energy Technology Data Exchange (ETDEWEB)

    1992-06-01

    Covered are: analytical laboratory operations (ALO) sample receipt and control, ALO data report/package preparation review and control, single shell tank (PST) project sample tracking system, sample receiving, analytical balances, duties and responsibilities of sample custodian, sample refrigerator temperature monitoring, security, assignment of staff responsibilities, sample storage, data reporting, and general requirements for glassware.

  8. Th-U-PbT dating by Electron Probe Microanalysis, Part I. Monazite: analytical procedures and data treatment

    International Nuclear Information System (INIS)

    Vlach, Silvio Roberto Farias

    2010-01-01

    Dating methodology by the electron probe microanalyser (EPMA) of (Th, U)-bearing minerals, highlighting monazite, acquired greater than ever importance in literature, particularly due to its superior spatial resolution, as well as versatility, which allow correlating petrological processes at times registered only in micro-scales in minerals and rocks with absolute ages. Although the accuracy is inferior to the one achieved with conventional isotopic methods in up to an order of magnitude, EPMA is the instrument that allows the best spatial resolution, reaching a few μm 3 in some conditions. Quantification of minor and trace elements with suitable precision and accuracy involves the own instrumental and analytical set-ups and data treatment strategies, significantly more rigorous when compared with those applied in conventional analyses. Th-U-Pb T dating is an example of these cases. Each EPMA is a unique machine as for its instrumental characteristics and respective automation system. In such a way, analytical procedures ought to be adjusted for laboratory specificities. The analytical strategies and data treatment adopted in the Electronic Microprobe Laboratory from Instituto de Geociencias of Universidade de Sao Paulo, Brazil, with a JEOL JXA8600S EPMA, and a ThermoNoran-Voyager 4.3 automation system, are presented and compared with the ones used in other laboratories. The influence of instrumental factors and spectral overlaps on Th, U, and Pb quantification is discussed. Applied procedures to interference correction, error propagation, data treatment, and fi nal chemical age presentation as well as to sampling and analyses are emphasized. Some typical applications are discussed, drawing attention to the most relevant aspects of electron microprobe dating. (author)

  9. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    Science.gov (United States)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  10. Stress analysis of R2 pressure vessel. Structural reliability benchmark exercise

    International Nuclear Information System (INIS)

    Vestergaard, N.

    1987-05-01

    The Structural Reliability Benchmark Exercise (SRBE) is sponsored by the EEC as part of the Reactor Safety Programme. The objectives of the SRBE are to evaluate and improve 1) inspection procedures, which use non-destructive methods to locate defects in pressure (reactor) vessels, as well as 2) analytical damage accumulation models, which predict the time to failure of vessels containing defects. In order to focus attention, an experimental presure vessel has been inspected, subjected fatigue loadings and subsequently analysed by several teams using methods of their choice. The present report contains the first part of the analytical damage accumulation analysis. The stress distributions in the welds of the experimental pressure vessel were determined. These stress distributions will be used to determine the driving forces of the damage accumulation models, which will be addressed in a future report. (author)

  11. Reliability based code calibration of fatigue design criteria of nuclear Class-1 piping

    International Nuclear Information System (INIS)

    Mishra, J.; Balasubramaniyan, V.; Chellapandi, P.

    2016-01-01

    Fatigue design of Class-l piping of NPP is carried out using Section-III of American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel code. The fatigue design criteria of ASME are based on the concept of safety factor, which does not provide means for the management of uncertainties for consistently reliable and economical designs. In this regards, a work is taken up to estimate the implicit reliability level associated with fatigue design criteria of Class-l piping specified by ASME Section III, NB-3650. As ASME fatigue curve is not in the form of analytical expression, the reliability level of pipeline fittings and joints is evaluated using the mean fatigue curve developed by Argonne National Laboratory (ANL). The methodology employed for reliability evaluation is FORM, HORSM and MCS. The limit state function for fatigue damage is found to be sensitive to eight parameters, which are systematically modelled as stochastic variables during reliability estimation. In conclusion a number of important aspects related to reliability of various piping product and joints are discussed. A computational example illustrates the developed procedure for a typical pipeline. (author)

  12. Toxicologic evaluation of analytes from Tank 241-C-103

    International Nuclear Information System (INIS)

    Mahlum, D.D.; Young, J.Y.; Weller, R.E.

    1994-11-01

    Westinghouse Hanford Company requested PNL to assemble a toxicology review panel (TRP) to evaluate analytical data compiled by WHC, and provide advice concerning potential health effects associated with exposure to tank-vapor constituents. The team's objectives would be to (1) review procedures used for sampling vapors from tanks, (2) identify constituents in tank-vapor samples that could be related to symptoms reported by workers, (3) evaluate the toxicological implications of those constituents by comparison to establish toxicological databases, (4) provide advice for additional analytical efforts, and (5) support other activities as requested by WHC. The TRP represents a wide range of expertise, including toxicology, industrial hygiene, and occupational medicine. The TRP prepared a list of target analytes that chemists at the Oregon Graduate Institute/Sandia (OGI), Oak Ridge National Laboratory (ORNL), and PNL used to establish validated methods for quantitative analysis of head-space vapors from Tank 241-C-103. this list was used by the analytical laboratories to develop appropriate analytical methods for samples from Tank 241-C-103. Target compounds on the list included acetone, acetonitrile, ammonia, benzene, 1, 3-butadiene, butanal, n-butanol, hexane, 2-hexanone, methylene chloride, nitric oxide, nitrogen dioxide, nitrous oxide, dodecane, tridecane, propane nitrile, sulfur oxide, tributyl phosphate, and vinylidene chloride. The TRP considered constituent concentrations, current exposure limits, reliability of data relative to toxicity, consistency of the analytical data, and whether the material was carcinogenic or teratogenic. A final consideration in the analyte selection process was to include representative chemicals for each class of compounds found

  13. A procedure to obtain reliable pair distribution functions of non-crystalline materials from diffraction data

    International Nuclear Information System (INIS)

    Hansen, F.Y.; Carneiro, K.

    1977-01-01

    A simple numerical method, which unifies the calculation of structure factors from X-ray or neutron diffraction data with the calculation of reliable pair distribution functions, is described. The objective of the method is to eliminate systematic errors in the normalizations and corrections of the intensity data, and to provide measures for elimination of truncation errors without losing information about the structure. This is done through an iterative procedure, which is easy to program for computers. The applications to amorphous selenium and diatomic liquids are briefly reviewed. (Auth.)

  14. HASL procedures manual

    International Nuclear Information System (INIS)

    Harley, J.H.

    1977-08-01

    Additions and corrections to the following sections of the HASL Procedures Manual are provided: General, Sampling, Field Measurements; General Analytical Chemistry, Chemical Procedures, Data Section, and Specifications

  15. Improving machinery reliability

    CERN Document Server

    Bloch, Heinz P

    1998-01-01

    This totally revised, updated and expanded edition provides proven techniques and procedures that extend machinery life, reduce maintenance costs, and achieve optimum machinery reliability. This essential text clearly describes the reliability improvement and failure avoidance steps practiced by best-of-class process plants in the U.S. and Europe.

  16. Green analytical chemistry introduction to chloropropanols determination at no economic and analytical performance costs?

    Science.gov (United States)

    Jędrkiewicz, Renata; Orłowski, Aleksander; Namieśnik, Jacek; Tobiszewski, Marek

    2016-01-15

    In this study we perform ranking of analytical procedures for 3-monochloropropane-1,2-diol determination in soy sauces by PROMETHEE method. Multicriteria decision analysis was performed for three different scenarios - metrological, economic and environmental, by application of different weights to decision making criteria. All three scenarios indicate capillary electrophoresis-based procedure as the most preferable. Apart from that the details of ranking results differ for these three scenarios. The second run of rankings was done for scenarios that include metrological, economic and environmental criteria only, neglecting others. These results show that green analytical chemistry-based selection correlates with economic, while there is no correlation with metrological ones. This is an implication that green analytical chemistry can be brought into laboratories without analytical performance costs and it is even supported by economic reasons. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Processes and Procedures for Estimating Score Reliability and Precision

    Science.gov (United States)

    Bardhoshi, Gerta; Erford, Bradley T.

    2017-01-01

    Precision is a key facet of test development, with score reliability determined primarily according to the types of error one wants to approximate and demonstrate. This article identifies and discusses several primary forms of reliability estimation: internal consistency (i.e., split-half, KR-20, a), test-retest, alternate forms, interscorer, and…

  18. Reliability demonstration test planning using bayesian analysis

    International Nuclear Information System (INIS)

    Chandran, Senthil Kumar; Arul, John A.

    2003-01-01

    In Nuclear Power Plants, the reliability of all the safety systems is very critical from the safety viewpoint and it is very essential that the required reliability requirements be met while satisfying the design constraints. From practical experience, it is found that the reliability of complex systems such as Safety Rod Drive Mechanism is of the order of 10 -4 with an uncertainty factor of 10. To demonstrate the reliability of such systems is prohibitive in terms of cost and time as the number of tests needed is very large. The purpose of this paper is to develop a Bayesian reliability demonstrating testing procedure for exponentially distributed failure times with gamma prior distribution on the failure rate which can be easily and effectively used to demonstrate component/subsystem/system reliability conformance to stated requirements. The important questions addressed in this paper are: With zero failures, how long one should perform the tests and how many components are required to conclude with a given degree of confidence, that the component under test, meets the reliability requirement. The procedure is explained with an example. This procedure can also be extended to demonstrate with more number of failures. The approach presented is applicable for deriving test plans for demonstrating component failure rates of nuclear power plants, as the failure data for similar components are becoming available in existing plants elsewhere. The advantages of this procedure are the criterion upon which the procedure is based is simple and pertinent, the fitting of the prior distribution is an integral part of the procedure and is based on the use of information regarding two percentiles of this distribution and finally, the procedure is straightforward and easy to apply in practice. (author)

  19. Comparison of Three Analytical Methods for Separation of Mineral and Chelated Fraction from an Adulterated Zn-EDTA Fertilizer

    International Nuclear Information System (INIS)

    Khan, M.S.; Qazi, M.A.; Khan, N.A.; Mian, S.M.; Ahmed, N.; Ahmed, N.

    2013-01-01

    Summary: Different analytical procedures are being employed in the world to quantify the chelated portion in a Zn-EDTA fertilizer. Agriculture Department, Government of the Punjab is following Shahid's analytical method in this regard. This method is based on Ion-chromatography (IC) that separates the mineral zinc (Zn) from an adulterated Zn-EDTA fertilizer sample i.e. mixture of mineral and chelated Zn fractions. To find out its effectiveness and suitability, this comparative study was carried out by analyzing adulterated, non-adulterated Zn-EDTA standard and Zn-EDTA samples taken from market in thrice following three methods namely Shahid's (IC) analytical method, Atomic Absorption Spectrophotometric (AAS) method based on the principle of precipitating the mineral Zn fraction at high pH value by using alkali solution of suitable concentration and analysis of filtrate containing only chelated fraction and Association of Official Analytical Chemists (AOAC) method FM-841 respectively. Adulterated Zn-EDTA samples were prepared by mixing of known quantity of mineral Zn with chelated Zn-EDTA standard. The results showed that Shahid's analytical method and AAS method, both successfully estimated the chelated fraction. The AOAC FM-841 method was insensitive to put a ceiling on the mineral fraction hence did not furnish the reliable results. The Shahid's analytical method was selected being equallyeffective to produce reliable results both for solid and liquid Zn-EDTA samples. The AAS method was comparable in only liquid samples. (author)

  20. Design Protocols and Analytical Strategies that Incorporate Structural Reliability Models

    Science.gov (United States)

    Duffy, Stephen F.

    1997-01-01

    Ceramic matrix composites (CMC) and intermetallic materials (e.g., single crystal nickel aluminide) are high performance materials that exhibit attractive mechanical, thermal and chemical properties. These materials are critically important in advancing certain performance aspects of gas turbine engines. From an aerospace engineer's perspective the new generation of ceramic composites and intermetallics offers a significant potential for raising the thrust/weight ratio and reducing NO(x) emissions of gas turbine engines. These aspects have increased interest in utilizing these materials in the hot sections of turbine engines. However, as these materials evolve and their performance characteristics improve a persistent need exists for state-of-the-art analytical methods that predict the response of components fabricated from CMC and intermetallic material systems. This need provided the motivation for the technology developed under this research effort. Continuous ceramic fiber composites exhibit an increase in work of fracture, which allows for "graceful" rather than catastrophic failure. When loaded in the fiber direction, these composites retain substantial strength capacity beyond the initiation of transverse matrix cracking despite the fact that neither of its constituents would exhibit such behavior if tested alone. As additional load is applied beyond first matrix cracking, the matrix tends to break in a series of cracks bridged by the ceramic fibers. Any additional load is born increasingly by the fibers until the ultimate strength of the composite is reached. Thus modeling efforts supported under this research effort have focused on predicting this sort of behavior. For single crystal intermetallics the issues that motivated the technology development involved questions relating to material behavior and component design. Thus the research effort supported by this grant had to determine the statistical nature and source of fracture in a high strength, Ni

  1. A column exchange chromatographic procedure for the automated purification of analytical samples in nuclear spent fuel reprocessing and plutonium fuel fabrication

    International Nuclear Information System (INIS)

    Zahradnik, P.; Swietly, H.; Doubek, N.; Bagliano, G.

    1992-11-01

    A Column Exchange Chromatographic procedure using Tri-n-Octyl-Phosphine-Oxide (TOPO) as stationary phase, is in routine use at SAL since 1984 on nuclear spent fuel reprocessing and on Pu product samples, prior to alpha and mass spectrometric analysis. This standard procedure was further on modified in view of its automation in a glove box; the resulting new procedure is described in this paper. Laboratory Robot Compatible (LRC) disposable columns were selected because their dimensions are particularly favorable and reproducible. A less corrosive HNO 3 -HI mixture substituted the former HC1-HI plutonium eluant. The inorganic support of the stationary phase used to test the above mentioned changes was unexpectedly withdrawn from the market so that another support had to be selected and the procedure reoptimized accordingly. The resulting procedure was tested with the robot and validated against the manual procedure taken as reference: the comparison showed that the modified procedure meets the analytical requirements and has the same performance than the original procedure. (author). Refs, figs and tabs

  2. Th-U-Pb{sub T} dating by electron probe microanalysis, Part I. Monazite: analytical procedures and data treatment

    Energy Technology Data Exchange (ETDEWEB)

    Vlach, Silvio Roberto Farias [Universidade de Sao Paulo (USP), Sao Carlos, SP (Brazil). Inst. de Geociencias. Dept. de Mineralogia e Geotectonica], e-mail: srfvlach@usp.br

    2010-03-15

    Dating methodology by the electron probe microanalyser (EPMA) of (Th, U)-bearing minerals, highlighting monazite, acquired greater than ever importance in literature, particularly due to its superior spatial resolution, as well as versatility, which allow correlating petrological processes at times registered only in micro-scales in minerals and rocks with absolute ages. Although the accuracy is inferior to the one achieved with conventional isotopic methods in up to an order of magnitude, EPMA is the instrument that allows the best spatial resolution, reaching a few {mu}m{sup 3} in some conditions. Quantification of minor and trace elements with suitable precision and accuracy involves the own instrumental and analytical set-ups and data treatment strategies, significantly more rigorous when compared with those applied in conventional analyses. Th-U-Pb{sub T} dating is an example of these cases. Each EPMA is a unique machine as for its instrumental characteristics and respective automation system. In such a way, analytical procedures ought to be adjusted for laboratory specific cities. The analytical strategies and data treatment adopted in the Electronic Microprobe Laboratory from Instituto de Geociencias of Universidade de Sao Paulo, Brazil, with a JEOL JXA8600S EPMA, and a ThermoNoran-Voyager 4.3 automation system, are presented and compared with the ones used in other laboratories. The influence of instrumental factors and spectral overlaps on Th, U, and Pb quantification is discussed. Applied procedures to interference correction, error propagation, data treatment, and final chemical age presentation as well as to sampling and analyses are emphasized. Some typical applications are discussed, drawing attention to the most relevant aspects of electron microprobe dating. (author)

  3. Analytical Chemistry Laboratory (ACL) procedure compendium

    International Nuclear Information System (INIS)

    1993-01-01

    This volume contains the interim change notice for the safety operation procedure for hot cell. It covers the master-slave manipulators, dry waste removal, cell transfers, hoists, cask handling, liquid waste system, and physical characterization of fluids

  4. Analytical procedure for characterization of medieval wall-paintings by X-ray fluorescence spectrometry, laser ablation inductively coupled plasma mass spectrometry and Raman spectroscopy

    International Nuclear Information System (INIS)

    Syta, Olga; Rozum, Karol; Choińska, Marta; Zielińska, Dobrochna; Żukowska, Grażyna Zofia; Kijowska, Agnieszka; Wagner, Barbara

    2014-01-01

    Analytical procedure for the comprehensive chemical characterization of samples from medieval Nubian wall-paintings by means of portable X-ray fluorescence (pXRF), laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS) and Raman spectroscopy (RS) was proposed in this work. The procedure was used for elemental and molecular investigations of samples from archeological excavations in Nubia (modern southern Egypt and northern Sudan). Numerous remains of churches with painted decorations dated back to the 7th–14th century were excavated in the region of medieval kingdoms of Nubia but many aspects of this art and its technology are still unknown. Samples from the selected archeological sites (Faras, Old Dongola and Banganarti) were analyzed in the form of transfers (n = 26), small fragments collected during the excavations (n = 35) and cross sections (n = 15). XRF was used to collect data about elemental composition, LA-ICPMS allowed mapping of selected elements, while RS was used to get the molecular information about the samples. The preliminary results indicated the usefulness of the proposed analytical procedure for distinguishing the substances, from both the surface and sub-surface domains of the wall-paintings. The possibility to identify raw materials from the wall-paintings will be used in the further systematic, archeometric studies devoted to the detailed comparison of various historic Nubian centers. - Highlights: • The analytical procedure for examination of unique wall paintings was proposed. • Identification of pigments and supporting layers of wall-paintings was obtained. • Heterogeneous samples were mapped with the use of LA-ICPMS. • Anatase in the sub-surface regions of samples was detected by Raman spectroscopy

  5. Analytical procedure for characterization of medieval wall-paintings by X-ray fluorescence spectrometry, laser ablation inductively coupled plasma mass spectrometry and Raman spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Syta, Olga; Rozum, Karol; Choińska, Marta [Faculty of Chemistry, University of Warsaw, Pasteura 1, 02-093 Warsaw (Poland); Zielińska, Dobrochna [Institute of Archaeology, University of Warsaw, Krakowskie Przedmieście 26/28, 00-927 Warsaw (Poland); Żukowska, Grażyna Zofia [Chemical Faculty, Warsaw University of Technology, Noakowskiego 3, 00-664 Warsaw (Poland); Kijowska, Agnieszka [National Museum in Warsaw, Aleje Jerozolimskie 3, 00-495 Warsaw (Poland); Wagner, Barbara, E-mail: barbog@chem.uw.edu.pl [Faculty of Chemistry, University of Warsaw, Pasteura 1, 02-093 Warsaw (Poland)

    2014-11-01

    Analytical procedure for the comprehensive chemical characterization of samples from medieval Nubian wall-paintings by means of portable X-ray fluorescence (pXRF), laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS) and Raman spectroscopy (RS) was proposed in this work. The procedure was used for elemental and molecular investigations of samples from archeological excavations in Nubia (modern southern Egypt and northern Sudan). Numerous remains of churches with painted decorations dated back to the 7th–14th century were excavated in the region of medieval kingdoms of Nubia but many aspects of this art and its technology are still unknown. Samples from the selected archeological sites (Faras, Old Dongola and Banganarti) were analyzed in the form of transfers (n = 26), small fragments collected during the excavations (n = 35) and cross sections (n = 15). XRF was used to collect data about elemental composition, LA-ICPMS allowed mapping of selected elements, while RS was used to get the molecular information about the samples. The preliminary results indicated the usefulness of the proposed analytical procedure for distinguishing the substances, from both the surface and sub-surface domains of the wall-paintings. The possibility to identify raw materials from the wall-paintings will be used in the further systematic, archeometric studies devoted to the detailed comparison of various historic Nubian centers. - Highlights: • The analytical procedure for examination of unique wall paintings was proposed. • Identification of pigments and supporting layers of wall-paintings was obtained. • Heterogeneous samples were mapped with the use of LA-ICPMS. • Anatase in the sub-surface regions of samples was detected by Raman spectroscopy.

  6. Background Contamination by Coplanar Polychlorinated Biphenyls (PCBS) in Trace Level High Resolution Gas Chromatography/High Resolution Mass Spectrometry (HRGC/HRMS) Analytical Procedures

    Science.gov (United States)

    The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for t...

  7. AN ANALYTICAL FRAMEWORK FOR ASSESSING RELIABLE NUCLEAR FUEL SERVICE APPROACHES: ECONOMIC AND NON-PROLIFERATION MERITS OF NUCLEAR FUEL LEASING

    International Nuclear Information System (INIS)

    Kreyling, Sean J.; Brothers, Alan J.; Short, Steven M.; Phillips, Jon R.; Weimar, Mark R.

    2010-01-01

    The goal of international nuclear policy since the dawn of nuclear power has been the peaceful expansion of nuclear energy while controlling the spread of enrichment and reprocessing technology. Numerous initiatives undertaken in the intervening decades to develop international agreements on providing nuclear fuel supply assurances, or reliable nuclear fuel services (RNFS) attempted to control the spread of sensitive nuclear materials and technology. In order to inform the international debate and the development of government policy, PNNL has been developing an analytical framework to holistically evaluate the economics and non-proliferation merits of alternative approaches to managing the nuclear fuel cycle (i.e., cradle-to-grave). This paper provides an overview of the analytical framework and discusses preliminary results of an economic assessment of one RNFS approach: full-service nuclear fuel leasing. The specific focus of this paper is the metrics under development to systematically evaluate the non-proliferation merits of fuel-cycle management alternatives. Also discussed is the utility of an integrated assessment of the economics and non-proliferation merits of nuclear fuel leasing.

  8. An overview of the IAEA Safety Series on procedures for evaluating the reliability of predictions made by environmental transfer models

    International Nuclear Information System (INIS)

    Hoffman, F.W.; Hofer, E.

    1987-10-01

    The International Atomic Energy Agency is preparing a Safety Series publication on practical approaches for evaluating the reliability of the predictions made by environmental radiological assessment models. This publication identifies factors that affect the reliability of these predictions and discusses methods for quantifying uncertainty. Emphasis is placed on understanding the quantity of interest specified by the assessment question and distinguishing between stochastic variability and lack of knowledge about either the true value or the true distribution of values for quantity of interest. Among the many approaches discussed, model testing using independent data sets (model validation) is considered the best method for evaluating the accuracy in model predictions. Analytical and numerical methods for propagating the uncertainties in model parameters are presented and the strengths and weaknesses of model intercomparison exercises are also discussed. It is recognized that subjective judgment is employed throughout the entire modelling process, and quantitative reliability statements must be subjectively obtained when models are applied to different situations from those under which they have been tested. (6 refs.)

  9. Pilot testing of SHRP 2 reliability data and analytical products: Washington.

    Science.gov (United States)

    2014-07-30

    The second Strategic Highway Research Program (SHRP 2) addresses the challenges of moving people and goods efficiently and safely on the nations highways. In its Reliability focus area, the research emphasizes improving the reliability of highway ...

  10. Reliability analysis using network simulation

    International Nuclear Information System (INIS)

    Engi, D.

    1985-01-01

    The models that can be used to provide estimates of the reliability of nuclear power systems operate at many different levels of sophistication. The least-sophisticated models treat failure processes that entail only time-independent phenomena (such as demand failure). More advanced models treat processes that also include time-dependent phenomena such as run failure and possibly repair. However, many of these dynamic models are deficient in some respects because they either disregard the time-dependent phenomena that cannot be expressed in closed-form analytic terms or because they treat these phenomena in quasi-static terms. The next level of modeling requires a dynamic approach that incorporates not only procedures for treating all significant time-dependent phenomena but also procedures for treating these phenomena when they are conditionally linked or characterized by arbitrarily selected probability distributions. The level of sophistication that is required is provided by a dynamic, Monte Carlo modeling approach. A computer code that uses a dynamic, Monte Carlo modeling approach is Q-GERT (Graphical Evaluation and Review Technique - with Queueing), and the present study had demonstrated the feasibility of using Q-GERT for modeling time-dependent, unconditionally and conditionally linked phenomena that are characterized by arbitrarily selected probability distributions

  11. Time-variant reliability assessment through equivalent stochastic process transformation

    International Nuclear Information System (INIS)

    Wang, Zequn; Chen, Wei

    2016-01-01

    Time-variant reliability measures the probability that an engineering system successfully performs intended functions over a certain period of time under various sources of uncertainty. In practice, it is computationally prohibitive to propagate uncertainty in time-variant reliability assessment based on expensive or complex numerical models. This paper presents an equivalent stochastic process transformation approach for cost-effective prediction of reliability deterioration over the life cycle of an engineering system. To reduce the high dimensionality, a time-independent reliability model is developed by translating random processes and time parameters into random parameters in order to equivalently cover all potential failures that may occur during the time interval of interest. With the time-independent reliability model, an instantaneous failure surface is attained by using a Kriging-based surrogate model to identify all potential failure events. To enhance the efficacy of failure surface identification, a maximum confidence enhancement method is utilized to update the Kriging model sequentially. Then, the time-variant reliability is approximated using Monte Carlo simulations of the Kriging model where system failures over a time interval are predicted by the instantaneous failure surface. The results of two case studies demonstrate that the proposed approach is able to accurately predict the time evolution of system reliability while requiring much less computational efforts compared with the existing analytical approach. - Highlights: • Developed a new approach for time-variant reliability analysis. • Proposed a novel stochastic process transformation procedure to reduce the dimensionality. • Employed Kriging models with confidence-based adaptive sampling scheme to enhance computational efficiency. • The approach is effective for handling random process in time-variant reliability analysis. • Two case studies are used to demonstrate the efficacy

  12. Reliability of hospital cost profiles in inpatient surgery.

    Science.gov (United States)

    Grenda, Tyler R; Krell, Robert W; Dimick, Justin B

    2016-02-01

    With increased policy emphasis on shifting risk from payers to providers through mechanisms such as bundled payments and accountable care organizations, hospitals are increasingly in need of metrics to understand their costs relative to peers. However, it is unclear whether Medicare payments for surgery can reliably compare hospital costs. We used national Medicare data to assess patients undergoing colectomy, pancreatectomy, and open incisional hernia repair from 2009 to 2010 (n = 339,882 patients). We first calculated risk-adjusted hospital total episode payments for each procedure. We then used hierarchical modeling techniques to estimate the reliability of total episode payments for each procedure and explored the impact of hospital caseload on payment reliability. Finally, we quantified the number of hospitals meeting published reliability benchmarks. Mean risk-adjusted total episode payments ranged from $13,262 (standard deviation [SD] $14,523) for incisional hernia repair to $25,055 (SD $22,549) for pancreatectomy. The reliability of hospital episode payments varied widely across procedures and depended on sample size. For example, mean episode payment reliability for colectomy (mean caseload, 157) was 0.80 (SD 0.18), whereas for pancreatectomy (mean caseload, 13) the mean reliability was 0.45 (SD 0.27). Many hospitals met published reliability benchmarks for each procedure. For example, 90% of hospitals met reliability benchmarks for colectomy, 40% for pancreatectomy, and 66% for incisional hernia repair. Episode payments for inpatient surgery are a reliable measure of hospital costs for commonly performed procedures, but are less reliable for lower volume operations. These findings suggest that hospital cost profiles based on Medicare claims data may be used to benchmark efficiency, especially for more common procedures. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Analytical procedures used by the uranium - radon - radium geochemistry group; Methodes d'analyses utilisees par la section de geochimie uranium, radon, radium

    Energy Technology Data Exchange (ETDEWEB)

    Berthollet, P [Commissariat a l' Energie Atomique, Fontenay-aux-Roses (France). Centre d' Etudes Nucleaires

    1968-07-01

    The analytical methods described are applied to the geochemical prospecting of uranium. The nature of the material under investigation, which may be soil, alluvium, rock, plant or water, and the particular requirements of geochemical exploration, have prompted us to adjust the widely used conventional methods to the demands of large scale operation, without lowering their standards of accuracy and reliability. These procedures are explained in great detail. Though most of this technical information may appear superfluous to the chemical engineer well versed in trace element determination, it will, however, serve a useful purpose both with the operator in charge of routine testing and with the chemist called upon to interpret results. (author) [French] Les methodes d'analyses decrites sont utilisees pour la prospection geochimique de l'uranium. La nature des materiaux: sols, alluvions, roches, vegetaux, eaux, et les exigences propres a la prospection geochimique, nous ont conduit a adapter des methodes classique couramment utilisees pour les rendre aptes a etre executees en grande serie, sans abandonner leurs qualites de precision et de fidelite. Ces methodes sont presentees avec un maximum de details operatoires qui paraitront superflus aux chimistes habitues aux dosages de traces, mais seront utiles aussi bien aux manipulateurs charges des analyses qu'aux geochimistes appeles a exploiter les resultats. (auteur)

  14. Preparation of methodology for reliability analysis of selected digital segments of the instrumentation and control systems of NPPs. Pt. 1

    International Nuclear Information System (INIS)

    Hustak, S.; Patrik, M.; Babic, P.

    2000-12-01

    The report is structured as follows: (i) Introduction; (ii) Important notions relating to the safety and dependability of software systems for nuclear power plants (selected notions from IAEA Technical Report No. 397; safety aspects of software application; reliability/dependability aspects of digital systems); (iii) Peculiarities of digital systems and ways to a dependable performance of the required function (failures in the system and principles of defence against them; ensuring resistance of digital systems against failures at various hardware and software levels); (iv) The issue of analytical procedures to assess the safety and reliability of safety-related digital systems (safety and reliability assessment at an early stage of the project; general framework of reliability analysis of complex systems; choice of an appropriate quantitative measure of software reliability); (v) Selected qualitative and quantitative information about the reliability of digital systems; the use of relations between the incidence of various types of faults); and (vi) Conclusions and recommendations. (P.A.)

  15. Go-flow: a reliability analysis methodology applicable to piping system

    International Nuclear Information System (INIS)

    Matsuoka, T.; Kobayashi, M.

    1985-01-01

    Since the completion of the Reactor Safety Study, the use of probabilistic risk assessment technique has been becoming more widespread in the nuclear community. Several analytical methods are used for the reliability analysis of nuclear power plants. The GO methodology is one of these methods. Using the GO methodology, the authors performed a reliability analysis of the emergency decay heat removal system of the nuclear ship Mutsu, in order to examine its applicability to piping systems. By this analysis, the authors have found out some disadvantages of the GO methodology. In the GO methodology, the signal is on-to-off or off-to-on signal, therefore the GO finds out the time point at which the state of a system changes, and can not treat a system which state changes as off-on-off. Several computer runs are required to obtain the time dependent failure probability of a system. In order to overcome these disadvantages, the authors propose a new analytical methodology: GO-FLOW. In GO-FLOW, the modeling method (chart) and the calculation procedure are similar to those in the GO methodology, but the meaning of signal and time point, and the definitions of operators are essentially different. In the paper, the GO-FLOW methodology is explained and two examples of the analysis by GO-FLOW are given

  16. Benchmark of systematic human action reliability procedure

    International Nuclear Information System (INIS)

    Spurgin, A.J.; Hannaman, G.W.; Moieni, P.

    1986-01-01

    Probabilistic risk assessment (PRA) methodology has emerged as one of the most promising tools for assessing the impact of human interactions on plant safety and understanding the importance of the man/machine interface. Human interactions were considered to be one of the key elements in the quantification of accident sequences in a PRA. The approach to quantification of human interactions in past PRAs has not been very systematic. The Electric Power Research Institute sponsored the development of SHARP to aid analysts in developing a systematic approach for the evaluation and quantification of human interactions in a PRA. The SHARP process has been extensively peer reviewed and has been adopted by the Institute of Electrical and Electronics Engineers as the basis of a draft guide for the industry. By carrying out a benchmark process, in which SHARP is an essential ingredient, however, it appears possible to assess the strengths and weaknesses of SHARP to aid human reliability analysts in carrying out human reliability analysis as part of a PRA

  17. Analytical toxicology of emerging drugs of abuse--an update.

    Science.gov (United States)

    Meyer, Markus R; Peters, Frank T

    2012-12-01

    The steady increase of new drugs of abuse on the illicit drug market is a great challenge for analytical toxicologists. Because most of these new drugs or drug classes are not included in established analytical methods targeting classic drugs of abuse, analytical procedures must be adapted or new procedures must be developed to cover such new compounds. This review summarizes procedures for analysis of these drugs of abuse published from January 2009 to January 2012 covering the following classes of emerging drugs of abuse as follows: β-keto-amphetamines, pyrrolidinophenones, tryptamines, and synthetic cannabinoids.

  18. Exact reliability quantification of highly reliable systems with maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Bris, Radim, E-mail: radim.bris@vsb.c [VSB-Technical University Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Applied Mathematics, 17. listopadu 15, 70833 Ostrava-Poruba (Czech Republic)

    2010-12-15

    When a system is composed of highly reliable elements, exact reliability quantification may be problematic, because computer accuracy is limited. Inaccuracy can be due to different aspects. For example, an error may be made when subtracting two numbers that are very close to each other, or at the process of summation of many very different numbers, etc. The basic objective of this paper is to find a procedure, which eliminates errors made by PC when calculations close to an error limit are executed. Highly reliable system is represented by the use of directed acyclic graph which is composed from terminal nodes, i.e. highly reliable input elements, internal nodes representing subsystems and edges that bind all of these nodes. Three admissible unavailability models of terminal nodes are introduced, including both corrective and preventive maintenance. The algorithm for exact unavailability calculation of terminal nodes is based on merits of a high-performance language for technical computing MATLAB. System unavailability quantification procedure applied to a graph structure, which considers both independent and dependent (i.e. repeatedly occurring) terminal nodes is based on combinatorial principle. This principle requires summation of a lot of very different non-negative numbers, which may be a source of an inaccuracy. That is why another algorithm for exact summation of such numbers is designed in the paper. The summation procedure uses benefits from a special number system with the base represented by the value 2{sup 32}. Computational efficiency of the new computing methodology is compared with advanced simulation software. Various calculations on systems from references are performed to emphasize merits of the methodology.

  19. A limited assessment of the ASEP human reliability analysis procedure using simulator examination results

    International Nuclear Information System (INIS)

    Gore, B.R.; Dukelow, J.S. Jr.; Mitts, T.M.; Nicholson, W.L.

    1995-10-01

    This report presents a limited assessment of the conservatism of the Accident Sequence Evaluation Program (ASEP) human reliability analysis (HRA) procedure described in NUREG/CR-4772. In particular, the, ASEP post-accident, post-diagnosis, nominal HRA procedure is assessed within the context of an individual's performance of critical tasks on the simulator portion of requalification examinations administered to nuclear power plant operators. An assessment of the degree to which operator perforn:Lance during simulator examinations is an accurate reflection of operator performance during actual accident conditions was outside the scope of work for this project; therefore, no direct inference can be made from this report about such performance. The data for this study are derived from simulator examination reports from the NRC requalification examination cycle. A total of 4071 critical tasks were identified, of which 45 had been failed. The ASEP procedure was used to estimate human error probability (HEP) values for critical tasks, and the HEP results were compared with the failure rates observed in the examinations. The ASEP procedure was applied by PNL operator license examiners who supplemented the limited information in the examination reports with expert judgment based upon their extensive simulator examination experience. ASEP analyses were performed for a sample of 162 critical tasks selected randomly from the 4071, and the results were used to characterize the entire population. ASEP analyses were also performed for all of the 45 failed critical tasks. Two tests were performed to assess the bias of the ASEP HEPs compared with the data from the requalification examinations. The first compared the average of the ASEP HEP values with the fraction of the population actually failed and it found a statistically significant factor of two bias on the average

  20. ASSESSING GOING CONCERN ASSUMPTION BY USING RATING VALUATION MODELS BASED UPON ANALYTICAL PROCEDURES IN CASE OF FINANCIAL INVESTMENT COMPANIES

    OpenAIRE

    Tatiana Danescu; Ovidiu Spatacean; Paula Nistor; Andrea Cristina Danescu

    2010-01-01

    Designing and performing analytical procedures aimed to assess the rating of theFinancial Investment Companies are essential activities both in the phase of planning a financialaudit mission and in the phase of issuing conclusions regarding the suitability of using by themanagement and other persons responsible for governance of going concern, as the basis forpreparation and disclosure of financial statements. The paper aims to examine the usefulness ofrecognized models used in the practice o...

  1. High-reliability microcontroller nerve stimulator for assistance in regional anaesthesia procedures.

    Science.gov (United States)

    Ferri, Carlos A; Quevedo, Antonio A F

    2017-07-01

    In the last decades, the use of nerve stimulators to aid in regional anaesthesia has been shown to benefit the patient since it allows a better location of the nerve plexus, leading to correct positioning of the needle through which the anaesthetic is applied. However, most of the nerve stimulators available in the market for this purpose do not have the minimum recommended features for a good stimulator, and this can lead to risks to the patient. Thus, this study aims to develop an equipment, using embedded electronics, which meets all the characteristics, for a successful blockade. The system is made of modules for generation and overall control of the current pulse and the patient and user interfaces. The results show that the designed system fits into required specifications for a good and reliable nerve stimulator. Linearity proved satisfactory, ensuring accuracy in electrical current amplitude for a wide range of body impedances. Field tests have proven very successful. The anaesthesiologist that used the system reported that, in all cases, plexus blocking was achieved with higher quality, faster anaesthetic diffusion and without needed of an additional dose when compared with same procedure without the use of the device.

  2. A new modular procedure for industrial plant simulations and its reliable implementation

    International Nuclear Information System (INIS)

    Carcasci, C.; Marini, L.; Morini, B.; Porcelli, M.

    2016-01-01

    Modeling of industrial plants, and especially energy systems, has become increasingly important in industrial engineering and the need for accurate information on their behavior has grown along with the complexity of the industrial processes. Consequently, accurate and flexible simulation tools became essential yielding the development of modular codes. The aim of this work is to propose a new modular mathematical modeling for industrial plant simulation and its reliable numerical implementation. Regardless of their layout, a large class of plant's configurations is modeled by a library of elementary parts; then the physical properties, compositions of the working fluid, and plant's performance are estimated. Each plant component is represented by equations modeling fundamental mechanical and thermodynamic laws and giving rise to a system of algebraic nonlinear equations; remarkably, suitable restrictions on the variables of such nonlinear equations are imposed to guarantee solutions of physical meaning. The proposed numerical procedure combines an outer iterative process which refines plants characteristic parameters and an inner one which solves the arising nonlinear systems and consists of a trust-region solver for bound-constrained nonlinear equalities. The new procedure has been validated performing simulations against an existing modular tool on two compression train arrangements with both series and parallel-mounted compressors. - Highlights: • A numerical modular tool for industrial plants simulation is presented. • Mathematical modeling is thoroughly described. • Solution of the nonlinear system is performed by a trust-region Gauss–Newton solver. • A detailed explanation of the optimization solver named TRESNEI is provided. • Code flexibility and robustness are investigated through numerical simulations.

  3. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  4. Reliability of the fuel identification procedure used by COGEMA during cask loading for shipment to LA HAGUE

    International Nuclear Information System (INIS)

    Pretesacque, P.; Eid, M.; Zachar, M.

    1993-01-01

    This study has been carried out to demonstrate the reliability of the system of the spent fuel identification used by COGEMA and NTL prior to shipment to the reprocessing plant of La Hague. This was a prerequisite for the French competent authority to accept the 'burnup credit' assumption in the criticality assessment of spent fuel packages. The probability to load a non-irradiated and non-specified fuel assembly was considered as acceptable if our identification and irradiation status measurement procedures were used. Furthermore, the task analysis enabled us to improve the working conditions at reactor sites, the quality of the working documentation, and consequently to improve the reliability of the system. The NTL experience of transporting to La Hague, as consignor, more than 10,000 fuel assemblies since the date of implementation of our system in 1984 without any non-conformance on fuel identification, validated the formalism of this study as well as our assumptions on basic events probabilities. (J.P.N.)

  5. Dynamic reliability networks with self-healing units

    International Nuclear Information System (INIS)

    Jenab, K.; Seyed Hosseini, S.M.; Dhillon, B.S.

    2008-01-01

    This paper presents an analytical approach for dynamic reliability networks used for the failure limit strategy in maintenance optimization. The proposed approach utilizes the moment generating function (MGF) and the flow-graph concept to depict the functional and reliability diagrams of the system comprised of series, parallel or mix configuration of self-healing units. The self-healing unit is featured by the embedded failure detection and recovery mechanisms presented by self-loop in flow-graph networks. The newly developed analytical approach provides the probability of the system failure and time-to-failure data i.e., mean and standard deviation time-to-failure used for maintenance optimization

  6. AK-SYS: An adaptation of the AK-MCS method for system reliability

    International Nuclear Information System (INIS)

    Fauriat, W.; Gayton, N.

    2014-01-01

    A lot of research work has been proposed over the last two decades to evaluate the probability of failure of a structure involving a very time-consuming mechanical model. Surrogate model approaches based on Kriging, such as the Efficient Global Reliability Analysis (EGRA) or the Active learning and Kriging-based Monte-Carlo Simulation (AK-MCS) methods, are very efficient and each has advantages of its own. EGRA is well suited to evaluating small probabilities, as the surrogate can be used to classify any population. AK-MCS is built in relation to a given population and requires no optimization program for the active learning procedure to be performed. It is therefore easier to implement and more likely to spend computational effort on areas with a significant probability content. When assessing system reliability, analytical approaches and first-order approximation are widely used in the literature. However, in the present paper we rather focus on sampling techniques and, considering the recent adaptation of the EGRA method for systems, a strategy is presented to adapt the AK-MCS method for system reliability. The AK-SYS method, “Active learning and Kriging-based SYStem reliability method”, is presented. Its high efficiency and accuracy are illustrated via various examples

  7. Different Analytical Procedures for the Study of Organic Residues in Archeological Ceramic Samples with the Use of Gas Chromatography-mass Spectrometry.

    Science.gov (United States)

    Kałużna-Czaplińska, Joanna; Rosiak, Angelina; Kwapińska, Marzena; Kwapiński, Witold

    2016-01-01

    The analysis of the composition of organic residues present in pottery is an important source of information for historians and archeologists. Chemical characterization of the materials provides information on diets, habits, technologies, and original use of the vessels. This review presents the problem of analytical studies of archeological materials with a special emphasis on organic residues. Current methods used in the determination of different organic compounds in archeological ceramics are presented. Particular attention is paid to the procedures of analysis of archeological ceramic samples used before gas chromatography-mass spectrometry. Advantages and disadvantages of different extraction methods and application of proper quality assurance/quality control procedures are discussed.

  8. Standardizing the practice of human reliability analysis

    International Nuclear Information System (INIS)

    Hallbert, B.P.

    1993-01-01

    The practice of human reliability analysis (HRA) within the nuclear industry varies greatly in terms of posited mechanisms that shape human performance, methods of characterizing and analytically modeling human behavior, and the techniques that are employed to estimate the frequency with which human error occurs. This variation has been a source of contention among HRA practitioners regarding the validity of results obtained from different HRA methods. It has also resulted in attempts to develop standard methods and procedures for conducting HRAs. For many of the same reasons, the practice of HRA has not been standardized or has been standardized only to the extent that individual analysts have developed heuristics and consistent approaches in their practice of HRA. From the standpoint of consumers and regulators, this has resulted in a lack of clear acceptance criteria for the assumptions, modeling, and quantification of human errors in probabilistic risk assessments

  9. Analytical aids in land management planning

    Science.gov (United States)

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  10. RADCHEM - Radiochemical procedures for the determination of Sr, U, Pu, Am and Cm

    Energy Technology Data Exchange (ETDEWEB)

    Sidhu, R. [Inst. for Energy Technology (Norway)

    2006-04-15

    An accurate determination of radionuclides from various sources in the environment is essential for assessment of the potential hazards and suitable countermeasures both in case of accidents, authorised release and routine surveillance. Reliable radiochemical separation and detection techniques are needed for accurate determination of alpha and beta emitters. Rapid analytical methods are needed in case of an accident for early decision-making. The objective of this project has been to compare and evaluate radiochemical procedures used at Nordic laboratories for the determination of strontium, uranium, plutonium, americium and curium. To gather detailed information on the procedures in use, a questionnaire regarding various aspects of radionuclide determination was developed and distributed to all (sixteen) relevant laboratories in the Nordic countries. The response and the procedures used by each laboratory were then discussed between those who answered the questionnaire. This report summaries the findings and gives recommendation on suitable practice. (au)

  11. RADCHEM - Radiochemical procedures for the determination of Sr, U, Pu, Am and Cm

    International Nuclear Information System (INIS)

    Sidhu, R.

    2006-04-01

    An accurate determination of radionuclides from various sources in the environment is essential for assessment of the potential hazards and suitable countermeasures both in case of accidents, authorised release and routine surveillance. Reliable radiochemical separation and detection techniques are needed for accurate determination of alpha and beta emitters. Rapid analytical methods are needed in case of an accident for early decision-making. The objective of this project has been to compare and evaluate radiochemical procedures used at Nordic laboratories for the determination of strontium, uranium, plutonium, americium and curium. To gather detailed information on the procedures in use, a questionnaire regarding various aspects of radionuclide determination was developed and distributed to all (sixteen) relevant laboratories in the Nordic countries. The response and the procedures used by each laboratory were then discussed between those who answered the questionnaire. This report summaries the findings and gives recommendation on suitable practice. (au)

  12. Seamless Digital Environment – Data Analytics Use Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-08-01

    Multiple research efforts in the U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program studies the need and design of an underlying architecture to support the increased amount and use of data in the nuclear power plant. More specifically the three LWRS research efforts; Digital Architecture for an Automated Plant, Automated Work Packages, Computer-Based Procedures for Field Workers, and the Online Monitoring efforts all have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment (SDE). A SDE provides a mean to access multiple applications, gather the data points needed, conduct the analysis requested, and present the result to the user with minimal or no effort by the user. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting the nuclear utilities identified the need for research focused on data analytics. The effort was to develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight forward, easy to consume manner. This report describes the use case study initiated by NITSL and conducted in a collaboration between Idaho National Laboratory, Arizona Public Service – Palo Verde Nuclear Generating Station, and NextAxiom Inc.

  13. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  14. Establishing the analytical procedure for acetate in water by ion chromatography method

    International Nuclear Information System (INIS)

    Nguyen Thi Hong Thinh; Ha Lan Anh; Vo Thi Anh

    2015-01-01

    In recent studies of contamination sources of arsenic, ammonium, iron, organic carbon in groundwater, acetate is measured a lot because it is the main decomposition product of organic compounds from sediment into groundwater. In order to better support for the study of the origin and mobilization mechanism of the pollutants, acetate was studied analysis method in Isotopes Hydrology Laboratory using ion chromatography technique. Project Researchers used Ion Chromatography system - DX-600 including IonPac ICE-AS1 column for separating acetate and conductivity detector CD 25 to quantify acetate in water samples. The study results showed that project team has successfully developed analytical procedures of acetate in water with acetate’s retention time is 12 minutes, limit of detection (LOD) of the method was 0.01 ppm. The accuracy of the method was established by calculating the precision and bias of 10 analysis times of a standard sample at content levels 1 ppm and 8 ppm. The results of the 10 measurements are satisfiable about precision and bias with repeated standard deviation coefficient CVR were 1.3% and 0.2% and the recoveries R were 99.92% and 101.72%. (author)

  15. AN ANALYTICAL STUDY OF EFFICACY OF CORNEAL COLLAGEN CROSSLINKING C3R PROCEDURE IN PROGRESSIVE KERATOCONUS PATIENTS

    Directory of Open Access Journals (Sweden)

    Rajasekar K

    2017-10-01

    Full Text Available BACKGROUND Keratoconus affects a significant number of the general population with conical weakened protruded area from the cornea due to weakening of the corneal stroma by a genetically premeditated preponderance. We see keratoconus as a standalone disease or accompanying other syndrome manifestations in patients. Mainly, the inferotemporal cornea is affected and the conical protrusion causes profound high irregular myopic astigmatism as a refractive error, which is very difficult to correct in progressed advanced stages. Especially in economically productive age group patients, the poor vision becomes very difficult to live with. Corneal collagen crosslinking procedure is a novel tool in the armamentarium of treatment procedures against this malady. MATERIALS AND METHODS This analytical study was conducted at cornea services, Regional Institute of Ophthalmology and Government Ophthalmic Hospital, Chennai, for a period of 14 months. Forty five eyes of forty patients with early progressive keratoconus who presented to cornea services were subjected to riboflavin UVA collagen crosslinking procedures using a standard protocol after getting an informed consent. Further response to treatment were assessed in the follow up period. RESULTS Out of 40 patients in our series, 23 were males and 17 were females. The maximum patients in our series were in the age group between 10 to 25 yrs. Epi-off procedure was done in 31 eyes and epi-on procedure was done in 14 eyes. The patients with pachymetry 400-450 microns underwent epi-on procedure and more than 450 microns underwent epi-off C3R procedure. The K values in our series were between 49D to maximum 63D. The topographic flattening was seen in 52% in epi-on and epi-off procedures. Vision improvement in our series was 57% following epi-on and 65% following epi-off procedures. CONCLUSION C3R is a very promising therapeutic modality that may halt the progression of ectatic process. It is a less invasive mode

  16. Procedures For Microbial-Ecology Laboratory

    Science.gov (United States)

    Huff, Timothy L.

    1993-01-01

    Microbial Ecology Laboratory Procedures Manual provides concise and well-defined instructions on routine technical procedures to be followed in microbiological laboratory to ensure safety, analytical control, and validity of results.

  17. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    Science.gov (United States)

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality

  18. Reference materials for micro-analytical nuclear techniques

    International Nuclear Information System (INIS)

    Valkovic, V.; Zeisler, R.; Bernasconi, G.; Danesi, P.R.

    1994-01-01

    Direct application of many existing reference materials in micro-analytical procedures such as energy dispersive x-ray fluorescence (EDXRF), particle induced x-ray emission spectroscopy (PIXE) and ion probe techniques for the determination of trace elements is often impossible or difficult because: 1) other constituents present in large amounts interfere with the determination; 2) trace components are not sufficiently homogeneously distributed in the sample. Therefore specific natural-matrix reference materials containing very low levels of trace elements and having high degree of homogeneity are required for many micro-analytical procedures. In this report, selection of the types of environmental and biological materials which are suitable for micro-analytical techniques will be discussed. (author)

  19. Sample preparation procedures utilized in microbial metabolomics: An overview.

    Science.gov (United States)

    Patejko, Małgorzata; Jacyna, Julia; Markuszewski, Michał J

    2017-02-01

    Bacteria are remarkably diverse in terms of their size, structure and biochemical properties. Due to this fact, it is hard to develop a universal method for handling bacteria cultures during metabolomic analysis. The choice of suitable processing methods constitutes a key element in any analysis, because only appropriate selection of procedures may provide accurate results, leading to reliable conclusions. Because of that, every analytical experiment concerning bacteria requires individually and very carefully planned research methodology. Although every study varies in terms of sample preparation, there are few general steps to follow while planning experiment, like sampling, separation of cells from growth medium, stopping their metabolism and extraction. As a result of extraction, all intracellular metabolites should be washed out from cell environment. What is more, extraction method utilized cannot cause any chemical decomposition or degradation of the metabolome. Furthermore, chosen extraction method should correlate with analytical technique, so it will not disturb or prolong following sample preparation steps. For those reasons, we observe a need to summarize sample preparation procedures currently utilized in microbial metabolomic studies. In the presented overview, papers concerning analysis of extra- and intracellular metabolites, published over the last decade, have been discussed. Presented work gives some basic guidelines that might be useful while planning experiments in microbial metabolomics. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Biomass Compositional Analysis Laboratory Procedures | Bioenergy | NREL

    Science.gov (United States)

    Biomass Compositional Analysis Laboratory Procedures Biomass Compositional Analysis Laboratory Procedures NREL develops laboratory analytical procedures (LAPs) for standard biomass analysis. These procedures help scientists and analysts understand more about the chemical composition of raw biomass

  1. An analytical procedure for determination of sulphur species and isotopes in boreal acid sulphate soils and sediments

    Directory of Open Access Journals (Sweden)

    K. BACKLUND

    2008-12-01

    Full Text Available An analytical scheme suitable for boreal acid sulphate (AS soils and sediments was developed on the basis of existing methods. The presented procedure can be used to quantify and discriminate among acid volatile sulphide, cold chromium reducible sulphur, hot chromium reducible sulphur, elemental sulphur, sulphate sulphur, organic sulphur, total reducible sulphur and total sulphur. The sulphur fractions are recovered as either Ag2S or BaSO4 precipitates and can further be used for isotope analysis. Overlaps between sulphur species are common during speciation, and must be minimized. Some of these overlaps are caused by poor sampling and storage, inappropriate conditions during the distillation, or natural variations in the sample (e.g. Fe3+ interference and grain size. The procedural impact was determined by conducting tests on both artificial and natural samples containing one or several sulphur species. The method is applied on reduced sediment from an AS soil locality (Överpurmo and a brackish lake (Larsmo Lake in western Finland and the results, including S-isotopes, are discussed.;

  2. Accuracy of a Classical Test Theory-Based Procedure for Estimating the Reliability of a Multistage Test. Research Report. ETS RR-17-02

    Science.gov (United States)

    Kim, Sooyeon; Livingston, Samuel A.

    2017-01-01

    The purpose of this simulation study was to assess the accuracy of a classical test theory (CTT)-based procedure for estimating the alternate-forms reliability of scores on a multistage test (MST) having 3 stages. We generated item difficulty and discrimination parameters for 10 parallel, nonoverlapping forms of the complete 3-stage test and…

  3. Corrections for criterion reliability in validity generalization: The consistency of Hermes, the utility of Midas

    Directory of Open Access Journals (Sweden)

    Jesús F. Salgado

    2016-04-01

    Full Text Available There is criticism in the literature about the use of interrater coefficients to correct for criterion reliability in validity generalization (VG studies and disputing whether .52 is an accurate and non-dubious estimate of interrater reliability of overall job performance (OJP ratings. We present a second-order meta-analysis of three independent meta-analytic studies of the interrater reliability of job performance ratings and make a number of comments and reflections on LeBreton et al.s paper. The results of our meta-analysis indicate that the interrater reliability for a single rater is .52 (k = 66, N = 18,582, SD = .105. Our main conclusions are: (a the value of .52 is an accurate estimate of the interrater reliability of overall job performance for a single rater; (b it is not reasonable to conclude that past VG studies that used .52 as the criterion reliability value have a less than secure statistical foundation; (c based on interrater reliability, test-retest reliability, and coefficient alpha, supervisor ratings are a useful and appropriate measure of job performance and can be confidently used as a criterion; (d validity correction for criterion unreliability has been unanimously recommended by "classical" psychometricians and I/O psychologists as the proper way to estimate predictor validity, and is still recommended at present; (e the substantive contribution of VG procedures to inform HRM practices in organizations should not be lost in these technical points of debate.

  4. Reliability of power system with open access

    International Nuclear Information System (INIS)

    Ehsani, A.; Ranjbar, A. M.; Fotuhi Firuzabad, M.; Ehsani, M.

    2003-01-01

    Recently, in many countries, electric utility industry is undergoing considerable changes in regard to its structure and regulation. It can be clearly seen that the thrust towards privatization and deregulation or re regulation of the electric utility industry will introduce numerous reliability problems that will require new criteria and analytical tools that recognize the residual uncertainties in the new environment. In this paper, different risks and uncertainties in competitive electricity markets are briefly introduced; the approach of customers, operators, planners, generation bodies and network providers to the reliability of deregulated system is studied; the impact of dispersed generation on system reliability is evaluated; and finally, the reliability cost/reliability worth issues in the new competitive environment are considered

  5. Development of an analytical procedure for plutonium in the concentration range of femtogram/gram and its application to environmental samples

    International Nuclear Information System (INIS)

    Schuettelkopf, H.

    1981-09-01

    To study the behaviour of plutonium in the environment and to measure plutonium in the vicinity of nuclear facilities, a quick, sensitive analytical method is required which can be applied to all sample materials found in the environment. For a sediment contaminated with plutonium a boiling out method using first HNO 3 /HF and subsequently HNO 3 /Al(NO 3 ) 3 was found to be successful. The leaching solution was then extracted by TOPO and the plutonium backextracted by ascorbic acid/HCl. Some different purification steps and finally electroplating using ammonium oxalate led to an optimum sample for α- spectroscopic determination of plutonium. An analytical method was worked out for plutonium which can be applied to all materials found in the environment. The sample size is 100 g but it might also be much greater. The average chemical yield is 70 and 80%. The detection limit for soil samples is 0.1 fCi/g and for plant samples 0.5 fCi/g. One technician can perform eight analyses per working day. The analytical procedure was applied to a large number of environmental samples and the results of these analyses are indicated. (orig./RB) [de

  6. Review of methods for the integration of reliability and design engineering

    International Nuclear Information System (INIS)

    Reilly, J.T.

    1978-03-01

    A review of methods for the integration of reliability and design engineering was carried out to establish a reliability program philosophy, an initial set of methods, and procedures to be used by both the designer and reliability analyst. The report outlines a set of procedures which implements a philosophy that requires increased involvement by the designer in reliability analysis. Discussions of each method reviewed include examples of its application

  7. Development of analytical procedures for the determination of hexavalent chromium in corrosion prevention coatings used in the automotive industry.

    Science.gov (United States)

    Séby, F; Castetbon, A; Ortega, R; Guimon, C; Niveau, F; Barrois-Oudin, N; Garraud, H; Donard, O F X

    2008-05-01

    The European directive 2000/53/EC limits the use of Cr(VI) in vehicle manufacturing. Although a maximum of 2 g of Cr(VI) was authorised per vehicle for corrosion prevention coatings of key components, since July 2007 its use has been prohibited except for some particular applications. Therefore, the objective of this work was to develop direct analytical procedures for Cr(VI) determination in the different steel coatings used for screws. Instead of working directly with screws, the optimisation of the procedures was carried out with metallic plates homogeneously coated to improve the data comparability. Extraction of Cr(VI) from the metallic parts was performed by sonication. Two extraction solutions were tested: a direct water extraction solution used in standard protocols and an ammonium/ammonia buffer solution at pH 8.9. The extracts were further analysed for Cr speciation by high-performance liquid chromatography (HPLC) inductively coupled plasma (ICP) atomic emission spectrometry or HPLC ICP mass spectrometry depending on the concentration level. When possible, the coatings were also directly analysed by solid speciation techniques (X-ray photoelectron spectroscopy, XPS, and X-ray absorption near-edge structure, XANES) for validation of the results. Very good results between the different analytical approaches were obtained for the sample of coating made up of a heated paint containing Zn, Al and Cr when using the extracting buffer solution at pH 8.9. After a repeated four-step extraction procedure on the same portion test, taking into account the depth of the surface layer reached, good agreement with XPS and XANES results was obtained. In contrast, for the coatings composed of an alkaline Zn layer where Cr(VI) and Cr(III) are deposited, only the extraction procedure using water allowed the detection of Cr(VI). To elucidate the Cr(VI) reduction during extraction at pH 8.9, the reactivity of Cr(VI) towards different species of Zn generally present in the

  8. A Procedure for the Sequential Determination of Radionuclides in Environmental Samples. Liquid Scintillation Counting and Alpha Spectrometry for 90Sr, 241Am and Pu Radioisotopes

    International Nuclear Information System (INIS)

    2014-01-01

    Since 2004, IAEA activities related to the terrestrial environment have aimed at the development of a set of procedures to determine radionuclides in environmental samples. Reliable, comparable and ‘fit for purpose’ results are an essential requirement for any decision based on analytical measurements. For the analyst, tested and validated analytical procedures are extremely important tools for the production of analytical data. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available for reference to both the analyst and the customer. This publication describes a combined procedure for the sequential determination of 90 Sr, 241 Am and Pu radioisotopes in environmental samples. The method is based on the chemical separation of strontium, americium and plutonium using ion exchange chromatography, extraction chromatography and precipitation followed by alpha spectrometric and liquid scintillation counting detection. The method was tested and validated in terms of repeatability and trueness in accordance with International Organization for Standardization (ISO) guidelines using reference materials and proficiency test samples. Reproducibility tests were performed later at the IAEA Terrestrial Environment Laboratory. The calculations of the massic activity, uncertainty budget, decision threshold and detection limit are also described in this publication. The procedure is introduced for the determination of 90 Sr, 241 Am and Pu radioisotopes in environmental samples such as soil, sediment, air filter and vegetation samples. It is expected to be of general use to a wide range of laboratories, including the Analytical Laboratories for the Measurement of Environmental Radioactivity (ALMERA) network for routine environmental monitoring purposes

  9. LC-MS/MS analytical procedure to quantify tris(nonylphenyl)phosphite, as a source of the endocrine disruptors 4-nonylphenols, in food packaging materials.

    Science.gov (United States)

    Mottier, Pascal; Frank, Nancy; Dubois, Mathieu; Tarres, Adrienne; Bessaire, Thomas; Romero, Roman; Delatour, Thierry

    2014-01-01

    Tris(nonylphenyl)phosphite, an antioxidant used in polyethylene resins for food applications, is problematic since it is a source of the endocrine-disrupting chemicals 4-nonylphenols (4NP) upon migration into packaged foods. As a response to concerns surrounding the presence of 4NP-based compounds in packaging materials, some resin producers and additive suppliers have decided to eliminate TNPP from formulations. This paper describes an analytical procedure to verify the "TNPP-free" statement in multilayer laminates used for bag-in-box packaging. The method involves extraction of TNPP from laminates with organic solvents followed by detection/quantification by LC-MS/MS using the atmospheric pressure chemical ionisation (APCI) mode. A further acidic treatment of the latter extract allows the release of 4NP from potentially extracted TNPP. 4NP is then analysed by LC-MS/MS using electrospray ionisation (ESI) mode. This two-step analytical procedure ensures not only TNPP quantification in laminates, but also allows the flagging of other possible sources of 4NP in such packaging materials, typically as non-intentionally added substances (NIAS). The limits of quantification were 0.50 and 0.48 µg dm⁻² for TNPP and 4NP in laminates, respectively, with recoveries ranging between 87% and 114%. Usage of such analytical methodologies in quality control operations has pointed to a lack of traceability at the packaging supplier level and cross-contamination of extrusion equipment at the converter level, when TNPP-containing laminates are processed on the same machine beforehand.

  10. A new, rapid and reliable method for the determination of reduced sulphur (S2-) species in natural water discharges

    International Nuclear Information System (INIS)

    Montegrossi, Giordano; Tassi, Franco; Vaselli, Orlando; Bidini, Eva; Minissale, Angelo

    2006-01-01

    The determination of reduced S species in natural waters is particularly difficult due to their high instability and chemical and physical interferences in the current analytical methods. In this paper a new, rapid and reliable analytical procedure is presented, named the Cd-IC method, for their determination as ΣS 2- via oxidation to SO 4 2- after chemical trapping with an ammonia-cadmium solution that allows precipitation of all the reduced S species as CdS. The S 2- -SO 4 is analysed by ion-chromatography. The main advantages of this method are: low cost, high stability of CdS precipitate, absence of interferences, low detection limit (0.01mg/L as SO 4 for 10mL of water) and low analytical error (about 5%). The proposed method has been applied to more than 100 water samples from different natural systems (water discharges and cold wells from volcanic and geothermal areas, crater lakes) in central-southern Italy

  11. Recent trends in analytical procedures in forensic toxicology.

    Science.gov (United States)

    Van Bocxlaer, Jan F

    2005-12-01

    Forensic toxicology is a very demanding discipline,heavily dependent on good analytical techniques. That is why new trends appear continuously. In the past years. LC-MS has revolutionized target compound analysis and has become the trend, also in toxicology. In LC-MS screening analysis, things are less straightforward and several approaches exist. One promising approach based on accurate LC-MSTOF mass measurements and elemental formula based library searches is discussed. This way of screening has already proven its applicability but at the same time it became obvious that a single accurate mass measurement lacks some specificity when using large compound libraries. CE too is a reemerging approach. The increasingly polar and ionic molecules encountered make it a worthwhile addition to e.g. LC, as illustrated for the analysis of GHB. A third recent trend is the use of MALDI mass spectrometry for small molecules. It is promising for its ease-of-use and high throughput. Unfortunately, re-ports of disappointment but also accomplishment, e.g. the quantitative analysis of LSD as discussed here, alternate, and it remains to be seen whether MALDI really will establish itself. Indeed, not all new trends will prove themselves but the mere fact that many appear in the world of analytical toxicology nowadays is, in itself, encouraging for the future of (forensic) toxicology.

  12. 8. All Polish Conference on Analytical Chemistry: Analytical Chemistry for the Community of the 21. Century

    International Nuclear Information System (INIS)

    Koscielniak, P.; Wieczorek, M.; Kozak, J.

    2010-01-01

    Book of Abstracts contains short descriptions of lectures, communications and posters presented during 8 th All Polish Conference on Analytical Chemistry (Cracow, 4-9.07.2010). Scientific programme consisted of: basic analytical problems, preparation of the samples, chemometry and metrology, miniaturization of the analytical procedures, environmental analysis, medicinal analyses, industrial analyses, food analyses, biochemical analyses, analysis of relicts of the past. Several posters were devoted to the radiochemical separations, radiochemical analysis, environmental behaviour of the elements important for the nuclear science and the professional tests.

  13. Development of reliable analytical tools for evaluating the influence of reductive winemaking on the quality of Lugana wines.

    Science.gov (United States)

    Mattivi, Fulvio; Fedrizzi, Bruno; Zenato, Alberto; Tiefenthaler, Paolo; Tempesta, Silvano; Perenzoni, Daniele; Cantarella, Paolo; Simeoni, Federico; Vrhovsek, Urska

    2012-06-30

    This paper presents methods for the definition of important analytical tools, such as the development of sensitive and rapid methods for analysing reduced and oxidised glutathione (GSH and GSSG), hydroxycinnamic acids (HCA), bound thiols (GSH-3MH and Cys-3MH) and free thiols (3MH and 3MHA), and their first application to evaluate the effect of reductive winemaking on the composition of Lugana juices and wines. Lugana is a traditional white wine from the Lake Garda region (Italy), produced using a local grape variety, Trebbiano di Lugana. An innovative winemaking procedure based on preliminary cooling of grape berries followed by crushing in an inert environment was implemented and explored on a winery scale. The effects of these procedures on hydroxycinnamic acids, GSH, GSSG, free and bound thiols and flavanols content were investigated. The juices and wines produced using different protocols were examined. Moreover, wines aged in tanks for 1, 2 and 3 months were analysed. The high level of GSH found in Lugana grapes, which can act as a natural antioxidant and be preserved in must and young wines, thus reducing the need of exogenous antioxidants, was particularly interesting. Moreover, it was clear that polyphenol concentrations (hydroxycinnamic acids and catechins) were strongly influenced by winemaking and pressing conditions, which required fine tuning of pressing. Above-threshold levels of 3-mercaptohexan-1-ol (3MH) and 3-mercaptohexyl acetate (3MHA) were found in the wines and changed according to the winemaking procedure applied. Interestingly, the evolution during the first three months also varied depending on the procedure adopted. Organic synthesis of cysteine and glutathione conjugates was carried out and juices and wines were subjected to LC-MS/MS analysis. These two molecules appeared to be strongly affected by the winemaking procedure, but did not show any significant change during the first 3 months of post-bottling ageing. This supports the theory

  14. Manual of selected physico-chemical analytical methods. IV

    International Nuclear Information System (INIS)

    Beran, M.; Klosova, E.; Krtil, J.; Sus, F.; Kuvik, V.; Vrbova, L.; Hamplova, M.; Lengyel, J.; Kelnar, L.; Zakouril, K.

    1990-11-01

    The Central Testing Laboratory of the Nuclear Research Institute at Rez has for a decade been participating in the development of analytical procedures and has been providing analyses of samples of different types and origin. The analytical procedures developed have been published in special journals and a number of them in the Manuals of analytical methods, in three parts. The 4th part of the Manual contains selected physico-chemical methods developed or modified by the Laboratory in the years 1986-1990 within the project ''Development of physico-chemical analytical methods''. In most cases, techniques are involved for non-nuclear applications. Some can find wider applications, especially in analyses of environmental samples. Others have been developed for specific cases of sample analyses or require special instrumentation (mass spectrometer), which partly restricts their applicability by other institutions. (author)

  15. Towards Reliable Multi-Hop Broadcast in VANETs: An Analytical Approach

    NARCIS (Netherlands)

    Gholibeigi, Mozhdeh; Baratchi, Mitra; van den Berg, Hans Leo; Heijenk, Geert

    2016-01-01

    Intelligent Transportation Systems in the domain of vehicular networking, have recently been subject to rapid development. In vehicular ad hoc networks, data broadcast is one of the main communication types and its reliability is crucial for high performance applications. However, due to the lack of

  16. Towards reliable multi-hop broadcast in VANETs : An analytical approach

    NARCIS (Netherlands)

    Gholibeigi, M.; Baratchi, M.; Berg, J.L. van den; Heijenk, G.

    2017-01-01

    Intelligent Transportation Systems in the domain of vehicular networking, have recently been subject to rapid development. In vehicular ad hoc networks, data broadcast is one of the main communication types and its reliability is crucial for high performance applications. However, due to the lack of

  17. Scale Reliability Evaluation with Heterogeneous Populations

    Science.gov (United States)

    Raykov, Tenko; Marcoulides, George A.

    2015-01-01

    A latent variable modeling approach for scale reliability evaluation in heterogeneous populations is discussed. The method can be used for point and interval estimation of reliability of multicomponent measuring instruments in populations representing mixtures of an unknown number of latent classes or subpopulations. The procedure is helpful also…

  18. Analytically solvable models of reaction-diffusion systems

    Energy Technology Data Exchange (ETDEWEB)

    Zemskov, E P; Kassner, K [Institut fuer Theoretische Physik, Otto-von-Guericke-Universitaet, Universitaetsplatz 2, 39106 Magdeburg (Germany)

    2004-05-01

    We consider a class of analytically solvable models of reaction-diffusion systems. An analytical treatment is possible because the nonlinear reaction term is approximated by a piecewise linear function. As particular examples we choose front and pulse solutions to illustrate the matching procedure in the one-dimensional case.

  19. A Comparison of Result Reliability for Investigation of Milk Composition by Alternative Analytical Methods in Czech Republic

    Directory of Open Access Journals (Sweden)

    Oto Hanuš

    2014-01-01

    Full Text Available The milk analyse result reliability is important for assurance of foodstuff chain quality. There are more direct and indirect methods for milk composition measurement (fat (F, protein (P, lactose (L and solids non fat (SNF content. The goal was to evaluate some reference and routine milk analytical procedures on result basis. The direct reference analyses were: F, fat content (Röse–Gottlieb method; P, crude protein content (Kjeldahl method; L, lactose (monohydrate, polarimetric method; SNF, solids non fat (gravimetric method. F, P, L and SNF were determined also by various indirect methods: – MIR (infrared (IR technology with optical filters, 7 instruments in 4 labs; – MIR–FT (IR spectroscopy with Fourier’s transformations, 10 in 6; – ultrasonic method (UM, 3 in 1; – analysis by the blue and red box (BRB, 1 v 1. There were used 10 reference milk samples. Coefficient of determination (R2, correlation coefficient (r and standard deviation of the mean of individual differences (MDsd, for n were evaluated. All correlations (r; for all indirect and alternative methods and all milk components were significant (P ≤ 0.001. MIR and MIR–FT (conventional methods explained considerably higher proportion of the variability in reference results than the UM and BRB methods (alternative. All r average values (x minus 1.64 × sd for 95% confidence interval can be used as standards for calibration quality evaluation (MIR, MIR–FT, UM and BRB: – for F 0.997, 0.997, 0.99 and 0.995; – for P 0.986, 0.981, 0.828 and 0.864; – for L 0.968, 0.871, 0.705 and 0.761; – for SNF 0.992, 0.993, 0.911 and 0.872. Similarly ​MDsd (x plus 1.64 × sd: – for F 0.071, 0.068, 0.132 and 0.101%; – for P 0.051, 0.054, 0.202 and 0.14%; – for L 0.037, 0.074, 0.113 and 0.11%; – for SNF 0.052, 0.068, 0.141 and 0.204.

  20. Representative Sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Petersen, Lars; Esbensen, Kim Harry

    2005-01-01

    regime in order to secure the necessary reliability of: samples (which must be representative, from the primary sampling onwards), analysis (which will not mean anything outside the miniscule analytical volume without representativity ruling all mass reductions involved, also in the laboratory) and data...

  1. Reliable quantification of phthalates in environmental matrices (air, water, sludge, sediment and soil): a review.

    Science.gov (United States)

    Net, Sopheak; Delmont, Anne; Sempéré, Richard; Paluselli, Andrea; Ouddane, Baghdad

    2015-05-15

    Because of their widespread application, phthalates or phthalic acid esters (PAEs) are ubiquitous in the environment. Their presence has attracted considerable attention due to their potential impacts on ecosystem functioning and on public health, so their quantification has become a necessity. Various extraction procedures as well as gas/liquid chromatography and mass spectrometry detection techniques are found as suitable for reliable detection of such compounds. However, PAEs are ubiquitous in the laboratory environment including ambient air, reagents, sampling equipment, and various analytical devices, that induces difficult analysis of real samples with a low PAE background. Therefore, accurate PAE analysis in environmental matrices is a challenging task. This paper reviews the extensive literature data on the techniques for PAE quantification in natural media. Sampling, sample extraction/pretreatment and detection for quantifying PAEs in different environmental matrices (air, water, sludge, sediment and soil) have been reviewed and compared. The concept of "green analytical chemistry" for PAE determination is also discussed. Moreover useful information about the material preparation and the procedures of quality control and quality assurance are presented to overcome the problem of sample contamination and these encountered due to matrix effects in order to avoid overestimating PAE concentrations in the environment. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Development of integrated analytical data management system

    International Nuclear Information System (INIS)

    Onishi, Koichi; Wachi, Isamu; Hiroki, Toshio

    1986-01-01

    The Analysis Subsection of Technical Service Section, Tokai Reprocessing Plant, Tokai Works, is engaged in analysis activities required for the management of processes and measurements in the plant. Currently, it has been desired to increase the reliability of analytical data and to perform analyses more rapidly to cope with the increasing number of analysis works. To meet this end, on-line data processing has been promoted and advanced analytical equipment has been introduced in order to enhance automization. In the present study, an integrated analytical data mangement system is developed which serves for improvement of reliability of analytical data as well as for rapid retrieval and automatic compilation of these data. Fabrication of a basic model of the system has been nearly completed and test operation has already been started. In selecting hardware to be used, examinations were made on easiness of system extension, Japanese language processing function for improving man-machine interface, large-capacity auxiliary memory system, and data base processing function. The existing analysis works wer reviewed in establishing the basic design of the system. According to this basic design, the system can perform such works as analysis of application slips received from clients as well as recording, sending, filing and retrieval of analysis results. (Nogami, K.)

  3. Common-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest, M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CROSSER, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), used independently of one another. Point of equality between reliability of system and common reliability of components found. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  4. A Procedure for the Rapid Determination of 226Ra and 228Ra in Drinking Water by Liquid Scintillation Counting

    International Nuclear Information System (INIS)

    2014-01-01

    Since 2004, the environment programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. Reliable, comparable and ‘fit for purpose’ results are essential requirements for any decision based on analytical measurements. For the analyst, tested and validated analytical procedures are extremely important tools for the production of such data. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available to both the analyst and the customer. This publication describes a procedure for the rapid determination of 226 Ra and 228 Ra in drinking water. The determination of radium in drinking water is important for protecting human health, since the consumption of drinking water containing radium may lead to an accumulation in the body, contributing to the radiological dose. The method is based on the separation of 226 Ra and 228 Ra from interfering elements using PbSO 4 and Ba(Ra)SO 4 coprecipitation steps. The isotopes 226 Ra and 228 Ra are then determined by liquid scintillation counting. The procedure is expected to be of general use to a wide range of laboratories, including the laboratories of the Analytical Laboratories for the Measurement of Environmental Radioactivity (ALMERA) network, both in emergency situations and for routine environmental monitoring purposes. The method was established after an extensive review of papers from the scientific literature, and was tested and validated in terms of repeatability and trueness (relative bias) in accordance with International Organization for Standardization guidelines. Reproducibility tests were performed at expert laboratories. The calculation of massic activities, uncertainty budget, decision threshold and detection limit are also described

  5. Preliminary study for the reliability Assurance on results and procedure of the out-pile mechanical characterization test for a fuel assembly; Lateral Vibration Test (I)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kang Hee; Yoon, Kyung Hee; Kim, Hyung Kyu [KAERI, Daejeon (Korea, Republic of)

    2007-07-01

    The reliability assurance with respect to the test procedure and results of the out-pile mechanical performance test for the nuclear fuel assembly is an essential task to assure the test quality and to get a permission for fuel loading into the commercial reactor core. For the case of vibration test, proper management and appropriate calibration of instruments and devices used in the test, various efforts to minimize the possible error during the test and signal acquisition process are needed. Additionally, the deep understanding both of the theoretical assumption and simplification for the signal processing/modal analysis and of the functions of the devices used in the test were highly required. In this study, the overall procedure and result of lateral vibration test were assembly's mechanical characterization were briefly introduced. A series of measures to assure and improve the reliability of the vibration test were discussed.

  6. Preliminary study for the reliability Assurance on results and procedure of the out-pile mechanical characterization test for a fuel assembly; Lateral Vibration Test (I)

    International Nuclear Information System (INIS)

    Lee, Kang Hee; Yoon, Kyung Hee; Kim, Hyung Kyu

    2007-01-01

    The reliability assurance with respect to the test procedure and results of the out-pile mechanical performance test for the nuclear fuel assembly is an essential task to assure the test quality and to get a permission for fuel loading into the commercial reactor core. For the case of vibration test, proper management and appropriate calibration of instruments and devices used in the test, various efforts to minimize the possible error during the test and signal acquisition process are needed. Additionally, the deep understanding both of the theoretical assumption and simplification for the signal processing/modal analysis and of the functions of the devices used in the test were highly required. In this study, the overall procedure and result of lateral vibration test were assembly's mechanical characterization were briefly introduced. A series of measures to assure and improve the reliability of the vibration test were discussed

  7. Basic Concepts in Classical Test Theory: Tests Aren't Reliable, the Nature of Alpha, and Reliability Generalization as a Meta-analytic Method.

    Science.gov (United States)

    Helms, LuAnn Sherbeck

    This paper discusses the fact that reliability is about scores and not tests and how reliability limits effect sizes. The paper also explores the classical reliability coefficients of stability, equivalence, and internal consistency. Stability is concerned with how stable test scores will be over time, while equivalence addresses the relationship…

  8. A new, rapid and reliable method for the determination of reduced sulphur (S{sup 2-}) species in natural water discharges

    Energy Technology Data Exchange (ETDEWEB)

    Montegrossi, Giordano [C.N.R. - Institute of Geosciences and Earth Resources, Via G. La Pira 4, 50121 Florence (Italy)]. E-mail: giordano@geo.unifi.it; Tassi, Franco [Department of Earth Sciences, University of Florence, Via G. La Pira 4, 50121 Florence (Italy); Vaselli, Orlando [C.N.R. - Institute of Geosciences and Earth Resources, Via G. La Pira 4, 50121 Florence (Italy); Department of Earth Sciences, University of Florence, Via G. La Pira 4, 50121 Florence (Italy); Bidini, Eva [Department of Earth Sciences, University of Florence, Via G. La Pira 4, 50121 Florence (Italy); Minissale, Angelo [C.N.R. - Institute of Geosciences and Earth Resources, Via G. La Pira 4, 50121 Florence (Italy)

    2006-05-15

    The determination of reduced S species in natural waters is particularly difficult due to their high instability and chemical and physical interferences in the current analytical methods. In this paper a new, rapid and reliable analytical procedure is presented, named the Cd-IC method, for their determination as {sigma}S{sup 2-} via oxidation to SO{sub 4}{sup 2-} after chemical trapping with an ammonia-cadmium solution that allows precipitation of all the reduced S species as CdS. The S{sup 2-}-SO{sub 4} is analysed by ion-chromatography. The main advantages of this method are: low cost, high stability of CdS precipitate, absence of interferences, low detection limit (0.01mg/L as SO{sub 4} for 10mL of water) and low analytical error (about 5%). The proposed method has been applied to more than 100 water samples from different natural systems (water discharges and cold wells from volcanic and geothermal areas, crater lakes) in central-southern Italy.

  9. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    Science.gov (United States)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  10. Development of a reliable estimation procedure of radioactivity inventory in a BWR plant due to neutron irradiation for decommissioning

    Directory of Open Access Journals (Sweden)

    Tanaka Ken-ichi

    2017-01-01

    Full Text Available Reliable information of radioactivity inventory resulted from the radiological characterization is important in order to plan decommissioning planning and is also crucial in order to promote decommissioning in effectiveness and in safe. The information is referred to by planning of decommissioning strategy and by an application to regulator. Reliable information of radioactivity inventory can be used to optimize the decommissioning processes. In order to perform the radiological characterization reliably, we improved a procedure of an evaluation of neutron-activated materials for a Boiling Water Reactor (BWR. Neutron-activated materials are calculated with calculation codes and their validity should be verified with measurements. The evaluation of neutron-activated materials can be divided into two processes. One is a distribution calculation of neutron-flux. Another is an activation calculation of materials. The distribution calculation of neutron-flux is performed with neutron transport calculation codes with appropriate cross section library to simulate neutron transport phenomena well. Using the distribution of neutron-flux, we perform distribution calculations of radioactivity concentration. We also estimate a time dependent distribution of radioactivity classification and a radioactive-waste classification. The information obtained from the evaluation is utilized by other tasks in the preparatory tasks to make the decommissioning plan and the activity safe and rational.

  11. Development of a reliable estimation procedure of radioactivity inventory in a BWR plant due to neutron irradiation for decommissioning

    Science.gov (United States)

    Tanaka, Ken-ichi; Ueno, Jun

    2017-09-01

    Reliable information of radioactivity inventory resulted from the radiological characterization is important in order to plan decommissioning planning and is also crucial in order to promote decommissioning in effectiveness and in safe. The information is referred to by planning of decommissioning strategy and by an application to regulator. Reliable information of radioactivity inventory can be used to optimize the decommissioning processes. In order to perform the radiological characterization reliably, we improved a procedure of an evaluation of neutron-activated materials for a Boiling Water Reactor (BWR). Neutron-activated materials are calculated with calculation codes and their validity should be verified with measurements. The evaluation of neutron-activated materials can be divided into two processes. One is a distribution calculation of neutron-flux. Another is an activation calculation of materials. The distribution calculation of neutron-flux is performed with neutron transport calculation codes with appropriate cross section library to simulate neutron transport phenomena well. Using the distribution of neutron-flux, we perform distribution calculations of radioactivity concentration. We also estimate a time dependent distribution of radioactivity classification and a radioactive-waste classification. The information obtained from the evaluation is utilized by other tasks in the preparatory tasks to make the decommissioning plan and the activity safe and rational.

  12. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    Science.gov (United States)

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  13. Analytical quality control [An IAEA service

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1973-07-01

    In analytical chemistry the determination of small or trace amounts of elements or compounds in different types of materials is increasingly important. The results of these findings have a great influence on different fields of science, and on human life. Their reliability, precision and accuracy must, therefore, be checked by analytical quality control measures. The International Atomic Energy Agency (IAEA) set up an Analytical Quality Control Service (AQCS) in 1962 to assist laboratories in Member States in the assessment of their reliability in radionuclide analysis, and in other branches of applied analysis in which radionuclides may be used as analytical implements. For practical reasons, most analytical laboratories are not in a position to check accuracy internally, as frequently resources are available for only one method; standardized sample material, particularly in the case of trace analysis, is not available and can be prepared by the institutes themselves only in exceptional cases; intercomparisons are organized rather seldom and many important types of analysis are so far not covered. AQCS assistance is provided by the shipment to laboratories of standard reference materials containing known quantities of different trace elements or radionuclides, as well as by the organization of analytical intercomparisons in which the participating laboratories are provided with aliquots of homogenized material of unknown composition for analysis. In the latter case the laboratories report their data to the Agency's laboratory, which calculates averages and distributions of results and advises each laboratory of its performance relative to all the others. Throughout the years several dozens of intercomparisons have been organized and many thousands of samples provided. The service offered, as a consequence, has grown enormously. The programme for 1973 and 1974, which is currently being distributed to Member States, will contain 31 different types of materials.

  14. Analytic continuation by duality estimation of the S parameter

    International Nuclear Information System (INIS)

    Ignjatovic, S. R.; Wijewardhana, L. C. R.; Takeuchi, T.

    2000-01-01

    We investigate the reliability of the analytic continuation by duality (ACD) technique in estimating the electroweak S parameter for technicolor theories. The ACD technique, which is an application of finite energy sum rules, relates the S parameter for theories with unknown particle spectra to known OPE coefficients. We identify the sources of error inherent in the technique and evaluate them for several toy models to see if they can be controlled. The evaluation of errors is done analytically and all relevant formulas are provided in appendixes including analytical formulas for approximating the function 1/s with a polynomial in s. The use of analytical formulas protects us from introducing additional errors due to numerical integration. We find that it is very difficult to control the errors even when the momentum dependence of the OPE coefficients is known exactly. In realistic cases in which the momentum dependence of the OPE coefficients is only known perturbatively, it is impossible to obtain a reliable estimate. (c) 2000 The American Physical Society

  15. A Procedure for the Sequential Determination of Radionuclides in Phosphogypsum Liquid Scintillation Counting and Alpha Spectrometry for 210Po, 210Pb, 226Ra, Th and U Radioisotopes

    International Nuclear Information System (INIS)

    2014-01-01

    Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, tested and validated analytical procedures are extremely important tools for the production of such analytical data. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. In this publication, a combined procedure for the sequential determination of 210Po, 210Pb, 226Ra, Th and U radioisotopes in phosphogypsum is described. The method is based on the dissolution of small amounts of phosphogypsum by microwave digestion, followed by sequential separation of 210Po, 210Pb, Th and U radioisotopes by selective extraction chromatography using Sr, TEVA and UTEVA resins. Radium-226 is separated from interfering elements using Ba(Ra)SO4 co-precipitation. Lead-210 is determined by liquid scintillation counting. The alpha source of 210Po is prepared by autodeposition on a silver plate. The alpha sources of Th and U are prepared by electrodeposition on a stainless steel plate. A comprehensive methodology for the calculation of results, including the quantification of measurement uncertainty, was also developed. The procedure is introduced as a recommended procedure and validated in terms of trueness, repeatability and reproducibility in accordance with ISO guidelines

  16. Validation of a new analytical procedure for determination of residual solvents in [18F]FDG by gas chromatography

    International Nuclear Information System (INIS)

    Costa, Flávia M.; Costa, Cassiano L.S.; Silva, Juliana B.; Ferreira, Soraya M.Z.M.D.

    2017-01-01

    Fludeoxyglucose F 18 ([ 18 F]FDG) is the most used radiopharmaceutical for positron emission tomography, especially on oncology. Organic solvents such as ether, ethanol and acetonitrile might be used in the synthesis of [ 18 F]FDG; however, they might not be completely removed during purification steps. The determination of residual solvents in [ 18 F]FDG is required in the European Pharmacopoeia (EP) and the United States Pharmacopeia (USP) monographs. While the procedure described in the EP is quite general, the one described in the USP requires a long runtime (about 13 minutes). In this work a simple and fast (4-minute) analytical procedure was developed and validated for determination of residual solvents in [ 18 F]FDG. Analyses were carried out in a Perkin Elmer gas chromatograph equipped with a flame ionization detector. The separation was obtained on a 0.53-mm x 30 m fused-silica column. Validation included the evaluation of various parameters, such as: specificity, linearity and range, limits of detection and quantitation, precision (repeatability and intermediate precision), accuracy, and robustness. Results were found to be within acceptable limits, indicating the developed procedure is suitable for its intended application. Considering the short half-life of fluorine-18 (109.7 minutes), this new method could be a valuable alternative for routine quality control of [ 18 F]FDG. (author)

  17. Fundamentals and applications of systems reliability analysis

    International Nuclear Information System (INIS)

    Boesebeck, K.; Heuser, F.W.; Kotthoff, K.

    1976-01-01

    The lecture gives a survey on the application of methods of reliability analysis to assess the safety of nuclear power plants. Possible statements of reliability analysis in connection with specifications of the atomic licensing procedure are especially dealt with. Existing specifications of safety criteria are additionally discussed with the help of reliability analysis by the example of the reliability analysis of a reactor protection system. Beyond the limited application to single safety systems, the significance of reliability analysis for a closed risk concept is explained in the last part of the lecture. (orig./LH) [de

  18. Reliable and Efficient Procedure for Steady-State Analysis of Nonautonomous and Autonomous Systems

    Directory of Open Access Journals (Sweden)

    J. Dobes

    2012-04-01

    Full Text Available The majority of contemporary design tools do not still contain steady-state algorithms, especially for the autonomous systems. This is mainly caused by insufficient accuracy of the algorithm for numerical integration, but also by unreliable steady-state algorithms themselves. Therefore, in the paper, a very stable and efficient procedure for the numerical integration of nonlinear differential-algebraic systems is defined first. Afterwards, two improved methods are defined for finding the steady state, which use this integration algorithm in their iteration loops. The first is based on the idea of extrapolation, and the second utilizes nonstandard time-domain sensitivity analysis. The two steady-state algorithms are compared by analyses of a rectifier and a C-class amplifier, and the extrapolation algorithm is primarily selected as a more reliable alternative. Finally, the method based on the extrapolation naturally cooperating with the algorithm for solving the differential-algebraic systems is thoroughly tested on various electronic circuits: Van der Pol and Colpitts oscillators, fragment of a large bipolar logical circuit, feedback and distributed microwave oscillators, and power amplifier. The results confirm that the extrapolation method is faster than a classical plain numerical integration, especially for larger circuits with complicated transients.

  19. The analytic hierarchy process as a systematic approach to the identification of important parameters for the reliability assessment of passive systems

    International Nuclear Information System (INIS)

    Zio, E.; Cantarella, M.; Cammi, A.

    2003-01-01

    Passive systems play a crucial role in the development of future solutions for nuclear plant technology. A fundamental issue still to be resolved is the quantification of the reliability of such systems. In this paper, we firstly illustrate a systematic methodology to guide the definition of the failure criteria of a passive system and the evaluation of its probability of occurrence, through the identification of the relevant system parameters and the propagation of their associated uncertainties. Within this methodology, we propose the use of the analytic hierarchy process as a structured and reproducible tool for the decomposition of the problem and the identification of the dominant system parameters. An example of its application to a real passive system is illustrated in details

  20. 18 CFR 39.4 - Funding of the Electric Reliability Organization.

    Science.gov (United States)

    2010-04-01

    ... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.4 Funding of the Electric Reliability Organization. (a) Any... Reliability Organization. 39.4 Section 39.4 Conservation of Power and Water Resources FEDERAL ENERGY...

  1. Reliability and Validity of 10 Different Standard Setting Procedures.

    Science.gov (United States)

    Halpin, Glennelle; Halpin, Gerald

    Research indicating that different cut-off points result from the use of different standard-setting techniques leaves decision makers with a disturbing dilemma: Which standard-setting method is best? This investigation of the reliability and validity of 10 different standard-setting approaches was designed to provide information that might help…

  2. Reliability studies in a developing technology

    International Nuclear Information System (INIS)

    Mitchell, L.A.; Osgood, C.; Radcliffe, S.J.

    1975-01-01

    The standard methods of reliability analysis can only be applied if valid failure statistics are available. In a developing technology the statistics which have been accumulated, over many years of conventional experience, are often rendered useless by environmental effects. Thus new data, which take account of the new environment, are required. This paper discusses the problem of optimizing the acquisition of these data when time-scales and resources are limited. It is concluded that the most fruitful strategy in assessing the reliability of mechanisms is to study the failures of individual joints whilst developing, where necessary, analytical tools to facilitate the use of these data. The approach is illustrated by examples from the field of tribology. Failures of rolling element bearings in moist, high-pressure carbon dioxide illustrate the important effects of apparently minor changes in the environment. New analytical techniques are developed from a study of friction failures in sliding joints. (author)

  3. Reliability analysis of prestressed concrete containment structures

    International Nuclear Information System (INIS)

    Jiang, J.; Zhao, Y.; Sun, J.

    1993-01-01

    The reliability analysis of prestressed concrete containment structures subjected to combinations of static and dynamic loads with consideration of uncertainties of structural and load parameters is presented. Limit state probabilities for given parameters are calculated using the procedure developed at BNL, while that with consideration of parameter uncertainties are calculated by a fast integration for time variant structural reliability. The limit state surface of the prestressed concrete containment is constructed directly incorporating the prestress. The sensitivities of the Choleskey decomposition matrix and the natural vibration character are calculated by simplified procedures. (author)

  4. Life cycle management of analytical methods.

    Science.gov (United States)

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Large Sample Confidence Intervals for Item Response Theory Reliability Coefficients

    Science.gov (United States)

    Andersson, Björn; Xin, Tao

    2018-01-01

    In applications of item response theory (IRT), an estimate of the reliability of the ability estimates or sum scores is often reported. However, analytical expressions for the standard errors of the estimators of the reliability coefficients are not available in the literature and therefore the variability associated with the estimated reliability…

  6. Reliability assessment of Port Harcourt 33/11kv Distribution System ...

    African Journals Online (AJOL)

    This makes reliability studies an important task besides all the other analyses required for assessing the system performance. The paper presents an analytical approach in the reliability assessment of the Port Harcourt 33/11kV power distribution system. The assessment was performed with the 2009 power outage data ...

  7. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  8. On Bayesian System Reliability Analysis

    International Nuclear Information System (INIS)

    Soerensen Ringi, M.

    1995-01-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs

  9. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 1: HARP introduction and user's guide

    Science.gov (United States)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.

  10. NUMERICAL AND ANALYTIC METHODS OF ESTIMATION BRIDGES’ CONSTRUCTIONS

    Directory of Open Access Journals (Sweden)

    Y. Y. Luchko

    2010-03-01

    Full Text Available In this article the numerical and analytical methods of calculation of the stressed-and-strained state of bridge constructions are considered. The task on increasing of reliability and accuracy of the numerical method and its solution by means of calculations in two bases are formulated. The analytical solution of the differential equation of deformation of a ferro-concrete plate under the action of local loads is also obtained.

  11. Assessing the Impact of Imperfect Diagnosis on Service Reliability

    DEFF Research Database (Denmark)

    Grønbæk, Lars Jesper; Schwefel, Hans-Peter; Kjærgaard, Jens Kristian

    2010-01-01

    , representative diagnosis performance metrics have been defined and their closed-form solutions obtained for the Markov model. These equations enable model parameterization from traces of implemented diagnosis components. The diagnosis model has been integrated in a reliability model assessing the impact...... of the diagnosis functions for the studied reliability problem. In a simulation study we finally analyze trade-off properties of diagnosis heuristics from literature, map them to the analytic Markov model, and investigate its suitability for service reliability optimization....

  12. Importance of implementing an analytical quality control system in a core laboratory.

    Science.gov (United States)

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions

  13. Predictive analytics and child protection: constraints and opportunities.

    Science.gov (United States)

    Russell, Jesse

    2015-08-01

    This paper considers how predictive analytics might inform, assist, and improve decision making in child protection. Predictive analytics represents recent increases in data quantity and data diversity, along with advances in computing technology. While the use of data and statistical modeling is not new to child protection decision making, its use in child protection is experiencing growth, and efforts to leverage predictive analytics for better decision-making in child protection are increasing. Past experiences, constraints and opportunities are reviewed. For predictive analytics to make the most impact on child protection practice and outcomes, it must embrace established criteria of validity, equity, reliability, and usefulness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Structural Reliability Analysis of Wind Turbines: A Review

    Directory of Open Access Journals (Sweden)

    Zhiyu Jiang

    2017-12-01

    Full Text Available The paper presents a detailed review of the state-of-the-art research activities on structural reliability analysis of wind turbines between the 1990s and 2017. We describe the reliability methods including the first- and second-order reliability methods and the simulation reliability methods and show the procedure for and application areas of structural reliability analysis of wind turbines. Further, we critically review the various structural reliability studies on rotor blades, bottom-fixed support structures, floating systems and mechanical and electrical components. Finally, future applications of structural reliability methods to wind turbine designs are discussed.

  15. A Big Data Analytics Methodology Program in the Health Sector

    Science.gov (United States)

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  16. Analytic tests and their relation to jet fuel thermal stability

    Energy Technology Data Exchange (ETDEWEB)

    Heneghan, S.P.; Kauffman, R.E. [Univ. of Dayton Research Institute, OH (United States)

    1995-05-01

    The evaluation of jet fuel thermal stability (TS) by simple analytic procedures has long been a goal of fuels chemists. The reason is obvious: if the analytic chemist can determine which types of material cause his test to respond, the refiners will know which materials to remove to improve stability. Complicating this quest is the lack of an acceptable quantitative TS test with which to compare any analytic procedures. To circumvent this problem, we recently compiled the results of TS tests for 12 fuels using six separate test procedures. The results covering a range of flow and temperature conditions show that TS is not as dependent on test conditions as previously thought. Also, comparing the results from these tests with several analytic procedures shows that either a measure of the number of phenols or the total sulfur present in jet fuels is strongly indicative of the TS. The phenols have been measured using a cyclic voltammetry technique and the polar material by gas chromatography (atomic emission detection) following a solid phase extraction on silica gel. The polar material has been identified as mainly phenols (by mass spectrometry identification). Measures of the total acid number or peroxide concentration have little correlation with TS.

  17. Reliability centred maintenance of nuclear power plant facilities

    International Nuclear Information System (INIS)

    Kovacs, Zoltan; Novakova, Helena; Hlavac, Pavol; Janicek, Frantisek

    2011-01-01

    A method for the optimization of preventive maintenance nuclear power plant equipment, i.e. reliability centred maintenance, is described. The method enables procedures and procedure schedules to be defined such as allow the maintenance cost to be minimized without compromising operational safety or reliability. Also, combinations of facilities which remain available and ensure reliable operation of the reactor unit during the maintenance of other pieces of equipment are identified. The condition-based maintenance concept is used in this process, thereby preventing unnecessary operator interventions into the equipment, which are often associated with human errors. Where probabilistic safety assessment is available, the most important structures, systems and components with the highest maintenance priority can be identified. (orig.)

  18. The analytical quality control programme of the IAEA

    Energy Technology Data Exchange (ETDEWEB)

    Suschny, O; Richman, D M [International Atomic Energy Agency, Division of Research and Laboratories, Seibersdorf (Austria)

    1973-10-01

    The International Atomic Energy Agency has distributed calibrated radioisotope solutions, standard reference materials and intercomparison materials in the nuclear and radioisotope materials and intercomparison materials in the nuclear and radioisotope fields since the early 1960's. The purpose of this activity was to help laboratories in the Member States to assess and, if necessary, to improve the reliability of their analytical work and to enable them, in this way, to render better service in a large number of areas ranging from nuclear technology to isotope applications in medicine and environmental sciences. The usefulness and the need for this service was demonstrated by the results of many intercomparisons which proved that without continued analytical quality control adequate reliability of analytical data could not be taken for granted. The scope and the size of the future programme of the Agency in this field has been delineated by recommendations made by several Panels of Experts. They have all agreed on the importance of it and made detailed recommendations in their areas of expertise.

  19. The analytical quality control programme of the IAEA

    International Nuclear Information System (INIS)

    Suschny, O.; Richman, D.M.

    1973-10-01

    The International Atomic Energy Agency has distributed calibrated radioisotope solutions, standard reference materials and intercomparison materials in the nuclear and radioisotope materials and intercomparison materials in the nuclear and radioisotope fields since the early 1960's. The purpose of this activity was to help laboratories in the Member States to assess and, if necessary, to improve the reliability of their analytical work and to enable them, in this way, to render better service in a large number of areas ranging from nuclear technology to isotope applications in medicine and environmental sciences. The usefulness and the need for this service was demonstrated by the results of many intercomparisons which proved that without continued analytical quality control adequate reliability of analytical data could not be taken for granted. The scope and the size of the future programme of the Agency in this field has been delineated by recommendations made by several Panels of Experts. They have all agreed on the importance of it and made detailed recommendations in their areas of expertise

  20. Pilot testing of SHRP 2 reliability data and analytical products: Florida.

    Science.gov (United States)

    2015-01-01

    Transportation agencies have realized the importance of performance estimation, measurement, and management. The Moving Ahead for Progress in the 21st Century Act legislation identifies travel time reliability as one of the goals of the federal highw...

  1. Analytical procedures for the determination of disperse azo dyes

    Energy Technology Data Exchange (ETDEWEB)

    Betowski, L.D.; Jones, T.L. (Environmental Protection Agency, Las Vegas, NV (USA)); Munslow, W.; Nunn, N.J. (Lockheed Engineering and Management Services Co., Las Vegas, NV (USA))

    1988-09-01

    Disperse Blue 79 is the most widely-used azo dye in the US. Its economic importance for the dye industry and textile industry is very great. Because of its use and potential for degradation to aromatic amines, this compound has been chosen for testing by the Interagency Testing Committee. The authors laboratory has been developing methods for the analytical determination of Disperse Blue 79 and any possible degradation products in wastewater. This work has been taking place in conjunction with the study of the fate of azo dyes in the wastewater treatment processes by the Water Engineering Research Laboratory of the US EPA in Cincinnati. There were various phases for this analytical development. The first step involved purifying the commercial material or presscake to obtain a standard for quantitative determination. A combination of HPLC, TLC and mass spectrometric methods was used to determine purity after extraction and column cleanup. Phase two involved the extraction of the dye from the matrices involved. The third phase was the actual testing of Disperse Blue 79 in the waste activated sludge system and anaerobic digester. Recovery of the dye and any degradation products at each sampling point (e.g., secondary effluent, waste activated sludge) was the goal of this phase.

  2. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    Directory of Open Access Journals (Sweden)

    Helen V. Hsieh

    2017-05-01

    Full Text Available Immunochromatographic or lateral flow assays (LFAs are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads, biological reagents (e.g., antibodies, blocking reagents and buffers and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness.

  3. Reliability of thermal interface materials: A review

    International Nuclear Information System (INIS)

    Due, Jens; Robinson, Anthony J.

    2013-01-01

    Thermal interface materials (TIMs) are used extensively to improve thermal conduction across two mating parts. They are particularly crucial in electronics thermal management since excessive junction-to-ambient thermal resistances can cause elevated temperatures which can negatively influence device performance and reliability. Of particular interest to electronic package designers is the thermal resistance of the TIM layer at the end of its design life. Estimations of this allow the package to be designed to perform adequately over its entire useful life. To this end, TIM reliability studies have been performed using accelerated stress tests. This paper reviews the body of work which has been performed on TIM reliability. It focuses on the various test methodologies with commentary on the results which have been obtained for the different TIM materials. Based on the information available in the open literature, a test procedure is proposed for TIM selection based on beginning and end of life performance. - Highlights: ► This paper reviews the body of work which has been performed on TIM reliability. ► Test methodologies for reliability testing are outlined. ► Reliability results for the different TIM materials are discussed. ► A test procedure is proposed for TIM selection BOLife and EOLife performance.

  4. Reliability analysis of self-actuated shutdown system

    International Nuclear Information System (INIS)

    Itooka, S.; Kumasaka, K.; Okabe, A.; Satoh, K.; Tsukui, Y.

    1991-01-01

    An analytical study was performed for the reliability of a self-actuated shutdown system (SASS) under the unprotected loss of flow (ULOF) event in a typical loop-type liquid metal fast breeder reactor (LMFBR) by the use of the response surface Monte Carlo analysis method. Dominant parameters for the SASS, such as Curie point characteristics, subassembly outlet coolant temperature, electromagnetic surface condition, etc., were selected and their probability density functions (PDFs) were determined by the design study information and experimental data. To get the response surface function (RSF) for the maximum coolant temperature, transient analyses of ULOF were performed by utilizing the experimental design method in the determination of analytical cases. Then, the RSF was derived by the multi-variable regression analysis. The unreliability of the SASS was evaluated as a probability that the maximum coolant temperature exceeded an acceptable level, employing the Monte Carlo calculation using the above PDFs and RSF. In this study, sensitivities to the dominant parameter were compared. The dispersion of subassembly outlet coolant temperature near the SASS-was found to be one of the most sensitive parameters. Fault tree analysis was performed using this value for the SASS in order to evaluate the shutdown system reliability. As a result of this study, the effectiveness of the SASS on the reliability improvement in the LMFBR shutdown system was analytically confirmed. This study has been performed as a part of joint research and development projects for DFBR under the sponsorship of the nine Japanese electric power companies, Electric Power Development Company and the Japan Atomic Power Company. (author)

  5. Reliability performance of standby equipment with periodic testing

    International Nuclear Information System (INIS)

    Sim, S.H.

    1985-11-01

    In this report, the reliability performance of standby equipment subjected to periodic testing is studied. analytical expressions have been derived for reliability measures, such as the man accumulated operating time to failure, the expected number of tests between two consecutive failures, the mean time to failure following an emergency start-up and the probability of failing to complete an emergency mission of a specified duration. These results are useful for the reliability assessment of standby equipment such as combustion turbine units of the emergency power supply system, and of the Class III power system at a nuclear generating station

  6. Analytic webs support the synthesis of ecological data sets.

    Science.gov (United States)

    Ellison, Aaron M; Osterweil, Leon J; Clarke, Lori; Hadley, Julian L; Wise, Alexander; Boose, Emery; Foster, David R; Hanson, Allen; Jensen, David; Kuzeja, Paul; Riseman, Edward; Schultz, Howard

    2006-06-01

    A wide variety of data sets produced by individual investigators are now synthesized to address ecological questions that span a range of spatial and temporal scales. It is important to facilitate such syntheses so that "consumers" of data sets can be confident that both input data sets and synthetic products are reliable. Necessary documentation to ensure the reliability and validation of data sets includes both familiar descriptive metadata and formal documentation of the scientific processes used (i.e., process metadata) to produce usable data sets from collections of raw data. Such documentation is complex and difficult to construct, so it is important to help "producers" create reliable data sets and to facilitate their creation of required metadata. We describe a formal representation, an "analytic web," that aids both producers and consumers of data sets by providing complete and precise definitions of scientific processes used to process raw and derived data sets. The formalisms used to define analytic webs are adaptations of those used in software engineering, and they provide a novel and effective support system for both the synthesis and the validation of ecological data sets. We illustrate the utility of an analytic web as an aid to producing synthetic data sets through a worked example: the synthesis of long-term measurements of whole-ecosystem carbon exchange. Analytic webs are also useful validation aids for consumers because they support the concurrent construction of a complete, Internet-accessible audit trail of the analytic processes used in the synthesis of the data sets. Finally we describe our early efforts to evaluate these ideas through the use of a prototype software tool, SciWalker. We indicate how this tool has been used to create analytic webs tailored to specific data-set synthesis and validation activities, and suggest extensions to it that will support additional forms of validation. The process metadata created by SciWalker is

  7. Investigating Reliabilities of Intraindividual Variability Indicators

    Science.gov (United States)

    Wang, Lijuan; Grimm, Kevin J.

    2012-01-01

    Reliabilities of the two most widely used intraindividual variability indicators, "ISD[superscript 2]" and "ISD", are derived analytically. Both are functions of the sizes of the first and second moments of true intraindividual variability, the size of the measurement error variance, and the number of assessments within a burst. For comparison,…

  8. Validation of a new analytical procedure for determination of residual solvents in [{sup 18}F]FDG by gas chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Flávia M.; Costa, Cassiano L.S.; Silva, Juliana B.; Ferreira, Soraya M.Z.M.D., E-mail: flaviabiomedica@yahoo.com.br [Centro de Desenvolvimento da Tecnologia Nuclear (UPPR/CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Unidade de Pesquisa e Produção de Radiofármacos

    2017-07-01

    Fludeoxyglucose F 18 ([{sup 18}F]FDG) is the most used radiopharmaceutical for positron emission tomography, especially on oncology. Organic solvents such as ether, ethanol and acetonitrile might be used in the synthesis of [{sup 18}F]FDG; however, they might not be completely removed during purification steps. The determination of residual solvents in [{sup 18}F]FDG is required in the European Pharmacopoeia (EP) and the United States Pharmacopeia (USP) monographs. While the procedure described in the EP is quite general, the one described in the USP requires a long runtime (about 13 minutes). In this work a simple and fast (4-minute) analytical procedure was developed and validated for determination of residual solvents in [{sup 18}F]FDG. Analyses were carried out in a Perkin Elmer gas chromatograph equipped with a flame ionization detector. The separation was obtained on a 0.53-mm x 30 m fused-silica column. Validation included the evaluation of various parameters, such as: specificity, linearity and range, limits of detection and quantitation, precision (repeatability and intermediate precision), accuracy, and robustness. Results were found to be within acceptable limits, indicating the developed procedure is suitable for its intended application. Considering the short half-life of fluorine-18 (109.7 minutes), this new method could be a valuable alternative for routine quality control of [{sup 18}F]FDG. (author)

  9. Quality assurance and quality control of nuclear analytical techniques

    International Nuclear Information System (INIS)

    Cincu, Emanuelathor

    2001-01-01

    Test and analytical laboratories in East and Central European countries need to prove the reliability and credibility of their economic, environmental, medical and legal decisions and their capacity of issuing reliable, verifiable reports. These demands imposed by the European Union aimed at avoiding a possible barrier to trade for the developing countries. In June 1999, in order to help Member States to develop according to EU objectives and the overall situation of the European market, IAEA launched a new co-operation programme designed to help the nuclear analytical laboratories in nuclear institutions and universities of Member States by training in the use of some Nuclear Analytical Techniques (NAT) that include: alpha, beta and gamma-ray spectrometry, radiochemical and neutron activation analysis, total reflection X-ray fluorescence. The Regional IAEA Project, named 'Quality Assurance/Quality Control of Nuclear Analytical Techniques' (NAT) aims at implementing the QA principles via a system of defined consecutive steps leading to a level on which the QA system is self-sustainable for formal accreditation or certification and satisfies the EU technical performance criteria; the requirements are in accordance with the new ISO/IEC 17025 Standard/Dec.1999 'General requirements for the competence of testing and calibration laboratories' - First edition. The Horia Hulubei National Institute for Nuclear Physics and Engineering, IFIN-HH, was admitted for participation in the IAEA Project in June 1999 account taken of its experience in the QA and metrology fields and its performance in the fields of beta and gamma-ray spectrometry, and radiochemical and neutron activation analysis, employed in both basic research and applications for external clients. Two working groups of specialists with the QA and Standardization and Metrology Departments and six analytical groups with the departments of Nuclear Applied Physics, Life Physics and Ionising Radiation Metrology are

  10. The application of analytical procedures in the audit process: A ...

    African Journals Online (AJOL)

    kirstam

    collected through interviews with senior audit managers at large audit ... providing a perspective of why and how South African auditors apply analytical ... and includes the objectives of each study, the data collection method used, and a ...... 2It is recommended that scholars use the findings of this study to perform further.

  11. Verification and validation of a numeric procedure for flow simulation of a 2x2 PWR rod bundle

    International Nuclear Information System (INIS)

    Santos, Andre A.C.; Barros Filho, Jose Afonso; Navarro, Moyses A.

    2011-01-01

    Before Computational Fluid Dynamics (CFD) can be considered as a reliable tool for the analysis of flow through rod bundles there is a need to establish the credibility of the numerical results. Procedures must be defined to evaluate the error and uncertainty due to aspects such as mesh refinement, turbulence model, wall treatment and appropriate definition of boundary conditions. These procedures are referred to as Verification and Validation (V and V) processes. In 2009 a standard was published by the American Society of Mechanical Engineers (ASME) establishing detailed procedures for V and V of CFD simulations. This paper presents a V and V evaluation of a numerical methodology applied to the simulation of a PWR rod bundle segment with a split vane spacer grid based on ASMEs standard. In this study six progressively refined meshes were generated to evaluate the numerical uncertainty through the verification procedure. Experimental and analytical results available in the literature were used in this study for validation purpose. The results show that the ASME verification procedure can give highly variable predictions of uncertainty depending on the mesh triplet used for the evaluation. However, the procedure can give good insight towards optimization of the mesh size and overall result quality. Although the experimental results used for the validation were not ideal, through the validation procedure the deficiencies and strengths of the presented modeling could be detected and reasonably evaluated. Even though it is difficult to obtain reliable estimates of the uncertainty of flow quantities in the turbulent flow, this study shows that the V and V process is a necessary step in a CFD analysis of a spacer grid design. (author)

  12. Environmental Measurements Laboratory (EML) procedures manual

    International Nuclear Information System (INIS)

    Chieco, N.A.; Bogen, D.C.; Knutson, E.O.

    1990-11-01

    Volume 1 of this manual documents the procedures and existing technology that are currently used by the Environmental Measurements Laboratory. A section devoted to quality assurance has been included. These procedures have been updated and revised and new procedures have been added. They include: sampling; radiation measurements; analytical chemistry; radionuclide data; special facilities; and specifications. 228 refs., 62 figs., 37 tabs. (FL)

  13. Reliability of structural systems subject to fatigue

    International Nuclear Information System (INIS)

    Rackwitz, R.

    1984-01-01

    Concepts and computational procedures for the reliability calculation of structural systems subject to fatigue are outlined. Systems are dealt with by approximately computing componential times to first failure. So-called first-order reliability methods are then used to formulate dependencies between componential failures and to evaluate the system failure probability. (Author) [pt

  14. Lifetime Reliability Assessment of Concrete Slab Bridges

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    A procedure for lifetime assesment of the reliability of short concrete slab bridges is presented in the paper. Corrosion of the reinforcement is the deterioration mechanism used for estimating the reliability profiles for such bridges. The importance of using sensitivity measures is stressed....... Finally the produce is illustrated on 6 existing UK bridges....

  15. A Modified GC-MS Analytical Procedure for Separation and Detection of Multiple Classes of Carbohydrates

    Directory of Open Access Journals (Sweden)

    Yong-Gang Xia

    2018-05-01

    Full Text Available A modified GC-MS analytical procedure based on trimethylsilyl-dithioacetal (TMSD derivatization has been established for a simultaneous determination of thirteen carbohydrates. Different from previous approaches, the current GC-MS method was featured by a powerful practicability for simultaneous detection of aldoses, uronic acids, ketoses, and amino sugars; simplifying GC-MS chromatograms and producing a single peak for each derivatized sugar, as well as high resolution, sensitivity, and repeatability. An additional liquid-liquid extraction from derivatization mixtures was performed not only to increase the detection sensitivity of amino sugars but also to decrease the by-products of derivatization. Contrarily, three amino sugars were detected at a very low intensity or not detected at all. The effect of time on monosaccharide- mercaptalated reaction was systematically investigated. The effect of trimethylsilylation on the formation of TMSD was also optimized. The established GC-MS based on TMSD derivatization was suitable for complex carbohydrate analysis and has been successfully applied for the detection of free carbohydrates in water extracts of Anemarrhena asphodeloides roots and determination of monosaccharides in Glossy ganoderma polysaccharides.

  16. System analysis procedures for conducting PSA of nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Yoon Hwan; Jeong, Won Dae; Kim, Tae Un; Kim, Kil You; Han, Sang Hoon; Chang, Seung Chul; Sung, Tae Yong; Yang, Jun Eon; Kang, Dae Il; Park, Jin Hee; Hwang, Mi Jeong; Jin, Young Ho.

    1997-03-01

    This document, the Probabilistic Safety Assessment(PSA) procedures guide for system analysis, is intended to provide the guidelines to analyze the target of system consistently and technically in the performance of PSA for nuclear power plants(NPPs). The guide has been prepared in accordance with the procedures and techniques for fault tree analysis(FTA) used in system analysis. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis(ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. After the construction of fault tree is completed, system unavailability is calculated with the CUT module of KIRAP, and the qualitative and quantitative analysis is performed through the process as above stated. As above mentioned, the procedures for system analysis is based on PSA procedures and methods which has been applied to the safety assessments of constructing NPPs in the country. Accordingly, the method of FTA stated in this procedures guide will be applicable to PSA for the NPPs to be constructed in the future. (author). 6 tabs., 11 figs., 7 refs

  17. System analysis procedures for conducting PSA of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yoon Hwan; Jeong, Won Dae; Kim, Tae Un; Kim, Kil You; Han, Sang Hoon; Chang, Seung Chul; Sung, Tae Yong; Yang, Jun Eon; Kang, Dae Il; Park, Jin Hee; Hwang, Mi Jeong; Jin, Young Ho

    1997-03-01

    This document, the Probabilistic Safety Assessment(PSA) procedures guide for system analysis, is intended to provide the guidelines to analyze the target of system consistently and technically in the performance of PSA for nuclear power plants(NPPs). The guide has been prepared in accordance with the procedures and techniques for fault tree analysis(FTA) used in system analysis. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis(ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. After the construction of fault tree is completed, system unavailability is calculated with the CUT module of KIRAP, and the qualitative and quantitative analysis is performed through the process as above stated. As above mentioned, the procedures for system analysis is based on PSA procedures and methods which has been applied to the safety assessments of constructing NPPs in the country. Accordingly, the method of FTA stated in this procedures guide will be applicable to PSA for the NPPs to be constructed in the future. (author). 6 tabs., 11 figs., 7 refs.

  18. Calculation of the reliability of large complex systems by the relevant path method

    International Nuclear Information System (INIS)

    Richter, G.

    1975-03-01

    In this paper, analytical methods are presented and tested with which the probabilistic reliability data of technical systems can be determined for given fault trees and block diagrams and known reliability data of the components. (orig./AK) [de

  19. Reliable Communication in Wireless Meshed Networks using Network Coding

    DEFF Research Database (Denmark)

    Pahlevani, Peyman; Paramanathan, Achuthan; Hundebøll, Martin

    2012-01-01

    The advantages of network coding have been extensively studied in the field of wireless networks. Integrating network coding with existing IEEE 802.11 MAC layer is a challenging problem. The IEEE 802.11 MAC does not provide any reliability mechanisms for overheard packets. This paper addresses...... this problem and suggests different mechanisms to support reliability as part of the MAC protocol. Analytical expressions to this problem are given to qualify the performance of the modified network coding. These expressions are confirmed by numerical result. While the suggested reliability mechanisms...

  20. Adjoint sensitivity analysis of dynamic reliability models based on Markov chains - II: Application to IFMIF reliability assessment

    Energy Technology Data Exchange (ETDEWEB)

    Cacuci, D. G. [Commiss Energy Atom, Direct Energy Nucl, Saclay, (France); Cacuci, D. G.; Balan, I. [Univ Karlsruhe, Inst Nucl Technol and Reactor Safetly, Karlsruhe, (Germany); Ionescu-Bujor, M. [Forschungszentrum Karlsruhe, Fus Program, D-76021 Karlsruhe, (Germany)

    2008-07-01

    In Part II of this work, the adjoint sensitivity analysis procedure developed in Part I is applied to perform sensitivity analysis of several dynamic reliability models of systems of increasing complexity, culminating with the consideration of the International Fusion Materials Irradiation Facility (IFMIF) accelerator system. Section II presents the main steps of a procedure for the automated generation of Markov chains for reliability analysis, including the abstraction of the physical system, construction of the Markov chain, and the generation and solution of the ensuing set of differential equations; all of these steps have been implemented in a stand-alone computer code system called QUEFT/MARKOMAG-S/MCADJSEN. This code system has been applied to sensitivity analysis of dynamic reliability measures for a paradigm '2-out-of-3' system comprising five components and also to a comprehensive dynamic reliability analysis of the IFMIF accelerator system facilities for the average availability and, respectively, the system's availability at the final mission time. The QUEFT/MARKOMAG-S/MCADJSEN has been used to efficiently compute sensitivities to 186 failure and repair rates characterizing components and subsystems of the first-level fault tree of the IFMIF accelerator system. (authors)

  1. Adjoint sensitivity analysis of dynamic reliability models based on Markov chains - II: Application to IFMIF reliability assessment

    International Nuclear Information System (INIS)

    Cacuci, D. G.; Cacuci, D. G.; Balan, I.; Ionescu-Bujor, M.

    2008-01-01

    In Part II of this work, the adjoint sensitivity analysis procedure developed in Part I is applied to perform sensitivity analysis of several dynamic reliability models of systems of increasing complexity, culminating with the consideration of the International Fusion Materials Irradiation Facility (IFMIF) accelerator system. Section II presents the main steps of a procedure for the automated generation of Markov chains for reliability analysis, including the abstraction of the physical system, construction of the Markov chain, and the generation and solution of the ensuing set of differential equations; all of these steps have been implemented in a stand-alone computer code system called QUEFT/MARKOMAG-S/MCADJSEN. This code system has been applied to sensitivity analysis of dynamic reliability measures for a paradigm '2-out-of-3' system comprising five components and also to a comprehensive dynamic reliability analysis of the IFMIF accelerator system facilities for the average availability and, respectively, the system's availability at the final mission time. The QUEFT/MARKOMAG-S/MCADJSEN has been used to efficiently compute sensitivities to 186 failure and repair rates characterizing components and subsystems of the first-level fault tree of the IFMIF accelerator system. (authors)

  2. Monte Carlo simulation based reliability evaluation in a multi-bilateral contracts market

    International Nuclear Information System (INIS)

    Goel, L.; Viswanath, P.A.; Wang, P.

    2004-01-01

    This paper presents a time sequential Monte Carlo simulation technique to evaluate customer load point reliability in multi-bilateral contracts market. The effects of bilateral transactions, reserve agreements, and the priority commitments of generating companies on customer load point reliability have been investigated. A generating company with bilateral contracts is modelled as an equivalent time varying multi-state generation (ETMG). A procedure to determine load point reliability based on ETMG has been developed. The developed procedure is applied to a reliability test system to illustrate the technique. Representing each bilateral contract by an ETMG provides flexibility in determining the reliability at various customer load points. (authors)

  3. A guide to reliability data collection, validation and storage

    International Nuclear Information System (INIS)

    Stevens, B.

    1986-01-01

    The EuReDatA Working Group produced a basic document that addressed many of the problems associated with the design of a suitable data collection scheme to achieve pre-defined objectives. The book that resulted from this work describes the need for reliability data, data sources and collection procedures, component description and classification, form design, data management, updating and checking procedures, the estimation of failure rates, availability and utilisation factors, and uncertainties in reliability parameters. (DG)

  4. On the NPP structural reliability

    International Nuclear Information System (INIS)

    Klemin, A.I.; Polyakov, E.F.

    1980-01-01

    Reviewed are the main statements peculiarities and possibilities of the first branch guiding technical material (GTM) ''The methods of calculation of structural reliability of NPP and its systems at the stage of projecting''. It is stated, that in GTM presented are recomendations on the calculation of reliability of such specific systems, as the system of the reactor control and protection the system of measuring instruments and automatics and safe systems. GTM are based on analytical methods of modern theory of realibility with the Use of metodology of minimal cross sections of complex systems. It is stressed, that the calculations on the proposed methods permit to calculate a wide complex of reliability parameters, reflecting separately or together prorerties of NPP dependability and maintainability. For NPP, operating by a variable schedule of leading, aditionally considered are parameters, characterizing reliability with account of the proposed regime of power change, i.e. taking into account failures, caused by decrease of the obtained power lower, than the reguired or increase of the required power higher, than the obtained

  5. Structural reliability analysis and seismic risk assessment

    International Nuclear Information System (INIS)

    Hwang, H.; Reich, M.; Shinozuka, M.

    1984-01-01

    This paper presents a reliability analysis method for safety evaluation of nuclear structures. By utilizing this method, it is possible to estimate the limit state probability in the lifetime of structures and to generate analytically the fragility curves for PRA studies. The earthquake ground acceleration, in this approach, is represented by a segment of stationary Gaussian process with a zero mean and a Kanai-Tajimi Spectrum. All possible seismic hazard at a site represented by a hazard curve is also taken into consideration. Furthermore, the limit state of a structure is analytically defined and the corresponding limit state surface is then established. Finally, the fragility curve is generated and the limit state probability is evaluated. In this paper, using a realistic reinforced concrete containment as an example, results of the reliability analysis of the containment subjected to dead load, live load and ground earthquake acceleration are presented and a fragility curve for PRA studies is also constructed

  6. Analytical Chemistry Laboratory (ACL) procedure compendium

    International Nuclear Information System (INIS)

    1993-01-01

    This volume contains the interim change notice for sample preparation methods. Covered are: acid digestion for metals analysis, fusion of Hanford tank waste solids, water leach of sludges/soils/other solids, extraction procedure toxicity (simulate leach in landfill), sample preparation for gamma spectroscopy, acid digestion for radiochemical analysis, leach preparation of solids for free cyanide analysis, aqueous leach of solids for anion analysis, microwave digestion of glasses and slurries for ICP/MS, toxicity characteristic leaching extraction for inorganics, leach/dissolution of activated metal for radiochemical analysis, extraction of single-shell tank (SST) samples for semi-VOC analysis, preparation and cleanup of hydrocarbon- containing samples for VOC and semi-VOC analysis, receiving of waste tank samples in onsite transfer cask, receipt and inspection of SST samples, receipt and extrusion of core samples at 325A shielded facility, cleaning and shipping of waste tank samplers, homogenization of solutions/slurries/sludges, and test sample preparation for bioassay quality control program

  7. Play vs. Procedures

    DEFF Research Database (Denmark)

    Hammar, Emil

    Through the theories of play by Gadamer (2004) and Henricks (2006), I will show how the relationship between play and game can be understood as dialectic and disruptive, thus challenging understandings of how the procedures of games determine player activity and vice versa. As such, I posit some...... analytical consequences for understandings of digital games as procedurally fixed (Boghost, 2006; Flannagan, 2009; Bathwaite & Sharp, 2010). That is, if digital games are argued to be procedurally fixed and if play is an appropriative and dialectic activity, then it could be argued that the latter affects...... and alters the former, and vice versa. Consequently, if the appointed procedures of a game are no longer fixed and rigid in their conveyance of meaning, qua the appropriative and dissolving nature of play, then understandings of games as conveying a fixed meaning through their procedures are inadequate...

  8. A Reliability Based Model for Wind Turbine Selection

    Directory of Open Access Journals (Sweden)

    A.K. Rajeevan

    2013-06-01

    Full Text Available A wind turbine generator output at a specific site depends on many factors, particularly cut- in, rated and cut-out wind speed parameters. Hence power output varies from turbine to turbine. The objective of this paper is to develop a mathematical relationship between reliability and wind power generation. The analytical computation of monthly wind power is obtained from weibull statistical model using cubic mean cube root of wind speed. Reliability calculation is based on failure probability analysis. There are many different types of wind turbinescommercially available in the market. From reliability point of view, to get optimum reliability in power generation, it is desirable to select a wind turbine generator which is best suited for a site. The mathematical relationship developed in this paper can be used for site-matching turbine selection in reliability point of view.

  9. Job analysis of the electrician position for the nuclear power plant maintenance personnel reliability model

    International Nuclear Information System (INIS)

    Federman, P.J.; Bartter, W.D.; Siegel, A.I.

    1984-02-01

    This report presents the methods, procedures, and results of the fourth and final of a series of job analytic studies which characterize maintenance positions in nuclear power plants. The electrician position is the subject of the present report. The characterization of the electrician position takes the form of detailed information about: (1) the frequency of performing various tasks, (2) the time required for performing each task, (3) the training required for adequate performance of each task, and (4) the perceived consequences resulting from inadequate task performance. Additionally, information is presented about the intellective and the perceptual-motor loading imposed by each task. This information contributes to job design and training requirements derivation as well as to the assessment of human performance reliability in nuclear power plants

  10. Reliability Worth Analysis of Distribution Systems Using Cascade Correlation Neural Networks

    DEFF Research Database (Denmark)

    Heidari, Alireza; Agelidis, Vassilios; Pou, Josep

    2018-01-01

    Reliability worth analysis is of great importance in the area of distribution network planning and operation. The reliability worth's precision can be affected greatly by the customer interruption cost model used. The choice of the cost models can change system and load point reliability indices....... In this study, a cascade correlation neural network is adopted to further develop two cost models comprising a probabilistic distribution model and an average or aggregate model. A contingency-based analytical technique is adopted to conduct the reliability worth analysis. Furthermore, the possible effects...

  11. Portfolio assessment during medical internships: How to obtain a reliable and feasible assessment procedure?

    Science.gov (United States)

    Michels, Nele R M; Driessen, Erik W; Muijtjens, Arno M M; Van Gaal, Luc F; Bossaert, Leo L; De Winter, Benedicte Y

    2009-12-01

    A portfolio is used to mentor and assess students' clinical performance at the workplace. However, students and raters often perceive the portfolio as a time-consuming instrument. In this study, we investigated whether assessment during medical internship by a portfolio can combine reliability and feasibility. The domain-oriented reliability of 61 double-rated portfolios was measured, using a generalisability analysis with portfolio tasks and raters as sources of variation in measuring the performance of a student. We obtained reliability (Phi coefficient) of 0.87 with this internship portfolio containing 15 double-rated tasks. The generalisability analysis showed that an acceptable level of reliability (Phi = 0.80) was maintained when the amount of portfolio tasks was decreased to 13 or 9 using one and two raters, respectively. Our study shows that a portfolio can be a reliable method for the assessment of workplace learning. The possibility of reducing the amount of tasks or raters while maintaining a sufficient level of reliability suggests an increase in feasibility of portfolio use for both students and raters.

  12. Reliability assessment of restructured power systems using reliability network equivalent and pseudo-sequential simulation techniques

    International Nuclear Information System (INIS)

    Ding, Yi; Wang, Peng; Goel, Lalit; Billinton, Roy; Karki, Rajesh

    2007-01-01

    This paper presents a technique to evaluate reliability of a restructured power system with a bilateral market. The proposed technique is based on the combination of the reliability network equivalent and pseudo-sequential simulation approaches. The reliability network equivalent techniques have been implemented in the Monte Carlo simulation procedure to reduce the computational burden of the analysis. Pseudo-sequential simulation has been used to increase the computational efficiency of the non-sequential simulation method and to model the chronological aspects of market trading and system operation. Multi-state Markov models for generation and transmission systems are proposed and implemented in the simulation. A new load shedding scheme is proposed during generation inadequacy and network congestion to minimize the load curtailment. The IEEE reliability test system (RTS) is used to illustrate the technique. (author)

  13. Reliable and valid assessment of Lichtenstein hernia repair skills

    DEFF Research Database (Denmark)

    Carlsen, C G; Lindorff Larsen, Karen; Funch-Jensen, P

    2014-01-01

    PURPOSE: Lichtenstein hernia repair is a common surgical procedure and one of the first procedures performed by a surgical trainee. However, formal assessment tools developed for this procedure are few and sparsely validated. The aim of this study was to determine the reliability and validity...... of an assessment tool designed to measure surgical skills in Lichtenstein hernia repair. METHODS: Key issues were identified through a focus group interview. On this basis, an assessment tool with eight items was designed. Ten surgeons and surgical trainees were video recorded while performing Lichtenstein hernia...... a significant difference between the three groups which indicates construct validity, p skills can be assessed blindly by a single rater in a reliable and valid fashion with the new procedure-specific assessment tool. We recommend this tool for future assessment...

  14. Improving the safety and reliability of Monju

    International Nuclear Information System (INIS)

    Itou, Kazumoto; Maeda, Hiroshi; Moriyama, Masatoshi

    1998-01-01

    Comprehensive safety review has been performed at Monju to determine why the Monju secondary sodium leakage accident occurred. We investigated how to improve the situation based on the results of the safety review. The safety review focused on five aspects of whether the facilities for dealing with the sodium leakage accident were adequate: the reliability of the detection method, the reliability of the method for preventing the spread of the sodium leakage accident, whether the documented operating procedures are adequate, whether the quality assurance system, program, and actions were properly performed and so on. As a result, we established for Monju a better method of dealing with sodium leakage accidents, rapid detection of sodium leakage, improvement of sodium drain facilities, and way to reduce damage to Monju systems after an accident. We also improve the operation procedures and quality assurance actions to increase the safety and reliability of Monju. (author)

  15. Training benefits of research on operator reliability

    International Nuclear Information System (INIS)

    Worledge, D.H.

    1989-01-01

    The purpose of the EPRI Operator Reliability Experiments (ORE) Program is to collect data for use in reliability and safety studies of nuclear power plant operation which more realistically take credit for operator performance in preventing core damage. The three objectives in fulfilling this purpose are: to obtain quantitative/qualitative performance data on operating crew responses in the control room for potential accident sequences by using plant simulators; to test the human cognitive reliability (HCR) correlation; and to develop a data collection analysis procedure. This paper discusses the background to this program, data collection and analysis, and the results of quantitative/qualitative insights stemming from initial work. Special attention is paid to how this program impacts upon simulator use and assessment of simulator fidelity. Attention is also paid to the use of data collection procedures to assist training departments in assessing the quality of their training programs

  16. Review of cause-based decision tree approach for the development of domestic standard human reliability analysis procedure in low power/shutdown operation probabilistic safety assessment

    International Nuclear Information System (INIS)

    Kang, D. I.; Jung, W. D.

    2003-01-01

    We review the Cause-Based Decision Tree (CBDT) approach to decide whether we incorporate it or not for the development of domestic standard Human Reliability Analysis (HRA) procedure in low power/shutdown operation Probabilistic Safety Assessment (PSA). In this paper, we introduce the cause based decision tree approach, quantify human errors using it, and identify merits and demerits of it in comparision with previously used THERP. The review results show that it is difficult to incorporate the CBDT method for the development of domestic standard HRA procedure in low power/shutdown PSA because the CBDT method need for the subjective judgment of HRA analyst like as THERP. However, it is expected that the incorporation of the CBDT method into the development of domestic standard HRA procedure only for the comparision of quantitative HRA results will relieve the burden of development of detailed HRA procedure and will help maintain consistent quantitative HRA results

  17. Analytical validation of an ultraviolet-visible procedure for determining lutein concentration and application to lutein-loaded nanoparticles.

    Science.gov (United States)

    Silva, Jéssica Thaís do Prado; Silva, Anderson Clayton da; Geiss, Julia Maria Tonin; de Araújo, Pedro Henrique Hermes; Becker, Daniela; Bracht, Lívia; Leimann, Fernanda Vitória; Bona, Evandro; Guerra, Gustavo Petri; Gonçalves, Odinei Hess

    2017-09-01

    Lutein is a carotenoid presenting known anti-inflammatory and antioxidant properties. Lutein-rich diets have been associated with neurological improvement as well as reduction of the risk of vision loss due to Age-Related Macular Degeneration (AMD). Micro and nanoencapsulation have demonstrated to be effective techniques in protecting lutein against degradation and also in improving its bioavailability. However, actual lutein concentration inside the capsules and encapsulation efficiency are key parameters that must be precisely known when designing in vitro and in vivo tests. In this work an analytical procedure was validated for the determination of the actual lutein content in zein nanoparticles using ultraviolet-visible spectroscopy. Method validation followed the International Conference on Harmonisation (ICH) guidelines which evaluate linearity, detection limit, quantification limit, accuracy and precision. The validated methodology was applied to characterize lutein-loaded nanoparticles. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. A platform analytical quality by design (AQbD) approach for multiple UHPLC-UV and UHPLC-MS methods development for protein analysis.

    Science.gov (United States)

    Kochling, Jianmei; Wu, Wei; Hua, Yimin; Guan, Qian; Castaneda-Merced, Juan

    2016-06-05

    A platform analytical quality by design approach for methods development is presented in this paper. This approach is not limited just to method development following the same logical Analytical quality by design (AQbD) process, it is also exploited across a range of applications in methods development with commonality in equipment and procedures. As demonstrated by the development process of 3 methods, the systematic approach strategy offers a thorough understanding of the method scientific strength. The knowledge gained from the UHPLC-UV peptide mapping method can be easily transferred to the UHPLC-MS oxidation method and the UHPLC-UV C-terminal heterogeneity methods of the same protein. In addition, the platform AQbD method development strategy ensures method robustness is built in during development. In early phases, a good method can generate reliable data for product development allowing confident decision making. Methods generated following the AQbD approach have great potential for avoiding extensive post-approval analytical method change. While in the commercial phase, high quality data ensures timely data release, reduced regulatory risk, and lowered lab operational cost. Moreover, large, reliable database and knowledge gained during AQbD method development provide strong justifications during regulatory filling for the selection of important parameters or parameter change needs for method validation, and help to justify for removal of unnecessary tests used for product specifications. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Human reliability in high dose rate afterloading radiotherapy based on FMECA

    International Nuclear Information System (INIS)

    Deng Jun; Fan Yaohua; Yue Baorong; Wei Kedao; Ren Fuli

    2012-01-01

    Objective: To put forward reasonable and feasible recommendations against the procedure with relative high risk during the high dose rate (HDR) afterloading radiotherapy, so as to enhance its clinical application safety, through studying the human reliability in the process of carrying out the HDR afterloading radiotherapy. Methods: Basic data were collected by on-site investigation and process analysis as well as expert evaluation. Failure mode, effect and criticality analysis (FMECA) employed to study the human reliability in the execution of HDR afterloading radiotherapy. Results: The FMECA model of human reliability for HDR afterloading radiotherapy was established, through which 25 procedures with relative high risk index were found,accounting for 14.1% of total 177 procedures. Conclusions: FMECA method in human reliability study for HDR afterloading radiotherapy is feasible. The countermeasures are put forward to reduce the human error, so as to provide important basis for enhancing clinical application safety of HDR afterloading radiotherapy. (authors)

  20. Atmospheric Deposition: Sampling Procedures, Analytical Methods, and Main Recent Findings from the Scientific Literature

    Directory of Open Access Journals (Sweden)

    M. Amodio

    2014-01-01

    Full Text Available The atmosphere is a carrier on which some natural and anthropogenic organic and inorganic chemicals are transported, and the wet and dry deposition events are the most important processes that remove those chemicals, depositing it on soil and water. A wide variety of different collectors were tested to evaluate site-specificity, seasonality and daily variability of settleable particle concentrations. Deposition fluxes of POPs showed spatial and seasonal variations, diagnostic ratios of PAHs on deposited particles, allowed the discrimination between pyrolytic or petrogenic sources. Congener pattern analysis and bulk deposition fluxes in rural sites confirmed long-range atmospheric transport of PCDDs/Fs. More and more sophisticated and newly designed deposition samplers have being used for characterization of deposited mercury, demonstrating the importance of rain scavenging and the relatively higher magnitude of Hg deposition from Chinese anthropogenic sources. Recently biological monitors demonstrated that PAH concentrations in lichens were comparable with concentrations measured in a conventional active sampler in an outdoor environment. In this review the authors explore the methodological approaches used for the assessment of atmospheric deposition, from the analysis of the sampling methods, the analytical procedures for chemical characterization of pollutants and the main results from the scientific literature.

  1. Optimization of organic contaminant and toxicity testing analytical procedures for estimating the characteristics and environmental significance of natural gas processing plant waste sludges

    International Nuclear Information System (INIS)

    Novak, N.

    1990-10-01

    The Gas Plant Sludge Characterization Phase IIB program is a continuation of the Canadian Petroleum Association's (CPA) initiatives to characterize sludge generated at gas processing plants. The objectives of the Phase IIB project were to develop an effective procedure for screening waste sludges or centrifuge/leachate generated from sludge samples for volatile, solvent-soluble and water-soluble organics; verify the reproducibility of the three aquatic toxicity tests recommended as the battery of tests for determining the environmental significance of sludge centrifugates or leachates; assess the performance of two terrestrial toxicity tests in determining the environmental significance of whole sludge samples applied to soil; and to assess and discuss the reproducibility and cost-effectiveness of the sampling and analytical techniques proposed for the overall sludge characterization procedure. Conclusions and recommendations are provided for sludge collection, preparation and distribution, organic analyses, toxicity testing, project management, and procedure standardization. The three aquatic and two terrestrial toxicity tests proved effective in indicating the toxicity of complex mixtures. 27 refs., 3 figs., 59 tabs

  2. Reliability-Based Optimization of Series Systems of Parallel Systems

    DEFF Research Database (Denmark)

    Enevoldsen, I.; Sørensen, John Dalsgaard

    Reliability-based design of structural systems is considered. Especially systems where the reliability model is a series system of parallel systems are analysed. A sensitivity analysis for this class of problems is presented. Direct and sequential optimization procedures to solve the optimization...

  3. Design for Reliability of Power Electronic Systems

    DEFF Research Database (Denmark)

    Wang, Huai; Ma, Ke; Blaabjerg, Frede

    2012-01-01

    Advances in power electronics enable efficient and flexible processing of electric power in the application of renewable energy sources, electric vehicles, adjustable-speed drives, etc. More and more efforts are devoted to better power electronic systems in terms of reliability to ensure high......). A collection of methodologies based on Physics-of-Failure (PoF) approach and mission profile analysis are presented in this paper to perform reliability-oriented design of power electronic systems. The corresponding design procedures and reliability prediction models are provided. Further on, a case study...... on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical components IGBTs. Different aspects of improving the reliability of the power converter are mapped. Finally, the challenges and opportunities to achieve more reliable power electronic systems are addressed....

  4. Reliability Generalization: An Examination of the Positive Affect and Negative Affect Schedule

    Science.gov (United States)

    Leue, Anja; Lange, Sebastian

    2011-01-01

    The assessment of positive affect (PA) and negative affect (NA) by means of the Positive Affect and Negative Affect Schedule has received a remarkable popularity in the social sciences. Using a meta-analytic tool--namely, reliability generalization (RG)--population reliability scores of both scales have been investigated on the basis of a random…

  5. Course on Advanced Analytical Chemistry and Chromatography

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Fristrup, Peter; Nielsen, Kristian Fog

    2011-01-01

    Methods of analytical chemistry constitute an integral part of decision making in chemical research, and students must master a high degree of knowledge, in order to perform reliable analysis. At DTU departments of chemistry it was thus decided to develop a course that was attractive to master...... students of different direction of studies, to Ph.D. students and to professionals that need an update of their current state of skills and knowledge. A course of 10 ECTS points was devised with the purpose of introducing students to analytical chemistry and chromatography with the aim of including theory...

  6. Analytical quality control concept in the Euratom on-site laboratories

    International Nuclear Information System (INIS)

    Mayer, K.; Duinslaeger, L.; Cromboom, O.; Ottmar, H.; Wojnowski, D.; Vegt, H. van der

    2001-01-01

    Full text: Two on-site laboratories have been developed, installed, commissioned and put into routine operation by the Euratom safeguards office (ESO), jointly with the Institute for Transuranium Elements (ITU). These laboratories are operated by ITU staff and provide verification measurement results on samples taken by Euratom inspectors. The analysts work in weekly changing shift teams, manage the laboratories and operate the various analytical techniques. Operating such a laboratory at a remote location, without a senior scientist immediately available in case of problems, The existing boundary conditions challenge the robustness of the entire laboratory, i.e. comprising staff and instrumentation. In order to continuously ensure a high degree of reliability of the measurement results, a stringent quality control system was implemented. The quality control concept for the two on-site laboratories was developed at a very early stage and implemented in the pre-OSL training facility at ITU. This enabled to thoroughly test and develop further the concept. At the same time the analysts get acquainted with the quality control procedures in place and they are instilled with the principles. The quality control concept makes use of a fully computerized data management and data acquisition system. All measurement devices, including balances, density meters, mass spectrometers, passive neutron counter, hybrid K-edge instrument, gamma spectrometers and alpha spectrometers are networked and data exchange is performed on electronic basis. A specifically developed laboratory information management system collects individual measurement data, calculates intermediate and final result and shares the information with a quality control module. In order to ensure the reliability of the results, which are reported to the ESO inspectorate, five levels of quality control were implemented. The present paper describes in detail the different levels of quality control, which check the

  7. Guided-inquiry laboratory experiments to improve students' analytical thinking skills

    Science.gov (United States)

    Wahyuni, Tutik S.; Analita, Rizki N.

    2017-12-01

    This study aims to improve the experiment implementation quality and analytical thinking skills of undergraduate students through guided-inquiry laboratory experiments. This study was a classroom action research conducted in three cycles. The study has been carried out with 38 undergraduate students of the second semester of Biology Education Department of State Islamic Institute (SII) of Tulungagung, as a part of Chemistry for Biology course. The research instruments were lesson plans, learning observation sheets and undergraduate students' experimental procedure. Research data were analyzed using quantitative-descriptive method. The increasing of analytical thinking skills could be measured using gain score normalized and statistical paired t-test. The results showed that guided-inquiry laboratory experiments model was able to improve both the experiment implementation quality and the analytical thinking skills. N-gain score of the analytical thinking skills was increased, in spite of just 0.03 with low increase category, indicated by experimental reports. Some of undergraduate students have had the difficulties in detecting the relation of one part to another and to an overall structure. The findings suggested that giving feedback the procedural knowledge and experimental reports were important. Revising the experimental procedure that completed by some scaffolding questions were also needed.

  8. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    Science.gov (United States)

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  9. Computer controlled quality of analytical measurements

    International Nuclear Information System (INIS)

    Clark, J.P.; Huff, G.A.

    1979-01-01

    A PDP 11/35 computer system is used in evaluating analytical chemistry measurements quality control data at the Barnwell Nuclear Fuel Plant. This computerized measurement quality control system has several features which are not available in manual systems, such as real-time measurement control, computer calculated bias corrections and standard deviation estimates, surveillance applications, evaluaton of measurement system variables, records storage, immediate analyst recertificaton, and the elimination of routine analysis of known bench standards. The effectiveness of the Barnwell computer system has been demonstrated in gathering and assimilating the measurements of over 1100 quality control samples obtained during a recent plant demonstration run. These data were used to determine equaitons for predicting measurement reliability estimates (bias and precision); to evaluate the measurement system; and to provide direction for modification of chemistry methods. The analytical chemistry measurement quality control activities represented 10% of the total analytical chemistry effort

  10. Analytical procedure for experimental quantification of carrier concentration in semiconductor devices by using electric scanning probe microscopy

    International Nuclear Information System (INIS)

    Fujita, Takaya; Matsumura, Koji; Itoh, Hiroshi; Fujita, Daisuke

    2014-01-01

    Scanning capacitance microscopy (SCM) is based on a contact-mode variant of atomic force microscopy, which is used for imaging two-dimensional carrier (electrons and holes) distributions in semiconductor devices. We introduced a method of quantification of the carrier concentration by experimentally deduced calibration curves, which were prepared for semiconductor materials such as silicon and silicon carbide. The analytical procedure was circulated to research organizations in a round-robin test. The effectiveness of the method was confirmed for practical analysis and for what is expected for industrial pre-standardization from the viewpoint of comparability among users. It was also applied to other electric scanning probe microscopy techniques such as scanning spreading resistance microscopy and scanning nonlinear dielectric microscopy. Their depth profiles of carrier concentration were found to be in good agreement with those characterized by SCM. These results suggest that our proposed method will be compatible with future next-generation microscopy. (paper)

  11. A sensitive analytical procedure for monitoring acrylamide in environmental water samples by offline SPE-UPLC/MS/MS.

    Science.gov (United States)

    Togola, Anne; Coureau, Charlotte; Guezennec, Anne-Gwenaëlle; Touzé, Solène

    2015-05-01

    The presence of acrylamide in natural systems is of concern from both environmental and health points of view. We developed an accurate and robust analytical procedure (offline solid phase extraction combined with UPLC/MS/MS) with a limit of quantification (20 ng L(-1)) compatible with toxicity threshold values. The optimized (considering the nature of extraction phases, sampling volumes, and solvent of elution) solid phase extraction (SPE) was validated according to ISO Standard ISO/IEC 17025 on groundwater, surface water, and industrial process water samples. Acrylamide is highly polar, which induces a high variability during the SPE step, therefore requiring the use of C(13)-labeled acrylamide as an internal standard to guarantee the accuracy and robustness of the method (uncertainty about 25 % (k = 2) at limit of quantification level). The specificity of the method and the stability of acrylamide were studied for these environmental media, and it was shown that the method is suitable for measuring acrylamide in environmental studies.

  12. How to Measure the Onset of Babbling Reliably?

    Science.gov (United States)

    Molemans, Inge; van den Berg, Renate; van Severen, Lieve; Gillis, Steven

    2012-01-01

    Various measures for identifying the onset of babbling have been proposed in the literature, but a formal definition of the exact procedure and a thorough validation of the sample size required for reliably establishing babbling onset is lacking. In this paper the reliability of five commonly used measures is assessed using a large longitudinal…

  13. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    Science.gov (United States)

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  14. Pilot testing of SHRP 2 reliability data and analytical products: Southern California.

    Science.gov (United States)

    2015-01-01

    The second Strategic Highway Research Program (SHRP 2) has been investigating the critical subject of travel time reliability for several years. As part of this research, SHRP 2 supported multiple efforts to develop products to evaluate travel time r...

  15. A reliability analysis tool for SpaceWire network

    Science.gov (United States)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  16. A semi-analytical modelling of multistage bunch compression with collective effects

    International Nuclear Information System (INIS)

    Zagorodnov, Igor; Dohlus, Martin

    2010-07-01

    In this paper we introduce an analytical solution (up to the third order) for a multistage bunch compression and acceleration system without collective effects. The solution for the system with collective effects is found by an iterative procedure based on this analytical result. The developed formalism is applied to the FLASH facility at DESY. Analytical estimations of RF tolerances are given. (orig.)

  17. A semi-analytical modelling of multistage bunch compression with collective effects

    Energy Technology Data Exchange (ETDEWEB)

    Zagorodnov, Igor; Dohlus, Martin

    2010-07-15

    In this paper we introduce an analytical solution (up to the third order) for a multistage bunch compression and acceleration system without collective effects. The solution for the system with collective effects is found by an iterative procedure based on this analytical result. The developed formalism is applied to the FLASH facility at DESY. Analytical estimations of RF tolerances are given. (orig.)

  18. An analytical model for computation of reliability of waste management facilities with intermediate storages

    International Nuclear Information System (INIS)

    Kallweit, A.; Schumacher, F.

    1977-01-01

    A high reliability is called for waste management facilities within the fuel cycle of nuclear power stations which can be fulfilled by providing intermediate storage facilities and reserve capacities. In this report a model based on the theory of Markov processes is described which allows computation of reliability characteristics of waste management facilities containing intermediate storage facilities. The application of the model is demonstrated by an example. (orig.) [de

  19. Study of systematic errors in the determination of total Hg levels in the range -5% in inorganic and organic matrices with two reliable spectrometrical determination procedures

    International Nuclear Information System (INIS)

    Kaiser, G.; Goetz, D.; Toelg, G.; Max-Planck-Institut fuer Metallforschung, Stuttgart; Knapp, G.; Maichin, B.; Spitzy, H.

    1978-01-01

    In the determiniation of Hg at ng/g and pg/g levels systematic errors are due to faults in the analytical methods such as intake, preparation and decomposition of a sample. The sources of these errors have been studied both with 203 Hg-radiotracer techniques and two multi-stage procedures developed for the determiniation of trace levels. The emission spectrometrie (OES-MIP) procedure includes incineration of the sample in a microwave induced oxygen plasma (MIP), the isolation and enrichment on a gold absorbent and its excitation in an argon plasma (MIP). The emitted Hg-radiation (253,7 nm) is evaluated photometrically with a semiconductor element. The detection limit of the OES-MIP procedure was found to be 0,01 ng, the coefficient of variation 5% for 1 ng Hg. The second procedure combines a semi-automated wet digestion method (HCLO 3 /HNO 3 ) with a reduction-aeration (ascorbic acid/SnCl 2 ), and the flameless atomic absorption technique (253,7 nm). The detection limit of this procedure was found to be 0,5 ng, the coefficient of variation 5% for 5 ng Hg. (orig.) [de

  20. Analytical measurements of fission products during a severe nuclear accident

    Science.gov (United States)

    Doizi, D.; Reymond la Ruinaz, S.; Haykal, I.; Manceron, L.; Perrin, A.; Boudon, V.; Vander Auwera, J.; tchana, F. Kwabia; Faye, M.

    2018-01-01

    The Fukushima accident emphasized the fact that ways to monitor in real time the evolution of a nuclear reactor during a severe accident remain to be developed. No fission products were monitored during twelve days; only dose rates were measured, which is not sufficient to carry out an online diagnosis of the event. The first measurements were announced with little reliability for low volatile fission products. In order to improve the safety of nuclear plants and minimize the industrial, ecological and health consequences of a severe accident, it is necessary to develop new reliable measurement systems, operating at the earliest and closest to the emission source of fission products. Through the French program ANR « Projet d'Investissement d'Avenir », the aim of the DECA-PF project (diagnosis of core degradation from fission products measurements) is to monitor in real time the release of the major fission products (krypton, xenon, gaseous forms of iodine and ruthenium) outside the nuclear reactor containment. These products are released at different times during a nuclear accident and at different states of the nuclear core degradation. Thus, monitoring these fission products gives information on the situation inside the containment and helps to apply the Severe Accident Management procedures. Analytical techniques have been proposed and evaluated. The results are discussed here.

  1. Analytical measurements of fission products during a severe nuclear accident

    Directory of Open Access Journals (Sweden)

    Doizi D.

    2018-01-01

    Full Text Available The Fukushima accident emphasized the fact that ways to monitor in real time the evolution of a nuclear reactor during a severe accident remain to be developed. No fission products were monitored during twelve days; only dose rates were measured, which is not sufficient to carry out an online diagnosis of the event. The first measurements were announced with little reliability for low volatile fission products. In order to improve the safety of nuclear plants and minimize the industrial, ecological and health consequences of a severe accident, it is necessary to develop new reliable measurement systems, operating at the earliest and closest to the emission source of fission products. Through the French program ANR « Projet d’Investissement d’Avenir », the aim of the DECA-PF project (diagnosis of core degradation from fission products measurements is to monitor in real time the release of the major fission products (krypton, xenon, gaseous forms of iodine and ruthenium outside the nuclear reactor containment. These products are released at different times during a nuclear accident and at different states of the nuclear core degradation. Thus, monitoring these fission products gives information on the situation inside the containment and helps to apply the Severe Accident Management procedures. Analytical techniques have been proposed and evaluated. The results are discussed here.

  2. Evaluation of the quality of results obtained in institutions participating in interlaboratory experiments and of the reliability characteristics of the analytical methods used on the basis of certification of standard soil samples

    Energy Technology Data Exchange (ETDEWEB)

    Parshin, A.K.; Obol' yaninova, V.G.; Sul' dina, N.P.

    1986-08-20

    Rapid monitoring of the level of pollution of the environment and, especially, of soils necessitates preparation of standard samples (SS) close in properties and material composition to the objects to be analyzed. During 1978-1982 four sets (three types of samples in each) of State Standard Samples of different soils were developed: soddy-podzolic sandy-loamy, typical chernozem, krasnozem, and calcareous sierozem. The certification studies of the SS of the soils were carried out in accordance with the classical scheme of interlab experiment (ILE). More than 100 institutions were involved in the ILE and the total number of independent analytical results was of the order of 10/sup 4/. With such a volume of analytical information at their disposal they were able to find some general characteristics intrinsic to certification studies, to assess the quality of work of the ILE participants with due regard for their specialization, and the reliability characteristics of the analytical methods used.

  3. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    Science.gov (United States)

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.

    1988-01-01

    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  4. Reliability evaluation of deregulated electric power systems for planning applications

    International Nuclear Information System (INIS)

    Ehsani, A.; Ranjbar, A.M.; Jafari, A.; Fotuhi-Firuzabad, M.

    2008-01-01

    In a deregulated electric power utility industry in which a competitive electricity market can influence system reliability, market risks cannot be ignored. This paper (1) proposes an analytical probabilistic model for reliability evaluation of competitive electricity markets and (2) develops a methodology for incorporating the market reliability problem into HLII reliability studies. A Markov state space diagram is employed to evaluate the market reliability. Since the market is a continuously operated system, the concept of absorbing states is applied to it in order to evaluate the reliability. The market states are identified by using market performance indices and the transition rates are calculated by using historical data. The key point in the proposed method is the concept that the reliability level of a restructured electric power system can be calculated using the availability of the composite power system (HLII) and the reliability of the electricity market. Two case studies are carried out over Roy Billinton Test System (RBTS) to illustrate interesting features of the proposed methodology

  5. Human factors reliability benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1989-08-01

    The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim of assessing the state of the art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participated in the HF-RBE. The HF-RBE was organised around two study cases: (1) analysis of routine functional Test and Maintenance (TPM) procedures: with the aim of assessing the probability of test induced failures, the probability of failures to remain unrevealed and the potential to initiate transients because of errors performed in the test; (2) analysis of human actions during an operational transient: with the aim of assessing the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. This report summarises the contributions received from the participants and analyses these contributions on a comparative basis. The aim of this analysis was to compare the procedures, modelling techniques and quantification methods used, to obtain insight in the causes and magnitude of the variability observed in the results, to try to identify preferred human reliability assessment approaches and to get an understanding of the current state of the art in the field identifying the limitations that are still inherent to the different approaches

  6. Reliability demonstration methodology for products with Gamma Process by optimal accelerated degradation testing

    International Nuclear Information System (INIS)

    Zhang, Chunhua; Lu, Xiang; Tan, Yuanyuan; Wang, Yashun

    2015-01-01

    For products with high reliability and long lifetime, accelerated degradation testing (ADT) may be adopted during product development phase to verify whether its reliability satisfies the predetermined level within feasible test duration. The actual degradation from engineering is usually a strictly monotonic process, such as fatigue crack growth, wear, and erosion. However, the method for reliability demonstration by ADT with monotonic degradation process has not been investigated so far. This paper proposes a reliability demonstration methodology by ADT for this kind of product. We first apply Gamma process to describe the monotonic degradation. Next, we present a reliability demonstration method by converting the required reliability level into allowable cumulative degradation in ADT and comparing the actual accumulative degradation with the allowable level. Further, we suggest an analytical optimal ADT design method for more efficient reliability demonstration by minimizing the asymptotic variance of decision variable in reliability demonstration under the constraints of sample size, test duration, test cost, and predetermined decision risks. The method is validated and illustrated with example on reliability demonstration of alloy product, and is applied to demonstrate the wear reliability within long service duration of spherical plain bearing in the end. - Highlights: • We present a reliability demonstration method by ADT for products with monotonic degradation process, which may be applied to verify reliability with long service life for products with monotonic degradation process within feasible test duration. • We suggest an analytical optimal ADT design method for more efficient reliability demonstration, which differs from the existed optimal ADT design for more accurate reliability estimation by different objective function and different constraints. • The methods are applied to demonstrate the wear reliability within long service duration of

  7. Analytical approximations for wide and narrow resonances

    International Nuclear Information System (INIS)

    Suster, Luis Carlos; Martinez, Aquilino Senra; Silva, Fernando Carvalho da

    2005-01-01

    This paper aims at developing analytical expressions for the adjoint neutron spectrum in the resonance energy region, taking into account both narrow and wide resonance approximations, in order to reduce the numerical computations involved. These analytical expressions, besides reducing computing time, are very simple from a mathematical point of view. The results obtained with this analytical formulation were compared to a reference solution obtained with a numerical method previously developed to solve the neutron balance adjoint equations. Narrow and wide resonances of U 238 were treated and the analytical procedure gave satisfactory results as compared with the reference solution, for the resonance energy range. The adjoint neutron spectrum is useful to determine the neutron resonance absorption, so that multigroup adjoint cross sections used by the adjoint diffusion equation can be obtained. (author)

  8. Analytical approximations for wide and narrow resonances

    Energy Technology Data Exchange (ETDEWEB)

    Suster, Luis Carlos; Martinez, Aquilino Senra; Silva, Fernando Carvalho da [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia. Programa de Engenharia Nuclear]. E-mail: aquilino@lmp.ufrj.br

    2005-07-01

    This paper aims at developing analytical expressions for the adjoint neutron spectrum in the resonance energy region, taking into account both narrow and wide resonance approximations, in order to reduce the numerical computations involved. These analytical expressions, besides reducing computing time, are very simple from a mathematical point of view. The results obtained with this analytical formulation were compared to a reference solution obtained with a numerical method previously developed to solve the neutron balance adjoint equations. Narrow and wide resonances of U{sup 238} were treated and the analytical procedure gave satisfactory results as compared with the reference solution, for the resonance energy range. The adjoint neutron spectrum is useful to determine the neutron resonance absorption, so that multigroup adjoint cross sections used by the adjoint diffusion equation can be obtained. (author)

  9. Analytic chemistry of molybdenum

    International Nuclear Information System (INIS)

    Parker, G.A.

    1983-01-01

    Electrochemical, colorimetric, gravimetric, spectroscopic, and radiochemical methods for the determination of molybdenum are summarized in this book. Some laboratory procedures are described in detail while literature citations are given for others. The reader is also referred to older comprehensive reviews of the analytical chemistry of molybdenum. Contents, abridged: Gravimetric methods. Titrimetric methods. Colorimetric methods. X-ray fluorescence. Voltammetry. Catalytic methods. Molybdenum in non-ferrous alloys. Molydbenum compounds

  10. Strategy for continuous improvement in IC manufacturability, yield, and reliability

    Science.gov (United States)

    Dreier, Dean J.; Berry, Mark; Schani, Phil; Phillips, Michael; Steinberg, Joe; DePinto, Gary

    1993-01-01

    Continual improvements in yield, reliability and manufacturability measure a fab and ultimately result in Total Customer Satisfaction. A new organizational and technical methodology for continuous defect reduction has been established in a formal feedback loop, which relies on yield and reliability, failed bit map analysis, analytical tools, inline monitoring, cross functional teams and a defect engineering group. The strategy requires the fastest detection, identification and implementation of possible corrective actions. Feedback cycle time is minimized at all points to improve yield and reliability and reduce costs, essential for competitiveness in the memory business. Payoff was a 9.4X reduction in defectivity and a 6.2X improvement in reliability of 256 K fast SRAMs over 20 months.

  11. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 4: HARP Output (HARPO) graphics display user's guide

    Science.gov (United States)

    Sproles, Darrell W.; Bavuso, Salvatore J.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical postprocessor program HARPO (HARP Output). HARPO reads ASCII files generated by HARP. It provides an interactive plotting capability that can be used to display alternate model data for trade-off analyses. File data can also be imported to other commercial software programs.

  12. Gearbox Reliability Collaborative Analytic Formulation for the Evaluation of Spline Couplings

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Yi [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keller, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Errichello, Robert [GEARTECH, Houston, TX (United States); Halse, Chris [Romax Technology, Nottingham (United Kingdom)

    2013-12-01

    Gearboxes in wind turbines have not been achieving their expected design life; however, they commonly meet and exceed the design criteria specified in current standards in the gear, bearing, and wind turbine industry as well as third-party certification criteria. The cost of gearbox replacements and rebuilds, as well as the down time associated with these failures, has elevated the cost of wind energy. The National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) was established by the U.S. Department of Energy in 2006; its key goal is to understand the root causes of premature gearbox failures and improve their reliability using a combined approach of dynamometer testing, field testing, and modeling. As part of the GRC program, this paper investigates the design of the spline coupling often used in modern wind turbine gearboxes to connect the planetary and helical gear stages. Aside from transmitting the driving torque, another common function of the spline coupling is to allow the sun to float between the planets. The amount the sun can float is determined by the spline design and the sun shaft flexibility subject to the operational loads. Current standards address spline coupling design requirements in varying detail. This report provides additional insight beyond these current standards to quickly evaluate spline coupling designs.

  13. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  14. Measurement methods to assess diastasis of the rectus abdominis muscle (DRAM): A systematic review of their measurement properties and meta-analytic reliability generalisation.

    Science.gov (United States)

    van de Water, A T M; Benjamin, D R

    2016-02-01

    Systematic literature review. Diastasis of the rectus abdominis muscle (DRAM) has been linked with low back pain, abdominal and pelvic dysfunction. Measurement is used to either screen or to monitor DRAM width. Determining which methods are suitable for screening and monitoring DRAM is of clinical value. To identify the best methods to screen for DRAM presence and monitor DRAM width. AMED, Embase, Medline, PubMed and CINAHL databases were searched for measurement property studies of DRAM measurement methods. Population characteristics, measurement methods/procedures and measurement information were extracted from included studies. Quality of all studies was evaluated using 'quality rating criteria'. When possible, reliability generalisation was conducted to provide combined reliability estimations. Thirteen studies evaluated measurement properties of the 'finger width'-method, tape measure, calipers, ultrasound, CT and MRI. Ultrasound was most evaluated. Methodological quality of these studies varied widely. Pearson's correlations of r = 0.66-0.79 were found between calipers and ultrasound measurements. Calipers and ultrasound had Intraclass Correlation Coefficients (ICC) of 0.78-0.97 for test-retest, inter- and intra-rater reliability. The 'finger width'-method had weighted Kappa's of 0.73-0.77 for test-retest reliability, but moderate agreement (63%; weighted Kappa = 0.53) between raters. Comparing calipers and ultrasound, low measurement error was found (above the umbilicus), and the methods had good agreement (83%; weighted Kappa = 0.66) for discriminative purposes. The available information support ultrasound and calipers as adequate methods to assess DRAM. For other methods limited measurement information of low to moderate quality is available and further evaluation of their measurement properties is required. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Improvement of level-1 PSA computer code package - Modeling and analysis for dynamic reliability of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hoon; Baek, Sang Yeup; Shin, In Sup; Moon, Shin Myung; Moon, Jae Phil; Koo, Hoon Young; Kim, Ju Shin [Seoul National University, Seoul (Korea, Republic of); Hong, Jung Sik [Seoul National Polytechnology University, Seoul (Korea, Republic of); Lim, Tae Jin [Soongsil University, Seoul (Korea, Republic of)

    1996-08-01

    The objective of this project is to develop a methodology of the dynamic reliability analysis for NPP. The first year`s research was focused on developing a procedure for analyzing failure data of running components and a simulator for estimating the reliability of series-parallel structures. The second year`s research was concentrated on estimating the lifetime distribution and PM effect of a component from its failure data in various cases, and the lifetime distribution of a system with a particular structure. Computer codes for performing these jobs were also developed. The objectives of the third year`s research is to develop models for analyzing special failure types (CCFs, Standby redundant structure) that were nor considered in the first two years, and to complete a methodology of the dynamic reliability analysis for nuclear power plants. The analysis of failure data of components and related researches for supporting the simulator must be preceded for providing proper input to the simulator. Thus this research is divided into three major parts. 1. Analysis of the time dependent life distribution and the PM effect. 2. Development of a simulator for system reliability analysis. 3. Related researches for supporting the simulator : accelerated simulation analytic approach using PH-type distribution, analysis for dynamic repair effects. 154 refs., 5 tabs., 87 figs. (author)

  16. Analytical procedures for identifying anthocyanins in natural extracts

    International Nuclear Information System (INIS)

    Marco, Paulo Henrique; Poppi, Ronei Jesus; Scarminio, Ieda Spacino

    2008-01-01

    Anthocyanins are among the most important plant pigments. Due to their potential benefits for human health, there is considerable interest in these natural pigments. Nonetheless, there is great difficulty in finding a technique that could provide the identification of structurally similar compounds and estimate the number and concentration of the species present. A lot of techniques have been tried to find the best methodology to extract information from these systems. In this paper, a review of the most important procedures is given, from the extraction to the identification of anthocyanins in natural extracts. (author)

  17. Structural Optimization with Reliability Constraints

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1986-01-01

    During the last 25 years considerable progress has been made in the fields of structural optimization and structural reliability theory. In classical deterministic structural optimization all variables are assumed to be deterministic. Due to the unpredictability of loads and strengths of actual......]. In this paper we consider only structures which can be modelled as systems of elasto-plastic elements, e.g. frame and truss structures. In section 2 a method to evaluate the reliability of such structural systems is presented. Based on a probabilistic point of view a modern structural optimization problem...... is formulated in section 3. The formulation is a natural extension of the commonly used formulations in determinstic structural optimization. The mathematical form of the optimization problem is briefly discussed. In section 4 two new optimization procedures especially designed for the reliability...

  18. Analytical evaluation of nonlinear distortion effects on multicarrier signals

    CERN Document Server

    Araújo, Theresa

    2015-01-01

    Due to their ability to support reliable high quality of service as well as spectral and power efficiency, multicarrier modulation systems have found increasing use in modern communications services. However, one of the main drawbacks of these systems is their vulnerability to nonlinear distortion effects. Analytical Evaluation of Nonlinear Distortion Effects on Multicarrier Signals details a unified approach to well-known analytical results on memoryless nonlinearities that takes advantage of the Gaussian behavior of multicarrier signals.Sharing new insights into the behavior of nonlinearly d

  19. Reliability Analysis of Money Habitudes

    Science.gov (United States)

    Delgadillo, Lucy M.; Bushman, Brittani S.

    2015-01-01

    Use of the Money Habitudes exercise has gained popularity among various financial professionals. This article reports on the reliability of this resource. A survey administered to young adults at a western state university was conducted, and each Habitude or "domain" was analyzed using Cronbach's alpha procedures. Results showed all six…

  20. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 3: HARP Graphics Oriented (GO) input user's guide

    Science.gov (United States)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Mittal, Nitin; Koppen, Sandra Howell

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems, and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical preprocessor Graphics Oriented (GO) program. GO is a graphical user interface for the HARP engine that enables the drawing of reliability/availability models on a monitor. A mouse is used to select fault tree gates or Markov graphical symbols from a menu for drawing.

  1. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology.

    Science.gov (United States)

    Jesus, Mafalda; Martins, Ana P J; Gallardo, Eugenia; Silvestre, Samuel

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata , Smilax China, and Trigonella foenum graecum . This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well.

  2. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology

    Directory of Open Access Journals (Sweden)

    Mafalda Jesus

    2016-01-01

    Full Text Available Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata, Smilax China, and Trigonella foenum graecum. This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well.

  3. On the reliability evaluation of communication equipment for SMART using FMEA

    International Nuclear Information System (INIS)

    Kim, D. H.; Suh, Y. S.; Koo, I. S.; Song, Ki Sang; Han, Byung Rae

    2000-07-01

    This report describes the reliability analysis method for communication equipment using FMEA and FTA. The major equipments to be applicable for SMART communication networks are repeater, bridge, router and gateway and we can apply the FMEA or FTA technique. In the FMEA process, analysis of tagged system, decision of the level of analysis of the target system, drawing reliability block diagram according to the function, decision of failure mode, writing the fault reasons, writing on the FMEA sheet and FMEA level decision are included. Also, the FTA, it is possible to figure out top event reasons and system reliability. We have considered these in mind and we did the FMEA and FTA for NIC, hub, client server and router. Also, we suggested and integrated network model for nuclear power plant and we have shown the reliability analysis procedure according to FTA. If any proprietary communication device is developed, the reliability can be easily determined with proposed procedures

  4. On the reliability evaluation of communication equipment for SMART using FMEA

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. H.; Suh, Y. S.; Koo, I. S.; Song, Ki Sang; Han, Byung Rae

    2000-07-01

    This report describes the reliability analysis method for communication equipment using FMEA and FTA. The major equipments to be applicable for SMART communication networks are repeater, bridge, router and gateway and we can apply the FMEA or FTA technique. In the FMEA process, analysis of tagged system, decision of the level of analysis of the target system, drawing reliability block diagram according to the function, decision of failure mode, writing the fault reasons, writing on the FMEA sheet and FMEA level decision are included. Also, the FTA, it is possible to figure out top event reasons and system reliability. We have considered these in mind and we did the FMEA and FTA for NIC, hub, client server and router. Also, we suggested and integrated network model for nuclear power plant and we have shown the reliability analysis procedure according to FTA. If any proprietary communication device is developed, the reliability can be easily determined with proposed procedures.

  5. Reliability of assessment of adherence to an antimicrobial treatment guideline

    NARCIS (Netherlands)

    Mol, PGM; Gans, ROB; Panday, PVN; Degener, JE; Laseur, M; Haaijer-Ruskamp, FM

    Assessment procedures for adherence to a guideline must be reliable and credible. The aim of this study was to explore the reliability of assessment of adherence, taking account of the professional backgrounds of the observers. A secondary analysis explored the impact of case characteristics on

  6. Scenes for Social Information Processing in Adolescence: Item and factor analytic procedures for psychometric appraisal.

    Science.gov (United States)

    Vagos, Paula; Rijo, Daniel; Santos, Isabel M

    2016-04-01

    Relatively little is known about measures used to investigate the validity and applications of social information processing theory. The Scenes for Social Information Processing in Adolescence includes items built using a participatory approach to evaluate the attribution of intent, emotion intensity, response evaluation, and response decision steps of social information processing. We evaluated a sample of 802 Portuguese adolescents (61.5% female; mean age = 16.44 years old) using this instrument. Item analysis and exploratory and confirmatory factor analytic procedures were used for psychometric examination. Two measures for attribution of intent were produced, including hostile and neutral; along with 3 emotion measures, focused on negative emotional states; 8 response evaluation measures; and 4 response decision measures, including prosocial and impaired social behavior. All of these measures achieved good internal consistency values and fit indicators. Boys seemed to favor and choose overt and relational aggression behaviors more often; girls conveyed higher levels of neutral attribution, sadness, and assertiveness and passiveness. The Scenes for Social Information Processing in Adolescence achieved adequate psychometric results and seems a valuable alternative for evaluating social information processing, even if it is essential to continue investigation into its internal and external validity. (c) 2016 APA, all rights reserved.

  7. Statistically qualified neuro-analytic failure detection method and system

    Science.gov (United States)

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  8. Analytical standards for accountability of uranium hexafluoride - 1972

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    An analytical standard for the accountability of uranium hexafluoride is presented that includes procedures for subsampling, determination of uranium, determination of metallic impurities and isotopic analysis by gas and thermal ionization mass spectrometry

  9. Modeling reliability measurement of interface on information system: Towards the forensic of rules

    Science.gov (United States)

    Nasution, M. K. M.; Sitompul, Darwin; Harahap, Marwan

    2018-02-01

    Today almost all machines depend on the software. As a software and hardware system depends also on the rules that are the procedures for its use. If the procedure or program can be reliably characterized by involving the concept of graph, logic, and probability, then regulatory strength can also be measured accordingly. Therefore, this paper initiates an enumeration model to measure the reliability of interfaces based on the case of information systems supported by the rules of use by the relevant agencies. An enumeration model is obtained based on software reliability calculation.

  10. Analytical capabilities of laser-probe mass spectrometry

    International Nuclear Information System (INIS)

    Kovalev, I.D.; Madsimov, G.A.; Suchkov, A.I.; Larin, N.V.

    1978-01-01

    The physical bases and quantitative analytical procedures of laser-probe mass spectrometry are considered in this review. A comparison is made of the capabilities of static and dynamic mass spectrometers. Techniques are studied for improving the analytical characteristics of laser-probe mass spectrometers. The advantages, for quantitative analysis, of the Q-switched mode over the normal pulse mode for lasers are: (a) the possibility of analysing metals, semiconductors and insulators without the use of standards; and (b) the possibility of layer-by-layer and local analysis. (Auth.)

  11. HuRECA: Human Reliability Evaluator for Computer-based Control Room Actions

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Lee, Seung Jun; Jang, Seung Cheol

    2011-01-01

    As computer-based design features such as computer-based procedures (CBP), soft controls (SCs), and integrated information systems are being adopted in main control rooms (MCR) of nuclear power plants, a human reliability analysis (HRA) method capable of dealing with the effects of these design features on human reliability is needed. From the observations of human factors engineering verification and validation experiments, we have drawn some major important characteristics on operator behaviors and design-related influencing factors (DIFs) from the perspective of human reliability. Firstly, there are new DIFs that should be considered in developing an HRA method for computer-based control rooms including especially CBP and SCs. In the case of the computer-based procedure rather than the paper-based procedure, the structural and managerial elements should be considered as important PSFs in addition to the procedural contents. In the case of the soft controllers, the so-called interface management tasks (or secondary tasks) should be reflected in the assessment of human error probability. Secondly, computer-based control rooms can provide more effective error recovery features than conventional control rooms. Major error recovery features for computer-based control rooms include the automatic logic checking function of the computer-based procedure and the information sharing feature of the general computer-based designs

  12. A New Project-Based Lab for Undergraduate Environmental and Analytical Chemistry

    Science.gov (United States)

    Adami, Gianpiero

    2006-01-01

    A new project-based lab was developed for third year undergraduate chemistry students based on real world applications. The experience suggests that the total analytical procedure (TAP) project offers a stimulating alternative for delivering science skills and developing a greater interest for analytical chemistry and environmental sciences and…

  13. The growing need for analytical quality control

    International Nuclear Information System (INIS)

    Suschny, O.; Richman, D.M.

    1974-01-01

    Technological development in a country is directly dependent upon its analytical chemistry or measurement capability, because it is impossible to achieve any level of technological sophistication without the ability to measure. Measurement capability is needed to determine both technological competence and technological consequence. But measurement itself is insufficient. There must be a standard or a reference for comparison. In the complicated world of chemistry the need for reference materials grows with successful technological development. The International Atomic Energy Agency has been distributing calibrated radioisotope solutions, standard reference materials and intercomparison materials since the early 1960's. The purpose of this activity has been to help laboratories in its Member States to assess and, if necessary, to improve the reliability of their analytical work. The value and continued need of this service has been demonstrated by the results of many intercomparisons which proved that without continuing analytical quality control activities, adequate reliability of analytical data could not be taken for granted. Analytical chemistry, lacking the glamour of other aspects of the physical sciences, has not attracted the attention it deserves, but in terms of practical importance, it warrants high priority in any developing technological scheme, because without it there is little chance to evaluate technological success or failure or opportunity to identify the reasons for success or failure. The scope and the size of the future programme of the IAEA in this field has been delineated by recommendations made by several Panels of Experts; all have agreed on the importance of this programme and made detailed recommendations in their areas of expertise. The Agency's resources are limited and it cannot on its own undertake the preparation and distribution of all the materials needed. It can, however, offer a focal point to bring together different

  14. Research on Operating Procedure Development in View of RCM Theory

    International Nuclear Information System (INIS)

    Shi, J.

    2015-01-01

    The operation of NPPs (nuclear power plants) is closely related to SSCs (Structure, System and Component) function implementations and failure recoveries, and strictly follows operating procedure. The philosophy of RCM (Reliability Centered Maintenance) which is a widely-used systematic engineering approach in industry focusing on likewise facility functions and effectiveness of maintenance is accepted in relative analysis of NPPs operation in this paper. Based on the theory of RCM, the paper will discuss general logic of operating procedure development and framework optimization as well combining NPPs engineering design. Since the quality of operating procedures has a significant impact on the safe and reliable operation of NPPs, the paper provides a proposed operating procedure development logic diagramme for reference for the procedure optimization task ahead. (author)

  15. Use of reference materials for validating analytical methods. Applied to the determination of As, Co, Na, Hg, Se and Fe using neutron activation analysis

    International Nuclear Information System (INIS)

    Munoz, L; Andonie, O; Kohnenkamp, I

    2000-01-01

    The main purpose of an analytical laboratory is to provide reliable information on the nature and composition of the materials submitted for analysis. This purpose can only be attained if analytical methodologies that have the attributes of accuracy, precision, specificity and sensitivity, among others, are used. The process by which these attributes are evaluated is called validation of the analytical method. The Chilean Nuclear Energy Commission's Neutron Activation Analysis Laboratory is applying a quality guarantee program to ensure the quality of its analytical results, which aims, as well, to attain accreditation for some of its measurements. Validation of the analytical methodologies used is an essential part of applying this program. There are many forms of validation, from comparison with reference techniques to participation in inter-comparison rounds. Certified reference materials were used in this work in order to validate the application of neutron activation analysis in determining As, Co, Na, Hg, Se and Fe in shellfish samples. The use of reference materials was chosen because it is a simple option that easily detects sources of systematic errors. Neutron activation analysis is an instrumental analytical method that does not need chemical treatment and that is based on processes which take place in the nuclei of atoms, making the matrix effects unimportant and different biological reference materials can be used. The following certified reference materials were used for validating the method used: BCR human hair 397, NRCC dogfish muscle DORM-2, NRCC -dogfish liver DOLT-2, NIST - oyster tissue 1566, NIES - mussel 6 and BCR - tuna fish 464. The reference materials were analyzed using the procedure developed for the shellfish samples and the above-mentioned elements were determined. With the results obtained, the parameters of accuracy, precision, detection limit, quantification limit and uncertainty associated with the method were determined for each

  16. Reliable and valid assessment of Lichtenstein hernia repair skills.

    Science.gov (United States)

    Carlsen, C G; Lindorff-Larsen, K; Funch-Jensen, P; Lund, L; Charles, P; Konge, L

    2014-08-01

    Lichtenstein hernia repair is a common surgical procedure and one of the first procedures performed by a surgical trainee. However, formal assessment tools developed for this procedure are few and sparsely validated. The aim of this study was to determine the reliability and validity of an assessment tool designed to measure surgical skills in Lichtenstein hernia repair. Key issues were identified through a focus group interview. On this basis, an assessment tool with eight items was designed. Ten surgeons and surgical trainees were video recorded while performing Lichtenstein hernia repair, (four experts, three intermediates, and three novices). The videos were blindly and individually assessed by three raters (surgical consultants) using the assessment tool. Based on these assessments, validity and reliability were explored. The internal consistency of the items was high (Cronbach's alpha = 0.97). The inter-rater reliability was very good with an intra-class correlation coefficient (ICC) = 0.93. Generalizability analysis showed a coefficient above 0.8 even with one rater. The coefficient improved to 0.92 if three raters were used. One-way analysis of variance found a significant difference between the three groups which indicates construct validity, p fashion with the new procedure-specific assessment tool. We recommend this tool for future assessment of trainees performing Lichtenstein hernia repair to ensure that the objectives of competency-based surgical training are met.

  17. Psychometrics Matter in Health Behavior: A Long-term Reliability Generalization Study.

    Science.gov (United States)

    Pickett, Andrew C; Valdez, Danny; Barry, Adam E

    2017-09-01

    Despite numerous calls for increased understanding and reporting of reliability estimates, social science research, including the field of health behavior, has been slow to respond and adopt such practices. Therefore, we offer a brief overview of reliability and common reporting errors; we then perform analyses to examine and demonstrate the variability of reliability estimates by sample and over time. Using meta-analytic reliability generalization, we examined the variability of coefficient alpha scores for a well-designed, consistent, nationwide health study, covering a span of nearly 40 years. For each year and sample, reliability varied. Furthermore, reliability was predicted by a sample characteristic that differed among age groups within each administration. We demonstrated that reliability is influenced by the methods and individuals from which a given sample is drawn. Our work echoes previous calls that psychometric properties, particularly reliability of scores, are important and must be considered and reported before drawing statistical conclusions.

  18. The effect of loss functions on empirical Bayes reliability analysis

    Directory of Open Access Journals (Sweden)

    Camara Vincent A. R.

    1998-01-01

    Full Text Available The aim of the present study is to investigate the sensitivity of empirical Bayes estimates of the reliability function with respect to changing of the loss function. In addition to applying some of the basic analytical results on empirical Bayes reliability obtained with the use of the “popular” squared error loss function, we shall derive some expressions corresponding to empirical Bayes reliability estimates obtained with the Higgins–Tsokos, the Harris and our proposed logarithmic loss functions. The concept of efficiency, along with the notion of integrated mean square error, will be used as a criterion to numerically compare our results. It is shown that empirical Bayes reliability functions are in general sensitive to the choice of the loss function, and that the squared error loss does not always yield the best empirical Bayes reliability estimate.

  19. Parametric statistical techniques for the comparative analysis of censored reliability data: a review

    International Nuclear Information System (INIS)

    Bohoris, George A.

    1995-01-01

    This paper summarizes part of the work carried out to date on seeking analytical solutions to the two-sample problem with censored data in the context of reliability and maintenance optimization applications. For this purpose, parametric two-sample tests for failure and censored reliability data are introduced and their applicability/effectiveness in common engineering problems is reviewed

  20. A Systematic Evaluation of Blood Serum and Plasma Pre-Analytics for Metabolomics Cohort Studies

    Directory of Open Access Journals (Sweden)

    Elodie Jobard

    2016-12-01

    Full Text Available The recent thriving development of biobanks and associated high-throughput phenotyping studies requires the elaboration of large-scale approaches for monitoring biological sample quality and compliance with standard protocols. We present a metabolomic investigation of human blood samples that delineates pitfalls and guidelines for the collection, storage and handling procedures for serum and plasma. A series of eight pre-processing technical parameters is systematically investigated along variable ranges commonly encountered across clinical studies. While metabolic fingerprints, as assessed by nuclear magnetic resonance, are not significantly affected by altered centrifugation parameters or delays between sample pre-processing (blood centrifugation and storage, our metabolomic investigation highlights that both the delay and storage temperature between blood draw and centrifugation are the primary parameters impacting serum and plasma metabolic profiles. Storing the blood drawn at 4 °C is shown to be a reliable routine to confine variability associated with idle time prior to sample pre-processing. Based on their fine sensitivity to pre-analytical parameters and protocol variations, metabolic fingerprints could be exploited as valuable ways to determine compliance with standard procedures and quality assessment of blood samples within large multi-omic clinical and translational cohort studies.

  1. Numerical and semi-analytical modelling of the process induced distortions in pultrusion

    DEFF Research Database (Denmark)

    Baran, Ismet; Carlone, P.; Hattel, Jesper Henri

    2013-01-01

    , the transient distortions are inferred adopting a semi-analytical procedure, i.e. post processing numerical results by means of analytical methods. The predictions of the process induced distortion development using the aforementioned methods are found to be qualitatively close to each other...

  2. Food irradiation. An update of legal and analytical aspects

    International Nuclear Information System (INIS)

    Masotti, P.; Zonta, F.

    1999-01-01

    A new European directive concerning ionising radiation treatment of foodstuffs has been recently adopted, although National laws may continue to be applied at least until 31 December 2000. A brief updated review dealing with the legal and analytical aspects of food irradiation is presented. The legal status of the food irradiation issue presently in force in Italy, in the European Union and in the USA is discussed. Some of the most used and reliable analytical methods for detecting irradiated foodstuffs, with special reference to standardised methods of European Committee of Standardization, are listed [it

  3. Accounting for measurement reliability to improve the quality of inference in dental microhardness research: a worked example.

    Science.gov (United States)

    Sever, Ivan; Klaric, Eva; Tarle, Zrinka

    2016-07-01

    Dental microhardness experiments are influenced by unobserved factors related to the varying tooth characteristics that affect measurement reproducibility. This paper explores the appropriate analytical tools for modeling different sources of unobserved variability to reduce the biases encountered and increase the validity of microhardness studies. The enamel microhardness of human third molars was measured by Vickers diamond. The effects of five bleaching agents-10, 16, and 30 % carbamide peroxide, and 25 and 38 % hydrogen peroxide-were examined, as well as the effect of artificial saliva and amorphous calcium phosphate. To account for both between- and within-tooth heterogeneity in evaluating treatment effects, the statistical analysis was performed in the mixed-effects framework, which also included the appropriate weighting procedure to adjust for confounding. The results were compared to those of the standard ANOVA model usually applied. The weighted mixed-effects model produced the parameter estimates of different magnitude and significance than the standard ANOVA model. The results of the former model were more intuitive, with more precise estimates and better fit. Confounding could seriously bias the study outcomes, highlighting the need for more robust statistical procedures in dental research that account for the measurement reliability. The presented framework is more flexible and informative than existing analytical techniques and may improve the quality of inference in dental research. Reported results could be misleading if underlying heterogeneity of microhardness measurements is not taken into account. The confidence in treatment outcomes could be increased by applying the framework presented.

  4. The effect of loss functions on empirical Bayes reliability analysis

    Directory of Open Access Journals (Sweden)

    Vincent A. R. Camara

    1999-01-01

    Full Text Available The aim of the present study is to investigate the sensitivity of empirical Bayes estimates of the reliability function with respect to changing of the loss function. In addition to applying some of the basic analytical results on empirical Bayes reliability obtained with the use of the “popular” squared error loss function, we shall derive some expressions corresponding to empirical Bayes reliability estimates obtained with the Higgins–Tsokos, the Harris and our proposed logarithmic loss functions. The concept of efficiency, along with the notion of integrated mean square error, will be used as a criterion to numerically compare our results.

  5. Seamless Digital Environment - Plan for Data Analytics Use Case Study

    International Nuclear Information System (INIS)

    Oxstrand, Johanna Helene; Bly, Aaron Douglas

    2016-01-01

    The U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program initiated research in to what is needed in order to provide a roadmap or model for Nuclear Power Plants to reference when building an architecture that can support the growing data supply and demand flowing through their networks. The Digital Architecture project published report Digital Architecture Planning Model (Oxstrand et. al, 2016) discusses things to consider when building an architecture to support the increasing needs and demands of data throughout the plant. Once the plant is able to support the data demands it still needs to be able to provide the data in an easy, quick and reliable method. A common method is to create a ''one stop shop'' application that a user can go to get all the data they need. The creation of this leads to the need of creating a Seamless Digital Environment (SDE) to integrate all the ''siloed'' data. An SDE is the desired perception that should be presented to users by gathering the data from any data source (e.g., legacy applications and work management systems) without effort by the user. The goal for FY16 was to complete a feasibility study for data mining and analytics for employing information from computer-based procedures enabled technologies for use in developing improved business analytics. The research team collaborated with multiple organizations to identify use cases or scenarios, which could be beneficial to investigate in a feasibility study. Many interesting potential use cases were identified throughout the FY16 activity. Unfortunately, due to factors out of the research team's control, none of the studies were initiated this year. However, the insights gained and the relationships built with both PVNGS and NextAxiom will be valuable when moving forward with future research. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting it was identified would be very beneficial to the industry to

  6. One Iota Fills the Quota: A Paradox in Multifacet Reliability Coefficients.

    Science.gov (United States)

    Conger, Anthony J.

    1983-01-01

    A paradoxical phenomenon of decreases in reliability as the number of elements averaged over increases is shown to be possible in multifacet reliability procedures (intraclass correlations or generalizability coefficients). Conditions governing this phenomenon are presented along with implications and cautions. (Author)

  7. A critique of reliability prediction techniques for avionics applications

    Directory of Open Access Journals (Sweden)

    Guru Prasad PANDIAN

    2018-01-01

    Full Text Available Avionics (aeronautics and aerospace industries must rely on components and systems of demonstrated high reliability. For this, handbook-based methods have been traditionally used to design for reliability, develop test plans, and define maintenance requirements and sustainment logistics. However, these methods have been criticized as flawed and leading to inaccurate and misleading results. In its recent report on enhancing defense system reliability, the U.S. National Academy of Sciences has recently discredited these methods, judging the Military Handbook (MIL-HDBK-217 and its progeny as invalid and inaccurate. This paper discusses the issues that arise with the use of handbook-based methods in commercial and military avionics applications. Alternative approaches to reliability design (and its demonstration are also discussed, including similarity analysis, testing, physics-of-failure, and data analytics for prognostics and systems health management.

  8. Reliability analyses of safety systems for WWER-440 nuclear power plants

    International Nuclear Information System (INIS)

    Dusek, J.; Hojny, V.

    1985-01-01

    The UJV in Rez near Prague studied the reliability of the system of emergency core cooling and of the system for suppressing pressure in the sealed area of the nuclear power plant in the occurrence of a loss-of-coolant accident. The reliability of the systems was evaluated by failure tree analysis. Simulation and analytical calculation programs were developed and used for the reliability analysis. The results are briefly presented of the reliability analyses of the passive system for the immediate short-term flooding of the reactor core, of the active low-pressure system of emergency core cooling, the spray system, the bubble-vacuum system and the system of emergency supply of the steam generators. (E.S.)

  9. Component reliability for electronic systems

    CERN Document Server

    Bajenescu, Titu-Marius I

    2010-01-01

    The main reason for the premature breakdown of today's electronic products (computers, cars, tools, appliances, etc.) is the failure of the components used to build these products. Today professionals are looking for effective ways to minimize the degradation of electronic components to help ensure longer-lasting, more technically sound products and systems. This practical book offers engineers specific guidance on how to design more reliable components and build more reliable electronic systems. Professionals learn how to optimize a virtual component prototype, accurately monitor product reliability during the entire production process, and add the burn-in and selection procedures that are the most appropriate for the intended applications. Moreover, the book helps system designers ensure that all components are correctly applied, margins are adequate, wear-out failure modes are prevented during the expected duration of life, and system interfaces cannot lead to failure.

  10. Intercalibration of analytical methods on marine environmental samples

    International Nuclear Information System (INIS)

    1988-06-01

    The pollution of the seas by various chemical substances constitutes nowadays one of the principal concerns of mankind. The International Atomic Energy Agency has organized in past years several intercomparison exercises in the framework of its Analytical Quality Control Service. The present intercomparison had a double aim: first, to give laboratories participating in this intercomparison an opportunity for checking their analytical performance. Secondly, to produce on the basis of the results of this intercomparison a reference material made of fish tissue which would be accurately certified with respect to many trace elements. Such a material could be used by analytical chemists to check the validity of new analytical procedures. In total, 53 laboratories from 29 countries reported results (585 laboratory means for 48 elements). 5 refs, 52 tabs

  11. A graph algebra for scalable visual analytics.

    Science.gov (United States)

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  12. How to conduct External Quality Assessment Schemes for the pre-analytical phase?

    Science.gov (United States)

    Kristensen, Gunn B B; Aakre, Kristin Moberg; Kristoffersen, Ann Helen; Sandberg, Sverre

    2014-01-01

    In laboratory medicine, several studies have described the most frequent errors in the different phases of the total testing process, and a large proportion of these errors occur in the pre-analytical phase. Schemes for registration of errors and subsequent feedback to the participants have been conducted for decades concerning the analytical phase by External Quality Assessment (EQA) organizations operating in most countries. The aim of the paper is to present an overview of different types of EQA schemes for the pre-analytical phase, and give examples of some existing schemes. So far, very few EQA organizations have focused on the pre-analytical phase, and most EQA organizations do not offer pre-analytical EQA schemes (EQAS). It is more difficult to perform and standardize pre-analytical EQAS and also, accreditation bodies do not ask the laboratories for results from such schemes. However, some ongoing EQA programs for the pre-analytical phase do exist, and some examples are given in this paper. The methods used can be divided into three different types; collecting information about pre-analytical laboratory procedures, circulating real samples to collect information about interferences that might affect the measurement procedure, or register actual laboratory errors and relate these to quality indicators. These three types have different focus and different challenges regarding implementation, and a combination of the three is probably necessary to be able to detect and monitor the wide range of errors occurring in the pre-analytical phase.

  13. Human reliability analysis using event trees

    International Nuclear Information System (INIS)

    Heslinga, G.

    1983-01-01

    The shut-down procedure of a technologically complex installation as a nuclear power plant consists of a lot of human actions, some of which have to be performed several times. The procedure is regarded as a chain of modules of specific actions, some of which are analyzed separately. The analysis is carried out by making a Human Reliability Analysis event tree (HRA event tree) of each action, breaking down each action into small elementary steps. The application of event trees in human reliability analysis implies more difficulties than in the case of technical systems where event trees were mainly used until now. The most important reason is that the operator is able to recover a wrong performance; memory influences play a significant role. In this study these difficulties are dealt with theoretically. The following conclusions can be drawn: (1) in principle event trees may be used in human reliability analysis; (2) although in practice the operator will recover his fault partly, theoretically this can be described as starting the whole event tree again; (3) compact formulas have been derived, by which the probability of reaching a specific failure consequence on passing through the HRA event tree after several times of recovery is to be calculated. (orig.)

  14. Connectivity-Based Reliable Multicast MAC Protocol for IEEE 802.11 Wireless LANs

    Directory of Open Access Journals (Sweden)

    Woo-Yong Choi

    2009-01-01

    Full Text Available We propose the efficient reliable multicast MAC protocol based on the connectivity information among the recipients. Enhancing the BMMM (Batch Mode Multicast MAC protocol, the reliable multicast MAC protocol significantly reduces the RAK (Request for ACK frame transmissions in a reasonable computational time and enhances the MAC performance. By the analytical performance analysis, the throughputs of the BMMM protocol and our proposed MAC protocol are derived. Numerical examples show that our proposed MAC protocol increases the reliable multicast MAC performance for IEEE 802.11 wireless LANs.

  15. General Staining and Segmentation Procedures for High Content Imaging and Analysis.

    Science.gov (United States)

    Chambers, Kevin M; Mandavilli, Bhaskar S; Dolman, Nick J; Janes, Michael S

    2018-01-01

    Automated quantitative fluorescence microscopy, also known as high content imaging (HCI), is a rapidly growing analytical approach in cell biology. Because automated image analysis relies heavily on robust demarcation of cells and subcellular regions, reliable methods for labeling cells is a critical component of the HCI workflow. Labeling of cells for image segmentation is typically performed with fluorescent probes that bind DNA for nuclear-based cell demarcation or with those which react with proteins for image analysis based on whole cell staining. These reagents, along with instrument and software settings, play an important role in the successful segmentation of cells in a population for automated and quantitative image analysis. In this chapter, we describe standard procedures for labeling and image segmentation in both live and fixed cell samples. The chapter will also provide troubleshooting guidelines for some of the common problems associated with these aspects of HCI.

  16. System-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, NEWTONP, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), used independently of one another. Program finds probability required to yield given system reliability. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  17. Proposal of resolution to create an inquiry commission on the french nuclear power plants reliability in case or earthquakes and on the safety, information and warning procedures in case of incidents

    International Nuclear Information System (INIS)

    2003-01-01

    This short paper presents the reasons of the creation of parliamentary inquiry commission of 30 members, on the reliability of the nuclear power plants in France in case of earthquakes and on the safety, information and warning procedures in case of accidents. (A.L.B.)

  18. Application of Metric-based Software Reliability Analysis to Example Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Smidts, Carol

    2008-07-01

    The software reliability of TELLERFAST ATM software is analyzed by using two metric-based software reliability analysis methods, a state transition diagram-based method and a test coverage-based method. The procedures for the software reliability analysis by using the two methods and the analysis results are provided in this report. It is found that the two methods have a relation of complementary cooperation, and therefore further researches on combining the two methods to reflect the benefit of the complementary cooperative effect to the software reliability analysis are recommended

  19. Reliability in maintenance and design of elastomer sealed closures

    International Nuclear Information System (INIS)

    Lake, W.H.

    1978-01-01

    The methods of reliability are considered for maintenance and design of elastomer sealed containment closures. Component reliability is used to establish a replacement schedule for system maintenance. Reliability data on elastomer seals is used to evaluate the common practice of annual replacement, and to calculate component reliability values for several typical shipment time periods. System reliability methods are used to examine the relative merits of typical closure designs. These include single component and redundant seal closure, with and without closure verification testing. The paper presents a general method of quantifying the merits of closure designs through the use of reliability analysis, which is a probabilistic technique. The reference list offers a general source of information in the field of reliability, and should offer the opportunity to extend the procedures discussed in this paper to other design safety applications

  20. Proposal of a new analytical procedure for the measurement of water absorption by stone. Preliminary study for an alternative to the Italian technical normative NORMAL 07-81

    Directory of Open Access Journals (Sweden)

    Plattner Susanne

    2012-06-01

    Full Text Available Abstract Background Italian technical normative in the field of cultural heritage is often considered insufficient or not suitable in practise, therefore efforts are necessary to design new and/or improve already existing ones. Results In this paper an alternative analytical procedure for the determination of water absorption (by full immersion by stone material, described in the NORMAL 07-81 document, is proposed. Improvements concern methods accuracy and reduction of sample size; further also density data is obtained. Conclusions The new procedure was applied on three different marble samples and outcomes are encouraging, but further testing is running to better understand to what extent sample size can be reduced without worsening accuracy of results, taking into account that stone is a very heterogeneous material.

  1. Human performance evaluation: The procedures of ultimate response guideline for nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Kang-Hung, E-mail: alvinks@iner.gov.tw [Institute of Nuclear Energy Research, Atomic Engery Council, No. 1000, Whenhua Road, Jiaan Village, Longtan Township, Taoyuan County, Taiwan (China); Hwang, Sheue-Ling, E-mail: slhwang@ie.nthu.edu.tw [Department of Industrial Engineering and Engineering Management, National Tsing Hua University, No. 101, Section 2, Kuang-Fu Road, Hsinchu, Taiwan 30013, Taiwan (China)

    2014-07-01

    Highlights: • This study adopts SPAR-H to evaluate HEPs in the URG procedures. • The involvement of URG procedures could reduce CDF significantly. • Upgrading the training level of staff will enhance the reliability effectively. • Aiding the plant manager in making URG decision will enhance the reliability. - Abstract: In the nuclear accident which occurred in Japan on March 11, 2011, several units of Fukushima conventional BWR experienced a total loss of power and water supply triggered by a heavy earthquake and subsequent Tsunami which were outside design models. In the past, when an accident occurred, operators in nuclear power plants (NPP) followed emergency operating procedures (EOPs) or severe accident management guidance (SAMG). However, EOP and SAMG are symptom-based procedures to cope with severe transients and accidents, depending on real-time operational parameters. Ultimate response guidelines (URG), a plant specific interim remedy action plan, was developed to manage accidents caused by compound disasters which exceed design models. The URG guides the plant operators’ conduct of reactor depressurization, core cooling water injection, and containment venting. This study adopts NUREG/CR-6883 (Standardized Plant Analysis Risk Human Reliability Analysis, SPAR-H) to evaluate human error probabilities (HEPs) of action and diagnosis in the current URG procedures. We found the human reliability of URG procedures analyzed by SPAR-H is about 85% (depending on different decision makers). Upgrading the training level of staff or enhancing plant managers ability to decide whether to execute URG will enhance the human reliability of URG procedures.

  2. Using Analytic Hierarchy Process in Textbook Evaluation

    Science.gov (United States)

    Kato, Shigeo

    2014-01-01

    This study demonstrates the application of the analytic hierarchy process (AHP) in English language teaching materials evaluation, focusing in particular on its potential for systematically integrating different components of evaluation criteria in a variety of teaching contexts. AHP is a measurement procedure wherein pairwise comparisons are made…

  3. Recent developments in computer vision-based analytical chemistry: A tutorial review.

    Science.gov (United States)

    Capitán-Vallvey, Luis Fermín; López-Ruiz, Nuria; Martínez-Olmos, Antonio; Erenas, Miguel M; Palma, Alberto J

    2015-10-29

    Chemical analysis based on colour changes recorded with imaging devices is gaining increasing interest. This is due to its several significant advantages, such as simplicity of use, and the fact that it is easily combinable with portable and widely distributed imaging devices, resulting in friendly analytical procedures in many areas that demand out-of-lab applications for in situ and real-time monitoring. This tutorial review covers computer vision-based analytical (CVAC) procedures and systems from 2005 to 2015, a period of time when 87.5% of the papers on this topic were published. The background regarding colour spaces and recent analytical system architectures of interest in analytical chemistry is presented in the form of a tutorial. Moreover, issues regarding images, such as the influence of illuminants, and the most relevant techniques for processing and analysing digital images are addressed. Some of the most relevant applications are then detailed, highlighting their main characteristics. Finally, our opinion about future perspectives is discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Microbial ecology laboratory procedures manual NASA/MSFC

    Science.gov (United States)

    Huff, Timothy L.

    1990-01-01

    An essential part of the efficient operation of any microbiology laboratory involved in sample analysis is a standard procedures manual. The purpose of this manual is to provide concise and well defined instructions on routine technical procedures involving sample analysis and methods for monitoring and maintaining quality control within the laboratory. Of equal importance is the safe operation of the laboratory. This manual outlines detailed procedures to be followed in the microbial ecology laboratory to assure safety, analytical control, and validity of results.

  5. Risk analysis of analytical validations by probabilistic modification of FMEA.

    Science.gov (United States)

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. A Note on the Score Reliability for the Satisfaction with Life Scale: An RG Study

    Science.gov (United States)

    Vassar, Matt

    2008-01-01

    The purpose of the present study was to meta-analytically investigate the score reliability for the Satisfaction With Life Scale. Four-hundred and sixteen articles using the measure were located through electronic database searches and then separated to identify studies which had calculated reliability estimates from their own data. Sixty-two…

  7. Operator reliability assessment system (OPERAS)

    International Nuclear Information System (INIS)

    Singh, A.; Spurgin, A.J.; Martin, T.; Welsch, J.; Hallam, J.W.

    1991-01-01

    OPERAS is a personal-computer (PC) based software to collect and process simulator data on control-room operators responses during requalification training scenarios. The data collection scheme is based upon approach developed earlier during the EPRI Operator Reliability Experiments project. The software allows automated data collection from simulator, thus minimizing simulator staff time and resources to collect, maintain and process data which can be useful in monitoring, assessing and enhancing the progress of crew reliability and effectiveness. The system is designed to provide the data and output information in the form of user-friendly charts, tables and figures for use by plant staff. OPERAS prototype software has been implemented at the Diablo Canyon (PWR) and Millstone (BWR) plants and is currently being used to collect operator response data. Data collected from similator include plant-state variables such as reactor pressure and temperature, malfunction, times at which annunciators are activated, operator actions and observations of crew behavior by training staff. The data and systematic analytical results provided by the OPERAS system can contribute to increase objectivity by the utility probabilistic risk analysis (PRA) and training staff in monitoring and assessing reliability of their crews

  8. A Turkish Version of the Critical-Care Pain Observation Tool: Reliability and Validity Assessment.

    Science.gov (United States)

    Aktaş, Yeşim Yaman; Karabulut, Neziha

    2017-08-01

    The study aim was to evaluate the validity and reliability of the Critical-Care Pain Observation Tool in critically ill patients. A repeated measures design was used for the study. A convenience sample of 66 patients who had undergone open-heart surgery in the cardiovascular surgery intensive care unit in Ordu, Turkey, was recruited for the study. The patients were evaluated by using the Critical-Care Pain Observation Tool at rest, during a nociceptive procedure (suctioning), and 20 minutes after the procedure while they were conscious and intubated after surgery. The Turkish version of the Critical-Care Pain Observation Tool has shown statistically acceptable levels of validity and reliability. Inter-rater reliability was supported by moderate-to-high-weighted κ coefficients (weighted κ coefficient = 0.55 to 1.00). For concurrent validity, significant associations were found between the scores on the Critical-Care Pain Observation Tool and the Behavioral Pain Scale scores. Discriminant validity was also supported by higher scores during suctioning (a nociceptive procedure) versus non-nociceptive procedures. The internal consistency of the Critical-Care Pain Observation Tool was 0.72 during a nociceptive procedure and 0.71 during a non-nociceptive procedure. The validity and reliability of the Turkish version of the Critical-Care Pain Observation Tool was determined to be acceptable for pain assessment in critical care, especially for patients who cannot communicate verbally. Copyright © 2016 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.

  9. A reliability program approach to operational safety

    International Nuclear Information System (INIS)

    Mueller, C.J.; Bezella, W.A.

    1985-01-01

    A Reliability Program (RP) model based on proven reliability techniques is being formulated for potential application in the nuclear power industry. Methods employed under NASA and military direction, commercial airline and related FAA programs were surveyed and a review of current nuclear risk-dominant issues conducted. The need for a reliability approach to address dependent system failures, operating and emergency procedures and human performance, and develop a plant-specific performance data base for safety decision making is demonstrated. Current research has concentrated on developing a Reliability Program approach for the operating phase of a nuclear plant's lifecycle. The approach incorporates performance monitoring and evaluation activities with dedicated tasks that integrate these activities with operation, surveillance, and maintenance of the plant. The detection, root-cause evaluation and before-the-fact correction of incipient or actual systems failures as a mechanism for maintaining plant safety is a major objective of the Reliability Program. (orig./HP)

  10. Development of reliability-based safety enhancement technology

    International Nuclear Information System (INIS)

    Kim, Kil Yoo; Han, Sang Hoon; Jang, Seung Cherl

    2002-04-01

    This project aims to develop critical technologies and the necessary reliability DB for maximizing the economics in the NPP operation with keeping the safety using the information of the risk (or reliability). For the research goal, firstly the four critical technologies(Risk Informed Tech. Spec. Optimization, Risk Informed Inservice Testing, On-line Maintenance, Maintenance Rule) for RIR and A have been developed. Secondly, KIND (Korea Information System for Nuclear Reliability Data) has been developed. Using KIND, YGN 3,4 and UCN 3,4 component reliability DB have been established. A reactor trip history DB for all NPP in Korea also has been developed and analyzed. Finally, a detailed reliability analysis of RPS/ESFAS for KNSP has been performed. With the result of the analysis, the sensitivity analysis also has been performed to optimize the AOT/STI of tech. spec. A statistical analysis procedure and computer code have been developed for the set point drift analysis

  11. Mission reliability of semi-Markov systems under generalized operational time requirements

    International Nuclear Information System (INIS)

    Wu, Xiaoyue; Hillston, Jane

    2015-01-01

    Mission reliability of a system depends on specific criteria for mission success. To evaluate the mission reliability of some mission systems that do not need to work normally for the whole mission time, two types of mission reliability for such systems are studied. The first type corresponds to the mission requirement that the system must remain operational continuously for a minimum time within the given mission time interval, while the second corresponds to the mission requirement that the total operational time of the system within the mission time window must be greater than a given value. Based on Markov renewal properties, matrix integral equations are derived for semi-Markov systems. Numerical algorithms and a simulation procedure are provided for both types of mission reliability. Two examples are used for illustration purposes. One is a one-unit repairable Markov system, and the other is a cold standby semi-Markov system consisting of two components. By the proposed approaches, the mission reliability of systems with time redundancy can be more precisely estimated to avoid possible unnecessary redundancy of system resources. - Highlights: • Two types of mission reliability under generalized requirements are defined. • Equations for both types of reliability are derived for semi-Markov systems. • Numerical methods are given for solving both types of reliability. • Simulation procedure is given for estimating both types of reliability. • Verification of the numerical methods is given by the results of simulation

  12. Quantification procedures in micro X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Kanngiesser, Birgit

    2003-01-01

    For the quantification in micro X-ray fluorescence analysis standardfree quantification procedures have become especially important. An introduction to the basic concepts of these quantification procedures is given, followed by a short survey of the procedures which are available now and what kind of experimental situations and analytical problems are addressed. The last point is extended by the description of an own development for the fundamental parameter method, which renders the inclusion of nonparallel beam geometries possible. Finally, open problems for the quantification procedures are discussed

  13. The Case for Adopting Server-side Analytics

    Science.gov (United States)

    Tino, C.; Holmes, C. P.; Feigelson, E.; Hurlburt, N. E.

    2017-12-01

    The standard method for accessing Earth and space science data relies on a scheme developed decades ago: data residing in one or many data stores must be parsed out and shipped via internet lines or physical transport to the researcher who in turn locally stores the data for analysis. The analyses tasks are varied and include visualization, parameterization, and comparison with or assimilation into physics models. In many cases this process is inefficient and unwieldy as the data sets become larger and demands on the analysis tasks become more sophisticated and complex. For about a decade, several groups have explored a new paradigm to this model. The names applied to the paradigm include "data analytics", "climate analytics", and "server-side analytics". The general concept is that in close network proximity to the data store there will be a tailored processing capability appropriate to the type and use of the data served. The user of the server-side analytics will operate on the data with numerical procedures. The procedures can be accessed via canned code, a scripting processor, or an analysis package such as Matlab, IDL or R. Results of the analytics processes will then be relayed via the internet to the user. In practice, these results will be at a much lower volume, easier to transport to and store locally by the user and easier for the user to interoperate with data sets from other remote data stores. The user can also iterate on the processing call to tailor the results as needed. A major component of server-side analytics could be to provide sets of tailored results to end users in order to eliminate the repetitive preconditioning that is both often required with these data sets and which drives much of the throughput challenges. NASA's Big Data Task Force studied this issue. This paper will present the results of this study including examples of SSAs that are being developed and demonstrated and suggestions for architectures that might be developed for

  14. Automation of analytical processes. A tool for higher efficiency and safety

    International Nuclear Information System (INIS)

    Groll, P.

    1976-01-01

    The analytical laboratory of a radiochemical facility is usually faced with the fact that numerous analyses of a similar type must be routinely carried out. Automation of such routine analytical procedures helps in increasing the efficiency and safety of the work. A review of the requirements for automation and its advantages is given and demonstrated on three examples. (author)

  15. 40Ar/39Ar geochronology at the Instituto de Geociências, USP: instrumentation, analytical procedures, and calibration

    Directory of Open Access Journals (Sweden)

    PAULO M. VASCONCELOS

    2002-06-01

    Full Text Available Laser heating 40Ar/39Ar geochronology provides high analytical precision and accuracy, mum-scale spatial resolution, and statistically significant data sets for the study of geological and planetary processes. A newly commissioned 40Ar/39Ar laboratory at CPGeo/USP, São Paulo, Brazil, equips the Brazilian scientific community with a new powerful tool applicable to the study of geological and cosmochemical processes. Detailed information about laboratory layout, environmental conditions, and instrumentation provides the necessary parameters for the evaluation of the CPGeo/USP 40Ar/39Ar suitability to a diverse range of applications. Details about analytical procedures, including mineral separation, irradiation at the IPEN/CNEN reactor at USP, and mass spectrometric analysis enable potential researchers to design the necessary sampling and sample preparation program suitable to the objectives of their study. Finally, the results of calibration tests using Ca and K salts and glasses, international mineral standards, and in-house mineral standards show that the accuracy and precision obtained at the 40Ar/39Ar laboratory at CPGeo/USP are comparable to results obtained in the most respected laboratories internationally. The extensive calibration and standardization procedures undertaken ensure that the results of analytical studies carried out in our laboratories will gain immediate international credibility, enabling Brazilian students and scientists to conduct forefront research in earth and planetary sciences.A geocronologia de 40Ar/39Ar por aquecimento a laser permite alta precisão e acurácia analítica, tem resolução espacial em escala micrométrica, e fornece um número de dados estatisticamente significantes para o estudo de processos geológicos e planetários. Um recém construído laboratório de 40Ar/39Ar no CPGeo/USP, São Paulo, Brazil, mune a sociedade científica brasileira com uma técnica eficaz aplicável aos estudos geol

  16. Appendix 1: Analytical Techniques (Online supplementary material ...

    Indian Academy of Sciences (India)

    HP

    Further details of analytical techniques are given in http://www.actlabs.com. Zircon U–Pb dating and trace element analysis. The zircons were separated using standard procedures including crushing (in iron mortar and pestle), sieving (375 to 75 micron), tabling, heavy liquid separation (bromoform and methylene iodide) ...

  17. Quality management at the Safeguards Analytical Laboratory of IAEA

    International Nuclear Information System (INIS)

    Aigner, H.; Doherty, P.; Donohue, D.; Kuno, Y.

    2001-01-01

    Full text: In the year 2000, SAL'S quality management system was certified for conforming with the requirements of the international standard ISO-9002: 1994. The certification incurred considerable efforts, both in manpower and capital investments. The expected benefits of a formal quality management system do not directly target the correctness and reliability of analytical results. SAL believes that it was already performing well in this respect, even before re-shaping its quality system according to the reference model. Systematic QA and QC procedures have been applied since the begin of SAL'S operations in the mid-70's. The management framework specified in ISO-9002: 1994 complements these technical measures. Besides its value of being internationally recognised and thus enhancing perhaps the credibility in the quality of SAL'S services, the quality management system in this form provides additional advantages for the customer of the services of SAL, i.e. the Department of Safeguards of the IAEA, but also for the control and management of SAL'S internal 'business' processes. The paper discusses if these expected additional benefits are indeed obtained and whether or not their value is in balance with operational and initial investment costs. (author)

  18. Neutron logging reliability techniques and apparatus

    International Nuclear Information System (INIS)

    Johnstone, C.W.

    1978-01-01

    This invention relates in general to neutron logging of earth formations, and in particular, to novel apparatus and procedures for determining the validity, or reliability, of data derived at least in part by logging neutron characteristics of earth formations and, if desired, for affording verifiably accurate indications of such data

  19. Neutron logging reliability techniques and apparatus

    International Nuclear Information System (INIS)

    Johnstone, C.W.

    1974-01-01

    This invention relates in general to neutron logging of earth formations, and in particular, to novel apparatus and procedures for determining the validity, or reliability, of data derived at least in part by logging neutron characteristics of earth formations and, if desired, for affording verifiably accurate indications of such data. (author)

  20. Quantifying the measurement uncertainty of results from environmental analytical methods.

    Science.gov (United States)

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  1. Reliability of risk-adjusted outcomes for profiling hospital surgical quality.

    Science.gov (United States)

    Krell, Robert W; Hozain, Ahmed; Kao, Lillian S; Dimick, Justin B

    2014-05-01

    Quality improvement platforms commonly use risk-adjusted morbidity and mortality to profile hospital performance. However, given small hospital caseloads and low event rates for some procedures, it is unclear whether these outcomes reliably reflect hospital performance. To determine the reliability of risk-adjusted morbidity and mortality for hospital performance profiling using clinical registry data. A retrospective cohort study was conducted using data from the American College of Surgeons National Surgical Quality Improvement Program, 2009. Participants included all patients (N = 55,466) who underwent colon resection, pancreatic resection, laparoscopic gastric bypass, ventral hernia repair, abdominal aortic aneurysm repair, and lower extremity bypass. Outcomes included risk-adjusted overall morbidity, severe morbidity, and mortality. We assessed reliability (0-1 scale: 0, completely unreliable; and 1, perfectly reliable) for all 3 outcomes. We also quantified the number of hospitals meeting minimum acceptable reliability thresholds (>0.70, good reliability; and >0.50, fair reliability) for each outcome. For overall morbidity, the most common outcome studied, the mean reliability depended on sample size (ie, how high the hospital caseload was) and the event rate (ie, how frequently the outcome occurred). For example, mean reliability for overall morbidity was low for abdominal aortic aneurysm repair (reliability, 0.29; sample size, 25 cases per year; and event rate, 18.3%). In contrast, mean reliability for overall morbidity was higher for colon resection (reliability, 0.61; sample size, 114 cases per year; and event rate, 26.8%). Colon resection (37.7% of hospitals), pancreatic resection (7.1% of hospitals), and laparoscopic gastric bypass (11.5% of hospitals) were the only procedures for which any hospitals met a reliability threshold of 0.70 for overall morbidity. Because severe morbidity and mortality are less frequent outcomes, their mean

  2. Reliability analysis for dynamic configurations of systems with three failure modes

    International Nuclear Information System (INIS)

    Pham, Hoang

    1999-01-01

    Analytical models for computing the reliability of dynamic configurations of systems, such as majority and k-out-of-n, assuming that units and systems are subject to three types of failures: stuck-at-0, stuck-at-1, and stuck-at-x are presented in this paper. Formulas for determining the optimal design policies that maximize the reliability of dynamic k-out-of-n configurations subject to three types of failures are defined. The comparisons of the reliability modeling functions are also obtained. The optimum system size and threshold value k that minimize the expected cost of dynamic k-out-of-n configurations are also determined

  3. Seamless Digital Environment – Plan for Data Analytics Use Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna Helene [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bly, Aaron Douglas [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    The U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program initiated research in to what is needed in order to provide a roadmap or model for Nuclear Power Plants to reference when building an architecture that can support the growing data supply and demand flowing through their networks. The Digital Architecture project published report Digital Architecture Planning Model (Oxstrand et. al, 2016) discusses things to consider when building an architecture to support the increasing needs and demands of data throughout the plant. Once the plant is able to support the data demands it still needs to be able to provide the data in an easy, quick and reliable method. A common method is to create a “one stop shop” application that a user can go to get all the data they need. The creation of this leads to the need of creating a Seamless Digital Environment (SDE) to integrate all the “siloed” data. An SDE is the desired perception that should be presented to users by gathering the data from any data source (e.g., legacy applications and work management systems) without effort by the user. The goal for FY16 was to complete a feasibility study for data mining and analytics for employing information from computer-based procedures enabled technologies for use in developing improved business analytics. The research team collaborated with multiple organizations to identify use cases or scenarios, which could be beneficial to investigate in a feasibility study. Many interesting potential use cases were identified throughout the FY16 activity. Unfortunately, due to factors out of the research team’s control, none of the studies were initiated this year. However, the insights gained and the relationships built with both PVNGS and NextAxiom will be valuable when moving forward with future research. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting it was identified would be very beneficial to the industry to

  4. Reliability concepts applied to cutting tool change time

    Energy Technology Data Exchange (ETDEWEB)

    Patino Rodriguez, Carmen Elena, E-mail: cpatino@udea.edu.c [Department of Industrial Engineering, University of Antioquia, Medellin (Colombia); Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil); Francisco Martha de Souza, Gilberto [Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil)

    2010-08-15

    This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled.

  5. Reliability concepts applied to cutting tool change time

    International Nuclear Information System (INIS)

    Patino Rodriguez, Carmen Elena; Francisco Martha de Souza, Gilberto

    2010-01-01

    This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled.

  6. Subset simulation for structural reliability sensitivity analysis

    International Nuclear Information System (INIS)

    Song Shufang; Lu Zhenzhou; Qiao Hongwei

    2009-01-01

    Based on two procedures for efficiently generating conditional samples, i.e. Markov chain Monte Carlo (MCMC) simulation and importance sampling (IS), two reliability sensitivity (RS) algorithms are presented. On the basis of reliability analysis of Subset simulation (Subsim), the RS of the failure probability with respect to the distribution parameter of the basic variable is transformed as a set of RS of conditional failure probabilities with respect to the distribution parameter of the basic variable. By use of the conditional samples generated by MCMC simulation and IS, procedures are established to estimate the RS of the conditional failure probabilities. The formulae of the RS estimator, its variance and its coefficient of variation are derived in detail. The results of the illustrations show high efficiency and high precision of the presented algorithms, and it is suitable for highly nonlinear limit state equation and structural system with single and multiple failure modes

  7. A clean laboratory for ultratrace analysis: the ultratrace analytical facility (UTAF)

    International Nuclear Information System (INIS)

    Jadhav, S.G.; Sounderajan, Suvarna; Kumar, Sanjukta A.; Udas, A.C.; Ramanathan, M.; Palrecha, M.M.; Sudersanan, M.

    2003-06-01

    Thare has been an increasing demand for the quantification of various elements at extremely low concentrations in a variety of samples such as high purity materials, environmental and biological samples. The need for a controlled environment to obtain reliable and reproducible data necessitates the use of strategies and practices to minimize contamination during the analytical procedure. This report describes the protocol observed in our clean laboratory to eliminate contamination and ensure low laboratory blanks and some of the methodologies developed to carry out the analysis. The analysis is carried out by Graphite Furnace Atomic Absorption Spectrometry and electrochemical techniques such as Anodic/ Cathodic / Adsorptive Stripping Voltammetry. Characterisation of 5N (total impurities 10 ppm) arsenic is routinely carried out. Al in serum of patients suffering from end stage renal failure are also analyzed. Pine leaves, spinach, carrot puree and milk powder have been characterized for Al and Hg content and bovine serum has been characterized for Cu, Zn, Na, K in samples as part of intercomparison exercises. (author)

  8. Influence Of Inspection Intervals On Mechanical System Reliability

    International Nuclear Information System (INIS)

    Zilberman, B.

    1998-01-01

    In this paper a methodology of reliability analysis of mechanical systems with latent failures is described. Reliability analysis of such systems must include appropriate usage of check intervals for latent failure detection. The methodology suggests, that based on system logic the analyst decides at the beginning if a system can fail actively or latently and propagates this approach through all system levels. All inspections are assumed to be perfect (all failures are detected and repaired and no new failures are introduced as a result of the maintenance). Additional assumptions are that mission time is much smaller, than check intervals and all components have constant failure rates. Analytical expressions for reliability calculates are provided, based on fault tree and Markov modeling techniques (for two and three redundant systems with inspection intervals). The proposed methodology yields more accurate results than are obtained by not using check intervals or using half check interval times. The conventional analysis assuming that at the beginning of each mission system is as new, give an optimistic prediction of system reliability. Some examples of reliability calculations of mechanical systems with latent failures and establishing optimum check intervals are provided

  9. Component aging and reliability trends in Loviisa Nuclear Power Plant

    International Nuclear Information System (INIS)

    Jankala, K.E.; Vaurio, J.K.

    1989-01-01

    A plant-specific reliability data collection and analysis system has been developed at the Loviisa Nuclear Power Plant to perform tests for component aging and analysis of reliability trends. The system yields both mean values an uncertainty distribution information for reliability parameters to be used in the PSA project underway and in living-PSA applications. Several different trend models are included in the reliability analysis system. Simple analytical expressions have been derived from the parameters of these models, and their variances have been obtained using the information matrix. This paper is focused on the details of the learning/aging models and the estimation of their parameters and statistical accuracies. Applications to the historical data of the Loviisa plant are presented. The results indicate both up- and down-trends in failure rates as well as individuality between nominally identical components

  10. Reliability-cost models for the power switching devices of wind power converters

    DEFF Research Database (Denmark)

    Ma, Ke; Blaabjerg, Frede

    2012-01-01

    In order to satisfy the growing reliability requirements for the wind power converters with more cost-effective solution, the target of this paper is to establish a new reliability-cost model which can connect the relationship between reliability performances and corresponding semiconductor cost...... temperature mean value Tm and fluctuation amplitude ΔTj of power devices, are presented. With the proposed reliability-cost model, it is possible to enable future reliability-oriented design of the power switching devices for wind power converters, and also an evaluation benchmark for different wind power...... for power switching devices. First the conduction loss, switching loss as well as thermal impedance models of power switching devices (IGBT module) are related to the semiconductor chip number information respectively. Afterwards simplified analytical solutions, which can directly extract the junction...

  11. 18 CFR 375.303 - Delegations to the Director of the Office of Electric Reliability.

    Science.gov (United States)

    2010-04-01

    ... Director of the Office of Electric Reliability. 375.303 Section 375.303 Conservation of Power and Water... Delegations § 375.303 Delegations to the Director of the Office of Electric Reliability. The Commission... Electric Reliability Organization or Regional Entity rules or procedures; (ii) Reject an application...

  12. 42 CFR 431.814 - Sampling plan and procedures.

    Science.gov (United States)

    2010-10-01

    ... reliability of the reduced sample. (4) The sample selection procedure. Systematic random sampling is... sampling, and yield estimates with the same or better precision than achieved in systematic random sampling... 42 Public Health 4 2010-10-01 2010-10-01 false Sampling plan and procedures. 431.814 Section 431...

  13. Is a pre-analytical process for urinalysis required?

    Science.gov (United States)

    Petit, Morgane; Beaudeux, Jean-Louis; Majoux, Sandrine; Hennequin, Carole

    2017-10-01

    For the reliable urinary measurement of calcium, phosphate and uric acid, a pre-analytical process by adding acid or base to urine samples at laboratory is recommended in order to dissolve precipitated solutes. Several studies on different kind of samples and analysers have previously shown that a such pre-analytical treatment is useless. The objective was to study the necessity of pre-analytical treatment of urine on samples collected using the V-Monovette ® (Sarstedt) system and measured on the analyser Architect C16000 (Abbott Diagnostics). Sixty urinary samples of hospitalized patients were selected (n=30 for calcium and phosphate, and n=30 for uric acid). After acidification of urine samples for measurement of calcium and phosphate, and alkalinisation for measurement of uric acid respectively, differences between results before and after the pre-analytical treatment were compared to acceptable limits recommended by the French society of clinical biology (SFBC). No difference in concentration between before and after pre-analytical treatment of urine samples exceeded acceptable limits from SFBC for measurement of calcium and uric acid. For phosphate, only one sample exceeded these acceptable limits, showing a result paradoxically lower after acidification. In conclusion, in agreement with previous study, our results show that acidification or alkalinisation of urine samples from 24 h urines or from urination is not a pre-analytical necessity for measurement of calcium, phosphate and uric acid.

  14. Addressing the reliability issues of intelligent well systems

    International Nuclear Information System (INIS)

    Drakeley, Brian; Douglas, Neil

    2000-01-01

    New Technology receives its fair share of 'risk aversion' both in good and not so good economic times from oil and gas operators evaluating application opportunities. This paper presents details of a strategy developed and implemented to bring to market an Intelligent Well system designed from day one to maximize system reliability, while offering the customer a high degree of choice in system functionality. A team of engineers and scientists skilled in all aspects of Reliability Analysis and Assessment analyzed the Intelligent Well system under development, gathered reliability performance data from other sources and using various analytical techniques developed matrices of system survival probability estimates for various scenarios. Interaction with the system and design engineers has been an on-going process as designs are modified to maximize reliability predictions and extensive qualification test programs developed from the component to the overall system level. The techniques used in the development project will be presented. A comparative model now exists that facilitates the evaluation of future design alternative considerations and also contains databases that can be readily updated with actual field data etc. (author)

  15. Optimization of wet digestion procedure of blood and tissue for selenium determination by means of 75Se tracer

    International Nuclear Information System (INIS)

    Holynska, B.; Lipinska, K.

    1977-01-01

    Selenium-75 tracer has been used for optimization of analytical procedure of selenium determination in blood and tissue. Wet digestion procedure and reduction of selenium to its elemental form with tellurium as coprecipitant have been tested. Recovery of selenium obtained with the use of optimized analytical procedure amounts up 95% and precision is equal to 4.2%. (author)

  16. Designing Glass Panels for Economy and Reliability

    Science.gov (United States)

    Moore, D. M.

    1983-01-01

    Analytical method determines probability of failure of rectangular glass plates subjected to uniformly distributed loads such as those from wind, earthquake, snow, and deadweight. Developed as aid in design of protective glass covers for solar-cell arrays and solar collectors, method is also useful in estimating the reliability of large windows in buildings exposed to high winds and is adapted to nonlinear stress analysis of simply supported plates of any elastic material.

  17. Statistical reliability assessment of UT round-robin test data for piping welds

    International Nuclear Information System (INIS)

    Kim, H.M.; Park, I.K.; Park, U.S.; Park, Y.W.; Kang, S.C.; Lee, J.H.

    2004-01-01

    Ultrasonic NDE is one of important technologies in the life-time maintenance of nuclear power plant. Ultrasonic inspection system is consisted of the operator, equipment and procedure. The reliability of ultrasonic inspection system is affected by its ability. The performance demonstration round robin was conducted to quantify the capability of ultrasonic inspection for in-service. Several teams employed procedures that met or exceeded with ASME sec. XI code requirements detected the piping of nuclear power plant with various cracks to evaluate the capability of detection and sizing. In this paper, the statistical reliability assessment of ultrasonic nondestructive inspection data using probability of detection (POD) is presented. The result of POD using logistic model was useful to the reliability assessment for the NDE hit or miss data. (orig.)

  18. Median of patient results as a tool for assessment of analytical stability

    DEFF Research Database (Denmark)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft

    2015-01-01

    BACKGROUND: In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. METHOD......: Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable...... analytical bias based on biological variation. RESULTS: Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. DISCUSSION: Patient results applied in analytical quality performance...

  19. Experiments with Cloze Procedure

    Science.gov (United States)

    Evans, Gordon; Haastrup, Kirsten

    1976-01-01

    The Nordic Test Development Group prepared proficiency tests of English designed to provide reliable information on which to base decisions as to whether a candidate would be able to function in a job as described or whether he could be trained to do so. Two subtests used a modified cloze procedure. (Author/CFM)

  20. Human Reliability Data Bank: evaluation results

    International Nuclear Information System (INIS)

    Comer, M.K.; Donovan, M.D.; Gaddy, C.D.

    1985-01-01

    The US Nuclear Regulatory Commission (NRC), Sandia National Laboratories (SNL), and General Physics Corporation are conducting a research program to determine the practicality, acceptability, and usefulness of a Human Reliability Data Bank for nuclear power industry probabilistic risk assessment (PRA). As part of this program, a survey was conducted of existing human reliability data banks from other industries, and a detailed concept of a Data Bank for the nuclear industry was developed. Subsequently, a detailed specification for implementing the Data Bank was developed. An evaluation of this specification was conducted and is described in this report. The evaluation tested data treatment, storage, and retrieval using the Data Bank structure, as modified from NUREG/CR-2744, and detailed procedures for data processing and retrieval, developed prior to this evaluation and documented in the test specification. The evaluation consisted of an Operability Demonstration and Evaluation of the data processing procedures, a Data Retrieval Demonstration and Evaluation, a Retrospective Analysis that included a survey of organizations currently operating data banks for the nuclear power industry, and an Internal Analysis of the current Data Bank System

  1. Technology development of maintenance optimization and reliability analysis for safety features in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Woon; Choi, Seong Soo; Lee, Dong Gue; Kim, Young Il

    1999-12-01

    The reliability data management system (RDMS) for safety systems of PHWR type plants has been developed and utilized in the reliability analysis of the special safety systems of Wolsong Unit 1,2 with plant overhaul period lengthened. The RDMS is developed for the periodic efficient reliability analysis of the safety systems of Wolsong Unit 1,2. In addition, this system provides the function of analyzing the effects on safety system unavailability if the test period of a test procedure changes as well as the function of optimizing the test periods of safety-related test procedures. The RDMS can be utilized in handling the requests of the regulatory institute actively with regard to the reliability validation of safety systems. (author)

  2. A method of predicting the reliability of CDM coil insulation

    International Nuclear Information System (INIS)

    Kytasty, A.; Ogle, C.; Arrendale, H.

    1992-01-01

    This paper presents a method of predicting the reliability of the Collider Dipole Magnet (CDM) coil insulation design. The method proposes a probabilistic treatment of electrical test data, stress analysis, material properties variability and loading uncertainties to give the reliability estimate. The approach taken to predict reliability of design related failure modes of the CDM is to form analytical models of the various possible failure modes and their related mechanisms or causes, and then statistically assess the contributions of the various contributing variables. The probability of the failure mode occurring is interpreted as the number of times one would expect certain extreme situations to combine and randomly occur. One of the more complex failure modes of the CDM will be used to illustrate this methodology

  3. Orders- Versus Encounters-Based Image Capture: Implications Pre- and Post-Procedure Workflow, Technical and Build Capabilities, Resulting, Analytics and Revenue Capture: HIMSS-SIIM Collaborative White Paper.

    Science.gov (United States)

    Cram, Dawn; Roth, Christopher J; Towbin, Alexander J

    2016-10-01

    The decision to implement an orders-based versus an encounters-based imaging workflow poses various implications to image capture and storage. The impacts include workflows before and after an imaging procedure, electronic health record build, technical infrastructure, analytics, resulting, and revenue. Orders-based workflows tend to favor some imaging specialties while others require an encounters-based approach. The intent of this HIMSS-SIIM white paper is to offer lessons learned from early adopting institutions to physician champions and informatics leadership developing strategic planning and operational rollouts for specialties capturing clinical multimedia.

  4. Development of Radioanalytical and Microanalytical Procedures for the Determination of Actinides in Environmental Samples

    International Nuclear Information System (INIS)

    Macsik, Zsuzsanna; Vajda, Nora; Bene, Balazs; Varga, Zsolt

    2008-01-01

    A radio-analytical procedure has been developed for the simultaneous determination of actinides in swipe samples by alpha-spectrometry after the separation of the actinides by extraction chromatography. The procedure is based on the complete decomposition of the sample by destruction with microwave digestion or ashing in furnace. Actinides are separated on an extraction chromatographic column filled with TRU resin (product of Eichrom Industries Inc.). Alpha sources prepared from the separated fractions of americium, plutonium, thorium and uranium are counted by alpha spectrometry. Micro-analytical procedure is being developed for the location and identification of individual particles containing fissile material using solid state nuclear track detectors. The parameters of alpha and fission track detection have been optimized and a procedure has been elaborated to locate the particles on the sample by defining the coordinates of the tracks created by the particles on the track detector. Development of a procedure is planned to separate the located particles using micromanipulator and these particles will be examined individually by different micro- and radio-analytical techniques. (authors)

  5. Development of Radioanalytical and Microanalytical Procedures for the Determination of Actinides in Environmental Samples

    Energy Technology Data Exchange (ETDEWEB)

    Macsik, Zsuzsanna [Institute of Nuclear Techniques, Moegyetem rakpart 9, H-1111 Budapest (Hungary); Vajda, Nora [RadAnal Ltd., Bimbo ut 119/a, H-1026 Budapest (Hungary); Bene, Balazs [National Institute of Standards and Technology, Gaithersburg, MD 20899 (United States); Varga, Zsolt [Institute of Isotopes, Konkoly-Thege M. ut 29-33, H-1121 Budapest (Hungary)

    2008-07-01

    A radio-analytical procedure has been developed for the simultaneous determination of actinides in swipe samples by alpha-spectrometry after the separation of the actinides by extraction chromatography. The procedure is based on the complete decomposition of the sample by destruction with microwave digestion or ashing in furnace. Actinides are separated on an extraction chromatographic column filled with TRU resin (product of Eichrom Industries Inc.). Alpha sources prepared from the separated fractions of americium, plutonium, thorium and uranium are counted by alpha spectrometry. Micro-analytical procedure is being developed for the location and identification of individual particles containing fissile material using solid state nuclear track detectors. The parameters of alpha and fission track detection have been optimized and a procedure has been elaborated to locate the particles on the sample by defining the coordinates of the tracks created by the particles on the track detector. Development of a procedure is planned to separate the located particles using micromanipulator and these particles will be examined individually by different micro- and radio-analytical techniques. (authors)

  6. Towards analytical mix design for large-stone asphalt mixes.

    CSIR Research Space (South Africa)

    Rust, FC

    1992-08-01

    Full Text Available This paper addresses the development of an analytically based design procedure for large-aggregate asphalt and its application in thirteen trial sections. The physical and engineering properties of the various materials are discussed and related...

  7. Exploratory factor analysis and reliability analysis with missing data: A simple method for SPSS users

    Directory of Open Access Journals (Sweden)

    Bruce Weaver

    2014-09-01

    Full Text Available Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009 argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM covariances(or correlations as input for the analysis. SPSS users who have the Missing Values Analysis add-on module can obtain vectors ofEM means and standard deviations plus EM correlation and covariance matrices via the MVA procedure. But unfortunately, MVA has no /MATRIX subcommand, and therefore cannot write the EM correlations directly to a matrix dataset of the type needed as input to the FACTOR and RELIABILITY procedures. We describe two macros that (in conjunction with an intervening MVA command carry out the data management steps needed to create two matrix datasets, one containing EM correlations and the other EM covariances. Either of those matrix datasets can then be used asinput to the FACTOR procedure, and the EM correlations can also be used as input to RELIABILITY. We provide an example that illustrates the use of the two macros to generate the matrix datasets and how to use those datasets as input to the FACTOR and RELIABILITY procedures. We hope that this simple method for handling missing data will prove useful to both students andresearchers who are conducting EFA or reliability analysis.

  8. A comparative examination of sample treatment procedures for ICAP-AES analysis of biological tissue

    Science.gov (United States)

    De Boer, J. L. M.; Maessen, F. J. M. J.

    The objective of this study was to contribute to the evaluation of existing sample preparation procedures for ICAP-AES analysis of biological material. Performance characteristics were established of current digestion procedures comprising extraction, solubilization, pressure digestion, and wet and dry ashing methods. Apart from accuracy and precision, a number of criteria of special interest for the analytical practice was applied. As a test sample served SRM bovine liver. In this material six elements were simultaneously determined. Results showed that every procedure has its defects and advantages. Hence, unambiguous recommendation of standard digestion procedures can be made only when taking into account the specific analytical problem.

  9. Reliability assessment and probability based design of reinforced concrete containments and shear walls

    International Nuclear Information System (INIS)

    Hwang, H.; Reich, M.; Ellingwood, B.; Shinozuka, M.

    1986-03-01

    This report summarizes work completed under the program entitled, ''Probability-Based Load Combinations for Design of Category I Structures.'' Under this program, the probabilistic models for various static and dynamic loads were formulated. The randomness and uncertainties in material strengths and structural resistance were established. Several limit states of concrete containments and shear walls were identified and analytically formulated. Furthermore, the reliability analysis methods for estimating limit state probabilities were established. These reliability analysis methods can be used to evaluate the safety levels of nuclear structures under various combinations of static and dynamic loads. They can also be used to generate analytically the fragility data for PRA studies. In addition to the development of reliability analysis methods, probability-based design criteria for concrete containments and shear wall structures have also been developed. The proposed design criteria are in the load and resistance factor design (LRFD) format. The load and resistance factors are determined for several limit states and target limit state probabilities. Thus, the proposed design criteria are risk-consistent and have a well-established rationale. 73 refs., 18 figs., 16 tabs

  10. Co-ordinated research project on assessment of levels and health-effects of airborne particulate matter in mining, metal refining and metal working industries using nuclear and related analytical techniques. Report on the first research co-ordination meeting (RCM)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-12-31

    The objectives of the CRP are to: (1) improve competence for research on workplace monitoring in terms of proper sampling and analytical procedures, (2) obtain relevant and reliable data on sources and levels of workplace pollution in various countries, (3) promote a better understanding of methods for the interpretation of such data including occupational heath studies, and (4) encourage closer collaboration between analytical scientists and researchers in the field of occupational health in the countries concerned. The CRP focuses on the use of nuclear and related analytical techniques for the following kinds of studies: (1) strategies and techniques for sampling of workplace airborne particulate matter and of human tissues and body fluids (hair, blood, etc.) sampling of exposed and non-exposed persons; (2) development of suitable analytical procedures for analysis of such types of samples; (3) workplace and personal monitoring of airborne particulate matter in the mining, refining and metal working industries, and the health effects of such exposure; and (4) tissue analysis of the workers exposed for biological monitoring and the health effects studies. This report includes the core and supplementary programme of the CRP; technical aspects of sampling, analysis, data processing, and quality assurance; and organizational aspects. The report includes also 10 papers contributed by the participants. Each individual contribution was indexed and provided with an abstract. Refs, figs, tabs

  11. Co-ordinated research project on assessment of levels and health-effects of airborne particulate matter in mining, metal refining and metal working industries using nuclear and related analytical techniques. Report on the first research co-ordination meeting (RCM)

    International Nuclear Information System (INIS)

    1998-01-01

    The objectives of the CRP are to: (1) improve competence for research on workplace monitoring in terms of proper sampling and analytical procedures, (2) obtain relevant and reliable data on sources and levels of workplace pollution in various countries, (3) promote a better understanding of methods for the interpretation of such data including occupational heath studies, and (4) encourage closer collaboration between analytical scientists and researchers in the field of occupational health in the countries concerned. The CRP focuses on the use of nuclear and related analytical techniques for the following kinds of studies: (1) strategies and techniques for sampling of workplace airborne particulate matter and of human tissues and body fluids (hair, blood, etc.) sampling of exposed and non-exposed persons; (2) development of suitable analytical procedures for analysis of such types of samples; (3) workplace and personal monitoring of airborne particulate matter in the mining, refining and metal working industries, and the health effects of such exposure; and (4) tissue analysis of the workers exposed for biological monitoring and the health effects studies. This report includes the core and supplementary programme of the CRP; technical aspects of sampling, analysis, data processing, and quality assurance; and organizational aspects. The report includes also 10 papers contributed by the participants. Each individual contribution was indexed and provided with an abstract

  12. Analytical procedure in aseismic design of eccentric structure using response spectrum

    International Nuclear Information System (INIS)

    Takemori, T.; Kuwabara, Y.; Suwabe, A.; Mitsunobu, S.

    1977-01-01

    In this paper, the response are evaluated by the following two methods by the use of the typical torsional analytical models in which masses, rigidities, eccentricities between the centers thereof and several actual earthquake waves are taken as the parameters: (1) the root mean square of responses by using the response spectra derived from the earthquake waves, (2) the time history analysis by using the earthquake wave. The earthquake waves used are chosen to present the different frequency content and magnitude of the response spectra. The typical results derived from the study are as follows: (a) the response accelerations of mass center in the input earthquake direction by the (1) method coincide comparatively well with those by the (2) method, (b) the response accelerations perpendicular to the input earthquake direction by (1) method are 2 to 3 times as much as those by the (2) method, (c) the amplification of the response accelerations at arbitrary points distributed on the spread mass to those of center of the lumped mass by the (1) method are remarkably large compared with those by the (2) method in both directions respectively. These problems on the response spectrum analysis for the above-mentioned eccentric structure are discussed, and an improved analytical method applying the amplification coefficients of responses derived from this parametric time history analysis is proposed to the actual seismic design by the using of the given design ground response spectrum with root mean square technique

  13. Basics of Bayesian reliability estimation from attribute test data

    International Nuclear Information System (INIS)

    Martz, H.F. Jr.; Waller, R.A.

    1975-10-01

    The basic notions of Bayesian reliability estimation from attribute lifetest data are presented in an introductory and expository manner. Both Bayesian point and interval estimates of the probability of surviving the lifetest, the reliability, are discussed. The necessary formulas are simply stated, and examples are given to illustrate their use. In particular, a binomial model in conjunction with a beta prior model is considered. Particular attention is given to the procedure for selecting an appropriate prior model in practice. Empirical Bayes point and interval estimates of reliability are discussed and examples are given. 7 figures, 2 tables

  14. Implementing self sustained quality control procedures in a clinical laboratory.

    Science.gov (United States)

    Khatri, Roshan; K C, Sanjay; Shrestha, Prabodh; Sinha, J N

    2013-01-01

    Quality control is an essential component in every clinical laboratory which maintains the excellence of laboratory standards, supplementing to proper disease diagnosis, patient care and resulting in overall strengthening of health care system. Numerous quality control schemes are available, with combinations of procedures, most of which are tedious, time consuming and can be "too technical" whereas commercially available quality control materials can be expensive especially for laboratories in developing nations like Nepal. Here, we present a procedure performed at our centre with self prepared control serum and use of simple statistical tools for quality assurance. The pooled serum was prepared as per guidelines for preparation of stabilized liquid quality control serum from human sera. Internal Quality Assessment was performed on this sample, on a daily basis which included measurement of 12 routine biochemical parameters. The results were plotted on Levey-Jennings charts and analysed with quality control rules, for a period of one month. The mean levels of biochemical analytes in self prepared control serum were within normal physiological range. This serum was evaluated every day along with patients' samples. The results obtained were plotted on control charts and analysed using common quality control rules to identify possible systematic and random errors. Immediate mitigation measures were taken and the dispatch of erroneous reports was avoided. In this study we try to highlight on a simple internal quality control procedure which can be performed by laboratories, with minimum technology, expenditure, and expertise and improve reliability and validity of the test reports.

  15. SPARTex: A Vertex-Centric Framework for RDF Data Analytics

    KAUST Repository

    Abdelaziz, Ibrahim

    2015-08-31

    A growing number of applications require combining SPARQL queries with generic graph search on RDF data. However, the lack of procedural capabilities in SPARQL makes it inappropriate for graph analytics. Moreover, RDF engines focus on SPARQL query evaluation whereas graph management frameworks perform only generic graph computations. In this work, we bridge the gap by introducing SPARTex, an RDF analytics framework based on the vertex-centric computation model. In SPARTex, user-defined vertex centric programs can be invoked from SPARQL as stored procedures. SPARTex allows the execution of a pipeline of graph algorithms without the need for multiple reads/writes of input data and intermediate results. We use a cost-based optimizer for minimizing the communication cost. SPARTex evaluates queries that combine SPARQL and generic graph computations orders of magnitude faster than existing RDF engines. We demonstrate a real system prototype of SPARTex running on a local cluster using real and synthetic datasets. SPARTex has a real-time graphical user interface that allows the participants to write regular SPARQL queries, use our proposed SPARQL extension to declaratively invoke graph algorithms or combine/pipeline both SPARQL querying and generic graph analytics.

  16. 18 CFR 39.10 - Changes to an Electric Reliability Organization Rule or Regional Entity Rule.

    Science.gov (United States)

    2010-04-01

    ... RULES CONCERNING CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.10 Changes to an Electric... Reliability Organization Rule or Regional Entity Rule. 39.10 Section 39.10 Conservation of Power and Water...

  17. Site study plan for geochemical analytical requirements and methodologies: Revision 1

    International Nuclear Information System (INIS)

    1987-12-01

    This site study plan documents the analytical methodologies and procedures that will be used to analyze geochemically the rock and fluid samples collected during Site Characterization. Information relating to the quality aspects of these analyses is also provided, where available. Most of the proposed analytical procedures have been used previously on the program and are sufficiently sensitive to yield high-quality analyses. In a few cases improvements in analytical methodology (e.g., greater sensitivity, fewer interferences) are desired. Suggested improvements to these methodologies are discussed. In most cases these method-development activities have already been initiated. The primary source of rock and fluid samples for geochemical analysis during Site Characterization will be the drilling program, as described in various SRP Site Study Plans. The Salt Repository Project (SRP) Networks specify the schedule under which the program will operate. Drilling will not begin until after site ground water baseline conditions have been established. The Technical Field Services Contractor (TFSC) is responsible for conducting the field program of drilling and testing. Samples and data will be handled and reported in accordance with established SRP procedures. A quality assurance program will be utilized to assure that activities affecting quality are performed correctly and that the appropriate documentation is maintained. 28 refs., 9 figs., 14 tabs

  18. The design and use of reliability data base with analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Doorepall, J.; Cooke, R.; Paulsen, J.; Hokstadt, P.

    1996-06-01

    With the advent of sophisticated computer tools, it is possible to give a distributed population of users direct access to reliability component operational histories. This allows the user a greater freedom in defining statistical populations of components and selecting failure modes. However, the reliability data analyst`s current analytical instrumentarium is not adequate for this purpose. The terminology used in organizing and gathering reliability data is standardized, and the statistical methods used in analyzing this data are not always suitably chosen. This report attempts to establish a baseline with regard to terminology and analysis methods, to support the use of a new analysis tool. It builds on results obtained in several projects for the ESTEC and SKI on the design of reliability databases. Starting with component socket time histories, we identify a sequence of questions which should be answered prior to the employment of analytical methods. These questions concern the homogeneity and stationarity of (possible dependent) competing failure modes and the independence of competing failure modes. Statistical tests, some of them new, are proposed for answering these questions. Attention is given to issues of non-identifiability of competing risk and clustering of failure-repair events. These ideas have been implemented in an analysis tool for grazing component socket time histories, and illustrative results are presented. The appendix provides background on statistical tests and competing failure modes. (au) 4 tabs., 17 ills., 61 refs.

  19. The design and use of reliability data base with analysis tool

    International Nuclear Information System (INIS)

    Doorepall, J.; Cooke, R.; Paulsen, J.; Hokstadt, P.

    1996-06-01

    With the advent of sophisticated computer tools, it is possible to give a distributed population of users direct access to reliability component operational histories. This allows the user a greater freedom in defining statistical populations of components and selecting failure modes. However, the reliability data analyst's current analytical instrumentarium is not adequate for this purpose. The terminology used in organizing and gathering reliability data is standardized, and the statistical methods used in analyzing this data are not always suitably chosen. This report attempts to establish a baseline with regard to terminology and analysis methods, to support the use of a new analysis tool. It builds on results obtained in several projects for the ESTEC and SKI on the design of reliability databases. Starting with component socket time histories, we identify a sequence of questions which should be answered prior to the employment of analytical methods. These questions concern the homogeneity and stationarity of (possible dependent) competing failure modes and the independence of competing failure modes. Statistical tests, some of them new, are proposed for answering these questions. Attention is given to issues of non-identifiability of competing risk and clustering of failure-repair events. These ideas have been implemented in an analysis tool for grazing component socket time histories, and illustrative results are presented. The appendix provides background on statistical tests and competing failure modes. (au) 4 tabs., 17 ills., 61 refs

  20. Selenide isotope generator for the Galileo mission. Reliability program plan

    International Nuclear Information System (INIS)

    1978-10-01

    The reliability program plan for the Selenide Isotope Generator (SIG) program is presented. It delineates the specific tasks that will be accomplished by Teledyne Energy Systems and its suppliers during design, development, fabrication and test of deliverable Radioisotopic Thermoelectric Generators (RTG), Electrical Heated Thermoelectric Generators (ETG) and associated Ground Support Equipment (GSE). The Plan is formulated in general accordance with procedures specified in DOE Reliability Engineering Program Requirements Publication No. SNS-2, dated June 17, 1974. The Reliability Program Plan presented herein defines the total reliability effort without further reference to Government Specifications. The reliability tasks to be accomplished are delineated herein and become the basis for contract compliance to the extent specified in the SIG contract Statement of Work

  1. Research on the reliability of friction system under combined additive and multiplicative random excitations

    Science.gov (United States)

    Sun, Jiaojiao; Xu, Wei; Lin, Zifei

    2018-01-01

    In this paper, the reliability of a non-linearly damped friction oscillator under combined additive and multiplicative Gaussian white noise excitations is investigated. The stochastic averaging method, which is usually applied to the research of smooth system, has been extended to the study of the reliability of non-smooth friction system. The results indicate that the reliability of friction system can be improved by Coulomb friction and reduced by random excitations. In particular, the effect of the external random excitation on the reliability is larger than the effect of the parametric random excitation. The validity of the analytical results is verified by the numerical results.

  2. Interlaboratory analytical performance studies; a way to estimate measurement uncertainty

    Directory of Open Access Journals (Sweden)

    El¿bieta £ysiak-Pastuszak

    2004-09-01

    Full Text Available Comparability of data collected within collaborative programmes became the key challenge of analytical chemistry in the 1990s, including monitoring of the marine environment. To obtain relevant and reliable data, the analytical process has to proceed under a well-established Quality Assurance (QA system with external analytical proficiency tests as an inherent component. A programme called Quality Assurance in Marine Monitoring in Europe (QUASIMEME was established in 1993 and evolved over the years as the major provider of QA proficiency tests for nutrients, trace metals and chlorinated organic compounds in marine environment studies. The article presents an evaluation of results obtained in QUASIMEME Laboratory Performance Studies by the monitoring laboratory of the Institute of Meteorology and Water Management (Gdynia, Poland in exercises on nutrient determination in seawater. The measurement uncertainty estimated from routine internal quality control measurements and from results of analytical performance exercises is also presented in the paper.

  3. Designing the optimal bit: balancing energetic cost, speed and reliability.

    Science.gov (United States)

    Deshpande, Abhishek; Gopalkrishnan, Manoj; Ouldridge, Thomas E; Jones, Nick S

    2017-08-01

    We consider the challenge of operating a reliable bit that can be rapidly erased. We find that both erasing and reliability times are non-monotonic in the underlying friction, leading to a trade-off between erasing speed and bit reliability. Fast erasure is possible at the expense of low reliability at moderate friction, and high reliability comes at the expense of slow erasure in the underdamped and overdamped limits. Within a given class of bit parameters and control strategies, we define 'optimal' designs of bits that meet the desired reliability and erasing time requirements with the lowest operational work cost. We find that optimal designs always saturate the bound on the erasing time requirement, but can exceed the required reliability time if critically damped. The non-trivial geometry of the reliability and erasing time scales allows us to exclude large regions of parameter space as suboptimal. We find that optimal designs are either critically damped or close to critical damping under the erasing procedure.

  4. Optimization of an analytical electron microscope for x-ray microanalysis: instrumental problems

    International Nuclear Information System (INIS)

    Bentley, J.; Zaluzec, N.J.; Kenik, E.A.; Carpenter, R.W.

    1979-01-01

    The addition of an energy dispersive x-ray spectrometer to a modern transmission or scanning transmission electron microscope can provide a powerful tool in the characterization of the materials. Unfortunately this seemingly simple modification can lead to a host of instrumental problems with respect to the accuracy, validity, and quality of the recorded information. This tutorial reviews the complications which can arise in performing x-ray microanalysis in current analytical electron microscopes. The first topic treated in depth is fluorescence by uncollimated radiation. The source, distinguishing characteristics, effects on quantitative analysis and schemes for elimination or minimization as applicable to TEM/STEMs, D-STEMs and HVEMs are discussed. The local specimen environment is considered in the second major section where again detrimental effects on quantitative analysis and remedial procedures, particularly the use of low-background specimen holers, are highlighted. Finally, the detrimental aspects of specimen contamination, insofar as they affect x-ray microanalysis, are discussed. It is concluded that if the described preventive measures are implemented, reliable quantitative analysis is possible

  5. Assessment of the reliability of ultrasonic inspection methods

    International Nuclear Information System (INIS)

    Haines, N.F.; Langston, D.B.; Green, A.J.; Wilson, R.

    1982-01-01

    The reliability of NDT techniques has remained an open question for many years. A reliable technique may be defined as one that, when rigorously applied by a number of inspection teams, consistently finds then correctly sizes all defects of concern. In this paper we report an assessment of the reliability of defect detection by manual ultrasonic methods applied to the inspection of thick section pressure vessel weldments. Initially we consider the available data relating to the inherent physical capabilities of ultrasonic techniques to detect cracks in weldment and then, independently, we assess the likely variability in team to team performance when several teams are asked to follow the same specified test procedure. The two aspects of 'capability' and 'variability' are brought together to provide quantitative estimates of the overall reliability of ultrasonic inspection of thick section pressure vessel weldments based on currently existing data. The final section of the paper considers current research programmes on reliability and presents a view on how these will help to further improve NDT reliability. (author)

  6. Determination of Total Carbohydrates in Algal Biomass: Laboratory Analytical Procedure (LAP)

    Energy Technology Data Exchange (ETDEWEB)

    Van Wychen, Stefanie; Laurens, Lieve M. L.

    2016-01-13

    This procedure uses two-step sulfuric acid hydrolysis to hydrolyze the polymeric forms of carbohydrates in algal biomass into monomeric subunits. The monomers are then quantified by either HPLC or a suitable spectrophotometric method.

  7. Fuzzy QFD for supply chain management with reliability consideration

    International Nuclear Information System (INIS)

    Sohn, So Young; Choi, In Su

    2001-01-01

    Although many products are made through several tiers of supply chains, a systematic way of handling reliability issues in a various product planning stage has drawn attention, only recently, in the context of supply chain management (SCM). The main objective of this paper is to develop a fuzzy quality function deployment (QFD) model in order to convey fuzzy relationship between customers needs and design specification for reliability in the context of SCM. A fuzzy multi criteria decision-making procedure is proposed and is applied to find a set of optimal solution with respect to the performance of the reliability test needed in CRT design. It is expected that the proposed approach can make significant contributions on the following areas: effectively communicating with technical personnel and users; developing relatively error-free reliability review system; and creating consistent and complete documentation for design for reliability

  8. Fuzzy QFD for supply chain management with reliability consideration

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, So Young; Choi, In Su

    2001-06-01

    Although many products are made through several tiers of supply chains, a systematic way of handling reliability issues in a various product planning stage has drawn attention, only recently, in the context of supply chain management (SCM). The main objective of this paper is to develop a fuzzy quality function deployment (QFD) model in order to convey fuzzy relationship between customers needs and design specification for reliability in the context of SCM. A fuzzy multi criteria decision-making procedure is proposed and is applied to find a set of optimal solution with respect to the performance of the reliability test needed in CRT design. It is expected that the proposed approach can make significant contributions on the following areas: effectively communicating with technical personnel and users; developing relatively error-free reliability review system; and creating consistent and complete documentation for design for reliability.

  9. Interim reliability evaluation program, Browns Ferry fault trees

    International Nuclear Information System (INIS)

    Stewart, M.E.

    1981-01-01

    An abbreviated fault tree method is used to evaluate and model Browns Ferry systems in the Interim Reliability Evaluation programs, simplifying the recording and displaying of events, yet maintaining the system of identifying faults. The level of investigation is not changed. The analytical thought process inherent in the conventional method is not compromised. But the abbreviated method takes less time, and the fault modes are much more visible

  10. Fast analytical model of MZI micro-opto-mechanical pressure sensor

    Science.gov (United States)

    Rochus, V.; Jansen, R.; Goyvaerts, J.; Neutens, P.; O’Callaghan, J.; Rottenberg, X.

    2018-06-01

    This paper presents a fast analytical procedure in order to design a micro-opto-mechanical pressure sensor (MOMPS) taking into account the mechanical nonlinearity and the optical losses. A realistic model of the photonic MZI is proposed, strongly coupled to a nonlinear mechanical model of the membrane. Based on the membrane dimensions, the residual stress, the position of the waveguide, the optical wavelength and the phase variation due to the opto-mechanical coupling, we derive an analytical model which allows us to predict the response of the total system. The effect of the nonlinearity and the losses on the total performance are carefully studied and measurements on fabricated devices are used to validate the model. Finally, a design procedure is proposed in order to realize fast design of this new type of pressure sensor.

  11. An Enhanced Backbone-Assisted Reliable Framework for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Amna Ali

    2010-03-01

    Full Text Available An extremely reliable source to sink communication is required for most of the contemporary WSN applications especially pertaining to military, healthcare and disaster-recovery. However, due to their intrinsic energy, bandwidth and computational constraints, Wireless Sensor Networks (WSNs encounter several challenges in reliable source to sink communication. In this paper, we present a novel reliable topology that uses reliable hotlines between sensor gateways to boost the reliability of end-to-end transmissions. This reliable and efficient routing alternative reduces the number of average hops from source to the sink. We prove, with the help of analytical evaluation, that communication using hotlines is considerably more reliable than traditional WSN routing. We use reliability theory to analyze the cost and benefit of adding gateway nodes to a backbone-assisted WSN. However, in hotline assisted routing some scenarios where source and the sink are just a couple of hops away might bring more latency, therefore, we present a Signature Based Routing (SBR scheme. SBR enables the gateways to make intelligent routing decisions, based upon the derived signature, hence providing lesser end-to-end delay between source to the sink communication. Finally, we evaluate our proposed hotline based topology with the help of a simulation tool and show that the proposed topology provides manifold increase in end-to-end reliability.

  12. Reliability-based performance simulation for optimized pavement maintenance

    International Nuclear Information System (INIS)

    Chou, Jui-Sheng; Le, Thanh-Son

    2011-01-01

    Roadway pavement maintenance is essential for driver safety and highway infrastructure efficiency. However, regular preventive maintenance and rehabilitation (M and R) activities are extremely costly. Unfortunately, the funds available for the M and R of highway pavement are often given lower priority compared to other national development policies, therefore, available funds must be allocated wisely. Maintenance strategies are typically implemented by optimizing only the cost whilst the reliability of facility performance is neglected. This study proposes a novel algorithm using multi-objective particle swarm optimization (MOPSO) technique to evaluate the cost-reliability tradeoff in a flexible maintenance strategy based on non-dominant solutions. Moreover, a probabilistic model for regression parameters is employed to assess reliability-based performance. A numerical example of a highway pavement project is illustrated to demonstrate the efficacy of the proposed MOPSO algorithms. The analytical results show that the proposed approach can help decision makers to optimize roadway maintenance plans. - Highlights: →A novel algorithm using multi-objective particle swarm optimization technique. → Evaluation of the cost-reliability tradeoff in a flexible maintenance strategy. → A probabilistic model for regression parameters is employed to assess reliability-based performance. → The proposed approach can help decision makers to optimize roadway maintenance plans.

  13. Reliability-based performance simulation for optimized pavement maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Chou, Jui-Sheng, E-mail: jschou@mail.ntust.edu.tw [Department of Construction Engineering, National Taiwan University of Science and Technology (Taiwan Tech), 43 Sec. 4, Keelung Rd., Taipei 106, Taiwan (China); Le, Thanh-Son [Department of Construction Engineering, National Taiwan University of Science and Technology (Taiwan Tech), 43 Sec. 4, Keelung Rd., Taipei 106, Taiwan (China)

    2011-10-15

    Roadway pavement maintenance is essential for driver safety and highway infrastructure efficiency. However, regular preventive maintenance and rehabilitation (M and R) activities are extremely costly. Unfortunately, the funds available for the M and R of highway pavement are often given lower priority compared to other national development policies, therefore, available funds must be allocated wisely. Maintenance strategies are typically implemented by optimizing only the cost whilst the reliability of facility performance is neglected. This study proposes a novel algorithm using multi-objective particle swarm optimization (MOPSO) technique to evaluate the cost-reliability tradeoff in a flexible maintenance strategy based on non-dominant solutions. Moreover, a probabilistic model for regression parameters is employed to assess reliability-based performance. A numerical example of a highway pavement project is illustrated to demonstrate the efficacy of the proposed MOPSO algorithms. The analytical results show that the proposed approach can help decision makers to optimize roadway maintenance plans. - Highlights: > A novel algorithm using multi-objective particle swarm optimization technique. > Evaluation of the cost-reliability tradeoff in a flexible maintenance strategy. > A probabilistic model for regression parameters is employed to assess reliability-based performance. > The proposed approach can help decision makers to optimize roadway maintenance plans.

  14. Soviet-designed pressurized water reactor symptomatic emergency operating instruction analytical procedure: approach, methodology, development and application

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1999-01-01

    A symptom approach to the analytical validation of symptom-based EOPs includes: (1) Identification of critical safety functions to the maintenance of fission product barrier integrity; (2) Identification of the symptoms which manifest an impending challenge to critical safety function maintenance; (3) Development of a symptomatic methodology to delineate bounding plant transient response modes; (4) Specification of bounding scenarios; (5) Development of a systematic calculational approach consistent with the objectives of the methodology; (6) Performance of thermal-hydraulic computer code calculations implementing the analytical methodology; (7) Interpretation of the analytical results on the basis of information available to the operator; (8) Application of the results to the validation of the proposed operator actions; (9) Production of a technical basis document justifying the proposed operator actions. (author)

  15. Accurate monitoring developed by EDF for FA-3-EPRTM and UK-EPRTM: chemistry-radiochemistry design and procedures

    International Nuclear Information System (INIS)

    Tigeras, Arancha; Bouhrizi, Sofia; Pierre, Marine; L'Orphelin, Jean-Matthieu

    2012-09-01

    The monitoring of chemistry and radiochemistry parameters is a fundamental need in nuclear power plants in order to ensure: - The reactivity control in real time, - The barrier integrity surveillance by means of the fuel cladding failures detection and the primary-pressure boundary components control, - The water quality to limit the radiation build-up and the material corrosion permitting to prepare the maintenance, radioprotection and waste operations. - The efficiency of treatment systems and hence the minimization of chemical and radiochemical substances discharges The relevant chemistry and radiochemistry parameters to be monitored are selected depending on the chemistry conditioning of systems, the source term evaluations, the corrosion mechanisms and the radioactivity consequences. In spite of the difficulties for obtaining representative samples under all circumstances, the EPR M design provides the appropriate provisions and analytical procedures for ensuring the reliable and accurate monitoring of parameters in compliance with the specification requirements. The design solutions, adopted for Flamanville 3-EPR M and UK-EPR M , concerning the sampling conditions and locations, the on-line and analytical equipment, the procedures and the results transmission to control room and chemistry laboratory are supported by ALARP considerations, international experience and researches concerning the nuclides behavior (corrosion product and actinides solubility, fission product degassing, impurities and additives reactions also). This paper details the means developed by EDF for making successful and meaningful sampling and measurements to achieve the essential objectives associated with the monitoring. (authors)

  16. Application of advanced nuclear and instrumental analytical techniques for characterisation of environmental materials

    International Nuclear Information System (INIS)

    Sudersanan, M.; Pawaskar, P.B.; Kayasth, S.R.; Kumar, S.C.

    2002-01-01

    Full text: Increasing realisation about the toxic effects of metal ions in environmental materials has given an impetus to research on analytical techniques for their characterization. The large number of analytes present at very low levels has necessitated the use of sensitive, selective and element specific techniques for their characterization. The concern about precision and accuracy on such analysis, which have socio-economic bearing, has emphasized the use of Certified Reference Materials and the use of multi-technique approach for the unambiguous characterization of analytes. The recent work carried out at Analytical Chemistry Division, BARC on these aspects is presented in this paper. Increasing use of fossil fuels has led to the generation of large quantities of fly ash which pose problems of safe disposal. The utilization of these materials for land filling is an attractive option but the presence of trace amounts of toxic metals like mercury, arsenic, lead etc may cause environmental problems. In view of the inhomogeneous nature of the material, efficient sample processing is an important factor, in addition to the validation of the results by the use of proper standards. Analysis was carried out on flyash samples received as reference materials and also as samples from commercial sources using a combination of both nuclear techniques like INAA and RNAA as well as other techniques like AAS, ICPAES, cold vapour AAS for mercury and hydride generation technique for arsenic. Similar analysis using nuclear techniques was employed for the characterization of air particulates. Biological materials often serve as sensitive indicator materials for pollution measurements. They are also employed for studies on the uptake of toxic metals like U, Th, Cd, Pb, Hg etc. The presence of large amounts of organic materials in them necessitate an appropriate sample dissolution procedure. In view of the possibility of loss of certain analytes like Cd, Hg, As, by high

  17. Exact combinatorial reliability analysis of dynamic systems with sequence-dependent failures

    International Nuclear Information System (INIS)

    Xing Liudong; Shrestha, Akhilesh; Dai Yuanshun

    2011-01-01

    Many real-life fault-tolerant systems are subjected to sequence-dependent failure behavior, in which the order in which the fault events occur is important to the system reliability. Such systems can be modeled by dynamic fault trees (DFT) with priority-AND (pAND) gates. Existing approaches for the reliability analysis of systems subjected to sequence-dependent failures are typically state-space-based, simulation-based or inclusion-exclusion-based methods. Those methods either suffer from the state-space explosion problem or require long computation time especially when results with high degree of accuracy are desired. In this paper, an analytical method based on sequential binary decision diagrams is proposed. The proposed approach can analyze the exact reliability of non-repairable dynamic systems subjected to the sequence-dependent failure behavior. Also, the proposed approach is combinatorial and is applicable for analyzing systems with any arbitrary component time-to-failure distributions. The application and advantages of the proposed approach are illustrated through analysis of several examples. - Highlights: → We analyze the sequence-dependent failure behavior using combinatorial models. → The method has no limitation on the type of time-to-failure distributions. → The method is analytical and based on sequential binary decision diagrams (SBDD). → The method is computationally more efficient than existing methods.

  18. Pilot testing of SHRP 2 reliability data and analytical products: Southern California. [supporting datasets

    Science.gov (United States)

    2014-01-01

    The objective of this project was to develop system designs for programs to monitor travel time reliability and to prepare a guidebook that practitioners and others can use to design, build, operate, and maintain such systems. Generally, such travel ...

  19. Wind Turbine Drivetrain Reliability Collaborative Workshop: A Recap

    Energy Technology Data Exchange (ETDEWEB)

    Keller, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sheng, Shuangwen [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cotrell, Jason [National Renewable Energy Lab. (NREL), Golden, CO (United States); Greco, Aaron [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-08-01

    The Wind Turbine Drivetrain Reliability Collaborative Workshop was convened by the National Renewable Energy Laboratory (NREL), Argonne National Laboratory, and the U.S. Department of Energy to explore the state of the art in wind turbine drivetrain mechanical system reliability as well as research and development (R&D) challenges that if solved could have significant benefits. The workshop was held at the Research Support Facility on NREL's main campus in Golden, Colorado, from February 16-17, 2016. More than 120 attendees participated from industry, academia, and government. Plenary presentations covered wind turbine drivetrain design, testing, and analysis; tribology -- the science and engineering of interacting surfaces in relative motion -- and failure modes; and condition monitoring and data analytics. In addition to the presentations, workshops were held in each of these areas to discuss R&D challenges. This report serves as a summary of the presentations, workshops, and conclusions on R&D challenges in wind turbine drivetrain reliability.

  20. Irregular analytical errors in diagnostic testing - a novel concept.

    Science.gov (United States)

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC

  1. Some recent developments in the risk- and reliability analysis of structures

    International Nuclear Information System (INIS)

    Bauer, J.; Choi, H.S.; Kappler, H.; Melzer, H.J.; Panggabean, H.; Reichmann, K.H.; Schueller, G.I.; Schwarz, R.F.

    1979-01-01

    This report consists of six contributions divided into four general topics. While Part I concentrates on the development of Analytical Methods in Structural Reliability, Part II to IV are devoted to the application of these methods to Offshore-, Nuclear - and generally to Wind- and Earthquake Exposed Structures. (orig.) [de

  2. Assessment of analytical techniques for characterization of crystalline clopidogrel forms in patent applications

    Directory of Open Access Journals (Sweden)

    Luiz Marcelo Lira

    2014-04-01

    Full Text Available The aim of this study was to evaluate two important aspects of patent applications of crystalline forms of drugs: (i the physicochemical characterization of the crystalline forms; and (ii the procedure for preparing crystals of the blockbuster drug clopidogrel. To this end, searches were conducted using online patent databases. The results showed that: (i the majority of patent applications for clopidogrel crystalline forms failed to comply with proposed Brazilian Patent Office guidelines. This was primarily due to insufficient number of analytical techniques evaluating the crystalline phase. In addition, some patent applications lacked assessment of chemical/crystallography purity; (ii use of more than two analytical techniques is important; and (iii the crystallization procedure for clopidogrel bisulfate form II were irreproducible based on the procedure given in the patent application.

  3. Software engineering practices for control system reliability

    International Nuclear Information System (INIS)

    S. K. Schaffner; K. S White

    1999-01-01

    This paper will discuss software engineering practices used to improve Control System reliability. The authors begin with a brief discussion of the Software Engineering Institute's Capability Maturity Model (CMM) which is a framework for evaluating and improving key practices used to enhance software development and maintenance capabilities. The software engineering processes developed and used by the Controls Group at the Thomas Jefferson National Accelerator Facility (Jefferson Lab), using the Experimental Physics and Industrial Control System (EPICS) for accelerator control, are described. Examples are given of how their procedures have been used to minimized control system downtime and improve reliability. While their examples are primarily drawn from their experience with EPICS, these practices are equally applicable to any control system. Specific issues addressed include resource allocation, developing reliable software lifecycle processes and risk management

  4. An integrated reliability-based design optimization of offshore towers

    International Nuclear Information System (INIS)

    Karadeniz, Halil; Togan, Vedat; Vrouwenvelder, Ton

    2009-01-01

    After recognizing the uncertainty in the parameters such as material, loading, geometry and so on in contrast with the conventional optimization, the reliability-based design optimization (RBDO) concept has become more meaningful to perform an economical design implementation, which includes a reliability analysis and an optimization algorithm. RBDO procedures include structural analysis, reliability analysis and sensitivity analysis both for optimization and for reliability. The efficiency of the RBDO system depends on the mentioned numerical algorithms. In this work, an integrated algorithms system is proposed to implement the RBDO of the offshore towers, which are subjected to the extreme wave loading. The numerical strategies interacting with each other to fulfill the RBDO of towers are as follows: (a) a structural analysis program, SAPOS, (b) an optimization program, SQP and (c) a reliability analysis program based on FORM. A demonstration of an example tripod tower under the reliability constraints based on limit states of the critical stress, buckling and the natural frequency is presented.

  5. An integrated reliability-based design optimization of offshore towers

    Energy Technology Data Exchange (ETDEWEB)

    Karadeniz, Halil [Faculty of Civil Engineering and Geosciences, Delft University of Technology, Delft (Netherlands)], E-mail: h.karadeniz@tudelft.nl; Togan, Vedat [Department of Civil Engineering, Karadeniz Technical University, Trabzon (Turkey); Vrouwenvelder, Ton [Faculty of Civil Engineering and Geosciences, Delft University of Technology, Delft (Netherlands)

    2009-10-15

    After recognizing the uncertainty in the parameters such as material, loading, geometry and so on in contrast with the conventional optimization, the reliability-based design optimization (RBDO) concept has become more meaningful to perform an economical design implementation, which includes a reliability analysis and an optimization algorithm. RBDO procedures include structural analysis, reliability analysis and sensitivity analysis both for optimization and for reliability. The efficiency of the RBDO system depends on the mentioned numerical algorithms. In this work, an integrated algorithms system is proposed to implement the RBDO of the offshore towers, which are subjected to the extreme wave loading. The numerical strategies interacting with each other to fulfill the RBDO of towers are as follows: (a) a structural analysis program, SAPOS, (b) an optimization program, SQP and (c) a reliability analysis program based on FORM. A demonstration of an example tripod tower under the reliability constraints based on limit states of the critical stress, buckling and the natural frequency is presented.

  6. External Quality Assessment Scheme for Biological Monitoring of Occupational Exposure to Toxic Chemicals

    Directory of Open Access Journals (Sweden)

    Mi-Young Lee

    2011-09-01

    Conclusion: The EQAS has taken a primary role in improving the reliability of analytical data. A total quality assurance scheme is suggested, including the validation of technical documentation for the whole analytical procedure.

  7. Analytic thinking reduces belief in conspiracy theories.

    Science.gov (United States)

    Swami, Viren; Voracek, Martin; Stieger, Stefan; Tran, Ulrich S; Furnham, Adrian

    2014-12-01

    Belief in conspiracy theories has been associated with a range of negative health, civic, and social outcomes, requiring reliable methods of reducing such belief. Thinking dispositions have been highlighted as one possible factor associated with belief in conspiracy theories, but actual relationships have only been infrequently studied. In Study 1, we examined associations between belief in conspiracy theories and a range of measures of thinking dispositions in a British sample (N=990). Results indicated that a stronger belief in conspiracy theories was significantly associated with lower analytic thinking and open-mindedness and greater intuitive thinking. In Studies 2-4, we examined the causational role played by analytic thinking in relation to conspiracist ideation. In Study 2 (N=112), we showed that a verbal fluency task that elicited analytic thinking reduced belief in conspiracy theories. In Study 3 (N=189), we found that an alternative method of eliciting analytic thinking, which related to cognitive disfluency, was effective at reducing conspiracist ideation in a student sample. In Study 4, we replicated the results of Study 3 among a general population sample (N=140) in relation to generic conspiracist ideation and belief in conspiracy theories about the July 7, 2005, bombings in London. Our results highlight the potential utility of supporting attempts to promote analytic thinking as a means of countering the widespread acceptance of conspiracy theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Determination of Total Solids and Ash in Algal Biomass: Laboratory Analytical Procedure (LAP)

    Energy Technology Data Exchange (ETDEWEB)

    Van Wychen, Stefanie; Laurens, Lieve M. L.

    2016-01-13

    This procedure describes the methods used to determine the amount of moisture or total solids present in a freeze-dried algal biomass sample, as well as the ash content. A traditional convection oven drying procedure is covered for total solids content, and a dry oxidation method at 575 deg. C is covered for ash content.

  9. Stochastic procedures for extreme wave induced responses in flexible ships

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher; Andersen, Ingrid Marie Vincent; Seng, Sopheak

    2014-01-01

    Different procedures for estimation of the extreme global wave hydroelastic responses in ships are discussed. Firstly, stochastic procedures for application in detailed numerical studies (CFD) are outlined. The use of the First Order Reliability Method (FORM) to generate critical wave episodes...

  10. The reliability of radiochemical and chemical trace analyses in environmental materials

    International Nuclear Information System (INIS)

    Heinonen, Jorma.

    1977-12-01

    After theoretically exploring the factors which influence the quality of analytical data as well as the means by which a sufficient quality can be assured and controlled, schemes of different kinds have been developed and applied in order to demonstrate the analytical quality assurance and control in practice. Methods have been developed for the determination of cesium, bromine and arsenic by neutron activation analysis at the natural ''background'' concentration level in environmental materials. The calibration of methods is described. The methods were also applied on practical routine analysis, the results of which are briefly reviewed. In the case of Ce the precision of a comprehensive calibration was found to vary between 5.2-9.2% as a relative standard deviation, which agrees well with the calculated statistical random error 5.7-8.7%. In the case of Br the method showed a reasonable precision, about 11% on the average, and accuracy. In employing the method to analyze died samples containing Br from 3 to 12 ppm a continuous control of precison was performed. The analysis of As demonstrates the many problems and difficulties associated with environmental analysis. In developing the final method four former intercomparison materials of IAEA were utilized in the calibration. The tests performed revealed a systematic error. In this case a scheme was developed for the continuous control of both precision and accuracy. The results of radiochemical analyses in environmental materials show a reliability somewhat better than that occuring in the determination of stable trace elements. According to a rough classification, 15% of the results of radiochemical analysis show excellent reliability, whereas 60% show a reliability adequate for certain purposes. The remaining 15% are excellent, 60% adequate for some purposes and 30% good-for-nothing. The reasons for often insufficient reliability of results are both organizational and technical. With reasonable effort and

  11. An enhanced reliability-oriented workforce planning model for process industry using combined fuzzy goal programming and differential evolution approach

    Science.gov (United States)

    Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.

    2018-03-01

    This paper draws on the "human reliability" concept as a structure for gaining insight into the maintenance workforce assessment in a process industry. Human reliability hinges on developing the reliability of humans to a threshold that guides the maintenance workforce to execute accurate decisions within the limits of resources and time allocations. This concept offers a worthwhile point of deviation to encompass three elegant adjustments to literature model in terms of maintenance time, workforce performance and return-on-workforce investments. These fully explain the results of our influence. The presented structure breaks new grounds in maintenance workforce theory and practice from a number of perspectives. First, we have successfully implemented fuzzy goal programming (FGP) and differential evolution (DE) techniques for the solution of optimisation problem in maintenance of a process plant for the first time. The results obtained in this work showed better quality of solution from the DE algorithm compared with those of genetic algorithm and particle swarm optimisation algorithm, thus expressing superiority of the proposed procedure over them. Second, the analytical discourse, which was framed on stochastic theory, focusing on specific application to a process plant in Nigeria is a novelty. The work provides more insights into maintenance workforce planning during overhaul rework and overtime maintenance activities in manufacturing systems and demonstrated capacity in generating substantially helpful information for practice.

  12. Near-critical carbon dioxide extraction and liquid chromatography determination of UV filters in solid cosmetic samples: a green analytical procedure.

    Science.gov (United States)

    Salvador, Amparo; Chisvert, Alberto; Jaime, Maria-Angeles

    2005-11-01

    Near-critical carbon dioxide extraction of four UV filters used as sunscreens in lipsticks and makeup formulations is reported. Extraction parameters were optimized. Efficient recoveries were obtained after 15 min of dynamic extraction with a 80:20 CO2/ethanol mixture at 300 atm and 54 degrees C, using a 1.8 mL/min flow rate. Extracts were collected in ethanol, and appropriately diluted with ethanol and 1% acetic acid to obtain a 70:30 v/v ethanol/1% acetic acid solution. The four UV filters were determined by LC with gradient elution using ethanol/1% acetic acid as mobile phase. The accuracy of the analytical procedure was estimated by comparing the results with those obtained by methods based on classical extraction. The proposed method only requires the use of CO2, ethanol and acetic acid avoiding the use of more toxic organic solvents, thus it could be considered as both operator and environment friendly.

  13. Reliability-Centric Analysis of Offloaded Computation in Cooperative Wearable Applications

    Directory of Open Access Journals (Sweden)

    Aleksandr Ometov

    2017-01-01

    Full Text Available Motivated by the unprecedented penetration of mobile communications technology, this work carefully brings into perspective the challenges related to heterogeneous communications and offloaded computation operating in cases of fault-tolerant computation, computing, and caching. We specifically focus on the emerging augmented reality applications that require reliable delegation of the computing and caching functionality to proximate resource-rich devices. The corresponding mathematical model proposed in this work becomes of value to assess system-level reliability in cases where one or more nearby collaborating nodes become temporarily unavailable. Our produced analytical and simulation results corroborate the asymptotic insensitivity of the stationary reliability of the system in question (under the “fast” recovery of its elements to the type of the “repair” time distribution, thus supporting the fault-tolerant system operation.

  14. Different Approaches for Ensuring Performance/Reliability of Plastic Encapsulated Microcircuits (PEMs) in Space Applications

    Science.gov (United States)

    Gerke, R. David; Sandor, Mike; Agarwal, Shri; Moor, Andrew F.; Cooper, Kim A.

    2000-01-01

    Engineers within the commercial and aerospace industries are using trade-off and risk analysis to aid in reducing spacecraft system cost while increasing performance and maintaining high reliability. In many cases, Commercial Off-The-Shelf (COTS) components, which include Plastic Encapsulated Microcircuits (PEMs), are candidate packaging technologies for spacecrafts due to their lower cost, lower weight and enhanced functionality. Establishing and implementing a parts program that effectively and reliably makes use of these potentially less reliable, but state-of-the-art devices, has become a significant portion of the job for the parts engineer. Assembling a reliable high performance electronic system, which includes COTS components, requires that the end user assume a risk. To minimize the risk involved, companies have developed methodologies by which they use accelerated stress testing to assess the product and reduce the risk involved to the total system. Currently, there are no industry standard procedures for accomplishing this risk mitigation. This paper will present the approaches for reducing the risk of using PEMs devices in space flight systems as developed by two independent Laboratories. The JPL procedure involves primarily a tailored screening with accelerated stress philosophy while the APL procedure is primarily, a lot qualification procedure. Both Laboratories successfully have reduced the risk of using the particular devices for their respective systems and mission requirements.

  15. Application of Statistical Methods to Activation Analytical Results near the Limit of Detection

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Wanscher, B.

    1978-01-01

    Reporting actual numbers instead of upper limits for analytical results at or below the detection limit may produce reliable data when these numbers are subjected to appropriate statistical processing. Particularly in radiometric methods, such as activation analysis, where individual standard...... deviations of analytical results may be estimated, improved discrimination may be based on the Analysis of Precision. Actual experimental results from a study of the concentrations of arsenic in human skin demonstrate the power of this principle....

  16. Mission Reliability Estimation for Repairable Robot Teams

    Science.gov (United States)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the

  17. Characterization of Analytical Reference Glass-1 (ARG-1)

    International Nuclear Information System (INIS)

    Smith, G.L.

    1993-12-01

    High-level radioactive waste may be immobilized in borosilicate glass at the West Valley Demonstration Project, West Valley, New York, the Defense Waste Processing Facility (DWPF), Aiken, South Carolina, and the Hanford Waste Vitrification Project (HWVP), Richland, Washington. The vitrified waste form will be stored in stainless steel canisters before its eventual transfer to a geologic repository for long-term disposal. Waste Acceptance Product Specifications (WAPS) (DOE 1993), Section 1.1.2 requires that the waste form producers must report the measured chemical composition of the vitrified waste in their production records before disposal. Chemical analysis of glass waste forms is receiving increased attention due to qualification requirements of vitrified waste forms. The Pacific Northwest Laboratory (PNL) has been supporting the glass producers' analytical laboratories by a continuing program of multilaboratory analytical testing using interlaboratory ''round robin'' methods. At the PNL Materials Characterization Center Analytical Round Robin 4 workshop ''Analysis of Nuclear Waste Glass and Related Materials,'' January 16--17, 1990, Pleasanton, California, the meeting attendees decided that simulated nuclear waste analytical reference glasses were needed for use as analytical standards. Use of common standard analytical reference materials would allow the glass producers' analytical laboratories to calibrate procedures and instrumentation, to control laboratory performance and conduct self-appraisals, and to help qualify their various waste forms

  18. Reliability analysis of large scaled structures by optimization technique

    International Nuclear Information System (INIS)

    Ishikawa, N.; Mihara, T.; Iizuka, M.

    1987-01-01

    This paper presents a reliability analysis based on the optimization technique using PNET (Probabilistic Network Evaluation Technique) method for the highly redundant structures having a large number of collapse modes. This approach makes the best use of the merit of the optimization technique in which the idea of PNET method is used. The analytical process involves the minimization of safety index of the representative mode, subjected to satisfaction of the mechanism condition and of the positive external work. The procedure entails the sequential performance of a series of the NLP (Nonlinear Programming) problems, where the correlation condition as the idea of PNET method pertaining to the representative mode is taken as an additional constraint to the next analysis. Upon succeeding iterations, the final analysis is achieved when a collapse probability at the subsequent mode is extremely less than the value at the 1st mode. The approximate collapse probability of the structure is defined as the sum of the collapse probabilities of the representative modes classified by the extent of correlation. Then, in order to confirm the validity of the proposed method, the conventional Monte Carlo simulation is also revised by using the collapse load analysis. Finally, two fairly large structures were analyzed to illustrate the scope and application of the approach. (orig./HP)

  19. Development of the GO-FLOW reliability analysis methodology for nuclear reactor system

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Kobayashi, Michiyuki

    1994-01-01

    Probabilistic Safety Assessment (PSA) is important in the safety analysis of technological systems and processes, such as, nuclear plants, chemical and petroleum facilities, aerospace systems. Event trees and fault trees are the basic analytical tools that have been most frequently used for PSAs. Several system analysis methods can be used in addition to, or in support of, the event- and fault-tree analysis. The need for more advanced methods of system reliability analysis has grown with the increased complexity of engineered systems. The Ship Research Institute has been developing a new reliability analysis methodology, GO-FLOW, which is a success-oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. The research has been supported by the special research fund for Nuclear Technology, Science and Technology Agency, from 1989 to 1994. This paper describes the concept of the Probabilistic Safety Assessment (PSA), an overview of various system analysis techniques, an overview of the GO-FLOW methodology, the GO-FLOW analysis support system, procedure of treating a phased mission problem, a function of common cause failure analysis, a function of uncertainty analysis, a function of common cause failure analysis with uncertainty, and printing out system of the results of GO-FLOW analysis in the form of figure or table. Above functions are explained by analyzing sample systems, such as PWR AFWS, BWR ECCS. In the appendices, the structure of the GO-FLOW analysis programs and the meaning of the main variables defined in the GO-FLOW programs are described. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis, and has a wide range of applications. With the development of the total system of the GO-FLOW, this methodology has became a powerful tool in a living PSA. (author) 54 refs

  20. A comparison of analytic procedures for measurement of fractional dextran clearances

    NARCIS (Netherlands)

    Hemmelder, MH; de Jong, PE; de Zeeuw, D

    Fractional dextran clearances have been extensively used to study glomerular size selectivity. We report on an analysis of different laboratory procedures involved in measuring fractional dextran clearances. The deproteinization of plasma samples by 20% trichloroacetic acid (TCA) revealed a protein

  1. Analytical investigations closer to the patient.

    OpenAIRE

    Watson, D

    1980-01-01

    Do-it-yourself bioanalytical equipment that requires no analytical skill to operate is currently available for use in intensive care units, operating suites, side wards, health centres, clinics, general practitioners' surgeries, etc. Agreement is needed between the laboratory consultant and doctors and others using laboratory-type equipment and reagents in near-bedside analyses for diagnosis, clinical management, or health screening of their patients. Choice and safety of method procedure, op...

  2. Analytical basis for evaluating the effect of unplanned interventions on the effectiveness of a human-robot system

    International Nuclear Information System (INIS)

    Shah, Julie A.; Saleh, Joseph H.; Hoffman, Jeffrey A.

    2008-01-01

    Increasing prevalence of human-robot systems in a variety of applications raises the question of how to design these systems to best leverage the capabilities of humans and robots. In this paper, we address the relationships between reliability, productivity, and risk to humans from human-robot systems operating in a hostile environment. Objectives for maximizing the effectiveness of a human-robot system are presented, which capture these coupled relationships, and reliability parameters are proposed to characterize unplanned interventions between a human and robot. The reliability metrics defined here take on an expanded meaning in which the underlying concept of failure in traditional reliability analysis is replaced by the notion of intervention. In the context of human-robotic systems, an intervention is not only driven by component failures, but includes many other factors that can make a robotic agent to request or a human agent to provide intervention, as we argue in this paper. The effect of unplanned interventions on the effectiveness of human-robot systems is then investigated analytically using traditional reliability analysis. Finally, we discuss the implications of these analytical trends on the design and evaluation of human-robot systems

  3. A reliability model of a warm standby configuration with two identical sets of units

    International Nuclear Information System (INIS)

    Huang, Wei; Loman, James; Song, Thomas

    2015-01-01

    This article presents a new reliability model and the development of its analytical solution for a warm standby redundant configuration with units that are originally operated in active mode, and then, upon turn-on of originally standby units, are put into warm standby mode. These units can be used later if a standby- turned into active-unit fails. Numerical results of an example configuration are presented and discussed with comparison to other warm standby configurations, and to Monte Carlo simulation results obtained from BlockSim software. Results show that the Monte Carlo simulation model gives virtually identical reliability value when the simulation uses a high number of replications, confirming the developed model. - Highlights: • A new reliability model is developed for a warm standby redundancy with two sets of identical units. • The units subject to state change from active to standby then back to active mode. • A closed form analytical solution is developed with exponential distribution. • To validate the developed model, a Monte Carlo simulation for an exemplary configuration is performed

  4. Sharing the Data along with the Responsibility: Examining an Analytic Scale-Based Model for Assessing School Climate.

    Science.gov (United States)

    Shindler, John; Taylor, Clint; Cadenas, Herminia; Jones, Albert

    This study was a pilot effort to examine the efficacy of an analytic trait scale school climate assessment instrument and democratic change system in two urban high schools. Pilot study results indicate that the instrument shows promising soundness in that it exhibited high levels of validity and reliability. In addition, the analytic trait format…

  5. Methodological procedures and analytical instruments to evaluate an indicators integrated archive for urban management; Guida metodologica per la costruzione di un archivio integrato di indicatori urbani

    Energy Technology Data Exchange (ETDEWEB)

    Del Ciello, R; Napoleoni, S [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dipt. Ambiente

    1998-07-01

    This guide provides the results of a research developed at ENEA (National Agency for new Technology, Energy and the Environment) Casaccia center (Rome, Italy) aimed to define methodological procedures and analytical instruments needed to carry out an indicators integrated archive for urban management. The guide also defines the scheme of a negotiation process aimed to reach and exchange data and information among governmental and local administrations, non-governmental organizations and scientific bodies. [Italian] Il lavoro presenta una sintesi dei risultati di una ricerca condotta presso il C.R. Casaccia dell'ENEA, relativia alla definizione di procedure metodologiche e strumenti di analisi ed elaborazione per realizzare un archivio integrato di indicatori per la gestione dei sistemi urbani. La guida, rivolta ai responsabili delle politiche urbane, deifinisce uno schema dei processi di condivisione degli indicatori urbani attraverso l'organizzazione di opportuni tavoli negoziali, costituiti da rappresentanti delle amministrazioni locali, dell'amministrazione centrale, delle categorie produttive e sociali e delle strutture tecniche operanti sul territorio.

  6. Methodological procedures and analytical instruments to evaluate an indicators integrated archive for urban management; Guida metodologica per la costruzione di un archivio integrato di indicatori urbani

    Energy Technology Data Exchange (ETDEWEB)

    Del Ciello, R.; Napoleoni, S. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dipt. Ambiente

    1998-07-01

    This guide provides the results of a research developed at ENEA (National Agency for new Technology, Energy and the Environment) Casaccia center (Rome, Italy) aimed to define methodological procedures and analytical instruments needed to carry out an indicators integrated archive for urban management. The guide also defines the scheme of a negotiation process aimed to reach and exchange data and information among governmental and local administrations, non-governmental organizations and scientific bodies. [Italian] Il lavoro presenta una sintesi dei risultati di una ricerca condotta presso il C.R. Casaccia dell'ENEA, relativia alla definizione di procedure metodologiche e strumenti di analisi ed elaborazione per realizzare un archivio integrato di indicatori per la gestione dei sistemi urbani. La guida, rivolta ai responsabili delle politiche urbane, deifinisce uno schema dei processi di condivisione degli indicatori urbani attraverso l'organizzazione di opportuni tavoli negoziali, costituiti da rappresentanti delle amministrazioni locali, dell'amministrazione centrale, delle categorie produttive e sociali e delle strutture tecniche operanti sul territorio.

  7. Pretreatment procedures applied to samples to be analysed by neutron activation analysis at CDTN/CNEN

    International Nuclear Information System (INIS)

    Francisco, Dovenir; Menezes, Maria Angela de Barros Correia

    2009-01-01

    The neutron activation technique - using several methods - has been applied in 80% of the analytical demand of Division for Reactor and Analytical Techniques at CDTN/CNEN, Belo Horizonte, Minas Gerais. This scenario emphasizes the responsibility of the Laboratory to provide and assure the quality of the measurements. The first step to assure the results quality is the preparation of the samples. Therefore, this paper describes the experimental procedures adopted at CDTN/CNEN in order to uniform conditions of analysis and to avoid contaminations by elements present everywhere. Some of the procedures are based on methods described in the literature; others are based on many years of experience preparing samples from many kinds of matrices. The procedures described are related to geological material - soil, sediment, rock, gems, clay, archaeological ceramics and ore - biological materials - hair, fish, plants, food - water, etc. Analytical results in sediment samples are shown as n example pointing out the efficiency of the experimental procedure. (author)

  8. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    Science.gov (United States)

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  9. Review of Pre-Analytical Errors in Oral Glucose Tolerance Testing in a Tertiary Care Hospital.

    Science.gov (United States)

    Nanda, Rachita; Patel, Suprava; Sahoo, Sibashish; Mohapatra, Eli

    2018-03-13

    The pre-pre-analytical and pre-analytical phases form a major chunk of the errors in a laboratory. The process has taken into consideration a very common procedure which is the oral glucose tolerance test to identify the pre-pre-analytical errors. Quality indicators provide evidence of quality, support accountability and help in the decision making of laboratory personnel. The aim of this research is to evaluate pre-analytical performance of the oral glucose tolerance test procedure. An observational study that was conducted overa period of three months, in the phlebotomy and accessioning unit of our laboratory using questionnaire that examined the pre-pre-analytical errors through a scoring system. The pre-analytical phase was analyzed for each sample collected as per seven quality indicators. About 25% of the population gave wrong answer with regard to the question that tested the knowledge of patient preparation. The appropriateness of test result QI-1 had the most error. Although QI-5 for sample collection had a low error rate, it is a very important indicator as any wrongly collected sample can alter the test result. Evaluating the pre-analytical and pre-pre-analytical phase is essential and must be conducted routinely on a yearly basis to identify errors and take corrective action and to facilitate their gradual introduction into routine practice.

  10. Improving QST Reliability – More Raters, Tests or Occasions? A Multivariate Generalizability Study

    DEFF Research Database (Denmark)

    O'Neill, Søren; O'Neill, Lotte

    2015-01-01

    The reliability of quantitative sensory testing (QST) is affected by the error attributable to both test occasion and rater (examiner) as well as interactions between them. Most reliability studies only account for one source of error. The present study employed a fully-crossed, multivariate...... threshold, intensity, tolerance and modulation with mechanical, thermal and chemical stimuli. The classical test-retest and inter-rater reliability (0.19... procedures. Reliability was improved more by repeated testing on separate occasions opposed to repeated testing by different raters....

  11. Do strict rules and moving images increase the reliability of sequential identification procedures?.

    OpenAIRE

    Valentine, Tim; Darling, Stephen; Memon, Amina

    2007-01-01

    Live identification procedures in England and Wales have been replaced by use of video, which provides a sequential presentation of facial images. Sequential presentation of photographs provides some protection to innocent suspects from mistaken identification when used with strict instructions designed to prevent relative judgements (Lindsay, Lea & Fulford, 1991). However, the current procedure in England and Wales is incompatible with these strict instructions. The reported research investi...

  12. The European industry reliability data bank EIReDA

    International Nuclear Information System (INIS)

    Procaccia, H.; Aufort, P.; Arsenis, S.

    1997-01-01

    EIReDA and the computerized version EIReDA.PC are living data bases aiming to satisfy the requirements of risk, safety, and availability studies on industrial systems for documented estimates of reliability parameters of mechanical, electrical, and instrumentation components. The data updating procedure is based on Bayesian techniques implemented in a specific software: FIABAYES. Estimates are mostly based on the operational experience of EDF components, but an effort has been made to bring together estimates of equivalent components published in the open literature, and so establish generic tables of reliability parameters. (author)

  13. Survey of methods used to asses human reliability in the human factors reliability benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1988-01-01

    The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim to assess the state-of-the-art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participate in the HF-RBE, which is organised around two study cases: (1) analysis of routine functional test and maintenance procedures, with the aim to assess the probability of test-induced failures, the probability of failures to remain unrevealed, and the potential to initiate transients because of errors performed in the test; and (2) analysis of human actions during an operational transient, with the aim to assess the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. The paper briefly reports how the HF-RBE was structured and gives an overview of the methods that have been used for predicting human reliability in both study cases. The experience in applying these methods is discussed and the results obtained are compared. (author)

  14. Reliability of Circumplex Axes

    Directory of Open Access Journals (Sweden)

    Micha Strack

    2013-06-01

    Full Text Available We present a confirmatory factor analysis (CFA procedure for computing the reliability of circumplex axes. The tau-equivalent CFA variance decomposition model estimates five variance components: general factor, axes, scale-specificity, block-specificity, and item-specificity. Only the axes variance component is used for reliability estimation. We apply the model to six circumplex types and 13 instruments assessing interpersonal and motivational constructs—Interpersonal Adjective List (IAL, Interpersonal Adjective Scales (revised; IAS-R, Inventory of Interpersonal Problems (IIP, Impact Messages Inventory (IMI, Circumplex Scales of Interpersonal Values (CSIV, Support Action Scale Circumplex (SAS-C, Interaction Problems With Animals (IPI-A, Team Role Circle (TRC, Competing Values Leadership Instrument (CV-LI, Love Styles, Organizational Culture Assessment Instrument (OCAI, Customer Orientation Circle (COC, and System for Multi-Level Observation of Groups (behavioral adjectives; SYMLOG—in 17 German-speaking samples (29 subsamples, grouped by self-report, other report, and metaperception assessments. The general factor accounted for a proportion ranging from 1% to 48% of the item variance, the axes component for 2% to 30%; and scale specificity for 1% to 28%, respectively. Reliability estimates varied considerably from .13 to .92. An application of the Nunnally and Bernstein formula proposed by Markey, Markey, and Tinsley overestimated axes reliabilities in cases of large-scale specificities but otherwise works effectively. Contemporary circumplex evaluations such as Tracey’s RANDALL are sensitive to the ratio of the axes and scale-specificity components. In contrast, the proposed model isolates both components.

  15. Integrating Water Quality and River Rehabilitation Management - A Decision-Analytical Perspective

    Science.gov (United States)

    Reichert, P.; Langhans, S.; Lienert, J.; Schuwirth, N.

    2009-04-01

    Integrative river management involves difficult decisions about alternative measures to improve their ecological state. For this reason, it seems useful to apply knowledge from the decision sciences to support river management. We discuss how decision-analytical elements can be employed for designing an integrated river management procedure. An important aspect of this procedure is to clearly separate scientific predictions of the consequences of alternatives from objectives to be achieved by river management. The key elements of the suggested procedure are (i) the quantitative elicitation of the objectives from different stakeholder groups, (ii) the compilation of the current scientific knowledge about the consequences of the effects resulting from suggested measures in the form of a probabilistic mathematical model, and (iii) the use of these predictions and valuations to prioritize alternatives, to uncover conflicting objectives, to support the design of better alternatives, and to improve the transparency of communication about the chosen management strategy. The development of this procedure led to insights regarding necessary steps to be taken for rational decision-making in river management, to guidelines about the use of decision-analytical techniques for performing these steps, but also to new insights about the application of decision-analytical techniques in general. In particular, the consideration of the spatial distribution of the effects of measures and the potential added value of connected rehabilitated river reaches leads to favoring measures that have a positive effect beyond a single river reach. As these effects only propagate within the river network, this results in a river basin oriented management concept as a consequence of a rational decision support procedure, rather than as an a priori management paradigm. There are also limitations to the support that can be expected from the decision-analytical perspective. It will not provide the

  16. Computerisation of procedures. Lessons learned and future perspectives

    International Nuclear Information System (INIS)

    O'Hara, J.; Pirus, D.; Nilsen, S.; Bisio, R.; Hulsund, J.-E.; Zhang, W.

    2003-07-01

    The computerisation of the procedures has been investigated for several years. Even though guidelines for such computerisation have been proposed, there is a need to extend and revise these guidelines. In this report, we look at what has been achieved so far, both within the Halden Project as well as within other organisations related to nuclear power plants. These experiences are often related to testing of particular computerised procedure systems either in research laboratories or in nuclear utilities. These activities have accumulated a body of general knowledge on the subject, as documented in other 'lessons learned' reports of the past. This report will extend this accepted body of knowledge. Furthermore, we identify the unresolved problems that need to be further studied to make usable computerised procedures for the future. The report identifies selected qualities that should be reinforced to make computerised procedure systems better. In particular, the integration aspect is emphasised. A flexible integration with the operator tasks and the remaining interfaces of the control room is important. Unless this integration is accomplished, the computerised procedures will not be functional. Another aspect of integration is combination with other systems inclusive those systems that deal with the plant documentation, electronic or paper based. This kind of integration is important to the safe and reliable operation of the plant. Good integration with plant documentation is instrumental in creating reliable QA of the procedures that covers the whole life cycle of the procedure. (Author). 48 refs., 12 figs., 2 tabs

  17. Fast Evaluation of the Reliability of Container Securing Arrangements

    DEFF Research Database (Denmark)

    Mansour, A.E.; Jensen, Jørgen Juncher; Olsen, Anders Smærup

    2004-01-01

    of container failures are considered including racking and corner post failure. The associated probability of failure is determined using a FORM approach. The procedure can easily be programmed in a simple spreadsheet and the calculation time is very short due to the use of analytical transfer functions...

  18. Chromatographic monitoring procedures in laboratory practice

    Energy Technology Data Exchange (ETDEWEB)

    Kaplina, E G; Belova, O I; Lasunina, N A

    1976-01-01

    The Moscow Coke and Chemical Works consist of three plants in combination, viz., the coking plant, the synthetic ammonia plant using coke-oven-gas hydrogen and the oxygen plant. The plant requirements include daily analyses not only of the coke-oven gas but also of a rich gas and an ethylene fraction. The analyses are carried out in VTI-2 apparatus. The analytical data are used to calculate the calorific values and densities of the gases. The time requirements are very considerable and the laboratory has long been engaged in developing and introducing chromatographic procedures for the major constituents of coke-oven gas, rich gas and ethylene fraction. The procedure developed for the coke-oven and rich gases uses two parallel columns, one packed with molecular sieves and the other with grade KSM silica gel. Hydrogen was determined with argon as the carrier gas, and all other constituents with helium. The procedure was time-consuming and complicated. An attempt was made to separate the gases in an LKhM-7a chromatograph with a programme-controlled 50 to 250/sup 0/C heating cycle, but the procedure still had a number of serious defects and could not be recommended for regular quality control. The final variant involved two parallel columns and a procedure based on that in GOST 14920 (''Dry gas. Proximate analysis''). The chromatograph was a type KhL-69 with a 6-way cock in the gas line so that each of the columns could be brought on stream in succession. The analytical column packings were zeolite (in a 2 m column) and diatomaceous brick with 25% n-hexadecane (in a 6 m column).

  19. Strain accumulation in a prototypic lmfbr nozzle: Experimental and analytical correlation

    International Nuclear Information System (INIS)

    Woodward, W.S.; Dhalia, A.K.; Berton, P.A.

    1986-01-01

    At an early stage in the design of the primary inlet nozzle for the Intermediate Heat Exchanger (IHX) of the Fast Flux Test Facility (FFTF), it was predicted that the inelastic strain accumulation during elevated temperature operation (1050 0 F/566 0 C) would exceed the ASME Code design allowables. Therefore, a proof test of a prototypic FFTF IHX nozzle was performed in the Westinghouse Creep Ratcheting Test Facility (CRTF) to measure the ratchet strain increments during the most severe postulated FFTF plant thermal transients. In addition, analytical procedures similar to those used in the plant design, were used to predict strain accumulation in the CRTF nozzle. This paper describes how the proof test was successfully completed, and it shows that both the test measurements and analytical predictions confirm that the FFTF IHX nozzle, subjected to postulated thermal and mechanical loadings, complies with the ASME Code strain limits. Also, these results provide a measure of validation for the analytical procedures used in the design of FFTF as well as demonstrate the structural adequacy of the FFTF IHX primary inlet nozzle

  20. Role of modern analytical techniques in the production of uranium metal

    International Nuclear Information System (INIS)

    Hareendran, K.N.; Roy, S.B.

    2009-01-01

    Production of nuclear grade uranium metal conforming to its stringent specification with respect to metallic and non metallic impurities necessitates implementation of a comprehensive quality control regime. Founding members of Uranium Metal Plant realised the importance of this aspect of metal production and a quality control laboratory was set up as part of the production plant. In the initial stages of its existence, the laboratory mainly catered to the process control analysis of the plant process samples and Spectroscopy Division and Analytical Division of BARC provided analysis of trace metallic impurities in the intermediates as well as in the product uranium metal. This laboratory also provided invaluable R and D support for the optimization of the process involving both calciothermy and magnesiothermy. Prior to 1985, analytical procedures used were limited to classical methods of analysis with minimal instrumental procedures. The first major analytical instrument, a Flame AAS was installed in 1985 and a beginning to the trace analysis was made. However during the last 15 years the Quality Control Section has modernized the analytical set up by acquiring appropriate instruments. Presently the facility has implemented a complete quality control and quality assurance program required to cover all aspects of uranium metal production viz analysis of raw materials, process samples, waste disposal samples and also determination of all the specification elements in uranium metal. The current analytical practices followed in QCS are presented here