WorldWideScience

Sample records for reliable analytical procedures

  1. Application of analytical procedure on system reliability, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Mitomo, Nobuo; Miyazaki, Keiko; Matsukura, Hiroshi; Kobayashi, Michiyuki

    2000-01-01

    In the Ship Research Institute, research and development of GO-FLOW procedure with various advanced functions as a system reliability analysis method occupying main part of the probabilistic safety assessment (PSA) were promoted. In this study, as an important evaluation technique on executing PSA with lower than level 3, by intending fundamental upgrading of the GO-FLOW procedure, a safety assessment system using the GO-FLOW as well as an analytical function coupling of dynamic behavior analytical function and physical behavior of the system with stochastic phenomenon change were developed. In 1998 fiscal year, preparation and verification of various functions such as dependence addition between the headings, rearrangement in order of time, positioning of same heading to plural positions, calculation of forming frequency with elapsing time were carried out. And, on a simulation analysis function of accident sequence, confirmation on analysis covering all of main accident sequence in the reactor for improved marine reactor, MRX was carried out. In addition, a function near automatically producible on input data for analysis was also prepared. As a result, the conventional analysis not always easy understanding on analytical results except an expert of PSA was solved, and understanding of the accident phenomenon, verification of validity on analysis, feedback to analysis, and feedback to design could be easily carried out. (G.K.)

  2. Analytical procedures for determining the impacts of reliability mitigation strategies.

    Science.gov (United States)

    2013-01-01

    Reliability of transport, especially the ability to reach a destination within a certain amount of time, is a regular concern of travelers and shippers. The definition of reliability used in this research is how travel time varies over time. The vari...

  3. Analytical procedures. Pt. 1

    International Nuclear Information System (INIS)

    Weber, G.

    1985-01-01

    In analytical procedures (Boole procedures) there is certain to be a close relationship between the safety assessment and reliability assessment of technical facilities. The paper gives an overview of the organization of models, fault trees, the probabilistic evaluation of systems, evaluation with minimum steps or minimum paths regarding statistically dependent components and of systems liable to suffer different kinds of outages. (orig.) [de

  4. Procedures for treating common cause failures in safety and reliability studies: Analytical background and techniques

    International Nuclear Information System (INIS)

    Mosleh, A.; Fleming, K.N.; Parry, G.W.; Paula, H.M.; Worledge, D.H.; Rasmuson, D.M.

    1989-01-01

    Volume I of this report presents a framework for the inclusion of the impact of common cause failures in risk and reliability evaluations. Common cause failures are defined as that subset of dependent failures for which causes are not explicitly included in the logic model as basic events. The emphasis here is on providing procedures for a practical, systematic approach that can be used to perform and clearly document the analysis. The framework and the methods discussed for performing the different stages of the analysis integrate insights obtained from engineering assessments of the system and the historical evidence from multiple failure events into a systematic, reproducible, and defensible analysis. This document, Volume 2, contains a series of appendices that provide additional background and methodological detail on several important topics discussed in Volume I

  5. Analytical procedures. Pt. 4

    International Nuclear Information System (INIS)

    Rackwitz, R.

    1985-01-01

    The semi-analytical procedures are summarized under the heading 'first or second-order reliability method'. The asymptotic aggravation of the theory was repeatedly hinted at. In supporting structures the probability of outage of components always is also a function of the condition of all other components. It depends moreover on the stress affecting mostly all components. This fact causes a marked reduction of the effect of redundant component arrangements in the system. It moreover requires very special formulations. Although theoretically interesting and practically important developments will leave their mark on the further progress of the theory, the statements obtained by those approaches will continue to depend on how closely the chosen physical relationships and stoachstic models can come to the scatter quantities. Sensitivity studies show that these are partly aspects of substantially higher importance with a view to decision criteria than the refinement of the (probabilistic) method. Questions of relevance and reliability of data and their adequate treatment in reliability analyses seem to rank higher in order of sequence than exaggerated demands on methodics. (orig./HP) [de

  6. Procedures for treating common cause failures in safety and reliability studies: Volume 2, Analytic background and techniques: Final report

    International Nuclear Information System (INIS)

    Mosleh, A.; Fleming, K.N.; Parry, G.W.; Paula, H.M.; Worledge, D.H.; Rasmuson, D.M.

    1988-12-01

    This report presents a framework for the inclusion of the impact of common cause failures in risk and reliability evaluations. Common cause failures are defined as that subset of dependent failures for which causes are not explicitly included in the logic model as basic events. The emphasis here is on providing procedures for a practical, systematic approach that can be used to perform and clearly document the analysis. The framework and the methods discussed for performing the different stages of the analysis integrate insights obtained from engineering assessments of the system and the historical evidence from multiple failure events into a systematic, reproducible, and defensible analysis. This document, Volume 2, contains a series of appendices that provide additional background and methodological detail on several important topics discussed in Volume 1

  7. Analytical Procedures for Testability.

    Science.gov (United States)

    1983-01-01

    Beat Internal Classifications", AD: A018516. "A System of Computer Aided Diagnosis with Blood Serum Chemistry Tests and Bayesian Statistics", AD: 786284...6 LIST OF TALS .. 1. Truth Table ......................................... 49 2. Covering Problem .............................. 93 3. Primary and...quential classification procedure in a coronary care ward is evaluated. In the toxicology field "A System of Computer Aided Diagnosis with Blood Serum

  8. An Analytical Cost Estimation Procedure

    National Research Council Canada - National Science Library

    Jayachandran, Toke

    1999-01-01

    Analytical procedures that can be used to do a sensitivity analysis of a cost estimate, and to perform tradeoffs to identify input values that can reduce the total cost of a project, are described in the report...

  9. Reliability of application of inspection procedures

    Energy Technology Data Exchange (ETDEWEB)

    Murgatroyd, R A

    1988-12-31

    This document deals with the reliability of application of inspection procedures. A method to ensure that the inspection of defects thanks to fracture mechanics is reliable is described. The Systematic Human Error Reduction and Prediction Analysis (SHERPA) methodology is applied to every task performed by the inspector to estimate the possibility of error. It appears that it is essential that inspection procedures should be sufficiently rigorous to avoid substantial errors, and that the selection procedures and the training period for inspectors should be optimised. (TEC). 3 refs.

  10. Reliability of application of inspection procedures

    International Nuclear Information System (INIS)

    Murgatroyd, R.A.

    1988-01-01

    This document deals with the reliability of application of inspection procedures. A method to ensure that the inspection of defects thanks to fracture mechanics is reliable is described. The Systematic Human Error Reduction and Prediction Analysis (SHERPA) methodology is applied to every task performed by the inspector to estimate the possibility of error. It appears that it is essential that inspection procedures should be sufficiently rigorous to avoid substantial errors, and that the selection procedures and the training period for inspectors should be optimised. (TEC)

  11. Human Reliability Analysis For Computerized Procedures

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Gertman, David I.; Le Blanc, Katya

    2011-01-01

    This paper provides a characterization of human reliability analysis (HRA) issues for computerized procedures in nuclear power plant control rooms. It is beyond the scope of this paper to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper provides a review of HRA as applied to traditional paper-based procedures, followed by a discussion of what specific factors should additionally be considered in HRAs for computerized procedures. Performance shaping factors and failure modes unique to computerized procedures are highlighted. Since there is no definitive guide to HRA for paper-based procedures, this paper also serves to clarify the existing guidance on paper-based procedures before delving into the unique aspects of computerized procedures.

  12. Interim Reliability Evaluation Program procedures guide

    International Nuclear Information System (INIS)

    Carlson, D.D.; Gallup, D.R.; Kolaczkowski, A.M.; Kolb, G.J.; Stack, D.W.; Lofgren, E.; Horton, W.H.; Lobner, P.R.

    1983-01-01

    This document presents procedures for conducting analyses of a scope similar to those performed in Phase II of the Interim Reliability Evaluation Program (IREP). It documents the current state of the art in performing the plant systems analysis portion of a probabilistic risk assessment. Insights gained into managing such an analysis are discussed. Step-by-step procedures and methodological guidance constitute the major portion of the document. While not to be viewed as a cookbook, the procedures set forth the principal steps in performing an IREP analysis. Guidance for resolving the problems encountered in previous analyses is offered. Numerous examples and representative products from previous analyses clarify the discussion

  13. The use of analytical procedures in the internal audit of the restaurant business expenses

    Directory of Open Access Journals (Sweden)

    T.Yu. Kopotienko

    2015-06-01

    Full Text Available The important task of carrying out the internal audit of expenses is to get the sufficient and reliable audit evidence. This can be achieved by using the analytical procedures in the audit process. The identification of the analytical procedures with the financial analysis of the business activities prevents from the efficient usage of them in the internal audit of the restaurant business expenses. The knowledge of internal auditors about the instructional techniques of analytical procedures and their tasks, depending on the verification steps are insufficient. The purpose of the article is the developing the methods of the internal audit of the restaurant business expenses based on an integrated application of analytical procedures. The nature and purpose of analytical procedures have been investigated in the article. It have been identified the factors influencing on auditor’s decision about the choice of analytical procedures complex. It was recommended to identify among them the purpose of analytical procedures, the type and structure of the enterprise, the source of the available information, the existence of financial and non-financial information, reliability and comparability of the available information. It have been identified the tasks of analytical procedures, depending on the verification steps. It was offered the analytical procedures complex as a part of internal audit of the restaurant business expenses. This complex contains a list of the analytical procedures, instructional techniques of analysis that are used in the appropriate procedure and the brief overview of the content of procedure.

  14. Summative Mass Analysis of Algal Biomass - Integration of Analytical Procedures: Laboratory Analytical Procedure (LAP)

    Energy Technology Data Exchange (ETDEWEB)

    Laurens, Lieve M. L.

    2016-01-13

    This procedure guides the integration of laboratory analytical procedures to measure algal biomass constituents in an unambiguous manner and ultimately achieve mass balance closure for algal biomass samples. Many of these methods build on years of research in algal biomass analysis.

  15. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Reliability of procedures used for scaling loudness

    DEFF Research Database (Denmark)

    Jesteadt, Walt; Joshi, Suyash Narendra

    2013-01-01

    In this study, 16 normally-hearing listeners judged the loudness of 1000-Hz sinusoids using magnitude estimation (ME), magnitude production (MP), and categorical loudness scaling (CLS). Listeners in each of four groups completed the loudness scaling tasks in a different sequence on the first visit...... (ME, MP, CLS; MP, ME, CLS; CLS, ME, MP; CLS, MP, ME), and the order was reversed on the second visit. This design made it possible to compare the reliability of estimates of the slope of the loudness function across procedures in the same listeners. The ME data were well fitted by an inflected...... results were the most reproducible, they do not provide direct information about the slope of the loudness function because the numbers assigned to CLS categories are arbitrary. This problem can be corrected by using data from the other procedures to assign numbers that are proportional to loudness...

  17. Pre-analytical and analytical aspects affecting clinical reliability of plasma glucose results.

    Science.gov (United States)

    Pasqualetti, Sara; Braga, Federica; Panteghini, Mauro

    2017-07-01

    The measurement of plasma glucose (PG) plays a central role in recognizing disturbances in carbohydrate metabolism, with established decision limits that are globally accepted. This requires that PG results are reliable and unequivocally valid no matter where they are obtained. To control the pre-analytical variability of PG and prevent in vitro glycolysis, the use of citrate as rapidly effective glycolysis inhibitor has been proposed. However, the commercial availability of several tubes with studies showing different performance has created confusion among users. Moreover, and more importantly, studies have shown that tubes promptly inhibiting glycolysis give PG results that are significantly higher than tubes containing sodium fluoride only, used in the majority of studies generating the current PG cut-points, with a different clinical classification of subjects. From the analytical point of view, to be equivalent among different measuring systems, PG results should be traceable to a recognized higher-order reference via the implementation of an unbroken metrological hierarchy. In doing this, it is important that manufacturers of measuring systems consider the uncertainty accumulated through the different steps of the selected traceability chain. In particular, PG results should fulfil analytical performance specifications defined to fit the intended clinical application. Since PG has tight homeostatic control, its biological variability may be used to define these limits. Alternatively, given the central diagnostic role of the analyte, an outcome model showing the impact of analytical performance of test on clinical classifications of subjects can be used. Using these specifications, performance assessment studies employing commutable control materials with values assigned by reference procedure have shown that the quality of PG measurements is often far from desirable and that problems are exacerbated using point-of-care devices. Copyright © 2017 The Canadian

  18. Application of system reliability analytical method, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Mitomo, Nobuo; Miyazaki, Keiko; Matsukura, Hiroshi; Kobayashi, Michiyuki

    1999-01-01

    The Ship Research Institute proceed a developmental study on GO-FLOW method with various advancing functionalities for the system reliability analysis method occupying main parts of PSA (Probabilistic Safety Assessment). Here was attempted to intend to upgrade functionality of the GO-FLOW method, to develop an analytical function integrated with dynamic behavior analytical function, physical behavior and probable subject transfer, and to prepare a main accident sequence picking-out function. In 1997 fiscal year, in dynamic event-tree analytical system, an analytical function was developed by adding dependency between headings. In simulation analytical function of the accident sequence, main accident sequence of MRX for improved ship propulsion reactor became possible to be covered perfectly. And, input data for analysis was prepared with a function capable easily to set by an analysis operator. (G.K.)

  19. Analytical Chemistry Laboratory (ACL) procedure compendium

    International Nuclear Information System (INIS)

    1992-06-01

    Covered are: analytical laboratory operations (ALO) sample receipt and control, ALO data report/package preparation review and control, single shell tank (PST) project sample tracking system, sample receiving, analytical balances, duties and responsibilities of sample custodian, sample refrigerator temperature monitoring, security, assignment of staff responsibilities, sample storage, data reporting, and general requirements for glassware

  20. Radioimmunoassay. A revolution in the analytic procedure

    Energy Technology Data Exchange (ETDEWEB)

    Strecker, H; Eckert, H G [Farbwerke Hoechst A.G., Frankfurt am Main (Germany, F.R.). Radiochemisches Lab.

    1978-04-01

    Radioimmunoassay is an analytic method which combines the sensitivity of radioactive measurement and the specificity of the antigen-antibody reaction Substances down to a concentration of some picograms per ml serum (or biological material) can be measured in the presence of a millionfold excess of otherwise interfering substances. The method is easy to carry out (test tube chemistry). The main field of application at the moment is in endocrinology; further possibilities of application are in pharmaceutical research, environmental protection, forensic medicine, and for general analytic purposes. Radioactive sources are used only in vitro in the nanocurie range, i.e. radiation exposure is negligible.

  1. Analytical Chemistry Laboratory (ACL) procedure compendium

    International Nuclear Information System (INIS)

    1993-01-01

    This volume contains the interim change notice for the safety operation procedure for hot cell. It covers the master-slave manipulators, dry waste removal, cell transfers, hoists, cask handling, liquid waste system, and physical characterization of fluids

  2. Benchmark of systematic human action reliability procedure

    International Nuclear Information System (INIS)

    Spurgin, A.J.; Hannaman, G.W.; Moieni, P.

    1986-01-01

    Probabilistic risk assessment (PRA) methodology has emerged as one of the most promising tools for assessing the impact of human interactions on plant safety and understanding the importance of the man/machine interface. Human interactions were considered to be one of the key elements in the quantification of accident sequences in a PRA. The approach to quantification of human interactions in past PRAs has not been very systematic. The Electric Power Research Institute sponsored the development of SHARP to aid analysts in developing a systematic approach for the evaluation and quantification of human interactions in a PRA. The SHARP process has been extensively peer reviewed and has been adopted by the Institute of Electrical and Electronics Engineers as the basis of a draft guide for the industry. By carrying out a benchmark process, in which SHARP is an essential ingredient, however, it appears possible to assess the strengths and weaknesses of SHARP to aid human reliability analysts in carrying out human reliability analysis as part of a PRA

  3. Analytical approach for confirming the achievement of LMFBR reliability goals

    International Nuclear Information System (INIS)

    Ingram, G.E.; Elerath, J.G.; Wood, A.P.

    1981-01-01

    The approach, recommended by GE-ARSD, for confirming the achievement of LMFBR reliability goals relies upon a comprehensive understanding of the physical and operational characteristics of the system and the environments to which the system will be subjected during its operational life. This kind of understanding is required for an approach based on system hardware testing or analyses, as recommended in this report. However, for a system as complex and expensive as the LMFBR, an approach which relies primarily on system hardware testing would be prohibitive both in cost and time to obtain the required system reliability test information. By using an analytical approach, results of tests (reliability and functional) at a low level within the specific system of interest, as well as results from other similar systems can be used to form the data base for confirming the achievement of the system reliability goals. This data, along with information relating to the design characteristics and operating environments of the specific system, will be used in the assessment of the system's reliability

  4. Argon analytical procedures for potassium-argon dating

    International Nuclear Information System (INIS)

    Gabites, J.E.; Adams, C.J.

    1981-01-01

    A manual for the argon analytical methods involved in potassium-argon geochronology, including: i) operating procedures for the ultra-high vacuum argon extraction/purification equipment for the analysis of nanolitre quantities of radiogenic argon in rocks, minerals and gases; ii) operating procedures for the AEI-MS10 gas source mass spectrometer

  5. Analytical Chemistry Laboratory (ACL) procedure compendium

    International Nuclear Information System (INIS)

    1993-01-01

    This volume contains the interim change notice for sample preparation methods. Covered are: acid digestion for metals analysis, fusion of Hanford tank waste solids, water leach of sludges/soils/other solids, extraction procedure toxicity (simulate leach in landfill), sample preparation for gamma spectroscopy, acid digestion for radiochemical analysis, leach preparation of solids for free cyanide analysis, aqueous leach of solids for anion analysis, microwave digestion of glasses and slurries for ICP/MS, toxicity characteristic leaching extraction for inorganics, leach/dissolution of activated metal for radiochemical analysis, extraction of single-shell tank (SST) samples for semi-VOC analysis, preparation and cleanup of hydrocarbon- containing samples for VOC and semi-VOC analysis, receiving of waste tank samples in onsite transfer cask, receipt and inspection of SST samples, receipt and extrusion of core samples at 325A shielded facility, cleaning and shipping of waste tank samplers, homogenization of solutions/slurries/sludges, and test sample preparation for bioassay quality control program

  6. Accident Sequence Evaluation Program: Human reliability analysis procedure

    Energy Technology Data Exchange (ETDEWEB)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  7. Accident Sequence Evaluation Program: Human reliability analysis procedure

    International Nuclear Information System (INIS)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs

  8. Precision profiles and analytic reliability of radioimmunologic methods

    International Nuclear Information System (INIS)

    Yaneva, Z.; Popova, Yu.

    1991-01-01

    The aim of the present study is to investigate and compare some methods for creation of 'precision profiles' (PP) and to clarify their possibilities for determining the analytical reliability of RIA. Only methods without complicated mathematical calculations has been used. The reproducibility in serums with a concentration of the determinable hormone in the whole range of the calibration curve has been studied. The radioimmunoassay has been performed with TSH-RIA set (ex East Germany), and comparative evaluations - with commercial sets of HOECHST (Germany) and AMERSHAM (GB). Three methods for obtaining the relationship concentration (IU/l) -reproducibility (C.V.,%) are used and a comparison is made of their corresponding profiles: preliminary rough profile, Rodbard-PP and Ekins-PP. It is concluded that the creation of a precision profile is obligatory and the method of its construction does not influence the relationship's course. PP allows to determine concentration range giving stable results which improves the efficiency of the analitical work. 16 refs., 4 figs

  9. Consistency of FMEA used in the validation of analytical procedures

    DEFF Research Database (Denmark)

    Oldenhof, M.T.; van Leeuwen, J.F.; Nauta, Maarten

    2011-01-01

    is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating......In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection...

  10. Nonspecialist Raters Can Provide Reliable Assessments of Procedural Skills

    DEFF Research Database (Denmark)

    Mahmood, Oria; Dagnæs, Julia; Bube, Sarah

    2018-01-01

    was significant (p Pearson's correlation of 0.77 for the nonspecialists and 0.75 for the specialists. The test-retest reliability showed the biggest difference between the 2 groups, 0.59 and 0.38 for the nonspecialist raters and the specialist raters, respectively (p ... was chosen as it is a simple procedural skill that is crucial to master in a resident urology program. RESULTS: The internal consistency of assessments was high, Cronbach's α = 0.93 and 0.95 for nonspecialist and specialist raters, respectively (p correlations). The interrater reliability...

  11. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  12. Design Protocols and Analytical Strategies that Incorporate Structural Reliability Models

    Science.gov (United States)

    Duffy, Stephen F.

    1997-01-01

    Ceramic matrix composites (CMC) and intermetallic materials (e.g., single crystal nickel aluminide) are high performance materials that exhibit attractive mechanical, thermal and chemical properties. These materials are critically important in advancing certain performance aspects of gas turbine engines. From an aerospace engineer's perspective the new generation of ceramic composites and intermetallics offers a significant potential for raising the thrust/weight ratio and reducing NO(x) emissions of gas turbine engines. These aspects have increased interest in utilizing these materials in the hot sections of turbine engines. However, as these materials evolve and their performance characteristics improve a persistent need exists for state-of-the-art analytical methods that predict the response of components fabricated from CMC and intermetallic material systems. This need provided the motivation for the technology developed under this research effort. Continuous ceramic fiber composites exhibit an increase in work of fracture, which allows for "graceful" rather than catastrophic failure. When loaded in the fiber direction, these composites retain substantial strength capacity beyond the initiation of transverse matrix cracking despite the fact that neither of its constituents would exhibit such behavior if tested alone. As additional load is applied beyond first matrix cracking, the matrix tends to break in a series of cracks bridged by the ceramic fibers. Any additional load is born increasingly by the fibers until the ultimate strength of the composite is reached. Thus modeling efforts supported under this research effort have focused on predicting this sort of behavior. For single crystal intermetallics the issues that motivated the technology development involved questions relating to material behavior and component design. Thus the research effort supported by this grant had to determine the statistical nature and source of fracture in a high strength, Ni

  13. Tritium isotope fractionation in biological systems and in analytical procedures

    International Nuclear Information System (INIS)

    Kim, M.A.; Baumgaertner, Franz

    1989-01-01

    The organically bound tritium (OBT) is evaluated in biological systems by determining the tritium distribution ratio (R-value), i.e. tritium concentrations in organic substance to cell water. The determination of the R-value always involves isotope fractionation is applied analytical procedures and hence the evaluation of the true OBT -value in a given biological system appears more complicated than hitherto known in the literature. The present work concentrates on the tritium isotope fractionation in the cell water separation and on the resulting effects on the R-value. The analytical procedures examined are vacuum freeze drying under equilibrium and non-equilibrium conditions and azeotropic distillation. The vaporization isotope effects are determined separately in the phase transition of solid or liquid to gas in pure tritium water systems as well as in real biological systems, e.g. corn plant. The results are systematically analyzed and the influence of isotope effects on the R-value is rigorously quantified

  14. Dispersant testing : a study on analytical test procedures

    International Nuclear Information System (INIS)

    Fingas, M.F.; Fieldhouse, B.; Wang, Z.; Environment Canada, Ottawa, ON

    2004-01-01

    Crude oil is a complex mixture of hydrocarbons, ranging from small, volatile compounds to very large, non-volatile compounds. Analysis of the dispersed oil is crucial. This paper described Environment Canada's ongoing studies on various traits of dispersants. In particular, it describes small studies related to dispersant effectiveness and methods to improve analytical procedures. The study also re-evaluated the analytical procedure for the Swirling Flask Test, which is now part of the ASTM standard procedure. There are new and improved methods for analyzing oil-in-water using gas chromatography (GC). The methods could be further enhanced by integrating the entire chromatogram rather than just peaks. This would result in a decrease in maximum variation from 5 per cent to about 2 per cent. For oil-dispersant studies, the surfactant-dispersed oil hydrocarbons consist of two parts: GC-resolved hydrocarbons and GC-unresolved hydrocarbons. This study also tested a second feature of the Swirling Flask Test in which the side spout was tested and compared with a new vessel with a septum port instead of a side spout. This decreased the variability as well as the energy and mixing in the vessel. Rather than being a variation of the Swirling Flask Test, it was suggested that a spoutless vessel might be considered as a completely separate test. 7 refs., 2 tabs., 4 figs

  15. Practical approach to a procedure for judging the results of analytical verification measurements

    International Nuclear Information System (INIS)

    Beyrich, W.; Spannagel, G.

    1979-01-01

    For practical safeguards a particularly transparent procedure is described to judge analytical differences between declared and verified values based on experimental data relevant to the actual status of the measurement technique concerned. Essentially it consists of two parts: Derivation of distribution curves for the occurrence of interlaboratory differences from the results of analytical intercomparison programmes; and judging of observed differences using criteria established on the basis of these probability curves. By courtesy of the Euratom Safeguards Directorate, Luxembourg, the applicability of this judging procedure has been checked in practical data verification for safeguarding; the experience gained was encouraging and implementation of the method is intended. Its reliability might be improved further by evaluation of additional experimental data. (author)

  16. Analytical modeling of nuclear power station operator reliability

    International Nuclear Information System (INIS)

    Sabri, Z.A.; Husseiny, A.A.

    1979-01-01

    The operator-plant interface is a critical component of power stations which requires the formulation of mathematical models to be applied in plant reliability analysis. The human model introduced here is based on cybernetic interactions and allows for use of available data from psychological experiments, hot and cold training and normal operation. The operator model is identified and integrated in the control and protection systems. The availability and reliability are given for different segments of the operator task and for specific periods of the operator life: namely, training, operation and vigilance or near retirement periods. The results can be easily and directly incorporated in system reliability analysis. (author)

  17. Tritium fractionation in biological systems and in analytical procedures

    International Nuclear Information System (INIS)

    Kim, M.A.; Baumgaertner, F.

    1991-01-01

    The organically bound tritium (OBT) is evaluated in biological systems by measuring the tritium distribution ratio (R-value), i.e. tritium concentrations in organic substance to tissue water. The determination of the R-value is found to involve always isotope fractionation in applied analytical procedures and hence the evaluation of the true OBT-value in a given biological system appears more complicated than hitherto known in the literature. The present work concentrates on the tritium isotope fraction in the tissue water separation and on the resulting effects on the R-value. The analytical procedures examined are vacuum freeze drying under equilibrium and non-equilibrium conditions and azeotropic distillation. The vaporization isotope effects are determined separately in the phase transition of solid or liquid to gas in pure water systems as well as in real biological systems, e.g. maize plant. The results are systematically analysed and the influence of isotope effects on the R-value is rigorously quantified. (orig.)

  18. Consistency of FMEA used in the validation of analytical procedures.

    Science.gov (United States)

    Oldenhof, M T; van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Vredenbregt, M J; Weda, M; Barends, D M

    2011-02-20

    In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection-Mass Spectrometry (HPLC-DAD-MS) analytical procedure used in the quality control of medicines. Each team was free to define their own ranking scales for the probability of severity (S), occurrence (O), and detection (D) of failure modes. We calculated Risk Priority Numbers (RPNs) and we identified the failure modes above the 90th percentile of RPN values as failure modes needing urgent corrective action; failure modes falling between the 75th and 90th percentile of RPN values were identified as failure modes needing necessary corrective action, respectively. Team 1 and Team 2 identified five and six failure modes needing urgent corrective action respectively, with two being commonly identified. Of the failure modes needing necessary corrective actions, about a third were commonly identified by both teams. These results show inconsistency in the outcome of the FMEA. To improve consistency, we recommend that FMEA is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating that this inconsistency is not always a drawback. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. An off-line two-dimensional analytical procedure for determination of polcyclic aromatic hydrocarbons in smoke aerosol

    NARCIS (Netherlands)

    Claessens, H.A.; Lammerts van Bueren, L.G.D.

    1987-01-01

    Smoke aerosol from stoves consists of a wide variety of chemical substances of which a number have toxic properties. To study the impact of aerosol emissions on health and environment reliable analytical procedures must be available for these samples. An off-line two-dimensional HPLC method is

  20. Genesis of theory and analysis of practice of applying the analytical procedures in auditing

    OpenAIRE

    Сурніна, К. С.

    2012-01-01

    Determination of concept "Analytical procedures" in an audit by different researchers is investigated in the article, ownvision of necessity of wideuse of analytical procedures in audit is defined. Classification of analytical procedures is presentedtaking into account the specifity of auditing process on the whole

  1. A Novel Analytic Technique for the Service Station Reliability in a Discrete-Time Repairable Queue

    Directory of Open Access Journals (Sweden)

    Renbin Liu

    2013-01-01

    Full Text Available This paper presents a decomposition technique for the service station reliability in a discrete-time repairable GeomX/G/1 queueing system, in which the server takes exhaustive service and multiple adaptive delayed vacation discipline. Using such a novel analytic technique, some important reliability indices and reliability relation equations of the service station are derived. Furthermore, the structures of the service station indices are also found. Finally, special cases and numerical examples validate the derived results and show that our analytic technique is applicable to reliability analysis of some complex discrete-time repairable bulk arrival queueing systems.

  2. Integrating analytical procedures into the continuous audit environment

    Directory of Open Access Journals (Sweden)

    Eija Koskivaara

    2006-12-01

    Full Text Available The objective of this article is to show how to embed analytical procedures (AP into the continuous audit environment. The audit environment is discussed in terms of audit phases, where the role of APs is to obtain evidence for auditors. The article addresses different characteristics of AP techniques. Furthermore, the article compares four different AP techniques to form expectations for the monthly sales values. Two of these techniques are simple quantitative ones, such as the previous year’s value and the mean of the previous years’ values. The advanced quantitative techniques are regression analysis and an artificial neural network (ANN-based model. In a comparison of the prediction results, the regression analysis and ANN model turn out to be equally good. The development of these kinds of tools is crucial to the continuous audit environment, especially when most data transmission between companies and their stakeholders are moved into the electronic form.

  3. SHARP1: A revised systematic human action reliability procedure

    International Nuclear Information System (INIS)

    Wakefield, D.J.; Parry, G.W.; Hannaman, G.W.; Spurgin, A.J.

    1990-12-01

    Individual plant examinations (IPE) are being performed by utilities to evaluate plant-specific vulnerabilities to severe accidents. A major tool in performing an IPE is a probabilistic risk assessment (PRA). The importance of human interactions in determining the plant response in past PRAs is well documented. The modeling and quantification of the probabilities of human interactions have been the subjects of considerable research by the Electric Power Research Institute (EPRI). A revised framework, SHARP1, for incorporating human interactions into PRA is summarized in this report. SHARP1 emphasizes that the process stages are iterative and directed at specific goals rather than being performed sequentially in a stepwise procedure. This expanded summary provides the reader with a flavor of the full report content. Excerpts from the full report are presented, following the same outline as the full report. In the full report, the interface of the human reliability analysis with the plant logic model development in a PRA is given special attention. In addition to describing a methodology framework, the report also discusses the types of human interactions to be evaluated, and how to formulate a project team to perform the human reliability analysis. A concise description and comparative evaluation of the selected existing methods of quantification of human error are also presented. Four case studies are also provided to illustrate the SHARP1 process

  4. Analytical procedure for the titrimetric determination of uranium in concentrates

    International Nuclear Information System (INIS)

    Florence, T.M.; Pakalns, P.

    1989-01-01

    In 1964 Davis and gray published a titrimetric method for uranium which does not require column reductors, electronic instruments or inert atmospheres, and is sufficiently selective to enable uranium to be determined without prior separation. The method involves reduction of uranium (VI) to (IV) by ferrous sulphate in concentrated phosphoric acid medium. The excess ion (II) is then selectively oxidised by nitric acid using molybdenum catalyst. After addition of sulphuric acid and dilution with water, the uranium (IV) is titrated with standard potassium dichromate, using barium diphenylamine sulphonate indicator. This method has been found to be simple, precise and reliable, and applicable to a wide range of uranium-containing materials. The method given here for determining uranium in concentrates is essentially that of Davies and Gray. Its applications, apparatus, reagents, procedures and accuracy and precision are discussed. 10 refs

  5. Effects of Analytical and Holistic Scoring Patterns on Scorer Reliability in Biology Essay Tests

    Science.gov (United States)

    Ebuoh, Casmir N.

    2018-01-01

    Literature revealed that the patterns/methods of scoring essay tests had been criticized for not being reliable and this unreliability is more likely to be more in internal examinations than in the external examinations. The purpose of this study is to find out the effects of analytical and holistic scoring patterns on scorer reliability in…

  6. An analytical framework for reliability growth of one-shot systems

    International Nuclear Information System (INIS)

    Hall, J. Brian; Mosleh, Ali

    2008-01-01

    In this paper, we introduce a new reliability growth methodology for one-shot systems that is applicable to the case where all corrective actions are implemented at the end of the current test phase. The methodology consists of four model equations for assessing: expected reliability, the expected number of failure modes observed in testing, the expected probability of discovering new failure modes, and the expected portion of system unreliability associated with repeat failure modes. These model equations provide an analytical framework for which reliability practitioners can estimate reliability improvement, address goodness-of-fit concerns, quantify programmatic risk, and assess reliability maturity of one-shot systems. A numerical example is given to illustrate the value and utility of the presented approach. This methodology is useful to program managers and reliability practitioners interested in applying the techniques above in their reliability growth program

  7. An integrated approach to human reliability analysis -- decision analytic dynamic reliability model

    International Nuclear Information System (INIS)

    Holmberg, J.; Hukki, K.; Norros, L.; Pulkkinen, U.; Pyy, P.

    1999-01-01

    The reliability of human operators in process control is sensitive to the context. In many contemporary human reliability analysis (HRA) methods, this is not sufficiently taken into account. The aim of this article is that integration between probabilistic and psychological approaches in human reliability should be attempted. This is achieved first, by adopting such methods that adequately reflect the essential features of the process control activity, and secondly, by carrying out an interactive HRA process. Description of the activity context, probabilistic modeling, and psychological analysis form an iterative interdisciplinary sequence of analysis in which the results of one sub-task maybe input to another. The analysis of the context is carried out first with the help of a common set of conceptual tools. The resulting descriptions of the context promote the probabilistic modeling, through which new results regarding the probabilistic dynamics can be achieved. These can be incorporated in the context descriptions used as reference in the psychological analysis of actual performance. The results also provide new knowledge of the constraints of activity, by providing information of the premises of the operator's actions. Finally, the stochastic marked point process model gives a tool, by which psychological methodology may be interpreted and utilized for reliability analysis

  8. Analytical Chemistry Laboratory (ACL) procedure compendium. Volume 1, Administrative

    Energy Technology Data Exchange (ETDEWEB)

    1992-06-01

    Covered are: analytical laboratory operations (ALO) sample receipt and control, ALO data report/package preparation review and control, single shell tank (PST) project sample tracking system, sample receiving, analytical balances, duties and responsibilities of sample custodian, sample refrigerator temperature monitoring, security, assignment of staff responsibilities, sample storage, data reporting, and general requirements for glassware.

  9. Processes and Procedures for Estimating Score Reliability and Precision

    Science.gov (United States)

    Bardhoshi, Gerta; Erford, Bradley T.

    2017-01-01

    Precision is a key facet of test development, with score reliability determined primarily according to the types of error one wants to approximate and demonstrate. This article identifies and discusses several primary forms of reliability estimation: internal consistency (i.e., split-half, KR-20, a), test-retest, alternate forms, interscorer, and…

  10. Procedures for treating common cause failures in safety and reliability studies: Procedural framework and examples

    International Nuclear Information System (INIS)

    Mosleh, A.; Fleming, K.N.; Parry, G.W.; Paula, H.M.; Worledge, D.H.; Rasmuson, D.M.

    1988-01-01

    This report presents a framework for the inclusion of the impact of common cause failures in risk and reliability evaluations. Common cause failures are defined as that cutset of dependent failures for which causes are not explicitly included in the logic model as basic events. The emphasis here is on providing procedures for a practical, systematic approach that can be used to perform and clearly document the analysis. The framework comprises four major stages: (1) System Logic Model Development; (2) Identification of Common Cause Component Groups; (3) Common Cause Modeling and Data Analysis; and (4) System Quantification and Interpretation of Results. The framework and the methods discussed for performing the different stages of the analysis integrate insights obtained from engineering assessments of the system and the historical evidence from multiple failure events into a systematic, reproducible, and defensible analysis. 22 figs., 34 tabs

  11. Role of the IAEA's ALMERA network in harmonization of analytical procedures applicable worldwide for radiological emergencies

    International Nuclear Information System (INIS)

    Pitois, A.; Osvath, I.; Tarjan, S.; Groening, M.; Osborn, D.; )

    2016-01-01

    The International Atomic Energy Agency (IAEA) coordinates and provides analytical support to the worldwide network of Analytical Laboratories for the Measurement of Environmental Radioactivity (ALMERA), consisting at the end of 2015 of 154 laboratories in 85 countries. This network, established by the IAEA in 1995, has for aim to provide timely and reliable measurement results of environmental radioactivity in routine monitoring and emergency situations. The IAEA supports the ALMERA laboratories in their routine and emergency response environmental monitoring activities by organizing proficiency tests and inter-laboratory comparison exercises, developing validated analytical procedures for environmental radioactivity measurement, and organizing training courses and workshops. The network also acts as a forum for sharing knowledge and expertise. The aim of this paper is to describe the current status of ALMERA analytical method development activities for radiological emergencies and the plans for further development in the field

  12. Procedure for hazards analysis of plutonium gloveboxes used in analytical chemistry operations

    International Nuclear Information System (INIS)

    Delvin, W.L.

    1977-06-01

    A procedure is presented to identify and assess hazards associated with gloveboxes used for analytical chemistry operations involving plutonium. This procedure is based upon analytic tree methodology and it has been adapted from the US Energy Research and Development Administration's safety program, the Management Oversight and Risk Tree

  13. Procedures for controlling the risks of reliability, safety, and availability of technical systems

    International Nuclear Information System (INIS)

    1987-01-01

    The reference book covers four sections. Apart from the fundamental aspects of the reliability problem, of risk and safety and the relevant criteria with regard to reliability, the material presented explains reliability in terms of maintenance, logistics and availability, and presents procedures for reliability assessment and determination of factors influencing the reliability, together with suggestions for systems technical integration. The reliability assessment consists of diagnostic and prognostic analyses. The section on factors influencing reliability discusses aspects of organisational structures, programme planning and control, and critical activities. (DG) [de

  14. Procedure for Application of Software Reliability Growth Models to NPP PSA

    International Nuclear Information System (INIS)

    Son, Han Seong; Kang, Hyun Gook; Chang, Seung Cheol

    2009-01-01

    As the use of software increases at nuclear power plants (NPPs), the necessity for including software reliability and/or safety into the NPP Probabilistic Safety Assessment (PSA) rises. This work proposes an application procedure of software reliability growth models (RGMs), which are most widely used to quantify software reliability, to NPP PSA. Through the proposed procedure, it can be determined if a software reliability growth model can be applied to the NPP PSA before its real application. The procedure proposed in this work is expected to be very helpful for incorporating software into NPP PSA

  15. Reliability and Validity of 10 Different Standard Setting Procedures.

    Science.gov (United States)

    Halpin, Glennelle; Halpin, Gerald

    Research indicating that different cut-off points result from the use of different standard-setting techniques leaves decision makers with a disturbing dilemma: Which standard-setting method is best? This investigation of the reliability and validity of 10 different standard-setting approaches was designed to provide information that might help…

  16. The application of analytical procedures in the audit process: A ...

    African Journals Online (AJOL)

    kirstam

    collected through interviews with senior audit managers at large audit ... providing a perspective of why and how South African auditors apply analytical ... and includes the objectives of each study, the data collection method used, and a ...... 2It is recommended that scholars use the findings of this study to perform further.

  17. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    International Nuclear Information System (INIS)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective

  18. Evidential Analytic Hierarchy Process Dependence Assessment Methodology in Human Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Luyuan Chen

    2017-02-01

    Full Text Available In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster–Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  19. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong [School of Computer and Information Science, Southwest University, Chongqing (China); Mahadevan, Sankaran [School of Engineering, Vanderbilt University, Nashville (United States)

    2017-02-15

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  20. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    Directory of Open Access Journals (Sweden)

    Helen V. Hsieh

    2017-05-01

    Full Text Available Immunochromatographic or lateral flow assays (LFAs are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads, biological reagents (e.g., antibodies, blocking reagents and buffers and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness.

  1. Recent trends in analytical procedures in forensic toxicology.

    Science.gov (United States)

    Van Bocxlaer, Jan F

    2005-12-01

    Forensic toxicology is a very demanding discipline,heavily dependent on good analytical techniques. That is why new trends appear continuously. In the past years. LC-MS has revolutionized target compound analysis and has become the trend, also in toxicology. In LC-MS screening analysis, things are less straightforward and several approaches exist. One promising approach based on accurate LC-MSTOF mass measurements and elemental formula based library searches is discussed. This way of screening has already proven its applicability but at the same time it became obvious that a single accurate mass measurement lacks some specificity when using large compound libraries. CE too is a reemerging approach. The increasingly polar and ionic molecules encountered make it a worthwhile addition to e.g. LC, as illustrated for the analysis of GHB. A third recent trend is the use of MALDI mass spectrometry for small molecules. It is promising for its ease-of-use and high throughput. Unfortunately, re-ports of disappointment but also accomplishment, e.g. the quantitative analysis of LSD as discussed here, alternate, and it remains to be seen whether MALDI really will establish itself. Indeed, not all new trends will prove themselves but the mere fact that many appear in the world of analytical toxicology nowadays is, in itself, encouraging for the future of (forensic) toxicology.

  2. Analytical procedures for the determination of disperse azo dyes

    Energy Technology Data Exchange (ETDEWEB)

    Betowski, L.D.; Jones, T.L. (Environmental Protection Agency, Las Vegas, NV (USA)); Munslow, W.; Nunn, N.J. (Lockheed Engineering and Management Services Co., Las Vegas, NV (USA))

    1988-09-01

    Disperse Blue 79 is the most widely-used azo dye in the US. Its economic importance for the dye industry and textile industry is very great. Because of its use and potential for degradation to aromatic amines, this compound has been chosen for testing by the Interagency Testing Committee. The authors laboratory has been developing methods for the analytical determination of Disperse Blue 79 and any possible degradation products in wastewater. This work has been taking place in conjunction with the study of the fate of azo dyes in the wastewater treatment processes by the Water Engineering Research Laboratory of the US EPA in Cincinnati. There were various phases for this analytical development. The first step involved purifying the commercial material or presscake to obtain a standard for quantitative determination. A combination of HPLC, TLC and mass spectrometric methods was used to determine purity after extraction and column cleanup. Phase two involved the extraction of the dye from the matrices involved. The third phase was the actual testing of Disperse Blue 79 in the waste activated sludge system and anaerobic digester. Recovery of the dye and any degradation products at each sampling point (e.g., secondary effluent, waste activated sludge) was the goal of this phase.

  3. Procedure for conducting a human-reliability analysis for nuclear power plants. Final report

    International Nuclear Information System (INIS)

    Bell, B.J.; Swain, A.D.

    1983-05-01

    This document describes in detail a procedure to be followed in conducting a human reliability analysis as part of a probabilistic risk assessment when such an analysis is performed according to the methods described in NUREG/CR-1278, Handbook for Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications. An overview of the procedure describing the major elements of a human reliability analysis is presented along with a detailed description of each element and an example of an actual analysis. An appendix consists of some sample human reliability analysis problems for further study

  4. Comparative evaluation of analytical procedures for the recovery of Enterobacteriaceae in coastal marine waters; Valutazione comparativa di procedure analitiche per il rilevamento di Enterobacteriaceae in acque marine costiere

    Energy Technology Data Exchange (ETDEWEB)

    Bonadonna, Lucia; Chiaretti, Gianluca; Coccia, Anna Maria; Semproni, Maurizio [Istituto Superiore di Sanita`, Rome (Italy). Lab. di Igiene Ambientale

    1997-03-01

    The use of quick and reliable procedures is fundamental for water quality evaluation control. In order to improve the analytical methods for microbiological examination of bathing waters, a comparison of different substrates for the recovery of Enterobacteriaceae from coastal marine waters was carried out. The medium indicated in the Italian technical normative has shown a good selectivity when the red colonies with a green metallic surface sheen were counted, as stated in the Standard Methods. On the other hand, the chronogenic substrate used in this study resulted easy to read, selective and specific for both Escherichia coli and total coliforms.

  5. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    Science.gov (United States)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  6. Big data analytics for the Future Circular Collider reliability and availability studies

    Science.gov (United States)

    Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter

    2017-10-01

    Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.

  7. Analytical procedures for identifying anthocyanins in natural extracts

    International Nuclear Information System (INIS)

    Marco, Paulo Henrique; Poppi, Ronei Jesus; Scarminio, Ieda Spacino

    2008-01-01

    Anthocyanins are among the most important plant pigments. Due to their potential benefits for human health, there is considerable interest in these natural pigments. Nonetheless, there is great difficulty in finding a technique that could provide the identification of structurally similar compounds and estimate the number and concentration of the species present. A lot of techniques have been tried to find the best methodology to extract information from these systems. In this paper, a review of the most important procedures is given, from the extraction to the identification of anthocyanins in natural extracts. (author)

  8. Methods for qualification of highly reliable software - international procedure

    International Nuclear Information System (INIS)

    Kersken, M.

    1997-01-01

    Despite the advantages of computer-assisted safety technology, there still is some uneasyness to be observed with respect to the novel processes, resulting from absence of a body of generally accepted and uncontentious qualification guides (regulatory provisions, standards) for safety evaluation of the computer codes applied. Warranty of adequate protection of the population, operators or plant components is an essential aspect in this context, too - as it is in general with reliability and risk assessment of novel technology - so that, due to appropriate legislation still missing, there currently is a licensing risk involved in the introduction of digital safety systems. Nevertheless, there is some extent of agreement within the international community and utility operators about what standards and measures should be applied for qualification of software of relevance to plant safety. The standard IEC 880/IEC 86/ in particular, in its original version, or national documents based on this standard, are applied in all countries using or planning to install those systems. A novel supplement to this standard, document /IEC 96/, is in the process of finalization and defines the requirements to be met by modern methods of software engineering. (orig./DG) [de

  9. Measurement of Actinides in Molybdenum-99 Solution Analytical Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Soderquist, Chuck Z. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weaver, Jamie L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-11-01

    This document is a companion report to a previous report, PNNL 24519, Measurement of Actinides in Molybdenum-99 Solution, A Brief Review of the Literature, August 2015. In this companion report, we report a fast, accurate, newly developed analytical method for measurement of trace alpha-emitting actinide elements in commercial high-activity molybdenum-99 solution. Molybdenum-99 is widely used to produce 99mTc for medical imaging. Because it is used as a radiopharmaceutical, its purity must be proven to be extremely high, particularly for the alpha emitting actinides. The sample of 99Mo solution is measured into a vessel (such as a polyethylene centrifuge tube) and acidified with dilute nitric acid. A gadolinium carrier is added (50 µg). Tracers and spikes are added as necessary. Then the solution is made strongly basic with ammonium hydroxide, which causes the gadolinium carrier to precipitate as hydrous Gd(OH)3. The precipitate of Gd(OH)3 carries all of the actinide elements. The suspension of gadolinium hydroxide is then passed through a membrane filter to make a counting mount suitable for direct alpha spectrometry. The high-activity 99Mo and 99mTc pass through the membrane filter and are separated from the alpha emitters. The gadolinium hydroxide, carrying any trace actinide elements that might be present in the sample, forms a thin, uniform cake on the surface of the membrane filter. The filter cake is first washed with dilute ammonium hydroxide to push the last traces of molybdate through, then with water. The filter is then mounted on a stainless steel counting disk. Finally, the alpha emitting actinide elements are measured by alpha spectrometry.

  10. Measurement of Actinides in Molybdenum-99 Solution Analytical Procedure

    International Nuclear Information System (INIS)

    Soderquist, Chuck Z.; Weaver, Jamie L.

    2015-01-01

    This document is a companion report to a previous report, PNNL 24519, Measurement of Actinides in Molybdenum-99 Solution, A Brief Review of the Literature, August 2015. In this companion report, we report a fast, accurate, newly developed analytical method for measurement of trace alpha-emitting actinide elements in commercial high-activity molybdenum-99 solution. Molybdenum-99 is widely used to produce 99m Tc for medical imaging. Because it is used as a radiopharmaceutical, its purity must be proven to be extremely high, particularly for the alpha emitting actinides. The sample of 99 Mo solution is measured into a vessel (such as a polyethylene centrifuge tube) and acidified with dilute nitric acid. A gadolinium carrier is added (50 µg). Tracers and spikes are added as necessary. Then the solution is made strongly basic with ammonium hydroxide, which causes the gadolinium carrier to precipitate as hydrous Gd(OH) 3 . The precipitate of Gd(OH) 3 carries all of the actinide elements. The suspension of gadolinium hydroxide is then passed through a membrane filter to make a counting mount suitable for direct alpha spectrometry. The high-activity 99 Mo and 99m Tc pass through the membrane filter and are separated from the alpha emitters. The gadolinium hydroxide, carrying any trace actinide elements that might be present in the sample, forms a thin, uniform cake on the surface of the membrane filter. The filter cake is first washed with dilute ammonium hydroxide to push the last traces of molybdate through, then with water. The filter is then mounted on a stainless steel counting disk. Finally, the alpha emitting actinide elements are measured by alpha spectrometry.

  11. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases.

    Science.gov (United States)

    Lluveras-Tenorio, Anna; Mazurek, Joy; Restivo, Annalaura; Colombini, Maria Perla; Bonaduce, Ilaria

    2012-10-10

    Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in

  12. The Analytical Pragmatic Structure of Procedural Due Process: A Framework for Inquiry in Administrative Decision Making.

    Science.gov (United States)

    Fisher, James E.; Sealey, Ronald W.

    The study describes the analytical pragmatic structure of concepts and applies this structure to the legal concept of procedural due process. This structure consists of form, purpose, content, and function. The study conclusions indicate that the structure of the concept of procedural due process, or any legal concept, is not the same as the…

  13. Analytical procedures for bulk frozen-hydrated biological tissues

    International Nuclear Information System (INIS)

    Echlin, P.; Hayes, T.L.; McKoon, M.

    1983-01-01

    The main advantage of using solid frozen samples for elemental x-ray microanalysis is the ease with which they may be prepared and maintained in the frozen-hydrated state. Within the limits imposed by the reduced spatial resolution of the method, the morphological identification of the tissue components is comparatively easy. Bearing in mind these limitations, the authors have carried out an analysis for several elements in the developing root tips of Lemna minor L (Duckweed). Fresh root tips of Lemna minor L, briefly encapsulated in a polymeric cryoprotectant, are quench frozen in melting nitrogen at ca. 70 0 K and transferred to the pre-cooled cold stage of an AMray Biochamber. The analysis was carried out by means of a Kevex energy-dispersive detector by use of the peak-to-background ratio method. These procedures allow the authors to obtain flat fracture faces in which they have been able to measure the relative concentrations of various elements at the various stages of differentiation in the root tissue

  14. External exposure from radionuclides in soil: analytical vs. simulation procedures

    International Nuclear Information System (INIS)

    Velasco, Hugo; Rizzotto, Marcos

    2008-01-01

    Full text: The external gamma irradiation resulting from radionuclides deposited on the ground surface can be an important source of radiation exposure. The assessment of this irradiation is extremely complex due to the large number of environmental factors which affect the gamma photon flux in air originating from the ground. The source energy affects the interaction between the radiation and the medium, and the characteristics and the properties of the soil are the most relevant factors to determine the energy and the angular distribution of gamma radiation in air 1 m above the ground surface. From an analytical point of view the calculations are based on the point-kernel integration method and assume that the source concentration at any depth in soil is uniform over an infinite surface parallel to the ground plane. The dose-rate factor is applied to environmental dose assessments by means of the general equation: H(t)= χ (t) x DRF where H is the external dose rate at time t, χ is the source concentration at the location of the exposed individual, and DRF is the dose-rate factor. Dose-rate factors in air at a height of 1 m above ground are tabulated for discrete photon energies between 0.01 and 10 MeV and for source depths in soil between 0 and 300 cm. These factors were determined for sources distributed in a slab of finite thickness and sources which are exponentially distributed with depth. A Monte Carlo algorithm was developed to simulate the gamma photons transport calculation for the soil/air configuration. In this case the soil constituents were assumed to be similar to those on the earth's crust. The model considers the gamma photons source distributed uniformly in the soil profile, from the ground surface to a depth beyond which the soil is considered uncontaminated. Source gamma photons were randomly selected from the contaminated soil zone and their subsequent interactions determined by the probability of occurrence via photoelectric effect, Compton

  15. Validity, reliability and support for implementation of independence-scaled procedural assessment in laparoscopic surgery.

    Science.gov (United States)

    Kramp, Kelvin H; van Det, Marc J; Veeger, Nic J G M; Pierie, Jean-Pierre E N

    2016-06-01

    There is no widely used method to evaluate procedure-specific laparoscopic skills. The first aim of this study was to develop a procedure-based assessment method. The second aim was to compare its validity, reliability and feasibility with currently available global rating scales (GRSs). An independence-scaled procedural assessment was created by linking the procedural key steps of the laparoscopic cholecystectomy to an independence scale. Subtitled and blinded videos of a novice, an intermediate and an almost competent trainee, were evaluated with GRSs (OSATS and GOALS) and the independence-scaled procedural assessment by seven surgeons, three senior trainees and six scrub nurses. Participants received a short introduction to the GRSs and independence-scaled procedural assessment before assessment. The validity was estimated with the Friedman and Wilcoxon test and the reliability with the intra-class correlation coefficient (ICC). A questionnaire was used to evaluate user opinion. Independence-scaled procedural assessment and GRS scores improved significantly with surgical experience (OSATS p = 0.001, GOALS p < 0.001, independence-scaled procedural assessment p < 0.001). The ICCs of the OSATS, GOALS and independence-scaled procedural assessment were 0.78, 0.74 and 0.84, respectively, among surgeons. The ICCs increased when the ratings of scrub nurses were added to those of the surgeons. The independence-scaled procedural assessment was not considered more of an administrative burden than the GRSs (p = 0.692). A procedural assessment created by combining procedural key steps to an independence scale is a valid, reliable and acceptable assessment instrument in surgery. In contrast to the GRSs, the reliability of the independence-scaled procedural assessment exceeded the threshold of 0.8, indicating that it can also be used for summative assessment. It furthermore seems that scrub nurses can assess the operative competence of surgical trainees.

  16. Improving Wind Turbine Drivetrain Reliability Using a Combined Experimental, Computational, and Analytical Approach

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; van Dam, J.; Bergua, R.; Jove, J.; Campbell, J.

    2015-03-01

    Nontorque loads induced by the wind turbine rotor overhang weight and aerodynamic forces can greatly affect drivetrain loads and responses. If not addressed properly, these loads can result in a decrease in gearbox component life. This work uses analytical modeling, computational modeling, and experimental data to evaluate a unique drivetrain design that minimizes the effects of nontorque loads on gearbox reliability: the Pure Torque(R) drivetrain developed by Alstom. The drivetrain has a hub-support configuration that transmits nontorque loads directly into the tower rather than through the gearbox as in other design approaches. An analytical model of Alstom's Pure Torque drivetrain provides insight into the relationships among turbine component weights, aerodynamic forces, and the resulting drivetrain loads. Main shaft bending loads are orders of magnitude lower than the rated torque and are hardly affected by wind conditions and turbine operations.

  17. Analytical procedures for determining Pb and Sr isotopic compositions in water samples by ID-TIMS

    Directory of Open Access Journals (Sweden)

    Veridiana Martins

    2008-01-01

    Full Text Available Few articles deal with lead and strontium isotopic analysis of water samples. The aim of this study was to define the chemical procedures for Pb and Sr isotopic analyses of groundwater samples from an urban sedimentary aquifer. Thirty lead and fourteen strontium isotopic analyses were performed to test different analytical procedures. Pb and Sr isotopic ratios as well as Sr concentration did not vary using different chemical procedures. However, the Pb concentrations were very dependent on the different procedures. Therefore, the choice of the best analytical procedure was based on the Pb results, which indicated a higher reproducibility from samples that had been filtered and acidified before the evaporation, had their residues totally dissolved, and were purified by ion chromatography using the Biorad® column. Our results showed no changes in Pb ratios with the storage time.

  18. Assessment of passive drag in swimming by numerical simulation and analytical procedure.

    Science.gov (United States)

    Barbosa, Tiago M; Ramos, Rui; Silva, António J; Marinho, Daniel A

    2018-03-01

    The aim was to compare the passive drag-gliding underwater by a numerical simulation and an analytical procedure. An Olympic swimmer was scanned by computer tomography and modelled gliding at a 0.75-m depth in the streamlined position. Steady-state computer fluid dynamics (CFD) analyses were performed on Fluent. A set of analytical procedures was selected concurrently. Friction drag (D f ), pressure drag (D pr ), total passive drag force (D f +pr ) and drag coefficient (C D ) were computed between 1.3 and 2.5 m · s -1 by both techniques. D f +pr ranged from 45.44 to 144.06 N with CFD, from 46.03 to 167.06 N with the analytical procedure (differences: from 1.28% to 13.77%). C D ranged between 0.698 and 0.622 by CFD, 0.657 and 0.644 by analytical procedures (differences: 0.40-6.30%). Linear regression models showed a very high association for D f +pr plotted in absolute values (R 2  = 0.98) and after log-log transformation (R 2  = 0.99). The C D also obtained a very high adjustment for both absolute (R 2  = 0.97) and log-log plots (R 2  = 0.97). The bias for the D f +pr was 8.37 N and 0.076 N after logarithmic transformation. D f represented between 15.97% and 18.82% of the D f +pr by the CFD, 14.66% and 16.21% by the analytical procedures. Therefore, despite the bias, analytical procedures offer a feasible way of gathering insight on one's hydrodynamics characteristics.

  19. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    Science.gov (United States)

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. A Simplified Procedure for Reliability Estimation of Underground Concrete Barriers against Normal Missile Impact

    Directory of Open Access Journals (Sweden)

    N. A. Siddiqui

    2011-06-01

    Full Text Available Underground concrete barriers are frequently used to protect strategic structures like Nuclear power plants (NPP, deep under the soil against any possible high velocity missile impact. For a given range and type of missile (or projectile it is of paramount importance to examine the reliability of underground concrete barriers under expected uncertainties involved in the missile, concrete, and soil parameters. In this paper, a simple procedure for the reliability assessment of underground concrete barriers against normal missile impact has been presented using the First Order Reliability Method (FORM. The presented procedure is illustrated by applying it to a concrete barrier that lies at a certain depth in the soil. Some parametric studies are also conducted to obtain the design values which make the barrier as reliable as desired.

  1. Adjoint sensitivity analysis procedure of Markov chains with applications on reliability of IFMIF accelerator-system facilities

    Energy Technology Data Exchange (ETDEWEB)

    Balan, I.

    2005-05-01

    This work presents the implementation of the Adjoint Sensitivity Analysis Procedure (ASAP) for the Continuous Time, Discrete Space Markov chains (CTMC), as an alternative to the other computational expensive methods. In order to develop this procedure as an end product in reliability studies, the reliability of the physical systems is analyzed using a coupled Fault-Tree - Markov chain technique, i.e. the abstraction of the physical system is performed using as the high level interface the Fault-Tree and afterwards this one is automatically converted into a Markov chain. The resulting differential equations based on the Markov chain model are solved in order to evaluate the system reliability. Further sensitivity analyses using ASAP applied to CTMC equations are performed to study the influence of uncertainties in input data to the reliability measures and to get the confidence in the final reliability results. The methods to generate the Markov chain and the ASAP for the Markov chain equations have been implemented into the new computer code system QUEFT/MARKOMAGS/MCADJSEN for reliability and sensitivity analysis of physical systems. The validation of this code system has been carried out by using simple problems for which analytical solutions can be obtained. Typical sensitivity results show that the numerical solution using ASAP is robust, stable and accurate. The method and the code system developed during this work can be used further as an efficient and flexible tool to evaluate the sensitivities of reliability measures for any physical system analyzed using the Markov chain. Reliability and sensitivity analyses using these methods have been performed during this work for the IFMIF Accelerator System Facilities. The reliability studies using Markov chain have been concentrated around the availability of the main subsystems of this complex physical system for a typical mission time. The sensitivity studies for two typical responses using ASAP have been

  2. Reliability of stellar inclination estimated from asteroseismology: analytical criteria, mock simulations and Kepler data analysis

    Science.gov (United States)

    Kamiaka, Shoya; Benomar, Othman; Suto, Yasushi

    2018-05-01

    Advances in asteroseismology of solar-like stars, now provide a unique method to estimate the stellar inclination i⋆. This enables to evaluate the spin-orbit angle of transiting planetary systems, in a complementary fashion to the Rossiter-McLaughlineffect, a well-established method to estimate the projected spin-orbit angle λ. Although the asteroseismic method has been broadly applied to the Kepler data, its reliability has yet to be assessed intensively. In this work, we evaluate the accuracy of i⋆ from asteroseismology of solar-like stars using 3000 simulated power spectra. We find that the low signal-to-noise ratio of the power spectra induces a systematic under-estimate (over-estimate) bias for stars with high (low) inclinations. We derive analytical criteria for the reliable asteroseismic estimate, which indicates that reliable measurements are possible in the range of 20° ≲ i⋆ ≲ 80° only for stars with high signal-to-noise ratio. We also analyse and measure the stellar inclination of 94 Kepler main-sequence solar-like stars, among which 33 are planetary hosts. According to our reliability criteria, a third of them (9 with planets, 22 without) have accurate stellar inclination. Comparison of our asteroseismic estimate of vsin i⋆ against spectroscopic measurements indicates that the latter suffers from a large uncertainty possibly due to the modeling of macro-turbulence, especially for stars with projected rotation speed vsin i⋆ ≲ 5km/s. This reinforces earlier claims, and the stellar inclination estimated from the combination of measurements from spectroscopy and photometric variation for slowly rotating stars needs to be interpreted with caution.

  3. Applying Behavior Analytic Procedures to Effectively Teach Literacy Skills in the Classroom

    Science.gov (United States)

    Joseph, Laurice M.; Alber-Morgan, Sheila; Neef, Nancy

    2016-01-01

    The purpose of this article is to discuss the application of behavior analytic procedures for advancing and evaluating methods for teaching literacy skills in the classroom. Particularly, applied behavior analysis has contributed substantially to examining the relationship between teacher behavior and student literacy performance. Teacher…

  4. Measurement of HDO Products Using GC-TCD: Towards Obtaining Reliable Analytical Data

    Directory of Open Access Journals (Sweden)

    Zuas Oman

    2018-03-01

    Full Text Available This paper reported the method development and validation of a gas chromatography with thermal conductivity detector (GC-TCD method for the measurement of the gaseous products of hydrodeoxygenation (HDO. The method validation parameters include selectivity, precision (repeatability and reproducibility, accuracy, linearity, limit of detection (LoD, limit of quantitation (LoQ, and robustness. The results showed that the developed method was able to separate the target components (H2, CO2, CH4 and CO from their mixtures without any special sample treatment. The validated method was selective, precise, accurate, and robust. Application of the developed and validated GC-TCD method to the measurement of by-product components of HDO of bio-oil revealed a good performance with relative standard deviation (RSD less than 1.0% for all target components, implying that the process of method development and validation provides a trustworthy way of obtaining reliable analytical data.

  5. Analytical quality assurance procedures developed for the IAEA's Reference Asian Man Project (Phase 2)

    International Nuclear Information System (INIS)

    Kawamura, H.; Parr, R.M.; Dang, H.S.; Tian, W.; Barnes, R.M.; Iyengar, G.V.

    2000-01-01

    Analytical quality assurance procedures adopted for use in the IAEA Co-ordinated Research Project on Ingestion and Organ Content of Trace Elements of Importance in Radiological Protection are designed to ensure comparability of the analytical results for Cs, I, Sr, Th, U and other elements in human tissues and diets collected and analysed in nine participating countries. The main analytical techniques are NAA and ICP-MS. For sample preparation, all participants are using identical food blenders which have been centrally supplied after testing for contamination. For quality control of the analyses, six NIST SRMs covering a range of matrices with certified and reference values for the elements of interest have been distributed. A new Japanese reference diet material has also been developed. These quality assurance procedures are summarized here and new data are presented for Cs, I, Sr, Th and U in the NIST SRMs. (author)

  6. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  7. Systems reliability analyses and risk analyses for the licencing procedure under atomic law

    International Nuclear Information System (INIS)

    Berning, A.; Spindler, H.

    1983-01-01

    For the licencing procedure under atomic law in accordance with Article 7 AtG, the nuclear power plant as a whole needs to be assessed, plus the reliability of systems and plant components that are essential to safety are to be determined with probabilistic methods. This requirement is the consequence of safety criteria for nuclear power plants issued by the Home Department (BMI). Systems reliability studies and risk analyses used in licencing procedures under atomic law are identified. The stress is on licencing decisions, mainly for PWR-type reactors. Reactor Safety Commission (RSK) guidelines, examples of reasoning in legal proceedings and arguments put forth by objectors are also dealt with. Correlations are shown between reliability analyses made by experts and licencing decisions by means of examples. (orig./HP) [de

  8. A Hybrid Approach for Reliability Analysis Based on Analytic Hierarchy Process and Bayesian Network

    International Nuclear Information System (INIS)

    Zubair, Muhammad

    2014-01-01

    By using analytic hierarchy process (AHP) and Bayesian Network (BN) the present research signifies the technical and non-technical issues of nuclear accidents. The study exposed that the technical faults was one major reason of these accidents. Keep an eye on other point of view it becomes clearer that human behavior like dishonesty, insufficient training, and selfishness are also play a key role to cause these accidents. In this study, a hybrid approach for reliability analysis based on AHP and BN to increase nuclear power plant (NPP) safety has been developed. By using AHP, best alternative to improve safety, design, operation, and to allocate budget for all technical and non-technical factors related with nuclear safety has been investigated. We use a special structure of BN based on the method AHP. The graphs of the BN and the probabilities associated with nodes are designed to translate the knowledge of experts on the selection of best alternative. The results show that the improvement in regulatory authorities will decrease failure probabilities and increase safety and reliability in industrial area.

  9. Validation of an advanced analytical procedure applied to the measurement of environmental radioactivity.

    Science.gov (United States)

    Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van

    2018-04-01

    In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. CONSIDERATIONS FOR THE TREATMENT OF COMPUTERIZED PROCEDURES IN HUMAN RELIABILITY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-07-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  11. HUMAN RELIABILITY ANALYSIS FOR COMPUTERIZED PROCEDURES, PART TWO: APPLICABILITY OF CURRENT METHODS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-10-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no U.S. nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  12. Development of an Analytical Procedure for the Determination of Multiclass Compounds for Forensic Veterinary Toxicology.

    Science.gov (United States)

    Sell, Bartosz; Sniegocki, Tomasz; Zmudzki, Jan; Posyniak, Andrzej

    2018-04-01

    Reported here is a new analytical multiclass method based on QuEChERS technique, which has proven to be effective in diagnosing fatal poisoning cases in animals. This method has been developed for the determination of analytes in liver samples comprising rodenticides, carbamate and organophosphorus pesticides, coccidiostats and mycotoxins. The procedure entails addition of acetonitrile and sodium acetate to 2 g of homogenized liver sample. The mixture was shaken intensively and centrifuged for phase separation, which was followed by an organic phase transfer into a tube containing sorbents (PSA and C18) and magnesium sulfate, then it was centrifuged, the supernatant was filtered and analyzed by liquid chromatography tandem mass spectrometry. A validation of the procedure was performed. Repeatability variation coefficients forensic toxicology cases.

  13. Definition, development, and demonstration of analytical procedures for the structured assessment approach. Final report

    International Nuclear Information System (INIS)

    1979-01-01

    Analytical procedures were refined for the Structural Assessment Approach for assessing the Material Control and Accounting systems at facilities that contain special nuclear material. Requirements were established for an efficient, feasible algorithm to be used in evaluating system performance measures that involve the probability of detection. Algorithm requirements to calculate the probability of detection for a given type of adversary and the target set are described

  14. A review of simple multiple criteria decision making analytic procedures which are implementable on spreadsheet packages

    Directory of Open Access Journals (Sweden)

    T.J. Stewart

    2003-12-01

    Full Text Available A number of modern multi-criteria decision making aids for the discrete choice problem, are reviewed, with particular emphasis on those which can be implemented on standard commercial spreadsheet packages. Three broad classes of procedures are discussed, namely the analytic hierarchy process, reference point methods, and outranking methods. The broad principles are summarised in a consistent framework, and on a spreadsheet. LOTUS spreadsheets implementing these are available from the author.

  15. Methodological procedures and analytical instruments to evaluate an indicators integrated archive for urban management

    International Nuclear Information System (INIS)

    Del Ciello, R.; Napoleoni, S.

    1998-01-01

    This guide provides the results of a research developed at ENEA (National Agency for new Technology, Energy and the Environment) Casaccia center (Rome, Italy) aimed to define methodological procedures and analytical instruments needed to carry out an indicators integrated archive for urban management. The guide also defines the scheme of a negotiation process aimed to reach and exchange data and information among governmental and local administrations, non-governmental organizations and scientific bodies [it

  16. Simple and reliable procedure for the evaluation of short-term dynamic processes in power systems

    Energy Technology Data Exchange (ETDEWEB)

    Popovic, D P

    1986-10-01

    An efficient approach is presented to the solution of the short-term dynamics model in power systems. It consists of an adequate algebraic treatment of the original system of nonlinear differential equations, using linearization, decomposition and Cauchy's formula. The simple difference equations obtained in this way are incorporated into a model of the electrical network, which is of a low order compared to the ones usually used. Newton's method is applied to the model formed in this way, which leads to a simple and reliable iterative procedure. The characteristics of the procedure developed are demonstrated on examples of transient stability analysis of real power systems. 12 refs.

  17. Validity, reliability, feasibility, acceptability and educational impact of direct observation of procedural skills (DOPS).

    Science.gov (United States)

    Naeem, Naghma

    2013-01-01

    Direct observation of procedural skills (DOPS) is a new workplace assessment tool. The aim of this narrative review of literature is to summarize the available evidence about the validity, reliability, feasibility, acceptability and educational impact of DOPS. A PubMed database and Google search of the literature on DOPS published from January 2000 to January 2012 was conducted which yielded 30 articles. Thirteen articles were selected for full text reading and review. In the reviewed literature, DOPS was found to be a useful tool for assessment of procedural skills, but further research is required to prove its utility as a workplace based assessment instrument.

  18. The Usefulness of Analytical Procedures - An Empirical Approach in the Auditing Sector in Portugal

    Directory of Open Access Journals (Sweden)

    Carlos Pinho

    2014-08-01

    Full Text Available The conceptual conflict between the efficiency and efficacy on financial auditing arises from the fact that resources are scarce, both in terms of the time available to carry out the audit and the quality and timeliness of the information available to the external auditor. Audits tend to be more efficient, the lower the combination of inherent risk and control risk is assessed to be, allowing the auditor to carry out less extensive and less timely auditing tests, meaning that in some cases analytical audit procedures are a good tool to support the opinions formed by the auditor. This research, by means of an empirical study of financial auditing in Portugal, aims to evaluate the extent to which analytical procedures are used during a financial audit engagement in Portugal, throughout the different phases involved in auditing. The conclusions point to the fact that, in general terms and regardless of the size of the audit company and the way in which professionals work, Portuguese auditors use analytical procedures more frequently during the planning phase rather than during the phase of evidence gathering and the phase of opinion formation.

  19. General analytical procedure for determination of acidity parameters of weak acids and bases.

    Science.gov (United States)

    Pilarski, Bogusław; Kaliszan, Roman; Wyrzykowski, Dariusz; Młodzianowski, Janusz; Balińska, Agata

    2015-01-01

    The paper presents a new convenient, inexpensive, and reagent-saving general methodology for the determination of pK a values for components of the mixture of diverse chemical classes weak organic acids and bases in water solution, without the need to separate individual analytes. The data obtained from simple pH-metric microtitrations are numerically processed into reliable pK a values for each component of the mixture. Excellent agreement has been obtained between the determined pK a values and the reference literature data for compounds studied.

  20. Reliability assessment of a manual-based procedure towards learning curve modeling and fmea analysis

    Directory of Open Access Journals (Sweden)

    Gustavo Rech

    2013-03-01

    Full Text Available Separation procedures in drug Distribution Centers (DC are manual-based activities prone to failures such as shipping exchanged, expired or broken drugs to the customer. Two interventions seem as promising in improving the reliability in the separation procedure: (i selection and allocation of appropriate operators to the procedure, and (ii analysis of potential failure modes incurred by selected operators. This article integrates Learning Curves (LC and FMEA (Failure Mode and Effect Analysis aimed at reducing the occurrence of failures in the manual separation of a drug DC. LCs parameters enable generating an index to identify the recommended operators to perform the procedures. The FMEA is then applied to the separation procedure carried out by the selected operators in order to identify failure modes. It also deployed the traditional FMEA severity index into two sub-indexes related to financial issues and damage to company´s image in order to characterize failures severity. When applied to a drug DC, the proposed method significantly reduced the frequency and severity of failures in the separation procedure.

  1. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  2. Pilot testing of SHRP 2 reliability data and analytical products: Washington.

    Science.gov (United States)

    2014-07-30

    The second Strategic Highway Research Program (SHRP 2) addresses the challenges of moving people and goods efficiently and safely on the nations highways. In its Reliability focus area, the research emphasizes improving the reliability of highway ...

  3. Analytical procedures for water-soluble vitamins in foods and dietary supplements: a review.

    Science.gov (United States)

    Blake, Christopher J

    2007-09-01

    Water-soluble vitamins include the B-group vitamins and vitamin C. In order to correctly monitor water-soluble vitamin content in fortified foods for compliance monitoring as well as to establish accurate data banks, an accurate and precise analytical method is a prerequisite. For many years microbiological assays have been used for analysis of B vitamins. However they are no longer considered to be the gold standard in vitamins analysis as many studies have shown up their deficiencies. This review describes the current status of analytical methods, including microbiological assays and spectrophotometric, biosensor and chromatographic techniques. In particular it describes the current status of the official methods and highlights some new developments in chromatographic procedures and detection methods. An overview is made of multivitamin extractions and analyses for foods and supplements.

  4. A procedure to obtain reliable pair distribution functions of non-crystalline materials from diffraction data

    International Nuclear Information System (INIS)

    Hansen, F.Y.; Carneiro, K.

    1977-01-01

    A simple numerical method, which unifies the calculation of structure factors from X-ray or neutron diffraction data with the calculation of reliable pair distribution functions, is described. The objective of the method is to eliminate systematic errors in the normalizations and corrections of the intensity data, and to provide measures for elimination of truncation errors without losing information about the structure. This is done through an iterative procedure, which is easy to program for computers. The applications to amorphous selenium and diatomic liquids are briefly reviewed. (Auth.)

  5. Analytical procedures for the determination of strontium radionuclides in environmental materials

    International Nuclear Information System (INIS)

    Harvey, B.R.; Ibbett, R.D.; Lovett, M.B.; Williams, K.J.

    1989-01-01

    As part of its statutory role in the authorisation, monitoring and research relating to radioactive wastes discharged into the aquatic environment, the Aquatic Environment Protection Division of the Directorate of Fisheries Research (DFR), Lowestoft routinely carries out analyses for a substantial number of radionuclides in a wide range of environmental materials. The Ministry of a Agriculture, Fisheries and Food has for many years required information about the concentrations of strontium radionuclides in waters, sediments and biological materials. There are not absolute standard methods for such radiochemical analysis; indeed none are required because methodology is continually developing. A very considerable amount of expertise has been developed in the analysis of radiostrontium at the Laboratory since the late 1950s, when detailed analysis first commenced, and the procedures described in this report have been developed and tested over a long period of time with a view to achieving the highest analytical quality. Full details of the practical, analytical and computational procedures, as currently used, are given in the Appendix. (author)

  6. An analytical inductor design procedure for three-phase PWM converters in power factor correction applications

    DEFF Research Database (Denmark)

    Kouchaki, Alireza; Niroumand, Farideh Javidi; Haase, Frerk

    2015-01-01

    This paper presents an analytical method for designing the inductor of three-phase power factor correction converters (PFCs). The complex behavior of the inductor current complicates the inductor design procedure as well as the core loss and copper loss calculations. Therefore, this paper analyze...... to calculate the core loss in the PFC application. To investigate the impact of the dc link voltage level, two inductors for different dc voltage levels are designed and the results are compared.......This paper presents an analytical method for designing the inductor of three-phase power factor correction converters (PFCs). The complex behavior of the inductor current complicates the inductor design procedure as well as the core loss and copper loss calculations. Therefore, this paper analyzes...... circuit is used to provide the inductor current harmonic spectrum. Therefore, using the harmonic spectrum, the low and high frequency copper losses are calculated. The high frequency minor B-H loops in one switching cycle are also analyzed. Then, the loss map provided by the measurement setup is used...

  7. Metrological reliability of the calibration procedure in terms of air kerma using the ionization chamber NE2575

    International Nuclear Information System (INIS)

    Guimaraes, Margarete Cristina; Silva, Teogenes Augusto da; Rosado, Paulo H.G.

    2016-01-01

    Metrology laboratories are expected to provide X radiation beams that were established by international standardization organizations to perform calibration and testing of dosimeters. Reliable and traceable standard dosimeters should be used in the calibration procedure. The aim of this work was to study the reliability of the NE 2575 ionization chamber used as standard dosimeter for the air kerma calibration procedure adopted in the CDTN Calibration Laboratory. (author)

  8. The validity and reliability of value-added and target-setting procedures with special reference to Key Stage 3

    OpenAIRE

    Moody, Ian Robin

    2003-01-01

    The validity of value-added systems of measurement is crucially dependent upon there being a demonstrably unambiguous relationship between the so-called baseline, or intake measures, and any subsequent measure of performance at a later stage. The reliability of such procedures is dependent on the relationships between these two measures being relatively stable over time. A number of questions arise with regard to both the validity and reliability of value-added procedures at any level in educ...

  9. Establishing Reliable Cognitive Change in Children with Epilepsy: The Procedures and Results for a Sample with Epilepsy

    Science.gov (United States)

    van Iterson, Loretta; Augustijn, Paul B.; de Jong, Peter F.; van der Leij, Aryan

    2013-01-01

    The goal of this study was to investigate reliable cognitive change in epilepsy by developing computational procedures to determine reliable change index scores (RCIs) for the Dutch Wechsler Intelligence Scales for Children. First, RCIs were calculated based on stability coefficients from a reference sample. Then, these RCIs were applied to a…

  10. Establishing reliable cognitive change in children with epilepsy: The procedures and results for a sample with epilepsy

    NARCIS (Netherlands)

    van Iterson, L.; Augustijn, P.B.; de Jong, P.F.; van der Leij, A.

    2013-01-01

    The goal of this study was to investigate reliable cognitive change in epilepsy by developing computational procedures to determine reliable change index scores (RCIs) for the Dutch Wechsler Intelligence Scales for Children. First, RCIs were calculated based on stability coefficients from a

  11. Differences in metabolite profiles caused by pre-analytical blood processing procedures.

    Science.gov (United States)

    Nishiumi, Shin; Suzuki, Makoto; Kobayashi, Takashi; Yoshida, Masaru

    2018-05-01

    Recently, the use of metabolomic analysis of human serum and plasma for biomarker discovery and disease diagnosis in clinical studies has been increasing. The feasibility of using a metabolite biomarker for disease diagnosis is strongly dependent on the metabolite's stability during pre-analytical blood processing procedures, such as serum or plasma sampling and sample storage prior to centrifugation. However, the influence of blood processing procedures on the stability of metabolites has not been fully characterized. In the present study, we compared the levels of metabolites in matched human serum and plasma samples using gas chromatography coupled with mass spectrometry and liquid chromatography coupled with mass spectrometry. In addition, we evaluated the changes in plasma metabolite levels induced by storage at room temperature or at a cold temperature prior to centrifugation. As a result, it was found that 76 metabolites exhibited significant differences between their serum and plasma levels. Furthermore, the pre-centrifugation storage conditions significantly affected the plasma levels of 45 metabolites. These results highlight the importance of blood processing procedures during metabolome analysis, which should be considered during biomarker discovery and the subsequent use of biomarkers for disease diagnosis. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  12. Pilot testing of SHRP 2 reliability data and analytical products: Southern California.

    Science.gov (United States)

    2015-01-01

    The second Strategic Highway Research Program (SHRP 2) has been investigating the critical subject of travel time reliability for several years. As part of this research, SHRP 2 supported multiple efforts to develop products to evaluate travel time r...

  13. Pilot testing of SHRP 2 reliability data and analytical products: Florida.

    Science.gov (United States)

    2015-01-01

    Transportation agencies have realized the importance of performance estimation, measurement, and management. The Moving Ahead for Progress in the 21st Century Act legislation identifies travel time reliability as one of the goals of the federal highw...

  14. Pilot testing of SHRP 2 reliability data and analytical products: Southern California. [supporting datasets

    Science.gov (United States)

    2014-01-01

    The objective of this project was to develop system designs for programs to monitor travel time reliability and to prepare a guidebook that practitioners and others can use to design, build, operate, and maintain such systems. Generally, such travel ...

  15. An analytical model for computation of reliability of waste management facilities with intermediate storages

    International Nuclear Information System (INIS)

    Kallweit, A.; Schumacher, F.

    1977-01-01

    A high reliability is called for waste management facilities within the fuel cycle of nuclear power stations which can be fulfilled by providing intermediate storage facilities and reserve capacities. In this report a model based on the theory of Markov processes is described which allows computation of reliability characteristics of waste management facilities containing intermediate storage facilities. The application of the model is demonstrated by an example. (orig.) [de

  16. A limited assessment of the ASEP human reliability analysis procedure using simulator examination results

    International Nuclear Information System (INIS)

    Gore, B.R.; Dukelow, J.S. Jr.; Mitts, T.M.; Nicholson, W.L.

    1995-10-01

    This report presents a limited assessment of the conservatism of the Accident Sequence Evaluation Program (ASEP) human reliability analysis (HRA) procedure described in NUREG/CR-4772. In particular, the, ASEP post-accident, post-diagnosis, nominal HRA procedure is assessed within the context of an individual's performance of critical tasks on the simulator portion of requalification examinations administered to nuclear power plant operators. An assessment of the degree to which operator perforn:Lance during simulator examinations is an accurate reflection of operator performance during actual accident conditions was outside the scope of work for this project; therefore, no direct inference can be made from this report about such performance. The data for this study are derived from simulator examination reports from the NRC requalification examination cycle. A total of 4071 critical tasks were identified, of which 45 had been failed. The ASEP procedure was used to estimate human error probability (HEP) values for critical tasks, and the HEP results were compared with the failure rates observed in the examinations. The ASEP procedure was applied by PNL operator license examiners who supplemented the limited information in the examination reports with expert judgment based upon their extensive simulator examination experience. ASEP analyses were performed for a sample of 162 critical tasks selected randomly from the 4071, and the results were used to characterize the entire population. ASEP analyses were also performed for all of the 45 failed critical tasks. Two tests were performed to assess the bias of the ASEP HEPs compared with the data from the requalification examinations. The first compared the average of the ASEP HEP values with the fraction of the population actually failed and it found a statistically significant factor of two bias on the average

  17. Procedure prediction from symbolic Electronic Health Records via time intervals analytics.

    Science.gov (United States)

    Moskovitch, Robert; Polubriaginof, Fernanda; Weiss, Aviram; Ryan, Patrick; Tatonetti, Nicholas

    2017-11-01

    Prediction of medical events, such as clinical procedures, is essential for preventing disease, understanding disease mechanism, and increasing patient quality of care. Although longitudinal clinical data from Electronic Health Records provides opportunities to develop predictive models, the use of these data faces significant challenges. Primarily, while the data are longitudinal and represent thousands of conceptual events having duration, they are also sparse, complicating the application of traditional analysis approaches. Furthermore, the framework presented here takes advantage of the events duration and gaps. International standards for electronic healthcare data represent data elements, such as procedures, conditions, and drug exposures, using eras, or time intervals. Such eras contain both an event and a duration and enable the application of time intervals mining - a relatively new subfield of data mining. In this study, we present Maitreya, a framework for time intervals analytics in longitudinal clinical data. Maitreya discovers frequent time intervals related patterns (TIRPs), which we use as prognostic markers for modelling clinical events. We introduce three novel TIRP metrics that are normalized versions of the horizontal-support, that represents the number of TIRP instances per patient. We evaluate Maitreya on 28 frequent and clinically important procedures, using the three novel TIRP representation metrics in comparison to no temporal representation and previous TIRPs metrics. We also evaluate the epsilon value that makes Allen's relations more flexible with several settings of 30, 60, 90 and 180days in comparison to the default zero. For twenty-two of these procedures, the use of temporal patterns as predictors was superior to non-temporal features, and the use of the vertically normalized horizontal support metric to represent TIRPs as features was most effective. The use of the epsilon value with thirty days was slightly better than the zero

  18. An analytical procedure to evaluate electronic integrals for molecular quantum mechanical calculations

    International Nuclear Information System (INIS)

    Mundim, Kleber C.

    2004-01-01

    Full text: We propose an alternative methodology for the calculation of electronic integrals, through an analytical function based on the generalized Gaussian function (q Gaussian), where a single q Gaussian replaces the usual linear combination of Gaussian functions for different basis set. Moreover, the integrals become analytical functions of the interatomic distances. Therefore, when estimating certain quantities such as molecular energy, g Gaussian avoid new calculations of the integrals: they are simply another value of the corresponding function. The procedure proposed here is particularly advantageous, when compared with the usual one, because it reduces drastically the number of two-electronic integrals used in the construction of the Fock matrix, enabling the use of the quantum mechanics in the description of macro-molecular systems. This advantage increases when the size of the molecular systems become larger and more complex. While in the usual approach CPU time increases with n4, in the one proposed here the CPU time scales linearly with n. This catastrophic dependence of the rank the Hamiltonian or Fock matrix with n4 two-electron integrals is a severe bottleneck for petaFLOPS computing time. Its is important to emphasize that this methodology is equally applicable to systems of any sizes, including biomolecules, solid materials and solutions, within the HF, post-HF and DFT theories. (author)

  19. Reliability and validity of procedure-based assessments in otolaryngology training.

    Science.gov (United States)

    Awad, Zaid; Hayden, Lindsay; Robson, Andrew K; Muthuswamy, Keerthini; Tolley, Neil S

    2015-06-01

    To investigate the reliability and construct validity of procedure-based assessment (PBA) in assessing performance and progress in otolaryngology training. Retrospective database analysis using a national electronic database. We analyzed PBAs of otolaryngology trainees in North London from core trainees (CTs) to specialty trainees (STs). The tool contains six multi-item domains: consent, planning, preparation, exposure/closure, technique, and postoperative care, rated as "satisfactory" or "development required," in addition to an overall performance rating (pS) of 1 to 4. Individual domain score, overall calculated score (cS), and number of "development-required" items were calculated for each PBA. Receiver operating characteristic analysis helped determine sensitivity and specificity. There were 3,152 otolaryngology PBAs from 46 otolaryngology trainees analyzed. PBA reliability was high (Cronbach's α 0.899), and sensitivity approached 99%. cS correlated positively with pS and level in training (rs : +0.681 and +0.324, respectively). ST had higher cS and pS than CT (93% ± 0.6 and 3.2 ± 0.03 vs. 71% ± 3.1 and 2.3 ± 0.08, respectively; P reliable and valid for assessing otolaryngology trainees' performance and progress at all levels. It is highly sensitive in identifying competent trainees. The tool is used in a formative and feedback capacity. The technical domain is the best predictor and should be given close attention. NA. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  20. A Comparison of Result Reliability for Investigation of Milk Composition by Alternative Analytical Methods in Czech Republic

    Directory of Open Access Journals (Sweden)

    Oto Hanuš

    2014-01-01

    Full Text Available The milk analyse result reliability is important for assurance of foodstuff chain quality. There are more direct and indirect methods for milk composition measurement (fat (F, protein (P, lactose (L and solids non fat (SNF content. The goal was to evaluate some reference and routine milk analytical procedures on result basis. The direct reference analyses were: F, fat content (Röse–Gottlieb method; P, crude protein content (Kjeldahl method; L, lactose (monohydrate, polarimetric method; SNF, solids non fat (gravimetric method. F, P, L and SNF were determined also by various indirect methods: – MIR (infrared (IR technology with optical filters, 7 instruments in 4 labs; – MIR–FT (IR spectroscopy with Fourier’s transformations, 10 in 6; – ultrasonic method (UM, 3 in 1; – analysis by the blue and red box (BRB, 1 v 1. There were used 10 reference milk samples. Coefficient of determination (R2, correlation coefficient (r and standard deviation of the mean of individual differences (MDsd, for n were evaluated. All correlations (r; for all indirect and alternative methods and all milk components were significant (P ≤ 0.001. MIR and MIR–FT (conventional methods explained considerably higher proportion of the variability in reference results than the UM and BRB methods (alternative. All r average values (x minus 1.64 × sd for 95% confidence interval can be used as standards for calibration quality evaluation (MIR, MIR–FT, UM and BRB: – for F 0.997, 0.997, 0.99 and 0.995; – for P 0.986, 0.981, 0.828 and 0.864; – for L 0.968, 0.871, 0.705 and 0.761; – for SNF 0.992, 0.993, 0.911 and 0.872. Similarly ​MDsd (x plus 1.64 × sd: – for F 0.071, 0.068, 0.132 and 0.101%; – for P 0.051, 0.054, 0.202 and 0.14%; – for L 0.037, 0.074, 0.113 and 0.11%; – for SNF 0.052, 0.068, 0.141 and 0.204.

  1. In situ impulse test: an experimental and analytical evaluation of data interpretation procedures

    International Nuclear Information System (INIS)

    1975-08-01

    Special experimental field testing and analytical studies were undertaken at Fort Lawton in Seattle, Washington, to study ''close-in'' wave propagation and evaluate data interpretation procedures for a new in situ impulse test. This test was developed to determine the shear wave velocity and dynamic modulus of soils underlying potential nuclear power plant sites. The test is different from conventional geophysical testing in that the velocity variation with strain is determined for each test. In general, strains between 10 -1 and 10 -3 percent are achieved. The experimental field work consisted of performing special tests in a large test sand fill to obtain detailed ''close-in'' data. Six recording transducers were placed at various points on the energy source, while approximately 37 different transducers were installed within the soil fill, all within 7 feet of the energy source. Velocity measurements were then taken simultaneously under controlled test conditions to study shear wave propagation phenomenology and help evaluate data interpretation procedures. Typical test data are presented along with detailed descriptions of the results

  2. Towards reliable multi-hop broadcast in VANETs : An analytical approach

    NARCIS (Netherlands)

    Gholibeigi, M.; Baratchi, M.; Berg, J.L. van den; Heijenk, G.

    2017-01-01

    Intelligent Transportation Systems in the domain of vehicular networking, have recently been subject to rapid development. In vehicular ad hoc networks, data broadcast is one of the main communication types and its reliability is crucial for high performance applications. However, due to the lack of

  3. Towards Reliable Multi-Hop Broadcast in VANETs: An Analytical Approach

    NARCIS (Netherlands)

    Gholibeigi, Mozhdeh; Baratchi, Mitra; van den Berg, Hans Leo; Heijenk, Geert

    2016-01-01

    Intelligent Transportation Systems in the domain of vehicular networking, have recently been subject to rapid development. In vehicular ad hoc networks, data broadcast is one of the main communication types and its reliability is crucial for high performance applications. However, due to the lack of

  4. An analytical procedure for computing smooth transitions between two specified cross sections with applications to blended wing body configuration

    Science.gov (United States)

    Barger, R. L.

    1982-01-01

    An analytical procedure is described for designing smooth transition surfaces for blended wing-body configurations. Starting from two specified cross section shapes, the procedure generates a gradual transition from one cross section shape to the other as an analytic blend of the two shapes. The method utilizes a conformal mapping, with subsequent translation and scaling, to transform the specified and shapes to curves that can be combined more smoothly. A sample calculation is applied to a blended wing-body missile type configuration with a top mounted inlet.

  5. A new modular procedure for industrial plant simulations and its reliable implementation

    International Nuclear Information System (INIS)

    Carcasci, C.; Marini, L.; Morini, B.; Porcelli, M.

    2016-01-01

    Modeling of industrial plants, and especially energy systems, has become increasingly important in industrial engineering and the need for accurate information on their behavior has grown along with the complexity of the industrial processes. Consequently, accurate and flexible simulation tools became essential yielding the development of modular codes. The aim of this work is to propose a new modular mathematical modeling for industrial plant simulation and its reliable numerical implementation. Regardless of their layout, a large class of plant's configurations is modeled by a library of elementary parts; then the physical properties, compositions of the working fluid, and plant's performance are estimated. Each plant component is represented by equations modeling fundamental mechanical and thermodynamic laws and giving rise to a system of algebraic nonlinear equations; remarkably, suitable restrictions on the variables of such nonlinear equations are imposed to guarantee solutions of physical meaning. The proposed numerical procedure combines an outer iterative process which refines plants characteristic parameters and an inner one which solves the arising nonlinear systems and consists of a trust-region solver for bound-constrained nonlinear equalities. The new procedure has been validated performing simulations against an existing modular tool on two compression train arrangements with both series and parallel-mounted compressors. - Highlights: • A numerical modular tool for industrial plants simulation is presented. • Mathematical modeling is thoroughly described. • Solution of the nonlinear system is performed by a trust-region Gauss–Newton solver. • A detailed explanation of the optimization solver named TRESNEI is provided. • Code flexibility and robustness are investigated through numerical simulations.

  6. [The analytical reliability of clinical laboratory information and role of the standards in its support].

    Science.gov (United States)

    Men'shikov, V V

    2012-12-01

    The article deals with the factors impacting the reliability of clinical laboratory information. The differences of qualities of laboratory analysis tools produced by various manufacturers are discussed. These characteristics are the causes of discrepancy of the results of laboratory analyses of the same analite. The role of the reference system in supporting the comparability of laboratory analysis results is demonstrated. The project of national standard is presented to regulate the requirements to standards and calibrators for analysis of qualitative and non-metrical characteristics of components of biomaterials.

  7. Reliable and Efficient Procedure for Steady-State Analysis of Nonautonomous and Autonomous Systems

    Directory of Open Access Journals (Sweden)

    J. Dobes

    2012-04-01

    Full Text Available The majority of contemporary design tools do not still contain steady-state algorithms, especially for the autonomous systems. This is mainly caused by insufficient accuracy of the algorithm for numerical integration, but also by unreliable steady-state algorithms themselves. Therefore, in the paper, a very stable and efficient procedure for the numerical integration of nonlinear differential-algebraic systems is defined first. Afterwards, two improved methods are defined for finding the steady state, which use this integration algorithm in their iteration loops. The first is based on the idea of extrapolation, and the second utilizes nonstandard time-domain sensitivity analysis. The two steady-state algorithms are compared by analyses of a rectifier and a C-class amplifier, and the extrapolation algorithm is primarily selected as a more reliable alternative. Finally, the method based on the extrapolation naturally cooperating with the algorithm for solving the differential-algebraic systems is thoroughly tested on various electronic circuits: Van der Pol and Colpitts oscillators, fragment of a large bipolar logical circuit, feedback and distributed microwave oscillators, and power amplifier. The results confirm that the extrapolation method is faster than a classical plain numerical integration, especially for larger circuits with complicated transients.

  8. High-reliability microcontroller nerve stimulator for assistance in regional anaesthesia procedures.

    Science.gov (United States)

    Ferri, Carlos A; Quevedo, Antonio A F

    2017-07-01

    In the last decades, the use of nerve stimulators to aid in regional anaesthesia has been shown to benefit the patient since it allows a better location of the nerve plexus, leading to correct positioning of the needle through which the anaesthetic is applied. However, most of the nerve stimulators available in the market for this purpose do not have the minimum recommended features for a good stimulator, and this can lead to risks to the patient. Thus, this study aims to develop an equipment, using embedded electronics, which meets all the characteristics, for a successful blockade. The system is made of modules for generation and overall control of the current pulse and the patient and user interfaces. The results show that the designed system fits into required specifications for a good and reliable nerve stimulator. Linearity proved satisfactory, ensuring accuracy in electrical current amplitude for a wide range of body impedances. Field tests have proven very successful. The anaesthesiologist that used the system reported that, in all cases, plexus blocking was achieved with higher quality, faster anaesthetic diffusion and without needed of an additional dose when compared with same procedure without the use of the device.

  9. Gearbox Reliability Collaborative Analytic Formulation for the Evaluation of Spline Couplings

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Yi [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keller, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Errichello, Robert [GEARTECH, Houston, TX (United States); Halse, Chris [Romax Technology, Nottingham (United Kingdom)

    2013-12-01

    Gearboxes in wind turbines have not been achieving their expected design life; however, they commonly meet and exceed the design criteria specified in current standards in the gear, bearing, and wind turbine industry as well as third-party certification criteria. The cost of gearbox replacements and rebuilds, as well as the down time associated with these failures, has elevated the cost of wind energy. The National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) was established by the U.S. Department of Energy in 2006; its key goal is to understand the root causes of premature gearbox failures and improve their reliability using a combined approach of dynamometer testing, field testing, and modeling. As part of the GRC program, this paper investigates the design of the spline coupling often used in modern wind turbine gearboxes to connect the planetary and helical gear stages. Aside from transmitting the driving torque, another common function of the spline coupling is to allow the sun to float between the planets. The amount the sun can float is determined by the spline design and the sun shaft flexibility subject to the operational loads. Current standards address spline coupling design requirements in varying detail. This report provides additional insight beyond these current standards to quickly evaluate spline coupling designs.

  10. Atmospheric Deposition: Sampling Procedures, Analytical Methods, and Main Recent Findings from the Scientific Literature

    Directory of Open Access Journals (Sweden)

    M. Amodio

    2014-01-01

    Full Text Available The atmosphere is a carrier on which some natural and anthropogenic organic and inorganic chemicals are transported, and the wet and dry deposition events are the most important processes that remove those chemicals, depositing it on soil and water. A wide variety of different collectors were tested to evaluate site-specificity, seasonality and daily variability of settleable particle concentrations. Deposition fluxes of POPs showed spatial and seasonal variations, diagnostic ratios of PAHs on deposited particles, allowed the discrimination between pyrolytic or petrogenic sources. Congener pattern analysis and bulk deposition fluxes in rural sites confirmed long-range atmospheric transport of PCDDs/Fs. More and more sophisticated and newly designed deposition samplers have being used for characterization of deposited mercury, demonstrating the importance of rain scavenging and the relatively higher magnitude of Hg deposition from Chinese anthropogenic sources. Recently biological monitors demonstrated that PAH concentrations in lichens were comparable with concentrations measured in a conventional active sampler in an outdoor environment. In this review the authors explore the methodological approaches used for the assessment of atmospheric deposition, from the analysis of the sampling methods, the analytical procedures for chemical characterization of pollutants and the main results from the scientific literature.

  11. A Modified GC-MS Analytical Procedure for Separation and Detection of Multiple Classes of Carbohydrates

    Directory of Open Access Journals (Sweden)

    Yong-Gang Xia

    2018-05-01

    Full Text Available A modified GC-MS analytical procedure based on trimethylsilyl-dithioacetal (TMSD derivatization has been established for a simultaneous determination of thirteen carbohydrates. Different from previous approaches, the current GC-MS method was featured by a powerful practicability for simultaneous detection of aldoses, uronic acids, ketoses, and amino sugars; simplifying GC-MS chromatograms and producing a single peak for each derivatized sugar, as well as high resolution, sensitivity, and repeatability. An additional liquid-liquid extraction from derivatization mixtures was performed not only to increase the detection sensitivity of amino sugars but also to decrease the by-products of derivatization. Contrarily, three amino sugars were detected at a very low intensity or not detected at all. The effect of time on monosaccharide- mercaptalated reaction was systematically investigated. The effect of trimethylsilylation on the formation of TMSD was also optimized. The established GC-MS based on TMSD derivatization was suitable for complex carbohydrate analysis and has been successfully applied for the detection of free carbohydrates in water extracts of Anemarrhena asphodeloides roots and determination of monosaccharides in Glossy ganoderma polysaccharides.

  12. Scenes for Social Information Processing in Adolescence: Item and factor analytic procedures for psychometric appraisal.

    Science.gov (United States)

    Vagos, Paula; Rijo, Daniel; Santos, Isabel M

    2016-04-01

    Relatively little is known about measures used to investigate the validity and applications of social information processing theory. The Scenes for Social Information Processing in Adolescence includes items built using a participatory approach to evaluate the attribution of intent, emotion intensity, response evaluation, and response decision steps of social information processing. We evaluated a sample of 802 Portuguese adolescents (61.5% female; mean age = 16.44 years old) using this instrument. Item analysis and exploratory and confirmatory factor analytic procedures were used for psychometric examination. Two measures for attribution of intent were produced, including hostile and neutral; along with 3 emotion measures, focused on negative emotional states; 8 response evaluation measures; and 4 response decision measures, including prosocial and impaired social behavior. All of these measures achieved good internal consistency values and fit indicators. Boys seemed to favor and choose overt and relational aggression behaviors more often; girls conveyed higher levels of neutral attribution, sadness, and assertiveness and passiveness. The Scenes for Social Information Processing in Adolescence achieved adequate psychometric results and seems a valuable alternative for evaluating social information processing, even if it is essential to continue investigation into its internal and external validity. (c) 2016 APA, all rights reserved.

  13. Establishing the analytical procedure for acetate in water by ion chromatography method

    International Nuclear Information System (INIS)

    Nguyen Thi Hong Thinh; Ha Lan Anh; Vo Thi Anh

    2015-01-01

    In recent studies of contamination sources of arsenic, ammonium, iron, organic carbon in groundwater, acetate is measured a lot because it is the main decomposition product of organic compounds from sediment into groundwater. In order to better support for the study of the origin and mobilization mechanism of the pollutants, acetate was studied analysis method in Isotopes Hydrology Laboratory using ion chromatography technique. Project Researchers used Ion Chromatography system - DX-600 including IonPac ICE-AS1 column for separating acetate and conductivity detector CD 25 to quantify acetate in water samples. The study results showed that project team has successfully developed analytical procedures of acetate in water with acetate’s retention time is 12 minutes, limit of detection (LOD) of the method was 0.01 ppm. The accuracy of the method was established by calculating the precision and bias of 10 analysis times of a standard sample at content levels 1 ppm and 8 ppm. The results of the 10 measurements are satisfiable about precision and bias with repeated standard deviation coefficient CVR were 1.3% and 0.2% and the recoveries R were 99.92% and 101.72%. (author)

  14. Analytic tools for investigating the structure of network reliability measures with regard to observation correlations

    Science.gov (United States)

    Prószyński, W.; Kwaśniak, M.

    2018-03-01

    A global measure of observation correlations in a network is proposed, together with the auxiliary indices related to non-diagonal elements of the correlation matrix. Based on the above global measure, a specific representation of the correlation matrix is presented, being the result of rigorously proven theorem formulated within the present research. According to the theorem, each positive definite correlation matrix can be expressed by a scale factor and a so-called internal weight matrix. Such a representation made it possible to investigate the structure of the basic reliability measures with regard to observation correlations. Numerical examples carried out for two test networks illustrate the structure of those measures that proved to be dependent on global correlation index. Also, the levels of global correlation are proposed. It is shown that one can readily find an approximate value of the global correlation index, and hence the correlation level, for the expected values of auxiliary indices being the only knowledge about a correlation matrix of interest. The paper is an extended continuation of the previous study of authors that was confined to the elementary case termed uniform correlation. The extension covers arbitrary correlation matrices and a structure of correlation effect.

  15. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  16. Reliability of the spent fuel identification for flask loading procedure used by COGEMA for fuel transport to La Hague

    International Nuclear Information System (INIS)

    Eid, M.; Zachar, M.; Pretesacque, P.

    1991-01-01

    The Spent Fuel Identification for Flask Loading (SFIFL) procedure designed by COGEMA is analysed and its reliability calculated. The reliability of the procedure is defined as the probability of transporting only approved fuel elements for a given number of shipments. The procedure describes a non-coherent system. A non-coherent system is the one in which two successive failures could result in a success, from the system mission point of view. A technique that describes the system with the help of its maximal cuts (states) is used for calculations. A maximal cut contains more than one failure which can split into two cuts (sub-states). Cuts splitting will enable us to analyse, in a systematic way, non-coherent systems with independent basic components. (author)

  17. Reliability of the spent fuel identification for flask loading procedure used by COGEMA for fuel transport to La Hague

    International Nuclear Information System (INIS)

    Eid, M.; Zachar, M.; Pretesacque, P.

    1990-01-01

    The Spent Fuel Identification for Flask Loading, SFIFL, procedure designed by COGEMA is analysed and its reliability is calculated. The reliability of the procedure is defined as the probability of transporting only approved fuel elements for a given number of shipments. The procedure describes a non-coherent system. A non-coherent system is the one in which two successive failures could result in a success, from the system mission point of view. A technique that describes the system with the help of its maximal cuts (states), is used for calculations. A maximal cut contains more than one failure can split into two cuts, (sub-states). Cuts splitting will enable us to analyse, in a systematic way, non-coherent systems with independent basic components. (author)

  18. AN ANALYTICAL FRAMEWORK FOR ASSESSING RELIABLE NUCLEAR FUEL SERVICE APPROACHES: ECONOMIC AND NON-PROLIFERATION MERITS OF NUCLEAR FUEL LEASING

    International Nuclear Information System (INIS)

    Kreyling, Sean J.; Brothers, Alan J.; Short, Steven M.; Phillips, Jon R.; Weimar, Mark R.

    2010-01-01

    The goal of international nuclear policy since the dawn of nuclear power has been the peaceful expansion of nuclear energy while controlling the spread of enrichment and reprocessing technology. Numerous initiatives undertaken in the intervening decades to develop international agreements on providing nuclear fuel supply assurances, or reliable nuclear fuel services (RNFS) attempted to control the spread of sensitive nuclear materials and technology. In order to inform the international debate and the development of government policy, PNNL has been developing an analytical framework to holistically evaluate the economics and non-proliferation merits of alternative approaches to managing the nuclear fuel cycle (i.e., cradle-to-grave). This paper provides an overview of the analytical framework and discusses preliminary results of an economic assessment of one RNFS approach: full-service nuclear fuel leasing. The specific focus of this paper is the metrics under development to systematically evaluate the non-proliferation merits of fuel-cycle management alternatives. Also discussed is the utility of an integrated assessment of the economics and non-proliferation merits of nuclear fuel leasing.

  19. The effect on reliability and sensitivity to level of training of combining analytic and holistic rating scales for assessing communication skills in an internal medicine resident OSCE.

    Science.gov (United States)

    Daniels, Vijay John; Harley, Dwight

    2017-07-01

    Although previous research has compared checklists to rating scales for assessing communication, the purpose of this study was to compare the effect on reliability and sensitivity to level of training of an analytic, a holistic, and a combined analytic-holistic rating scale in assessing communication skills. The University of Alberta Internal Medicine Residency runs OSCEs for postgraduate year (PGY) 1 and 2 residents and another for PGY-4 residents. Communication stations were scored with an analytic scale (empathy, non-verbal skills, verbal skills, and coherence subscales) and a holistic scale. Authors analyzed reliability of individual and combined scales using generalizability theory and evaluated each scale's sensitivity to level of training. For analytic, holistic, and combined scales, 12, 12, and 11 stations respectively yielded a Phi of 0.8 for the PGY-1,2 cohort, and 16, 16, and 14 stations yielded a Phi of 0.8 for the PGY-4 cohort. PGY-4 residents scored higher on the combined scale, the analytic rating scale, and the non-verbal and coherence subscales. A combined analytic-holistic rating scale increased score reliability and was sensitive to level of training. Given increased validity evidence, OSCE developers should consider combining analytic and holistic scales when assessing communication skills. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. A competency based selection procedure for Dutch postgraduate GP training: a pilot study on validity and reliability.

    Science.gov (United States)

    Vermeulen, Margit I; Tromp, Fred; Zuithoff, Nicolaas P A; Pieters, Ron H M; Damoiseaux, Roger A M J; Kuyvenhoven, Marijke M

    2014-12-01

    Abstract Background: Historically, semi-structured interviews (SSI) have been the core of the Dutch selection for postgraduate general practice (GP) training. This paper describes a pilot study on a newly designed competency-based selection procedure that assesses whether candidates have the competencies that are required to complete GP training. The objective was to explore reliability and validity aspects of the instruments developed. The new selection procedure comprising the National GP Knowledge Test (LHK), a situational judgement tests (SJT), a patterned behaviour descriptive interview (PBDI) and a simulated encounter (SIM) was piloted alongside the current procedure. Forty-seven candidates volunteered in both procedures. Admission decision was based on the results of the current procedure. Study participants did hardly differ from the other candidates. The mean scores of the candidates on the LHK and SJT were 21.9 % (SD 8.7) and 83.8% (SD 3.1), respectively. The mean self-reported competency scores (PBDI) were higher than the observed competencies (SIM): 3.7(SD 0.5) and 2.9(SD 0.6), respectively. Content-related competencies showed low correlations with one another when measured with different instruments, whereas more diverse competencies measured by a single instrument showed strong to moderate correlations. Moreover, a moderate correlation between LHK and SJT was found. The internal consistencies (intraclass correlation, ICC) of LHK and SJT were poor while the ICC of PBDI and SIM showed acceptable levels of reliability. Findings on content validity and reliability of these new instruments are promising to realize a competency based procedure. Further development of the instruments and research on predictive validity should be pursued.

  1. Portfolio assessment during medical internships: How to obtain a reliable and feasible assessment procedure?

    Science.gov (United States)

    Michels, Nele R M; Driessen, Erik W; Muijtjens, Arno M M; Van Gaal, Luc F; Bossaert, Leo L; De Winter, Benedicte Y

    2009-12-01

    A portfolio is used to mentor and assess students' clinical performance at the workplace. However, students and raters often perceive the portfolio as a time-consuming instrument. In this study, we investigated whether assessment during medical internship by a portfolio can combine reliability and feasibility. The domain-oriented reliability of 61 double-rated portfolios was measured, using a generalisability analysis with portfolio tasks and raters as sources of variation in measuring the performance of a student. We obtained reliability (Phi coefficient) of 0.87 with this internship portfolio containing 15 double-rated tasks. The generalisability analysis showed that an acceptable level of reliability (Phi = 0.80) was maintained when the amount of portfolio tasks was decreased to 13 or 9 using one and two raters, respectively. Our study shows that a portfolio can be a reliable method for the assessment of workplace learning. The possibility of reducing the amount of tasks or raters while maintaining a sufficient level of reliability suggests an increase in feasibility of portfolio use for both students and raters.

  2. ASSESSING GOING CONCERN ASSUMPTION BY USING RATING VALUATION MODELS BASED UPON ANALYTICAL PROCEDURES IN CASE OF FINANCIAL INVESTMENT COMPANIES

    OpenAIRE

    Tatiana Danescu; Ovidiu Spatacean; Paula Nistor; Andrea Cristina Danescu

    2010-01-01

    Designing and performing analytical procedures aimed to assess the rating of theFinancial Investment Companies are essential activities both in the phase of planning a financialaudit mission and in the phase of issuing conclusions regarding the suitability of using by themanagement and other persons responsible for governance of going concern, as the basis forpreparation and disclosure of financial statements. The paper aims to examine the usefulness ofrecognized models used in the practice o...

  3. General Procedure for the Easy Calculation of pH in an Introductory Course of General or Analytical Chemistry

    Science.gov (United States)

    Cepriá, Gemma; Salvatella, Luis

    2014-01-01

    All pH calculations for simple acid-base systems used in introductory courses on general or analytical chemistry can be carried out by using a general procedure requiring the use of predominance diagrams. In particular, the pH is calculated as the sum of an independent term equaling the average pK[subscript a] values of the acids involved in the…

  4. Basic Concepts in Classical Test Theory: Tests Aren't Reliable, the Nature of Alpha, and Reliability Generalization as a Meta-analytic Method.

    Science.gov (United States)

    Helms, LuAnn Sherbeck

    This paper discusses the fact that reliability is about scores and not tests and how reliability limits effect sizes. The paper also explores the classical reliability coefficients of stability, equivalence, and internal consistency. Stability is concerned with how stable test scores will be over time, while equivalence addresses the relationship…

  5. Design-related influencing factors of the computerized procedure system for inclusion into human reliability analysis of the advanced control room

    International Nuclear Information System (INIS)

    Kim, Jaewhan; Lee, Seung Jun; Jang, Seung Cheol; Ahn, Kwang-Il; Shin, Yeong Cheol

    2013-01-01

    This paper presents major design factors of the computerized procedure system (CPS) by task characteristics/requirements, with individual relative weight evaluated by the analytic hierarchy process (AHP) technique, for inclusion into human reliability analysis (HRA) of the advanced control rooms. Task characteristics/requirements of an individual procedural step are classified into four categories according to the dynamic characteristics of an emergency situation: (1) a single-static step, (2) a single-dynamic and single-checking step, (3) a single-dynamic and continuous-monitoring step, and (4) a multiple-dynamic and continuous-monitoring step. According to the importance ranking evaluation by the AHP technique, ‘clearness of the instruction for taking action’, ‘clearness of the instruction and its structure for rule interpretation’, and ‘adequate provision of requisite information’ were rated as of being higher importance for all the task classifications. Importance of ‘adequacy of the monitoring function’ and ‘adequacy of representation of the dynamic link or relationship between procedural steps’ is dependent upon task characteristics. The result of the present study gives a valuable insight on which design factors of the CPS should be incorporated, with relative importance or weight between design factors, into HRA of the advanced control rooms. (author)

  6. Development of reliable analytical tools for evaluating the influence of reductive winemaking on the quality of Lugana wines.

    Science.gov (United States)

    Mattivi, Fulvio; Fedrizzi, Bruno; Zenato, Alberto; Tiefenthaler, Paolo; Tempesta, Silvano; Perenzoni, Daniele; Cantarella, Paolo; Simeoni, Federico; Vrhovsek, Urska

    2012-06-30

    This paper presents methods for the definition of important analytical tools, such as the development of sensitive and rapid methods for analysing reduced and oxidised glutathione (GSH and GSSG), hydroxycinnamic acids (HCA), bound thiols (GSH-3MH and Cys-3MH) and free thiols (3MH and 3MHA), and their first application to evaluate the effect of reductive winemaking on the composition of Lugana juices and wines. Lugana is a traditional white wine from the Lake Garda region (Italy), produced using a local grape variety, Trebbiano di Lugana. An innovative winemaking procedure based on preliminary cooling of grape berries followed by crushing in an inert environment was implemented and explored on a winery scale. The effects of these procedures on hydroxycinnamic acids, GSH, GSSG, free and bound thiols and flavanols content were investigated. The juices and wines produced using different protocols were examined. Moreover, wines aged in tanks for 1, 2 and 3 months were analysed. The high level of GSH found in Lugana grapes, which can act as a natural antioxidant and be preserved in must and young wines, thus reducing the need of exogenous antioxidants, was particularly interesting. Moreover, it was clear that polyphenol concentrations (hydroxycinnamic acids and catechins) were strongly influenced by winemaking and pressing conditions, which required fine tuning of pressing. Above-threshold levels of 3-mercaptohexan-1-ol (3MH) and 3-mercaptohexyl acetate (3MHA) were found in the wines and changed according to the winemaking procedure applied. Interestingly, the evolution during the first three months also varied depending on the procedure adopted. Organic synthesis of cysteine and glutathione conjugates was carried out and juices and wines were subjected to LC-MS/MS analysis. These two molecules appeared to be strongly affected by the winemaking procedure, but did not show any significant change during the first 3 months of post-bottling ageing. This supports the theory

  7. Determination of Total Solids and Ash in Algal Biomass: Laboratory Analytical Procedure (LAP)

    Energy Technology Data Exchange (ETDEWEB)

    Van Wychen, Stefanie; Laurens, Lieve M. L.

    2016-01-13

    This procedure describes the methods used to determine the amount of moisture or total solids present in a freeze-dried algal biomass sample, as well as the ash content. A traditional convection oven drying procedure is covered for total solids content, and a dry oxidation method at 575 deg. C is covered for ash content.

  8. Do strict rules and moving images increase the reliability of sequential identification procedures?.

    OpenAIRE

    Valentine, Tim; Darling, Stephen; Memon, Amina

    2007-01-01

    Live identification procedures in England and Wales have been replaced by use of video, which provides a sequential presentation of facial images. Sequential presentation of photographs provides some protection to innocent suspects from mistaken identification when used with strict instructions designed to prevent relative judgements (Lindsay, Lea & Fulford, 1991). However, the current procedure in England and Wales is incompatible with these strict instructions. The reported research investi...

  9. Dynamic control of the lumbopelvic complex; lack of reliability of established test procedures

    DEFF Research Database (Denmark)

    Henriksen, Marius; Lund, Hans; Bliddal, Henning

    2007-01-01

    used in order to account for learning effects. Intraclass correlation coefficients were low for the sitting (0.54) and supported standing positions (0.36). In the standing position, a significant difference between test and retest was observed (P = 0.003) and further reliability analysis was therefore...

  10. A Procedure to Obtain Reliable Pair Distribution Functions of Non-Crystalline Materials from Diffraction Data

    DEFF Research Database (Denmark)

    Hansen, Flemming Yssing; Carneiro, K.

    1977-01-01

    A simple numerical method, which unifies the calculation of structure factors from X-ray or neutron diffraction data with the calculation of reliable pair distribution functions, is described. The objective of the method is to eliminate systematic errors in the normalizations and corrections of t...

  11. Orthogonal Higher Order Structure of the WISC-IV Spanish Using Hierarchical Exploratory Factor Analytic Procedures

    Science.gov (United States)

    McGill, Ryan J.; Canivez, Gary L.

    2016-01-01

    As recommended by Carroll, the present study examined the factor structure of the Wechsler Intelligence Scale for Children-Fourth Edition Spanish (WISC-IV Spanish) normative sample using higher order exploratory factor analytic techniques not included in the WISC-IV Spanish Technical Manual. Results indicated that the WISC-IV Spanish subtests were…

  12. Fused Deposition Modeling 3D Printing for (Bio)analytical Device Fabrication : Procedures, Materials, and Applications

    NARCIS (Netherlands)

    Salentijn, Gert Ij; Oomen, Pieter E; Grajewski, Maciej; Verpoorte, Elisabeth

    2017-01-01

    In this work, the use of fused deposition modeling (FDM) in a (bio)analytical/lab-on-a-chip research laboratory is described. First, the specifications of this 3D printing method that are important for the fabrication of (micro)devices were characterized for a benchtop FDM 3D printer. These include

  13. Determination of Total Carbohydrates in Algal Biomass: Laboratory Analytical Procedure (LAP)

    Energy Technology Data Exchange (ETDEWEB)

    Van Wychen, Stefanie; Laurens, Lieve M. L.

    2016-01-13

    This procedure uses two-step sulfuric acid hydrolysis to hydrolyze the polymeric forms of carbohydrates in algal biomass into monomeric subunits. The monomers are then quantified by either HPLC or a suitable spectrophotometric method.

  14. A comparison of analytic procedures for measurement of fractional dextran clearances

    NARCIS (Netherlands)

    Hemmelder, MH; de Jong, PE; de Zeeuw, D

    Fractional dextran clearances have been extensively used to study glomerular size selectivity. We report on an analysis of different laboratory procedures involved in measuring fractional dextran clearances. The deproteinization of plasma samples by 20% trichloroacetic acid (TCA) revealed a protein

  15. Ficolin-2 reveals different analytical and biological properties dependent on different sample handling procedures

    DEFF Research Database (Denmark)

    Hein, Estrid; Bay, Jakob T; Munthe-Fog, Lea

    2013-01-01

    Ficolin-2 (L-ficolin) is a germ line encoded pattern recognition molecule circulating in the blood, and functions as a recognition molecule in the lectin complement pathway. However, consistent and reliable measurements of Ficolin-2 concentration and activity have been difficult to achieve. After...

  16. HALO EFFECT IN ANALYTICAL PROCEDURE: THE IMPACT OF CLIENT PROFILE AND INFORMATION SCOPE

    OpenAIRE

    Intiyas Utami; Indra Wijaya Kusuma; Gudono; Supriyadi

    2014-01-01

    Many auditors use risk-based audit as a methodology that emphasizes assessing audit risk. A holistic perspective during strategic assessment encourages the auditor to focus on the big picture. They understand the industry and client business and determine the risk of material misstatement asan initial hypothesis about the client. Previous research found that a holistic perspective in strategic assessment causes a halo effect. This study focuses on the phenomena of a halo effect in analytical ...

  17. An overview of the IAEA Safety Series on procedures for evaluating the reliability of predictions made by environmental transfer models

    International Nuclear Information System (INIS)

    Hoffman, F.W.; Hofer, E.

    1987-10-01

    The International Atomic Energy Agency is preparing a Safety Series publication on practical approaches for evaluating the reliability of the predictions made by environmental radiological assessment models. This publication identifies factors that affect the reliability of these predictions and discusses methods for quantifying uncertainty. Emphasis is placed on understanding the quantity of interest specified by the assessment question and distinguishing between stochastic variability and lack of knowledge about either the true value or the true distribution of values for quantity of interest. Among the many approaches discussed, model testing using independent data sets (model validation) is considered the best method for evaluating the accuracy in model predictions. Analytical and numerical methods for propagating the uncertainties in model parameters are presented and the strengths and weaknesses of model intercomparison exercises are also discussed. It is recognized that subjective judgment is employed throughout the entire modelling process, and quantitative reliability statements must be subjectively obtained when models are applied to different situations from those under which they have been tested. (6 refs.)

  18. New analysis procedure for fast and reliable size measurement of nanoparticles from atomic force microscopy images

    International Nuclear Information System (INIS)

    Boyd, Robert D.; Cuenat, Alexandre

    2011-01-01

    Accurate size measurement during nanoparticle production is essential for the continuing innovation, quality and safety of nano-enabled products. Size measurement by analysing a number of separate particles individually has particular advantages over ensemble methods. In the latter case nanoparticles have to be well dispersed in a fluid and changes that may occur during analysis, such as agglomeration and degradation, will not be detected which could lead to misleading results. Atomic force microscopy (AFM) allows imaging of particles both in air and liquid, however, the strong interactions between the probe and the particle will cause the broadening of the lateral dimension in the final image. In this paper a new procedure to measure the size of spherical nanoparticles from AFM images via vertical height measurement is described. This procedure will quickly analyse hundred of particles simultaneously and reproduce the measurements obtained from electron microscopy (EM). Nanoparticles samples that were difficult, if not impossible, to analyse with EM were successfully measured using this method. The combination of this procedure with the use of a metrological AFM moves closer to true traceable measurements of nanoparticle dispersions.

  19. Analytical procedure in aseismic design of eccentric structure using response spectrum

    International Nuclear Information System (INIS)

    Takemori, T.; Kuwabara, Y.; Suwabe, A.; Mitsunobu, S.

    1977-01-01

    In this paper, the response are evaluated by the following two methods by the use of the typical torsional analytical models in which masses, rigidities, eccentricities between the centers thereof and several actual earthquake waves are taken as the parameters: (1) the root mean square of responses by using the response spectra derived from the earthquake waves, (2) the time history analysis by using the earthquake wave. The earthquake waves used are chosen to present the different frequency content and magnitude of the response spectra. The typical results derived from the study are as follows: (a) the response accelerations of mass center in the input earthquake direction by the (1) method coincide comparatively well with those by the (2) method, (b) the response accelerations perpendicular to the input earthquake direction by (1) method are 2 to 3 times as much as those by the (2) method, (c) the amplification of the response accelerations at arbitrary points distributed on the spread mass to those of center of the lumped mass by the (1) method are remarkably large compared with those by the (2) method in both directions respectively. These problems on the response spectrum analysis for the above-mentioned eccentric structure are discussed, and an improved analytical method applying the amplification coefficients of responses derived from this parametric time history analysis is proposed to the actual seismic design by the using of the given design ground response spectrum with root mean square technique

  20. Thermogravimetric analytical procedures for determining reactivities of chars from New Zealand coals

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, K.J.; Beamish, B.B.; Rodgers, K.A. [University of Auckland, Auckland (New Zealand). Dept. of Geology

    1997-10-22

    This paper describes how tightly constrained thermogravimetric experimental procedures (particle size {lt} 212 {mu}m, sample mass 15.5 mg, CO{sub 2} reactant gas, near isothermal conditions) allow the reactivity of chars from high volatile New Zealand coals to be determined to a repeatability of {+-}0.07 h{sup -1} at 900{degree}C and {+-}0.5 h{sup -1} at 1100{degree}C. The procedure also provides proximate analyses information and affords a quick ({lt} 90 min) comparison between different coal types as well as indicating likely operating conditions and problems associated with a particular coal or blend. A clear difference is evident between reactivities of differing New Zealand coal ranks. Between 900 and 1100{degree}C, bituminous coals increase thirtyfold in reactivity compared with fourfold for subbituminous, with the latter being three to five times greater in reactivity at higher temperature.

  1. Research And Establishment Of The Analytical Procedure For/Of Sr-90 In Milk Samples

    International Nuclear Information System (INIS)

    Tran Thi Tuyet Mai; Duong Duc Thang; Nguyen Thi Linh; Bui Thi Anh Duong

    2014-01-01

    Sr-90 is an indicator for the transfer radionuclides from environment to human. This work was setup to build a procedure for Sr-90 determination in main popular foodstuff and focus to fresh milk. The deal of this work was establish procedure for Sr-90 , assessment for chemical yield and test sample of Vietnam fresh milk, also in this work, the QA, QC for the procedure was carried out using standard sample of IAEA. The work has been completed for the procedure of determination Sr-90 in milk. The chemical yield of recovery for Y-90 and Sr-90 were at 46.76 % ±1.25% and 0.78 ± 0.086, respectively. The QA & QC program was carried out using reference material IAEA-373. The result parse is appropriate equally and well agreement with the certificate value. Three reference samples were analyses with 15 measurements. The results of Sr-90 concentration after processing statistics given a value at 3.69 Bq/kg with uncertainty of 0.23 Bq/kg. The certificate of IAEA-154 for Sr-90 (half live 28.8 year) is the 6.9 Bq/kg, with the range 95% Confidence Interval as (6.0 -8.0 ) Bq/kg at 31st August 1987. After adjusting decay, the radioactivity at this time is 3.67 Bq/kg. It means that such the result of this work was perfect matching the value of stock index IAEA. Five Vietnam fresh milk samples were analyzed for Sr-90, the specific radioactivity of Sr-90 in milk were in a range from 0.032 to 0.041 Bq/l. (author)

  2. Rapid analytical procedure for determination of mineral oils in edible oil by GC-FID.

    Science.gov (United States)

    Wrona, Magdalena; Pezo, Davinson; Nerin, Cristina

    2013-12-15

    A procedure for the determination of mineral oils in edible oil has been fully developed. The procedure consists of using a sulphuric acid-impregnated silica gel (SAISG) glass column to eliminate the fat matter. A chemical combustion of the fatty acids takes place, while the mineral oils are not affected by the sulphuric acid. The column is eluted with hexane using a vacuum pump and the final extract is concentrated and analysed by gas chromatography (GC) with flame ionisation detector (FID). The detection limit (LOD) and the quantification limit (LOQ) in hexane were 0.07 and 0.21 μg g(-1) respectively and the LOQ in vegetable oil was 1 μg g(-1). Only a few minutes were necessary for sample treatment to have a clean extract. The efficiency of the process, measured through the recoveries from spiked samples of edible oil was higher than 95%. The procedure has been applied to determine mineral oil in olive oil from the retailed market. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Synthesis of [{sup 123}I]IBZM: a reliable procedure for routine clinical studies

    Energy Technology Data Exchange (ETDEWEB)

    Zea-Ponce, Yolanda E-mail: yolanda@neuron.cpmc.columbia.edu; Laruelle, Marc

    1999-08-01

    The single photon emission computed tomography (SPECT) D{sub 2}/D{sub 3} receptor radiotracer [{sup 123}I]IBZM, is prepared by electrophilic radioiodination of the precursor BZM with high-purity sodium [{sup 123}I]iodide in the presence of diluted peracetic acid. However, in our hands, the most commonly used procedure for this radiosynthesis produced variable and inconsistent labeling yields, to such extent that it became inappropriate for routine clinical studies. Our goal was to modify the labeling procedure, to obtain consistently better labeling and radiochemical yields. The best conditions found for the radioiodination were as follows: 50 {mu}g precursor in 50 {mu}L EtOH mixed with buffer pH 2; Na[{sup 123}I]I in 0.1 M NaOH (<180 {mu}L), 50 {mu}L peracetic acid diluted solution, heating at 65 deg. C for 14 min. Purification was achieved by solid phase extraction (SPE) and reverse-phase high performance liquid chromatography (HPLC). Under these conditions, labeling yield average was 76{+-}4% (n=31); radiochemical yield was 69{+-}4% and radiochemical purity was 98{+-}1%. With larger volumes of the Na[{sup 123}I]I solution the yields were consistent but lower. For example, for volumes between 417 and 523 {mu}L the labeling yield was 61{+-}5% (n=21), radiochemical yield was 56{+-} 5% and radiochemical purity was 98{+-}1%.

  4. Synthesis of [123I]IBZM: a reliable procedure for routine clinical studies

    International Nuclear Information System (INIS)

    Zea-Ponce, Yolanda; Laruelle, Marc

    1999-01-01

    The single photon emission computed tomography (SPECT) D 2 /D 3 receptor radiotracer [ 123 I]IBZM, is prepared by electrophilic radioiodination of the precursor BZM with high-purity sodium [ 123 I]iodide in the presence of diluted peracetic acid. However, in our hands, the most commonly used procedure for this radiosynthesis produced variable and inconsistent labeling yields, to such extent that it became inappropriate for routine clinical studies. Our goal was to modify the labeling procedure, to obtain consistently better labeling and radiochemical yields. The best conditions found for the radioiodination were as follows: 50 μg precursor in 50 μL EtOH mixed with buffer pH 2; Na[ 123 I]I in 0.1 M NaOH ( 123 I]I solution the yields were consistent but lower. For example, for volumes between 417 and 523 μL the labeling yield was 61±5% (n=21), radiochemical yield was 56± 5% and radiochemical purity was 98±1%

  5. Standard operating procedures for pre-analytical handling of blood and urine for metabolomic studies and biobanks.

    Science.gov (United States)

    Bernini, Patrizia; Bertini, Ivano; Luchinat, Claudio; Nincheri, Paola; Staderini, Samuele; Turano, Paola

    2011-04-01

    (1)H NMR metabolic profiling of urine, serum and plasma has been used to monitor the impact of the pre-analytical steps on the sample quality and stability in order to propose standard operating procedures (SOPs) for deposition in biobanks. We analyzed the quality of serum and plasma samples as a function of the elapsed time (t = 0-4 h) between blood collection and processing and of the time from processing to freezing (up to 24 h). The stability of the urine metabolic profile over time (up to 24 h) at various storage temperatures was monitored as a function of the different pre-analytical treatments like pre-storage centrifugation, filtration, and addition of the bacteriostatic preservative sodium azide. Appreciable changes in the profiles, reflecting changes in the concentration of a number of metabolites, were detected and discussed in terms of chemical and enzymatic reactions for both blood and urine samples. Appropriate procedures for blood derivatives collection and urine preservation/storage that allow maintaining as much as possible the original metabolic profile of the fresh samples emerge, and are proposed as SOPs for biobanking.

  6. Standard operating procedures for pre-analytical handling of blood and urine for metabolomic studies and biobanks

    International Nuclear Information System (INIS)

    Bernini, Patrizia; Bertini, Ivano; Luchinat, Claudio; Nincheri, Paola; Staderini, Samuele; Turano, Paola

    2011-01-01

    1 H NMR metabolic profiling of urine, serum and plasma has been used to monitor the impact of the pre-analytical steps on the sample quality and stability in order to propose standard operating procedures (SOPs) for deposition in biobanks. We analyzed the quality of serum and plasma samples as a function of the elapsed time (t = 0−4 h) between blood collection and processing and of the time from processing to freezing (up to 24 h). The stability of the urine metabolic profile over time (up to 24 h) at various storage temperatures was monitored as a function of the different pre-analytical treatments like pre-storage centrifugation, filtration, and addition of the bacteriostatic preservative sodium azide. Appreciable changes in the profiles, reflecting changes in the concentration of a number of metabolites, were detected and discussed in terms of chemical and enzymatic reactions for both blood and urine samples. Appropriate procedures for blood derivatives collection and urine preservation/storage that allow maintaining as much as possible the original metabolic profile of the fresh samples emerge, and are proposed as SOPs for biobanking.

  7. Standard operating procedures for pre-analytical handling of blood and urine for metabolomic studies and biobanks

    Energy Technology Data Exchange (ETDEWEB)

    Bernini, Patrizia; Bertini, Ivano, E-mail: bertini@cerm.unifi.it; Luchinat, Claudio [University of Florence, Magnetic Resonance Center (CERM) (Italy); Nincheri, Paola; Staderini, Samuele [FiorGen Foundation (Italy); Turano, Paola [University of Florence, Magnetic Resonance Center (CERM) (Italy)

    2011-04-15

    {sup 1}H NMR metabolic profiling of urine, serum and plasma has been used to monitor the impact of the pre-analytical steps on the sample quality and stability in order to propose standard operating procedures (SOPs) for deposition in biobanks. We analyzed the quality of serum and plasma samples as a function of the elapsed time (t = 0-4 h) between blood collection and processing and of the time from processing to freezing (up to 24 h). The stability of the urine metabolic profile over time (up to 24 h) at various storage temperatures was monitored as a function of the different pre-analytical treatments like pre-storage centrifugation, filtration, and addition of the bacteriostatic preservative sodium azide. Appreciable changes in the profiles, reflecting changes in the concentration of a number of metabolites, were detected and discussed in terms of chemical and enzymatic reactions for both blood and urine samples. Appropriate procedures for blood derivatives collection and urine preservation/storage that allow maintaining as much as possible the original metabolic profile of the fresh samples emerge, and are proposed as SOPs for biobanking.

  8. AN ANALYTICAL STUDY OF EFFICACY OF CORNEAL COLLAGEN CROSSLINKING C3R PROCEDURE IN PROGRESSIVE KERATOCONUS PATIENTS

    Directory of Open Access Journals (Sweden)

    Rajasekar K

    2017-10-01

    Full Text Available BACKGROUND Keratoconus affects a significant number of the general population with conical weakened protruded area from the cornea due to weakening of the corneal stroma by a genetically premeditated preponderance. We see keratoconus as a standalone disease or accompanying other syndrome manifestations in patients. Mainly, the inferotemporal cornea is affected and the conical protrusion causes profound high irregular myopic astigmatism as a refractive error, which is very difficult to correct in progressed advanced stages. Especially in economically productive age group patients, the poor vision becomes very difficult to live with. Corneal collagen crosslinking procedure is a novel tool in the armamentarium of treatment procedures against this malady. MATERIALS AND METHODS This analytical study was conducted at cornea services, Regional Institute of Ophthalmology and Government Ophthalmic Hospital, Chennai, for a period of 14 months. Forty five eyes of forty patients with early progressive keratoconus who presented to cornea services were subjected to riboflavin UVA collagen crosslinking procedures using a standard protocol after getting an informed consent. Further response to treatment were assessed in the follow up period. RESULTS Out of 40 patients in our series, 23 were males and 17 were females. The maximum patients in our series were in the age group between 10 to 25 yrs. Epi-off procedure was done in 31 eyes and epi-on procedure was done in 14 eyes. The patients with pachymetry 400-450 microns underwent epi-on procedure and more than 450 microns underwent epi-off C3R procedure. The K values in our series were between 49D to maximum 63D. The topographic flattening was seen in 52% in epi-on and epi-off procedures. Vision improvement in our series was 57% following epi-on and 65% following epi-off procedures. CONCLUSION C3R is a very promising therapeutic modality that may halt the progression of ectatic process. It is a less invasive mode

  9. Th-U-Pb{sub T} dating by electron probe microanalysis, Part I. Monazite: analytical procedures and data treatment

    Energy Technology Data Exchange (ETDEWEB)

    Vlach, Silvio Roberto Farias [Universidade de Sao Paulo (USP), Sao Carlos, SP (Brazil). Inst. de Geociencias. Dept. de Mineralogia e Geotectonica], e-mail: srfvlach@usp.br

    2010-03-15

    Dating methodology by the electron probe microanalyser (EPMA) of (Th, U)-bearing minerals, highlighting monazite, acquired greater than ever importance in literature, particularly due to its superior spatial resolution, as well as versatility, which allow correlating petrological processes at times registered only in micro-scales in minerals and rocks with absolute ages. Although the accuracy is inferior to the one achieved with conventional isotopic methods in up to an order of magnitude, EPMA is the instrument that allows the best spatial resolution, reaching a few {mu}m{sup 3} in some conditions. Quantification of minor and trace elements with suitable precision and accuracy involves the own instrumental and analytical set-ups and data treatment strategies, significantly more rigorous when compared with those applied in conventional analyses. Th-U-Pb{sub T} dating is an example of these cases. Each EPMA is a unique machine as for its instrumental characteristics and respective automation system. In such a way, analytical procedures ought to be adjusted for laboratory specific cities. The analytical strategies and data treatment adopted in the Electronic Microprobe Laboratory from Instituto de Geociencias of Universidade de Sao Paulo, Brazil, with a JEOL JXA8600S EPMA, and a ThermoNoran-Voyager 4.3 automation system, are presented and compared with the ones used in other laboratories. The influence of instrumental factors and spectral overlaps on Th, U, and Pb quantification is discussed. Applied procedures to interference correction, error propagation, data treatment, and final chemical age presentation as well as to sampling and analyses are emphasized. Some typical applications are discussed, drawing attention to the most relevant aspects of electron microprobe dating. (author)

  10. Th-U-PbT dating by Electron Probe Microanalysis, Part I. Monazite: analytical procedures and data treatment

    International Nuclear Information System (INIS)

    Vlach, Silvio Roberto Farias

    2010-01-01

    Dating methodology by the electron probe microanalyser (EPMA) of (Th, U)-bearing minerals, highlighting monazite, acquired greater than ever importance in literature, particularly due to its superior spatial resolution, as well as versatility, which allow correlating petrological processes at times registered only in micro-scales in minerals and rocks with absolute ages. Although the accuracy is inferior to the one achieved with conventional isotopic methods in up to an order of magnitude, EPMA is the instrument that allows the best spatial resolution, reaching a few μm 3 in some conditions. Quantification of minor and trace elements with suitable precision and accuracy involves the own instrumental and analytical set-ups and data treatment strategies, significantly more rigorous when compared with those applied in conventional analyses. Th-U-Pb T dating is an example of these cases. Each EPMA is a unique machine as for its instrumental characteristics and respective automation system. In such a way, analytical procedures ought to be adjusted for laboratory specificities. The analytical strategies and data treatment adopted in the Electronic Microprobe Laboratory from Instituto de Geociencias of Universidade de Sao Paulo, Brazil, with a JEOL JXA8600S EPMA, and a ThermoNoran-Voyager 4.3 automation system, are presented and compared with the ones used in other laboratories. The influence of instrumental factors and spectral overlaps on Th, U, and Pb quantification is discussed. Applied procedures to interference correction, error propagation, data treatment, and fi nal chemical age presentation as well as to sampling and analyses are emphasized. Some typical applications are discussed, drawing attention to the most relevant aspects of electron microprobe dating. (author)

  11. Development of a reliable estimation procedure of radioactivity inventory in a BWR plant due to neutron irradiation for decommissioning

    Directory of Open Access Journals (Sweden)

    Tanaka Ken-ichi

    2017-01-01

    Full Text Available Reliable information of radioactivity inventory resulted from the radiological characterization is important in order to plan decommissioning planning and is also crucial in order to promote decommissioning in effectiveness and in safe. The information is referred to by planning of decommissioning strategy and by an application to regulator. Reliable information of radioactivity inventory can be used to optimize the decommissioning processes. In order to perform the radiological characterization reliably, we improved a procedure of an evaluation of neutron-activated materials for a Boiling Water Reactor (BWR. Neutron-activated materials are calculated with calculation codes and their validity should be verified with measurements. The evaluation of neutron-activated materials can be divided into two processes. One is a distribution calculation of neutron-flux. Another is an activation calculation of materials. The distribution calculation of neutron-flux is performed with neutron transport calculation codes with appropriate cross section library to simulate neutron transport phenomena well. Using the distribution of neutron-flux, we perform distribution calculations of radioactivity concentration. We also estimate a time dependent distribution of radioactivity classification and a radioactive-waste classification. The information obtained from the evaluation is utilized by other tasks in the preparatory tasks to make the decommissioning plan and the activity safe and rational.

  12. Development of a reliable estimation procedure of radioactivity inventory in a BWR plant due to neutron irradiation for decommissioning

    Science.gov (United States)

    Tanaka, Ken-ichi; Ueno, Jun

    2017-09-01

    Reliable information of radioactivity inventory resulted from the radiological characterization is important in order to plan decommissioning planning and is also crucial in order to promote decommissioning in effectiveness and in safe. The information is referred to by planning of decommissioning strategy and by an application to regulator. Reliable information of radioactivity inventory can be used to optimize the decommissioning processes. In order to perform the radiological characterization reliably, we improved a procedure of an evaluation of neutron-activated materials for a Boiling Water Reactor (BWR). Neutron-activated materials are calculated with calculation codes and their validity should be verified with measurements. The evaluation of neutron-activated materials can be divided into two processes. One is a distribution calculation of neutron-flux. Another is an activation calculation of materials. The distribution calculation of neutron-flux is performed with neutron transport calculation codes with appropriate cross section library to simulate neutron transport phenomena well. Using the distribution of neutron-flux, we perform distribution calculations of radioactivity concentration. We also estimate a time dependent distribution of radioactivity classification and a radioactive-waste classification. The information obtained from the evaluation is utilized by other tasks in the preparatory tasks to make the decommissioning plan and the activity safe and rational.

  13. Procedures and methods that increase reliability and reproducibility of the transplanted kidney perfusion index

    International Nuclear Information System (INIS)

    Smokvina, A.

    1994-01-01

    At different times following surgery and during various complications, 119 studies were performed on 57 patients. In many patients studies were repeated several times. Twenty-three studies were performed in as many patients, in whom a normal function of the transplanted kidney was established by other diagnostic methods and retrospective analysis. Comparison was made of the perfusion index results obtained by the Hilson et al. method from 1978 and the ones obtained by my own modified method, which for calculating the index also takes into account: the time difference in appearance of the initial portions of the artery and kidney curves; the positioning of the region of interest over the distal part of the aorta; the bolus injection into the arteriovenous shunt of the forearm with high specific activity of small volumes of Tc-99m labelled agents; a fast 0.5 seconds study of data collection; and a standard for normalization of numerical data. The reliability of one or the other method tested by simulated time shift of the peak of arterial curves shows that the deviation percentage from the main index value in the unmodified method is 2-5 times greater than in the modified method. The normal value of the perfusion index applying the modified method is 91-171. (author)

  14. Management of thyroid cytological material, pre-analytical procedures and bio-banking.

    Science.gov (United States)

    Bode-Lesniewska, Beata; Cochand-Priollet, Beatrix; Straccia, Patrizia; Fadda, Guido; Bongiovanni, Massimo

    2018-06-09

    Thyroid nodules are common and increasingly detected due to recent advances in imaging techniques. However, clinically relevant thyroid cancer is rare and the mortality from aggressive thyroid cancer remains constant. FNAC (Fine Needle Aspiration Cytology) is a standard method for diagnosing thyroid malignancy and the discrimination of malignant nodules from goiter. As the examined nodules on thyroid FNAC are often small incidental findings, it is important to maintain a low rate of undetermined diagnoses requiring further clinical work up or surgery. The most important factors determining the accuracy of the cytological diagnosis and suitability for biobanking of thyroid FNACs are the quality of the sample and availability of adequate tissue for auxiliary studies. This article analyses technical aspects (pre-analytics) of performing thyroid FNACs, including image guidance and rapid on slide evaluation (ROSE), sample collection methods (conventional slides, liquid based methods (LBC), cell blocks) and storage (bio-banking). The spectrum of the special studies (immunocytochemistry on direct slides or LBC, immunohistochemistry on cell blocks and molecular methods) required for improving the precision of the cytological diagnosis of the thyroid nodules is discussed. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  15. Validation of DWI pre-processing procedures for reliable differentiation between human brain gliomas.

    Science.gov (United States)

    Vellmer, Sebastian; Tonoyan, Aram S; Suter, Dieter; Pronin, Igor N; Maximov, Ivan I

    2018-02-01

    Diffusion magnetic resonance imaging (dMRI) is a powerful tool in clinical applications, in particular, in oncology screening. dMRI demonstrated its benefit and efficiency in the localisation and detection of different types of human brain tumours. Clinical dMRI data suffer from multiple artefacts such as motion and eddy-current distortions, contamination by noise, outliers etc. In order to increase the image quality of the derived diffusion scalar metrics and the accuracy of the subsequent data analysis, various pre-processing approaches are actively developed and used. In the present work we assess the effect of different pre-processing procedures such as a noise correction, different smoothing algorithms and spatial interpolation of raw diffusion data, with respect to the accuracy of brain glioma differentiation. As a set of sensitive biomarkers of the glioma malignancy grades we chose the derived scalar metrics from diffusion and kurtosis tensor imaging as well as the neurite orientation dispersion and density imaging (NODDI) biophysical model. Our results show that the application of noise correction, anisotropic diffusion filtering, and cubic-order spline interpolation resulted in the highest sensitivity and specificity for glioma malignancy grading. Thus, these pre-processing steps are recommended for the statistical analysis in brain tumour studies. Copyright © 2017. Published by Elsevier GmbH.

  16. Validation of a new analytical procedure for determination of residual solvents in [18F]FDG by gas chromatography

    International Nuclear Information System (INIS)

    Costa, Flávia M.; Costa, Cassiano L.S.; Silva, Juliana B.; Ferreira, Soraya M.Z.M.D.

    2017-01-01

    Fludeoxyglucose F 18 ([ 18 F]FDG) is the most used radiopharmaceutical for positron emission tomography, especially on oncology. Organic solvents such as ether, ethanol and acetonitrile might be used in the synthesis of [ 18 F]FDG; however, they might not be completely removed during purification steps. The determination of residual solvents in [ 18 F]FDG is required in the European Pharmacopoeia (EP) and the United States Pharmacopeia (USP) monographs. While the procedure described in the EP is quite general, the one described in the USP requires a long runtime (about 13 minutes). In this work a simple and fast (4-minute) analytical procedure was developed and validated for determination of residual solvents in [ 18 F]FDG. Analyses were carried out in a Perkin Elmer gas chromatograph equipped with a flame ionization detector. The separation was obtained on a 0.53-mm x 30 m fused-silica column. Validation included the evaluation of various parameters, such as: specificity, linearity and range, limits of detection and quantitation, precision (repeatability and intermediate precision), accuracy, and robustness. Results were found to be within acceptable limits, indicating the developed procedure is suitable for its intended application. Considering the short half-life of fluorine-18 (109.7 minutes), this new method could be a valuable alternative for routine quality control of [ 18 F]FDG. (author)

  17. Validation of a new analytical procedure for determination of residual solvents in [{sup 18}F]FDG by gas chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Flávia M.; Costa, Cassiano L.S.; Silva, Juliana B.; Ferreira, Soraya M.Z.M.D., E-mail: flaviabiomedica@yahoo.com.br [Centro de Desenvolvimento da Tecnologia Nuclear (UPPR/CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Unidade de Pesquisa e Produção de Radiofármacos

    2017-07-01

    Fludeoxyglucose F 18 ([{sup 18}F]FDG) is the most used radiopharmaceutical for positron emission tomography, especially on oncology. Organic solvents such as ether, ethanol and acetonitrile might be used in the synthesis of [{sup 18}F]FDG; however, they might not be completely removed during purification steps. The determination of residual solvents in [{sup 18}F]FDG is required in the European Pharmacopoeia (EP) and the United States Pharmacopeia (USP) monographs. While the procedure described in the EP is quite general, the one described in the USP requires a long runtime (about 13 minutes). In this work a simple and fast (4-minute) analytical procedure was developed and validated for determination of residual solvents in [{sup 18}F]FDG. Analyses were carried out in a Perkin Elmer gas chromatograph equipped with a flame ionization detector. The separation was obtained on a 0.53-mm x 30 m fused-silica column. Validation included the evaluation of various parameters, such as: specificity, linearity and range, limits of detection and quantitation, precision (repeatability and intermediate precision), accuracy, and robustness. Results were found to be within acceptable limits, indicating the developed procedure is suitable for its intended application. Considering the short half-life of fluorine-18 (109.7 minutes), this new method could be a valuable alternative for routine quality control of [{sup 18}F]FDG. (author)

  18. "INTRODUCING A FULL VALIDATED ANALYTICAL PROCEDURE AS AN OFFICIAL COMPENDIAL METHOD FOR FENTANYL TRANSDERMAL PATCHES"

    Directory of Open Access Journals (Sweden)

    Amir Mehdizadeh

    2005-04-01

    Full Text Available A simple, sensitive and specific HPLC method and also a simple and fast extraction procedure were developed for quantitative analysis of fentanyl transdermal patches. Chloroform, methanol and ethanol were used as extracting solvents with recovery percent of 92.1, 94.3 and 99.4% respectively. Fentanyl was extracted with ethanol and the eluted fentanyl through the C18 column was monitored by UV detection at 230 nm. The linearity was at the range of 0.5-10 µg/mL with correlation coefficient (r2 of 0.9992. Both intra and inter-day accuracy and precision were within acceptable limits. The detection limit (DL and quantitation limit (QL were 0.15 and 0.5 µg/mL, respectively. Other validation characteristics such as selectivity, robustness and ruggedness were evaluated. Following method validation, a system suitability test (SST including capacity factor (k´, plate number (N, tailing factor (T, and RSD was defined for routine test.

  19. Analytical procedure for experimental quantification of carrier concentration in semiconductor devices by using electric scanning probe microscopy

    International Nuclear Information System (INIS)

    Fujita, Takaya; Matsumura, Koji; Itoh, Hiroshi; Fujita, Daisuke

    2014-01-01

    Scanning capacitance microscopy (SCM) is based on a contact-mode variant of atomic force microscopy, which is used for imaging two-dimensional carrier (electrons and holes) distributions in semiconductor devices. We introduced a method of quantification of the carrier concentration by experimentally deduced calibration curves, which were prepared for semiconductor materials such as silicon and silicon carbide. The analytical procedure was circulated to research organizations in a round-robin test. The effectiveness of the method was confirmed for practical analysis and for what is expected for industrial pre-standardization from the viewpoint of comparability among users. It was also applied to other electric scanning probe microscopy techniques such as scanning spreading resistance microscopy and scanning nonlinear dielectric microscopy. Their depth profiles of carrier concentration were found to be in good agreement with those characterized by SCM. These results suggest that our proposed method will be compatible with future next-generation microscopy. (paper)

  20. Analytical validation of an ultraviolet-visible procedure for determining lutein concentration and application to lutein-loaded nanoparticles.

    Science.gov (United States)

    Silva, Jéssica Thaís do Prado; Silva, Anderson Clayton da; Geiss, Julia Maria Tonin; de Araújo, Pedro Henrique Hermes; Becker, Daniela; Bracht, Lívia; Leimann, Fernanda Vitória; Bona, Evandro; Guerra, Gustavo Petri; Gonçalves, Odinei Hess

    2017-09-01

    Lutein is a carotenoid presenting known anti-inflammatory and antioxidant properties. Lutein-rich diets have been associated with neurological improvement as well as reduction of the risk of vision loss due to Age-Related Macular Degeneration (AMD). Micro and nanoencapsulation have demonstrated to be effective techniques in protecting lutein against degradation and also in improving its bioavailability. However, actual lutein concentration inside the capsules and encapsulation efficiency are key parameters that must be precisely known when designing in vitro and in vivo tests. In this work an analytical procedure was validated for the determination of the actual lutein content in zein nanoparticles using ultraviolet-visible spectroscopy. Method validation followed the International Conference on Harmonisation (ICH) guidelines which evaluate linearity, detection limit, quantification limit, accuracy and precision. The validated methodology was applied to characterize lutein-loaded nanoparticles. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Development of analytical procedures for the determination of hexavalent chromium in corrosion prevention coatings used in the automotive industry.

    Science.gov (United States)

    Séby, F; Castetbon, A; Ortega, R; Guimon, C; Niveau, F; Barrois-Oudin, N; Garraud, H; Donard, O F X

    2008-05-01

    The European directive 2000/53/EC limits the use of Cr(VI) in vehicle manufacturing. Although a maximum of 2 g of Cr(VI) was authorised per vehicle for corrosion prevention coatings of key components, since July 2007 its use has been prohibited except for some particular applications. Therefore, the objective of this work was to develop direct analytical procedures for Cr(VI) determination in the different steel coatings used for screws. Instead of working directly with screws, the optimisation of the procedures was carried out with metallic plates homogeneously coated to improve the data comparability. Extraction of Cr(VI) from the metallic parts was performed by sonication. Two extraction solutions were tested: a direct water extraction solution used in standard protocols and an ammonium/ammonia buffer solution at pH 8.9. The extracts were further analysed for Cr speciation by high-performance liquid chromatography (HPLC) inductively coupled plasma (ICP) atomic emission spectrometry or HPLC ICP mass spectrometry depending on the concentration level. When possible, the coatings were also directly analysed by solid speciation techniques (X-ray photoelectron spectroscopy, XPS, and X-ray absorption near-edge structure, XANES) for validation of the results. Very good results between the different analytical approaches were obtained for the sample of coating made up of a heated paint containing Zn, Al and Cr when using the extracting buffer solution at pH 8.9. After a repeated four-step extraction procedure on the same portion test, taking into account the depth of the surface layer reached, good agreement with XPS and XANES results was obtained. In contrast, for the coatings composed of an alkaline Zn layer where Cr(VI) and Cr(III) are deposited, only the extraction procedure using water allowed the detection of Cr(VI). To elucidate the Cr(VI) reduction during extraction at pH 8.9, the reactivity of Cr(VI) towards different species of Zn generally present in the

  2. Analytical procedures used by the uranium - radon - radium geochemistry group; Methodes d'analyses utilisees par la section de geochimie uranium, radon, radium

    Energy Technology Data Exchange (ETDEWEB)

    Berthollet, P [Commissariat a l' Energie Atomique, Fontenay-aux-Roses (France). Centre d' Etudes Nucleaires

    1968-07-01

    The analytical methods described are applied to the geochemical prospecting of uranium. The nature of the material under investigation, which may be soil, alluvium, rock, plant or water, and the particular requirements of geochemical exploration, have prompted us to adjust the widely used conventional methods to the demands of large scale operation, without lowering their standards of accuracy and reliability. These procedures are explained in great detail. Though most of this technical information may appear superfluous to the chemical engineer well versed in trace element determination, it will, however, serve a useful purpose both with the operator in charge of routine testing and with the chemist called upon to interpret results. (author) [French] Les methodes d'analyses decrites sont utilisees pour la prospection geochimique de l'uranium. La nature des materiaux: sols, alluvions, roches, vegetaux, eaux, et les exigences propres a la prospection geochimique, nous ont conduit a adapter des methodes classique couramment utilisees pour les rendre aptes a etre executees en grande serie, sans abandonner leurs qualites de precision et de fidelite. Ces methodes sont presentees avec un maximum de details operatoires qui paraitront superflus aux chimistes habitues aux dosages de traces, mais seront utiles aussi bien aux manipulateurs charges des analyses qu'aux geochimistes appeles a exploiter les resultats. (auteur)

  3. Reliability of the fuel identification procedure used by COGEMA during cask loading for shipment to LA HAGUE

    International Nuclear Information System (INIS)

    Pretesacque, P.; Eid, M.; Zachar, M.

    1993-01-01

    This study has been carried out to demonstrate the reliability of the system of the spent fuel identification used by COGEMA and NTL prior to shipment to the reprocessing plant of La Hague. This was a prerequisite for the French competent authority to accept the 'burnup credit' assumption in the criticality assessment of spent fuel packages. The probability to load a non-irradiated and non-specified fuel assembly was considered as acceptable if our identification and irradiation status measurement procedures were used. Furthermore, the task analysis enabled us to improve the working conditions at reactor sites, the quality of the working documentation, and consequently to improve the reliability of the system. The NTL experience of transporting to La Hague, as consignor, more than 10,000 fuel assemblies since the date of implementation of our system in 1984 without any non-conformance on fuel identification, validated the formalism of this study as well as our assumptions on basic events probabilities. (J.P.N.)

  4. An analytical procedure for determination of sulphur species and isotopes in boreal acid sulphate soils and sediments

    Directory of Open Access Journals (Sweden)

    K. BACKLUND

    2008-12-01

    Full Text Available An analytical scheme suitable for boreal acid sulphate (AS soils and sediments was developed on the basis of existing methods. The presented procedure can be used to quantify and discriminate among acid volatile sulphide, cold chromium reducible sulphur, hot chromium reducible sulphur, elemental sulphur, sulphate sulphur, organic sulphur, total reducible sulphur and total sulphur. The sulphur fractions are recovered as either Ag2S or BaSO4 precipitates and can further be used for isotope analysis. Overlaps between sulphur species are common during speciation, and must be minimized. Some of these overlaps are caused by poor sampling and storage, inappropriate conditions during the distillation, or natural variations in the sample (e.g. Fe3+ interference and grain size. The procedural impact was determined by conducting tests on both artificial and natural samples containing one or several sulphur species. The method is applied on reduced sediment from an AS soil locality (Överpurmo and a brackish lake (Larsmo Lake in western Finland and the results, including S-isotopes, are discussed.;

  5. Bio-Oil Analysis Laboratory Procedures | Bioenergy | NREL

    Science.gov (United States)

    Bio-Oil Analysis Laboratory Procedures Bio-Oil Analysis Laboratory Procedures NREL develops laboratory analytical procedures (LAPs) for the analysis of raw and upgraded pyrolysis bio-oils. These standard procedures have been validated and allow for reliable bio-oil analysis. Procedures Determination

  6. Development of quantitative analytical procedures on two-phase flow in tight-lattice fuel bundles for reduced-moderation light-water reactors

    International Nuclear Information System (INIS)

    Ohnuki, A.; Kureta, M.; Takae, K.; Tamai, H.; Akimoto, H.; Yoshida, H.

    2004-01-01

    The research project to investigate thermal-hydraulic performance in tight-lattice rod bundles for Reduced-Moderation Water Reactor (RMWR) started at Japan Atomic Energy Research Institute (JAERI) in 2002. The RMWR is a light water reactor for which a higher conversion ratio more than one can be expected. In order to attain this higher conversion ratio, triangular tight-lattice fuel bundles whose gap spacing between each fuel rod is around 1 mm are required. As for the thermal design of the RMWR core, conventional analytical methods are no good because the conventional composition equations can not predict the RMWR core with high accuracy. Then, development of new quantitative analytical procedures was carried out. Those analytical procedures are constructed by model experiments and advanced two-phase flow analysis codes. This paper describes the results of the model experiments and analytical results with the developed analysis codes. (authors)

  7. Risk and reliability allocation to risk control

    International Nuclear Information System (INIS)

    Vojnovic, D.; Kozuh, M.

    1992-01-01

    The risk allocation procedure is used as an analytical model to support the optimal decision making for reliability/availability improvement planning. Both levels of decision criteria, the plant risk measures and plant performance indices, are used in risk allocation procedure. Decision support system uses the multi objective decision making concept. (author) [sl

  8. Prepared for the thirtieth annual conference on bioassay analytical and environmental chemistry. Reliable analysis of high resolution gamma spectra

    International Nuclear Information System (INIS)

    Spitz, H.B.; Buschbom, R.; Rieksts, G.A.; Palmer, H.E.

    1985-01-01

    A new method has been developed to reliably analyze pulse height-energy spectra obtained from measurements employing high resolution germanium detectors. The method employs a simple data transformation and smoothing function to calculate background and identify photopeaks and isotopic analysis. This technique is elegant in its simplicity because it avoids dependence upon complex spectrum deconvolution, stripping, or other least-square-fitting techniques which complicate the assessment of measurement reliability. A moving median was chosen for data smoothing because, unlike moving averages, medians are not dominated by extreme data points. Finally, peaks are identified whenever the difference between the background spectrum and the transformed spectrum exceeds a pre-determined number of standard deviations

  9. A sensitive analytical procedure for monitoring acrylamide in environmental water samples by offline SPE-UPLC/MS/MS.

    Science.gov (United States)

    Togola, Anne; Coureau, Charlotte; Guezennec, Anne-Gwenaëlle; Touzé, Solène

    2015-05-01

    The presence of acrylamide in natural systems is of concern from both environmental and health points of view. We developed an accurate and robust analytical procedure (offline solid phase extraction combined with UPLC/MS/MS) with a limit of quantification (20 ng L(-1)) compatible with toxicity threshold values. The optimized (considering the nature of extraction phases, sampling volumes, and solvent of elution) solid phase extraction (SPE) was validated according to ISO Standard ISO/IEC 17025 on groundwater, surface water, and industrial process water samples. Acrylamide is highly polar, which induces a high variability during the SPE step, therefore requiring the use of C(13)-labeled acrylamide as an internal standard to guarantee the accuracy and robustness of the method (uncertainty about 25 % (k = 2) at limit of quantification level). The specificity of the method and the stability of acrylamide were studied for these environmental media, and it was shown that the method is suitable for measuring acrylamide in environmental studies.

  10. 40Ar/39Ar geochronology at the Instituto de Geociências, USP: instrumentation, analytical procedures, and calibration

    Directory of Open Access Journals (Sweden)

    PAULO M. VASCONCELOS

    2002-06-01

    Full Text Available Laser heating 40Ar/39Ar geochronology provides high analytical precision and accuracy, mum-scale spatial resolution, and statistically significant data sets for the study of geological and planetary processes. A newly commissioned 40Ar/39Ar laboratory at CPGeo/USP, São Paulo, Brazil, equips the Brazilian scientific community with a new powerful tool applicable to the study of geological and cosmochemical processes. Detailed information about laboratory layout, environmental conditions, and instrumentation provides the necessary parameters for the evaluation of the CPGeo/USP 40Ar/39Ar suitability to a diverse range of applications. Details about analytical procedures, including mineral separation, irradiation at the IPEN/CNEN reactor at USP, and mass spectrometric analysis enable potential researchers to design the necessary sampling and sample preparation program suitable to the objectives of their study. Finally, the results of calibration tests using Ca and K salts and glasses, international mineral standards, and in-house mineral standards show that the accuracy and precision obtained at the 40Ar/39Ar laboratory at CPGeo/USP are comparable to results obtained in the most respected laboratories internationally. The extensive calibration and standardization procedures undertaken ensure that the results of analytical studies carried out in our laboratories will gain immediate international credibility, enabling Brazilian students and scientists to conduct forefront research in earth and planetary sciences.A geocronologia de 40Ar/39Ar por aquecimento a laser permite alta precisão e acurácia analítica, tem resolução espacial em escala micrométrica, e fornece um número de dados estatisticamente significantes para o estudo de processos geológicos e planetários. Um recém construído laboratório de 40Ar/39Ar no CPGeo/USP, São Paulo, Brazil, mune a sociedade científica brasileira com uma técnica eficaz aplicável aos estudos geol

  11. Optimisation (sampling strategies and analytical procedures) for site specific environment monitoring at the areas of uranium production legacy sites in Ukraine - 59045

    International Nuclear Information System (INIS)

    Voitsekhovych, Oleg V.; Lavrova, Tatiana V.; Kostezh, Alexander B.

    2012-01-01

    There are many sites in the world, where Environment are still under influence of the contamination related to the Uranium production carried out in past. Author's experience shows that lack of site characterization data, incomplete or unreliable environment monitoring studies can significantly limit quality of Safety Assessment procedures and Priority actions analyses needed for Remediation Planning. During recent decades the analytical laboratories of the many enterprises, currently being responsible for establishing the site specific environment monitoring program have been significantly improved their technical sampling and analytical capacities. However, lack of experience in the optimal site specific sampling strategy planning and also not enough experience in application of the required analytical techniques, such as modern alpha-beta radiometers, gamma and alpha spectrometry and liquid-scintillation analytical methods application for determination of U-Th series radionuclides in the environment, does not allow to these laboratories to develop and conduct efficiently the monitoring programs as a basis for further Safety Assessment in decision making procedures. This paper gives some conclusions, which were gained from the experience establishing monitoring programs in Ukraine and also propose some practical steps on optimization in sampling strategy planning and analytical procedures to be applied for the area required Safety assessment and justification for its potential remediation and safe management. (authors)

  12. Background Contamination by Coplanar Polychlorinated Biphenyls (PCBS) in Trace Level High Resolution Gas Chromatography/High Resolution Mass Spectrometry (HRGC/HRMS) Analytical Procedures

    Science.gov (United States)

    The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for t...

  13. Usefulness of analytical parameters in the management of paediatric patients with suspicion of acute pyelonephritis. Is procalcitonin reliable?

    Science.gov (United States)

    Bañuelos-Andrío, L; Espino-Hernández, M; Ruperez-Lucas, M; Villar-Del Campo, M C; Romero-Carrasco, C I; Rodríguez-Caravaca, G

    To investigate the usefulness of procalcitonin (PCT) and other analytical parameters (white blood cell count [WBC], C-reactive protein [CRP]) as markers of acute renal damage in children after a first febrile or afebrile urinary tract infection (UTI). A retrospective study was conducted on children with a first episode of UTI admitted between January 2009 to December 2011, and in whom serum PCT, CRP and white blood cell count were measured, as well as assessing the acute renal damage with renal scintigraphy with 99m Tc-DMSA (DMSA) within the first 72h after referral. A descriptive study was performed and ROC curves were plotted, with optimal cut-off points calculated for each parameter. The 101 enrolled patients were divided into two groups according to DMSA scintigraphy results, with 64 patients being classified with acute pyelonephritis (APN), and 37 with UTI. The mean WBC, CRP and PCT values were significantly higher in patients with APN with respect to normal acute DMSA. The area under the ROC curve was 0.862 for PCR, 0.774 for WBC, and 0.731 for PCT. The optimum statistical cut-off value for PCT was 0.285ng/ml (sensitivity 71.4% and specificity 75%). Although the mean levels of fever, WBC, CRP, and PCT were significantly increased in patients with APN than in those who had UTI, the sensitivity and specificity of these analytical parameters are unable to predict the existence of acute renal damage, making the contribution by renal DMSA scintigraphy essential. Copyright © 2016 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  14. Accuracy of a Classical Test Theory-Based Procedure for Estimating the Reliability of a Multistage Test. Research Report. ETS RR-17-02

    Science.gov (United States)

    Kim, Sooyeon; Livingston, Samuel A.

    2017-01-01

    The purpose of this simulation study was to assess the accuracy of a classical test theory (CTT)-based procedure for estimating the alternate-forms reliability of scores on a multistage test (MST) having 3 stages. We generated item difficulty and discrimination parameters for 10 parallel, nonoverlapping forms of the complete 3-stage test and…

  15. Reliable screening of various foodstuffs with respect to their irradiation status: A comparative study of different analytical techniques

    International Nuclear Information System (INIS)

    Ahn, Jae-Jun; Akram, Kashif; Kwak, Ji-Young; Jeong, Mi-Seon; Kwon, Joong-Ho

    2013-01-01

    Cost-effective and time-efficient analytical techniques are required to screen large food lots in accordance to their irradiation status. Gamma-irradiated (0–10 kGy) cinnamon, red pepper, black pepper, and fresh paprika were investigated using photostimulated luminescence (PSL), direct epifluorescent filter technique/the aerobic plate count (DEFT/APC), and electronic-nose (e-nose) analyses. The screening results were also confirmed with thermoluminescence analysis. PSL analysis discriminated between irradiated (positive, >5000 PCs) and non-irradiated (negative, <700 PCs) cinnamon and red peppers. Black pepper had intermediate results (700–5000 PCs), while paprika had low sensitivity (negative results) upon irradiation. The DEFT/APC technique also showed clear screening results through the changes in microbial profiles, where the best results were found in paprika, followed by red pepper and cinnamon. E-nose analysis showed a dose-dependent discrimination in volatile profiles upon irradiation through principal component analysis. These methods can be used considering their potential applications for the screening analysis of irradiated foods. - Highlights: • Detection of irradiated food is important to enforce the applied regulations. • Gamma-irradiated spices were investigated to confirm their irradiation status. • Screening techniques such as PSL, DEFT/APC, and E-nose were tested. • Specificity and potential applications of screening techniques were evaluated. • The screening results were confirmed by promising thermoluminescence technique

  16. Evaluation of the Most Reliable Procedure of Determining Jump Height During the Loaded Countermovement Jump Exercise: Take-Off Velocity vs. Flight Time.

    Science.gov (United States)

    Pérez-Castilla, Alejandro; García-Ramos, Amador

    2018-07-01

    Pérez-Castilla, A and García-Ramos, A. Evaluation of the most reliable procedure of determining jump height during the loaded countermovement jump exercise: Take-off velocity vs. flight time. J Strength Cond Res 32(7): 2025-2030, 2018-This study aimed to compare the reliability of jump height between the 2 standard procedures of analyzing force-time data (take-off velocity [TOV] and flight time [FT]) during the loaded countermovement (CMJ) exercise performed with a free-weight barbell and in a Smith machine. The jump height of 17 men (age: 22.2 ± 2.2 years, body mass: 75.2 ± 7.1 kg, and height: 177.0 ± 6.0 cm) was tested in 4 sessions (twice for each CMJ type) against external loads of 17, 30, 45, 60, and 75 kg. Jump height reliability was comparable between the TOV (coefficient of variation [CV]: 6.42 ± 2.41%) and FT (CV: 6.53 ± 2.17%) during the free-weight CMJ, but it was higher for the FT when the CMJ was performed in a Smith machine (CV: 11.34 ± 3.73% for TOV and 5.95 ± 1.12% for FT). Bland-Altman plots revealed trivial differences (≤0.27 cm) and no heteroscedasticity of the errors (R ≤ 0.09) for the jump height obtained by the TOV and FT procedures, whereas the random error between both procedures was higher for the CMJ performed in the Smith machine (2.02 cm) compared with the free-weight barbell (1.26 cm). Based on these results, we recommend the FT procedure to determine jump height during the loaded CMJ performed in a Smith machine, whereas the TOV and FT procedures provide similar reliability during the free-weight CMJ.

  17. Reliable computation of roots in analytical waveguide modeling using an interval-Newton approach and algorithmic differentiation.

    Science.gov (United States)

    Bause, Fabian; Walther, Andrea; Rautenberg, Jens; Henning, Bernd

    2013-12-01

    For the modeling and simulation of wave propagation in geometrically simple waveguides such as plates or rods, one may employ the analytical global matrix method. That is, a certain (global) matrix depending on the two parameters wavenumber and frequency is built. Subsequently, one must calculate all parameter pairs within the domain of interest where the global matrix becomes singular. For this purpose, one could compute all roots of the determinant of the global matrix when the two parameters vary in the given intervals. This requirement to calculate all roots is actually the method's most concerning restriction. Previous approaches are based on so-called mode-tracers, which use the physical phenomenon that solutions, i.e., roots of the determinant of the global matrix, appear in a certain pattern, the waveguide modes, to limit the root-finding algorithm's search space with respect to consecutive solutions. In some cases, these reductions of the search space yield only an incomplete set of solutions, because some roots may be missed as a result of uncertain predictions. Therefore, we propose replacement of the mode-tracer approach with a suitable version of an interval- Newton method. To apply this interval-based method, we extended the interval and derivative computation provided by a numerical computing environment such that corresponding information is also available for Bessel functions used in circular models of acoustic waveguides. We present numerical results for two different scenarios. First, a polymeric cylindrical waveguide is simulated, and second, we show simulation results of a one-sided fluid-loaded plate. For both scenarios, we compare results obtained with the proposed interval-Newton algorithm and commercial software.

  18. Optimization of instrumental neutron activation analysis method by means of 2k experimental design technique aiming the validation of analytical procedures

    International Nuclear Information System (INIS)

    Petroni, Robson; Moreira, Edson G.

    2013-01-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) methods were carried out for the determination of the elements arsenic, chromium, cobalt, iron, rubidium, scandium, selenium and zinc in biological materials. The aim is to validate the analytical methods for future accreditation at the National Institute of Metrology, Quality and Technology (INMETRO). The 2 k experimental design was applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. Samples of Mussel Tissue Certified Reference Material and multi-element standards were analyzed considering the following variables: sample decay time, counting time and sample distance to detector. The standard multi-element concentration (comparator standard), mass of the sample and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN - CNEN/SP). Optimized conditions were estimated based on the results of z-score tests, main effect and interaction effects. The results obtained with the different experimental configurations were evaluated for accuracy (precision and trueness) for each measurement. (author)

  19. Optimization of instrumental neutron activation analysis method by means of 2{sup k} experimental design technique aiming the validation of analytical procedures

    Energy Technology Data Exchange (ETDEWEB)

    Petroni, Robson; Moreira, Edson G., E-mail: rpetroni@ipen.br, E-mail: emoreira@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) methods were carried out for the determination of the elements arsenic, chromium, cobalt, iron, rubidium, scandium, selenium and zinc in biological materials. The aim is to validate the analytical methods for future accreditation at the National Institute of Metrology, Quality and Technology (INMETRO). The 2{sup k} experimental design was applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. Samples of Mussel Tissue Certified Reference Material and multi-element standards were analyzed considering the following variables: sample decay time, counting time and sample distance to detector. The standard multi-element concentration (comparator standard), mass of the sample and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN - CNEN/SP). Optimized conditions were estimated based on the results of z-score tests, main effect and interaction effects. The results obtained with the different experimental configurations were evaluated for accuracy (precision and trueness) for each measurement. (author)

  20. The analytic hierarchy process as a systematic approach to the identification of important parameters for the reliability assessment of passive systems

    International Nuclear Information System (INIS)

    Zio, E.; Cantarella, M.; Cammi, A.

    2003-01-01

    Passive systems play a crucial role in the development of future solutions for nuclear plant technology. A fundamental issue still to be resolved is the quantification of the reliability of such systems. In this paper, we firstly illustrate a systematic methodology to guide the definition of the failure criteria of a passive system and the evaluation of its probability of occurrence, through the identification of the relevant system parameters and the propagation of their associated uncertainties. Within this methodology, we propose the use of the analytic hierarchy process as a structured and reproducible tool for the decomposition of the problem and the identification of the dominant system parameters. An example of its application to a real passive system is illustrated in details

  1. A column exchange chromatographic procedure for the automated purification of analytical samples in nuclear spent fuel reprocessing and plutonium fuel fabrication

    International Nuclear Information System (INIS)

    Zahradnik, P.; Swietly, H.; Doubek, N.; Bagliano, G.

    1992-11-01

    A Column Exchange Chromatographic procedure using Tri-n-Octyl-Phosphine-Oxide (TOPO) as stationary phase, is in routine use at SAL since 1984 on nuclear spent fuel reprocessing and on Pu product samples, prior to alpha and mass spectrometric analysis. This standard procedure was further on modified in view of its automation in a glove box; the resulting new procedure is described in this paper. Laboratory Robot Compatible (LRC) disposable columns were selected because their dimensions are particularly favorable and reproducible. A less corrosive HNO 3 -HI mixture substituted the former HC1-HI plutonium eluant. The inorganic support of the stationary phase used to test the above mentioned changes was unexpectedly withdrawn from the market so that another support had to be selected and the procedure reoptimized accordingly. The resulting procedure was tested with the robot and validated against the manual procedure taken as reference: the comparison showed that the modified procedure meets the analytical requirements and has the same performance than the original procedure. (author). Refs, figs and tabs

  2. Measurement methods to assess diastasis of the rectus abdominis muscle (DRAM): A systematic review of their measurement properties and meta-analytic reliability generalisation.

    Science.gov (United States)

    van de Water, A T M; Benjamin, D R

    2016-02-01

    Systematic literature review. Diastasis of the rectus abdominis muscle (DRAM) has been linked with low back pain, abdominal and pelvic dysfunction. Measurement is used to either screen or to monitor DRAM width. Determining which methods are suitable for screening and monitoring DRAM is of clinical value. To identify the best methods to screen for DRAM presence and monitor DRAM width. AMED, Embase, Medline, PubMed and CINAHL databases were searched for measurement property studies of DRAM measurement methods. Population characteristics, measurement methods/procedures and measurement information were extracted from included studies. Quality of all studies was evaluated using 'quality rating criteria'. When possible, reliability generalisation was conducted to provide combined reliability estimations. Thirteen studies evaluated measurement properties of the 'finger width'-method, tape measure, calipers, ultrasound, CT and MRI. Ultrasound was most evaluated. Methodological quality of these studies varied widely. Pearson's correlations of r = 0.66-0.79 were found between calipers and ultrasound measurements. Calipers and ultrasound had Intraclass Correlation Coefficients (ICC) of 0.78-0.97 for test-retest, inter- and intra-rater reliability. The 'finger width'-method had weighted Kappa's of 0.73-0.77 for test-retest reliability, but moderate agreement (63%; weighted Kappa = 0.53) between raters. Comparing calipers and ultrasound, low measurement error was found (above the umbilicus), and the methods had good agreement (83%; weighted Kappa = 0.66) for discriminative purposes. The available information support ultrasound and calipers as adequate methods to assess DRAM. For other methods limited measurement information of low to moderate quality is available and further evaluation of their measurement properties is required. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Review of cause-based decision tree approach for the development of domestic standard human reliability analysis procedure in low power/shutdown operation probabilistic safety assessment

    International Nuclear Information System (INIS)

    Kang, D. I.; Jung, W. D.

    2003-01-01

    We review the Cause-Based Decision Tree (CBDT) approach to decide whether we incorporate it or not for the development of domestic standard Human Reliability Analysis (HRA) procedure in low power/shutdown operation Probabilistic Safety Assessment (PSA). In this paper, we introduce the cause based decision tree approach, quantify human errors using it, and identify merits and demerits of it in comparision with previously used THERP. The review results show that it is difficult to incorporate the CBDT method for the development of domestic standard HRA procedure in low power/shutdown PSA because the CBDT method need for the subjective judgment of HRA analyst like as THERP. However, it is expected that the incorporation of the CBDT method into the development of domestic standard HRA procedure only for the comparision of quantitative HRA results will relieve the burden of development of detailed HRA procedure and will help maintain consistent quantitative HRA results

  4. Simultaneous grouping and ranking with combination of SOM and TOPSIS for selection of preferable analytical procedure for furan determination in food.

    Science.gov (United States)

    Jędrkiewicz, Renata; Tsakovski, Stefan; Lavenu, Aurore; Namieśnik, Jacek; Tobiszewski, Marek

    2018-02-01

    Novel methodology for grouping and ranking with application of self-organizing maps and multicriteria decision analysis is presented. The dataset consists of 22 objects that are analytical procedures applied to furan determination in food samples. They are described by 10 variables, referred to their analytical performance, environmental and economic aspects. Multivariate statistics analysis allows to limit the amount of input data for ranking analysis. Assessment results show that the most beneficial procedures are based on microextraction techniques with GC-MS final determination. It is presented how the information obtained from both tools complement each other. The applicability of combination of grouping and ranking is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Different Analytical Procedures for the Study of Organic Residues in Archeological Ceramic Samples with the Use of Gas Chromatography-mass Spectrometry.

    Science.gov (United States)

    Kałużna-Czaplińska, Joanna; Rosiak, Angelina; Kwapińska, Marzena; Kwapiński, Witold

    2016-01-01

    The analysis of the composition of organic residues present in pottery is an important source of information for historians and archeologists. Chemical characterization of the materials provides information on diets, habits, technologies, and original use of the vessels. This review presents the problem of analytical studies of archeological materials with a special emphasis on organic residues. Current methods used in the determination of different organic compounds in archeological ceramics are presented. Particular attention is paid to the procedures of analysis of archeological ceramic samples used before gas chromatography-mass spectrometry. Advantages and disadvantages of different extraction methods and application of proper quality assurance/quality control procedures are discussed.

  6. Preliminary study for the reliability Assurance on results and procedure of the out-pile mechanical characterization test for a fuel assembly; Lateral Vibration Test (I)

    International Nuclear Information System (INIS)

    Lee, Kang Hee; Yoon, Kyung Hee; Kim, Hyung Kyu

    2007-01-01

    The reliability assurance with respect to the test procedure and results of the out-pile mechanical performance test for the nuclear fuel assembly is an essential task to assure the test quality and to get a permission for fuel loading into the commercial reactor core. For the case of vibration test, proper management and appropriate calibration of instruments and devices used in the test, various efforts to minimize the possible error during the test and signal acquisition process are needed. Additionally, the deep understanding both of the theoretical assumption and simplification for the signal processing/modal analysis and of the functions of the devices used in the test were highly required. In this study, the overall procedure and result of lateral vibration test were assembly's mechanical characterization were briefly introduced. A series of measures to assure and improve the reliability of the vibration test were discussed

  7. Preliminary study for the reliability Assurance on results and procedure of the out-pile mechanical characterization test for a fuel assembly; Lateral Vibration Test (I)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kang Hee; Yoon, Kyung Hee; Kim, Hyung Kyu [KAERI, Daejeon (Korea, Republic of)

    2007-07-01

    The reliability assurance with respect to the test procedure and results of the out-pile mechanical performance test for the nuclear fuel assembly is an essential task to assure the test quality and to get a permission for fuel loading into the commercial reactor core. For the case of vibration test, proper management and appropriate calibration of instruments and devices used in the test, various efforts to minimize the possible error during the test and signal acquisition process are needed. Additionally, the deep understanding both of the theoretical assumption and simplification for the signal processing/modal analysis and of the functions of the devices used in the test were highly required. In this study, the overall procedure and result of lateral vibration test were assembly's mechanical characterization were briefly introduced. A series of measures to assure and improve the reliability of the vibration test were discussed.

  8. Soviet-designed pressurized water reactor symptomatic emergency operating instruction analytical procedure: approach, methodology, development and application

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1999-01-01

    A symptom approach to the analytical validation of symptom-based EOPs includes: (1) Identification of critical safety functions to the maintenance of fission product barrier integrity; (2) Identification of the symptoms which manifest an impending challenge to critical safety function maintenance; (3) Development of a symptomatic methodology to delineate bounding plant transient response modes; (4) Specification of bounding scenarios; (5) Development of a systematic calculational approach consistent with the objectives of the methodology; (6) Performance of thermal-hydraulic computer code calculations implementing the analytical methodology; (7) Interpretation of the analytical results on the basis of information available to the operator; (8) Application of the results to the validation of the proposed operator actions; (9) Production of a technical basis document justifying the proposed operator actions. (author)

  9. EML procedures manual

    International Nuclear Information System (INIS)

    Volchok, H.L.; de Planque, G.

    1982-01-01

    This manual contains the procedures that are used currently by the Environmental Measurements Laboratory of the US Department of Energy. In addition a number of analytical methods from other laboratories have been included. These were tested for reliability at the Battelle, Pacific Northwest Laboratory under contract with the Division of Biomedical and Environmental Research of the AEC. These methods are clearly distinguished. The manual is prepared in loose leaf form to facilitate revision of the procedures and inclusion of additional procedures or data sheets. Anyone receiving the manual through EML should receive this additional material automatically. The contents are as follows: (1) general; (2) sampling; (3) field measurements; (4) general analytical chemistry; (5) chemical procedures; (6) data section; (7) specifications

  10. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  11. Reliability Based Optimization of Structural Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    1987-01-01

    The optimization problem to design structural systems such that the reliability is satisfactory during the whole lifetime of the structure is considered in this paper. Some of the quantities modelling the loads and the strength of the structure are modelled as random variables. The reliability...... is estimated using first. order reliability methods ( FORM ). The design problem is formulated as the optimization problem to minimize a given cost function such that the reliability of the single elements satisfies given requirements or such that the systems reliability satisfies a given requirement....... For these optimization problems it is described how a sensitivity analysis can be performed. Next, new optimization procedures to solve the optimization problems are presented. Two of these procedures solve the system reliability based optimization problem sequentially using quasi-analytical derivatives. Finally...

  12. Determination of lycopene in food by on-line SFE-LC eliminating its degradation during the analytical procedure

    Czech Academy of Sciences Publication Activity Database

    Pól, Jaroslav; Hyötyläinen, T.; Ranta-Aho, O.; Riekkola, M. L.

    2005-01-01

    Roč. 99, S (2005), s251 ISSN 0009-2770. [Meeting on Chemistry and Life /3./. 20.09.2005-22.09.2005, Brno] R&D Projects: GA AV ČR KJB4031405 Keywords : liquid chromatography * supercritical fluid extraction * antioxidant Subject RIV: CB - Analytical Chemistry, Separation

  13. Analytical procedure for characterization of medieval wall-paintings by X-ray fluorescence spectrometry, laser ablation inductively coupled plasma mass spectrometry and Raman spectroscopy

    International Nuclear Information System (INIS)

    Syta, Olga; Rozum, Karol; Choińska, Marta; Zielińska, Dobrochna; Żukowska, Grażyna Zofia; Kijowska, Agnieszka; Wagner, Barbara

    2014-01-01

    Analytical procedure for the comprehensive chemical characterization of samples from medieval Nubian wall-paintings by means of portable X-ray fluorescence (pXRF), laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS) and Raman spectroscopy (RS) was proposed in this work. The procedure was used for elemental and molecular investigations of samples from archeological excavations in Nubia (modern southern Egypt and northern Sudan). Numerous remains of churches with painted decorations dated back to the 7th–14th century were excavated in the region of medieval kingdoms of Nubia but many aspects of this art and its technology are still unknown. Samples from the selected archeological sites (Faras, Old Dongola and Banganarti) were analyzed in the form of transfers (n = 26), small fragments collected during the excavations (n = 35) and cross sections (n = 15). XRF was used to collect data about elemental composition, LA-ICPMS allowed mapping of selected elements, while RS was used to get the molecular information about the samples. The preliminary results indicated the usefulness of the proposed analytical procedure for distinguishing the substances, from both the surface and sub-surface domains of the wall-paintings. The possibility to identify raw materials from the wall-paintings will be used in the further systematic, archeometric studies devoted to the detailed comparison of various historic Nubian centers. - Highlights: • The analytical procedure for examination of unique wall paintings was proposed. • Identification of pigments and supporting layers of wall-paintings was obtained. • Heterogeneous samples were mapped with the use of LA-ICPMS. • Anatase in the sub-surface regions of samples was detected by Raman spectroscopy

  14. Analytical procedure for characterization of medieval wall-paintings by X-ray fluorescence spectrometry, laser ablation inductively coupled plasma mass spectrometry and Raman spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Syta, Olga; Rozum, Karol; Choińska, Marta [Faculty of Chemistry, University of Warsaw, Pasteura 1, 02-093 Warsaw (Poland); Zielińska, Dobrochna [Institute of Archaeology, University of Warsaw, Krakowskie Przedmieście 26/28, 00-927 Warsaw (Poland); Żukowska, Grażyna Zofia [Chemical Faculty, Warsaw University of Technology, Noakowskiego 3, 00-664 Warsaw (Poland); Kijowska, Agnieszka [National Museum in Warsaw, Aleje Jerozolimskie 3, 00-495 Warsaw (Poland); Wagner, Barbara, E-mail: barbog@chem.uw.edu.pl [Faculty of Chemistry, University of Warsaw, Pasteura 1, 02-093 Warsaw (Poland)

    2014-11-01

    Analytical procedure for the comprehensive chemical characterization of samples from medieval Nubian wall-paintings by means of portable X-ray fluorescence (pXRF), laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS) and Raman spectroscopy (RS) was proposed in this work. The procedure was used for elemental and molecular investigations of samples from archeological excavations in Nubia (modern southern Egypt and northern Sudan). Numerous remains of churches with painted decorations dated back to the 7th–14th century were excavated in the region of medieval kingdoms of Nubia but many aspects of this art and its technology are still unknown. Samples from the selected archeological sites (Faras, Old Dongola and Banganarti) were analyzed in the form of transfers (n = 26), small fragments collected during the excavations (n = 35) and cross sections (n = 15). XRF was used to collect data about elemental composition, LA-ICPMS allowed mapping of selected elements, while RS was used to get the molecular information about the samples. The preliminary results indicated the usefulness of the proposed analytical procedure for distinguishing the substances, from both the surface and sub-surface domains of the wall-paintings. The possibility to identify raw materials from the wall-paintings will be used in the further systematic, archeometric studies devoted to the detailed comparison of various historic Nubian centers. - Highlights: • The analytical procedure for examination of unique wall paintings was proposed. • Identification of pigments and supporting layers of wall-paintings was obtained. • Heterogeneous samples were mapped with the use of LA-ICPMS. • Anatase in the sub-surface regions of samples was detected by Raman spectroscopy.

  15. Study of systematic errors in the determination of total Hg levels in the range -5% in inorganic and organic matrices with two reliable spectrometrical determination procedures

    International Nuclear Information System (INIS)

    Kaiser, G.; Goetz, D.; Toelg, G.; Max-Planck-Institut fuer Metallforschung, Stuttgart; Knapp, G.; Maichin, B.; Spitzy, H.

    1978-01-01

    In the determiniation of Hg at ng/g and pg/g levels systematic errors are due to faults in the analytical methods such as intake, preparation and decomposition of a sample. The sources of these errors have been studied both with 203 Hg-radiotracer techniques and two multi-stage procedures developed for the determiniation of trace levels. The emission spectrometrie (OES-MIP) procedure includes incineration of the sample in a microwave induced oxygen plasma (MIP), the isolation and enrichment on a gold absorbent and its excitation in an argon plasma (MIP). The emitted Hg-radiation (253,7 nm) is evaluated photometrically with a semiconductor element. The detection limit of the OES-MIP procedure was found to be 0,01 ng, the coefficient of variation 5% for 1 ng Hg. The second procedure combines a semi-automated wet digestion method (HCLO 3 /HNO 3 ) with a reduction-aeration (ascorbic acid/SnCl 2 ), and the flameless atomic absorption technique (253,7 nm). The detection limit of this procedure was found to be 0,5 ng, the coefficient of variation 5% for 5 ng Hg. (orig.) [de

  16. Assessing the reliability of the borderline regression method as a standard setting procedure for objective structured clinical examination

    Directory of Open Access Journals (Sweden)

    Sara Mortaz Hejri

    2013-01-01

    Full Text Available Background: One of the methods used for standard setting is the borderline regression method (BRM. This study aims to assess the reliability of BRM when the pass-fail standard in an objective structured clinical examination (OSCE was calculated by averaging the BRM standards obtained for each station separately. Materials and Methods: In nine stations of the OSCE with direct observation the examiners gave each student a checklist score and a global score. Using a linear regression model for each station, we calculated the checklist score cut-off on the regression equation for the global scale cut-off set at 2. The OSCE pass-fail standard was defined as the average of all station′s standard. To determine the reliability, the root mean square error (RMSE was calculated. The R2 coefficient and the inter-grade discrimination were calculated to assess the quality of OSCE. Results: The mean total test score was 60.78. The OSCE pass-fail standard and its RMSE were 47.37 and 0.55, respectively. The R2 coefficients ranged from 0.44 to 0.79. The inter-grade discrimination score varied greatly among stations. Conclusion: The RMSE of the standard was very small indicating that BRM is a reliable method of setting standard for OSCE, which has the advantage of providing data for quality assurance.

  17. A competency based selection procedure for Dutch postgraduate GP training: a pilot study on validity and reliability

    NARCIS (Netherlands)

    Vermeulen, M.I.; Tromp, F.; Zuithoff, N.P.; Pieters, R.H.; Damoiseaux, R.A.; Kuyvenhoven, M.M.

    2014-01-01

    Abstract Background: Historically, semi-structured interviews (SSI) have been the core of the Dutch selection for postgraduate general practice (GP) training. This paper describes a pilot study on a newly designed competency-based selection procedure that assesses whether candidates have the

  18. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  19. Determination of Total Lipids as Fatty Acid Methyl Esters (FAME) by in situ Transesterification: Laboratory Analytical Procedure (LAP)

    Energy Technology Data Exchange (ETDEWEB)

    Van Wychen, Stefanie; Ramirez, Kelsey; Laurens, Lieve M. L.

    2016-01-13

    This procedure is based on a whole biomass transesterification of lipids to fatty acid methyl esters to represent an accurate reflection of the potential of microalgal biofuels. Lipids are present in many forms and play various roles within an algal cell, from cell membrane phospholipids to energy stored as triacylglycerols.

  20. Proposal of a new analytical procedure for the measurement of water absorption by stone. Preliminary study for an alternative to the Italian technical normative NORMAL 07-81

    Directory of Open Access Journals (Sweden)

    Plattner Susanne

    2012-06-01

    Full Text Available Abstract Background Italian technical normative in the field of cultural heritage is often considered insufficient or not suitable in practise, therefore efforts are necessary to design new and/or improve already existing ones. Results In this paper an alternative analytical procedure for the determination of water absorption (by full immersion by stone material, described in the NORMAL 07-81 document, is proposed. Improvements concern methods accuracy and reduction of sample size; further also density data is obtained. Conclusions The new procedure was applied on three different marble samples and outcomes are encouraging, but further testing is running to better understand to what extent sample size can be reduced without worsening accuracy of results, taking into account that stone is a very heterogeneous material.

  1. Trace element partitioning between plagioclase and melt: An investigation of the impact of experimental and analytical procedures

    Science.gov (United States)

    Nielsen, Roger L.; Ustunisik, Gokce; Weinsteiger, Allison B.; Tepley, Frank J.; Johnston, A. Dana; Kent, Adam J. R.

    2017-09-01

    Quantitative models of petrologic processes require accurate partition coefficients. Our ability to obtain accurate partition coefficients is constrained by their dependence on pressure temperature and composition, and on the experimental and analytical techniques we apply. The source and magnitude of error in experimental studies of trace element partitioning may go unrecognized if one examines only the processed published data. The most important sources of error are relict crystals, and analyses of more than one phase in the analytical volume. Because we have typically published averaged data, identification of compromised data is difficult if not impossible. We addressed this problem by examining unprocessed data from plagioclase/melt partitioning experiments, by comparing models based on that data with existing partitioning models, and evaluated the degree to which the partitioning models are dependent on the calibration data. We found that partitioning models are dependent on the calibration data in ways that result in erroneous model values, and that the error will be systematic and dependent on the value of the partition coefficient. In effect, use of different calibration datasets will result in partitioning models whose results are systematically biased, and that one can arrive at different and conflicting conclusions depending on how a model is calibrated, defeating the purpose of applying the models. Ultimately this is an experimental data problem, which can be solved if we publish individual analyses (not averages) or use a projection method wherein we use an independent compositional constraint to identify and estimate the uncontaminated composition of each phase.

  2. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A.

    2008-12-17

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.

  3. Control Chart on Semi Analytical Weighting

    Science.gov (United States)

    Miranda, G. S.; Oliveira, C. C.; Silva, T. B. S. C.; Stellato, T. B.; Monteiro, L. R.; Marques, J. R.; Faustino, M. G.; Soares, S. M. V.; Ulrich, J. C.; Pires, M. A. F.; Cotrim, M. E. B.

    2018-03-01

    Semi-analytical balance verification intends to assess the balance performance using graphs that illustrate measurement dispersion, trough time, and to demonstrate measurements were performed in a reliable manner. This study presents internal quality control of a semi-analytical balance (GEHAKA BG400) using control charts. From 2013 to 2016, 2 weight standards were monitored before any balance operation. This work intended to evaluate if any significant difference or bias were presented on weighting procedure over time, to check the generated data reliability. This work also exemplifies how control intervals are established.

  4. Analytical solution to the 1D Lemaitre's isotropic damage model and plane stress projected implicit integration procedure

    DEFF Research Database (Denmark)

    Andriollo, Tito; Thorborg, Jesper; Hattel, Jesper Henri

    2016-01-01

    obtaining an integral relationship between total strain and effective stress. By means of the generalized binomial theorem, an expression in terms of infinite series is subsequently derived. The solution is found to simplify considerably existing techniques for material parameters identification based...... on optimization, as all issues associated with classical numerical solution procedures of the constitutive equations are eliminated. In addition, an implicit implementation of the plane stress projected version of Lemaitre's model is discussed, showing that the resulting algebraic system can be reduced...

  5. Analytical Procedure Development to Determine Polycyclic Aromatic Compounds in the PM2.5-PM10 Fraction of Atmospheric Aerosols

    International Nuclear Information System (INIS)

    Barrado, A. I.; Garcia, S.; Perez, R. M.

    2013-01-01

    This paper presents an optimized and validated analytical methodology for the determination of various polycyclic aromatic compounds in ambient air using liquid chromatography with fluorescence detection. This analysis method was applied to samples obtained during more than one year in an area of Madrid. Selected compounds have included thirteen polycyclic hydrocarbons considered priorities by the EPA, and hydroxylated derivatives, which have been less investigated in air samples by liquid chromatography with fluorescence detection. We have characterized and compared the concentration ranges of compounds identified and studied seasonal and monthly variations. In addition, the techniques have been applied to study multivariate correlations, factor analysis and cluster analysis to extract as much information as possible for interpretation and more complete and accurate characterization of the results and their relationship with meteorological parameters and physicochemical. (Author)

  6. Development of an analytical procedure for plutonium in the concentration range of femtogram/gram and its application to environmental samples

    International Nuclear Information System (INIS)

    Schuettelkopf, H.

    1981-09-01

    To study the behaviour of plutonium in the environment and to measure plutonium in the vicinity of nuclear facilities, a quick, sensitive analytical method is required which can be applied to all sample materials found in the environment. For a sediment contaminated with plutonium a boiling out method using first HNO 3 /HF and subsequently HNO 3 /Al(NO 3 ) 3 was found to be successful. The leaching solution was then extracted by TOPO and the plutonium backextracted by ascorbic acid/HCl. Some different purification steps and finally electroplating using ammonium oxalate led to an optimum sample for α- spectroscopic determination of plutonium. An analytical method was worked out for plutonium which can be applied to all materials found in the environment. The sample size is 100 g but it might also be much greater. The average chemical yield is 70 and 80%. The detection limit for soil samples is 0.1 fCi/g and for plant samples 0.5 fCi/g. One technician can perform eight analyses per working day. The analytical procedure was applied to a large number of environmental samples and the results of these analyses are indicated. (orig./RB) [de

  7. LC-MS/MS analytical procedure to quantify tris(nonylphenyl)phosphite, as a source of the endocrine disruptors 4-nonylphenols, in food packaging materials.

    Science.gov (United States)

    Mottier, Pascal; Frank, Nancy; Dubois, Mathieu; Tarres, Adrienne; Bessaire, Thomas; Romero, Roman; Delatour, Thierry

    2014-01-01

    Tris(nonylphenyl)phosphite, an antioxidant used in polyethylene resins for food applications, is problematic since it is a source of the endocrine-disrupting chemicals 4-nonylphenols (4NP) upon migration into packaged foods. As a response to concerns surrounding the presence of 4NP-based compounds in packaging materials, some resin producers and additive suppliers have decided to eliminate TNPP from formulations. This paper describes an analytical procedure to verify the "TNPP-free" statement in multilayer laminates used for bag-in-box packaging. The method involves extraction of TNPP from laminates with organic solvents followed by detection/quantification by LC-MS/MS using the atmospheric pressure chemical ionisation (APCI) mode. A further acidic treatment of the latter extract allows the release of 4NP from potentially extracted TNPP. 4NP is then analysed by LC-MS/MS using electrospray ionisation (ESI) mode. This two-step analytical procedure ensures not only TNPP quantification in laminates, but also allows the flagging of other possible sources of 4NP in such packaging materials, typically as non-intentionally added substances (NIAS). The limits of quantification were 0.50 and 0.48 µg dm⁻² for TNPP and 4NP in laminates, respectively, with recoveries ranging between 87% and 114%. Usage of such analytical methodologies in quality control operations has pointed to a lack of traceability at the packaging supplier level and cross-contamination of extrusion equipment at the converter level, when TNPP-containing laminates are processed on the same machine beforehand.

  8. Analytical procedures for identifying anthocyanins in natural extracts; Procedimentos analiticos para identificacao de antocianinas presentes em extratos naturais

    Energy Technology Data Exchange (ETDEWEB)

    Marco, Paulo Henrique; Poppi, Ronei Jesus [Universidade Estadual de Campinas (UNICAMP), Campinas, SP (Brazil). Inst. de Quimica]. E-mail: ronei@iqm.unicamp.br; Scarminio, Ieda Spacino [Universidade Estadual de Londrina (UEL), PR (Brazil). Dept. de Quimica

    2008-07-01

    Anthocyanins are among the most important plant pigments. Due to their potential benefits for human health, there is considerable interest in these natural pigments. Nonetheless, there is great difficulty in finding a technique that could provide the identification of structurally similar compounds and estimate the number and concentration of the species present. A lot of techniques have been tried to find the best methodology to extract information from these systems. In this paper, a review of the most important procedures is given, from the extraction to the identification of anthocyanins in natural extracts. (author)

  9. Improved Efficiency and Reliability of NGS Amplicon Sequencing Data Analysis for Genetic Diagnostic Procedures Using AGSA Software

    Directory of Open Access Journals (Sweden)

    Axel Poulet

    2016-01-01

    Full Text Available Screening for BRCA mutations in women with familial risk of breast or ovarian cancer is an ideal situation for high-throughput sequencing, providing large amounts of low cost data. However, 454, Roche, and Ion Torrent, Thermo Fisher, technologies produce homopolymer-associated indel errors, complicating their use in routine diagnostics. We developed software, named AGSA, which helps to detect false positive mutations in homopolymeric sequences. Seventy-two familial breast cancer cases were analysed in parallel by amplicon 454 pyrosequencing and Sanger dideoxy sequencing for genetic variations of the BRCA genes. All 565 variants detected by dideoxy sequencing were also detected by pyrosequencing. Furthermore, pyrosequencing detected 42 variants that were missed with Sanger technique. Six amplicons contained homopolymer tracts in the coding sequence that were systematically misread by the software supplied by Roche. Read data plotted as histograms by AGSA software aided the analysis considerably and allowed validation of the majority of homopolymers. As an optimisation, additional 250 patients were analysed using microfluidic amplification of regions of interest (Access Array Fluidigm of the BRCA genes, followed by 454 sequencing and AGSA analysis. AGSA complements a complete line of high-throughput diagnostic sequence analysis, reducing time and costs while increasing reliability, notably for homopolymer tracts.

  10. Machinery safety of lathe machine using SHARP-systemic human action reliability procedure: a pilot case study in academic laboratory

    Science.gov (United States)

    Suryoputro, M. R.; Sari, A. D.; Sugarindra, M.; Arifin, R.

    2017-12-01

    This research aimed to understand the human reliability analysis, to find the SHARP method with its functionality on case study and also emphasize the practice in Lathe machine, continued with identifying improvement that could be made to the existing safety system. SHARP comprises of 7 stages including definition, screening, breakdown, representation, impact assessment, quantification and documentation. These steps were combined and analysed using HIRA, FTA and FMEA. HIRA analysed the lathe at academic laboratory showed the level of the highest risk with a score of 9 for the activities of power transmission parts and a score of 6 for activities which shall mean the moving parts required to take action to reduce the level of risk. Hence, the highest RPN values obtained in the power transmission activities with a value of 18 in the power transmission and then the activities of moving parts is 12 and the activities of the operating point of 8. Thus, this activity has the highest risk of workplace accidents in the operation. On the academic laboratory the improvement made on the engineering control initially with a machine guarding and completed with necessary administrative controls (SOP, work permit, training and routine cleaning) and dedicated PPEs.

  11. Urban transportation energy conservation: analytic procedures for estimating changes in travel demand and fuel consumption. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Atherton, T.J.; Suhrbier, J.H.

    1979-10-01

    This series of reports provides metropolitan planning organizations with analytical tools that can be used to evaluate the effectiveness of alternative transportation policies in achieving reductions in overall fuel consumption. To ensure a high measure of accuracy, the analysis goes beyond the first order effects, i.e., the shift from single occupant autos as the mode chosen for the work trip to more fuel efficient means of travel. Questions treated include what will happen with the autos left at home as a result of increased carpooling for work trips. Will certain policies, such as gasoline price increases, directly impact non-work tripmaking. Will a particular transportation policy affect all segments of the population, or will certain groups be impacted significantly more than others. The methodology developed links together several disaggregate travel demand models to predict auto ownership, work trip mode choice, and non-work travel demands. This report introduces the theoretical basis for the travel demand models used, describes these models and their linkages both with each other and with the various submodels, and documents the assumptions made in developing the model system and using it to forecast responses to alternative transportation policies. Emphasis is placed on the conceptual framework of the model system and specification of the individual models and submodels.

  12. Combined analytical-numerical procedure to solve multigroup spherical harmonics equations in two-dimensional r-z geometry

    International Nuclear Information System (INIS)

    Matausek, M.V.; Milosevic, M.

    1986-01-01

    In the present paper a generalization is performed of a procedure to solve multigroup spherical harmonics equations, which has originally been proposed and developed for one-dimensional systems in cylindrical or spherical geometry, and later extended for a special case of a two-dimensional system in r-z geometry. The expressions are derived for the axial and the radial dependence of the group values of the neutron flux moments, in the P-3 approximation of the spherical harmonics method, in a cylindrically symmetrical system with an arbitrary number of material regions in both r- and z-directions. In the special case of an axially homogeneous system, these expressions reduce to the relations derived previously. (author)

  13. Evaluation of structural reliability using simulation methods

    Directory of Open Access Journals (Sweden)

    Baballëku Markel

    2015-01-01

    Full Text Available Eurocode describes the 'index of reliability' as a measure of structural reliability, related to the 'probability of failure'. This paper is focused on the assessment of this index for a reinforced concrete bridge pier. It is rare to explicitly use reliability concepts for design of structures, but the problems of structural engineering are better known through them. Some of the main methods for the estimation of the probability of failure are the exact analytical integration, numerical integration, approximate analytical methods and simulation methods. Monte Carlo Simulation is used in this paper, because it offers a very good tool for the estimation of probability in multivariate functions. Complicated probability and statistics problems are solved through computer aided simulations of a large number of tests. The procedures of structural reliability assessment for the bridge pier and the comparison with the partial factor method of the Eurocodes have been demonstrated in this paper.

  14. Optimization of organic contaminant and toxicity testing analytical procedures for estimating the characteristics and environmental significance of natural gas processing plant waste sludges

    International Nuclear Information System (INIS)

    Novak, N.

    1990-10-01

    The Gas Plant Sludge Characterization Phase IIB program is a continuation of the Canadian Petroleum Association's (CPA) initiatives to characterize sludge generated at gas processing plants. The objectives of the Phase IIB project were to develop an effective procedure for screening waste sludges or centrifuge/leachate generated from sludge samples for volatile, solvent-soluble and water-soluble organics; verify the reproducibility of the three aquatic toxicity tests recommended as the battery of tests for determining the environmental significance of sludge centrifugates or leachates; assess the performance of two terrestrial toxicity tests in determining the environmental significance of whole sludge samples applied to soil; and to assess and discuss the reproducibility and cost-effectiveness of the sampling and analytical techniques proposed for the overall sludge characterization procedure. Conclusions and recommendations are provided for sludge collection, preparation and distribution, organic analyses, toxicity testing, project management, and procedure standardization. The three aquatic and two terrestrial toxicity tests proved effective in indicating the toxicity of complex mixtures. 27 refs., 3 figs., 59 tabs

  15. Theoretical, analytical, and statistical interpretation of environmental data

    International Nuclear Information System (INIS)

    Lombard, S.M.

    1974-01-01

    The reliability of data from radiochemical analyses of environmental samples cannot be determined from nuclear counting statistics alone. The rigorous application of the principles of propagation of errors, an understanding of the physics and chemistry of the species of interest in the environment, and the application of information from research on the analytical procedure are all necessary for a valid estimation of the errors associated with analytical results. The specific case of the determination of plutonium in soil is considered in terms of analytical problems and data reliability. (U.S.)

  16. Understanding the contamination of food with mineral oil: the need for a confirmatory analytical and procedural approach.

    Science.gov (United States)

    Spack, Lionel W; Leszczyk, Gabriela; Varela, Jesus; Simian, Hervé; Gude, Thomas; Stadler, Richard H

    2017-06-01

    The contamination of food by mineral oil hydrocarbons (MOHs) found in packaging is a long-running concern. A main source of MOHs in foods is the migration of mineral oil from recycled board into the packed food products. Consequently, the majority of food manufacturers have taken protective measures, e.g., by using virgin board instead of recycled fibres and, where feasible, introducing functional barriers to mitigate migration. Despite these protective measures, MOHs may still be observed in low amounts in certain food products, albeit due to different entry points across the food supply chain. In this study, we successfully apply gas chromatography coupled to mass spectrometry (GC-MS) to demonstrate, through marker compounds and the profile of the hydrocarbon response, the possible source of contamination using mainly chocolate and cereals as food matrices. The conventional liquid chromatography-one-dimensional GC coupled to a flame ionisation detector (LC-GC-FID) is a useful screening method, but in cases of positive samples it must be complemented by a confirmatory method such as, for example, GC-MS, allowing a verification of mineral oil contamination. The procedural approach proposed in this study entails profile analysis, marker identification, and interpretation and final quantification.

  17. Analytic of elements for the determination of soil->plant transfer factors

    International Nuclear Information System (INIS)

    Liese, T.

    1985-02-01

    This article describes a part of the conventional analytical work, which was done to determine soil to plant transfer factors. The analytical methods, the experiments to find out the best way of sample digestion and the resulting analytical procedures are described. Analytical methods are graphite furnace atomic absorption spectrometry (GFAAS) and inductively coupled plasma atomic emission spectrometry (ICP-AES). In case of ICP-AES the necessity of right background correction and correction of the spectral interferences is shown. The reliability of the analytical procedure is demonstrated by measuring different kinds of standard reference materials and by comparison of AAS and AES. (orig./HP) [de

  18. Centrifugation: an important pre-analytic procedure that influences plasma microRNA quantification during blood processing.

    Science.gov (United States)

    Zheng, Xiao-Hui; Cui, Cui; Zhou, Xin-Xi; Zeng, Yi-Xin; Jia, Wei-Hua

    2013-12-01

    Circulating microRNAs are robustly present in plasma or serum and have become a research focus as biomarkers for tumor diagnosis and prognosis. Centrifugation is a necessary procedure for obtaining high-quality blood supernatant. Herein, we investigated one-step and two-step centrifugations, two centrifugal methods routinely used in microRNA study, to explore their effects on plasma microRNA quantification. The microRNAs obtained from one-step and two-step centrifugations were quantified by microarray and TaqMan-based real-time quantitative polymerase chain reaction (Q-PCR). Dynamic light scattering was performed to explore the difference underlying the two centrifugal methods. The results from the microarray containing 1,347 microRNAs showed that the signal detection rate was greatly decreased in the plasma sample prepared by two-step centrifugation. More importantly, the microRNAs missing in this plasma sample could be recovered and detected in the precipitate generated from the second centrifugation. Consistent with the results from microarray, a marked decrease of three representative microRNAs in two-step centrifugal plasma was validated by Q-PCR. According to the size distribution of all nanoparticles in plasma, there were fewer nanoparticles with size >1,000 nm in two-step centrifugal plasma. Our experiments directly demonstrated that different centrifugation methods produced distinct quantities of plasma microRNAs. Thus, exosomes or protein complexes containing microRNAs may be involved in large nanoparticle formation and may be precipitated after two-step centrifugation. Our results remind us that sample processing methods should be first considered in conducting research.

  19. Near-critical carbon dioxide extraction and liquid chromatography determination of UV filters in solid cosmetic samples: a green analytical procedure.

    Science.gov (United States)

    Salvador, Amparo; Chisvert, Alberto; Jaime, Maria-Angeles

    2005-11-01

    Near-critical carbon dioxide extraction of four UV filters used as sunscreens in lipsticks and makeup formulations is reported. Extraction parameters were optimized. Efficient recoveries were obtained after 15 min of dynamic extraction with a 80:20 CO2/ethanol mixture at 300 atm and 54 degrees C, using a 1.8 mL/min flow rate. Extracts were collected in ethanol, and appropriately diluted with ethanol and 1% acetic acid to obtain a 70:30 v/v ethanol/1% acetic acid solution. The four UV filters were determined by LC with gradient elution using ethanol/1% acetic acid as mobile phase. The accuracy of the analytical procedure was estimated by comparing the results with those obtained by methods based on classical extraction. The proposed method only requires the use of CO2, ethanol and acetic acid avoiding the use of more toxic organic solvents, thus it could be considered as both operator and environment friendly.

  20. Comparison of different cleanup procedures for oil crops based on the development of a trace analytical method for the determination of pyraclostrobin and epoxiconazole.

    Science.gov (United States)

    Pan, Xinglu; Dong, Fengshou; Xu, Jun; Liu, Xingang; Cheng, Youpu; Chen, Zenglong; Liu, Na; Chen, Xixi; Tao, Yan; Zheng, Yongquan

    2014-12-01

    The effects of different cleanup procedures in removing high-molecular-mass lipids and natural colorants from oil-crop extracts, including dispersive solid-phase extraction, low-temperature precipitation and gel permeation chromatography, were studied. The pigment removal, lipid quantity, and matrix effects of the three cleanup methods were evaluated. Results indicated that the gel permeation chromatography method is the most effective way to compare the dispersive solid-phase extraction and low-temperature precipitation. Pyraclostrobin and epoxiconazole applied extensively in oil-crop production were selected as typical pesticides to study and a trace analytical method was developed by gel permeation chromatography and ultra high performance liquid chromatography with tandem mass spectrometry. Average recoveries of the target pesticides at three levels (10, 50, and 100 μg/kg) were in the range of 74.7-96.8% with relative standard deviation values below 9.2%. The limits of detection did not exceed 0.46 μg/kg, whereas the limits of quantification were below 1.54 μg/kg and much lower than maximum residue limit in all matrices. This study may provide the essential data for optimizing the analytical method of pesticides in oil-crop samples. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Guide to data processing and revision: Part 2, Human error probability data entry and revision procedures

    International Nuclear Information System (INIS)

    Gilmore, W.E.; Gertman, D.I.; Gilbert, B.G.; Reece, W.J.

    1988-11-01

    The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) is an automated data base management system for processing and storing human error probability (HEP) and hardware component failure data (HCFD). The NUCLARR system software resides on an IBM (or compatible) personal micro-computer. Users can perform data base searches to furnish HEP estimates and HCFD rates. In this manner, the NUCLARR system can be used to support a variety of risk assessment activities. This volume, Volume 3 of a 5-volume series, presents the procedures used to process HEP and HCFD for entry in NUCLARR and describes how to modify the existing NUCLARR taxonomy in order to add either equipment types or action verbs. Volume 3 also specifies the various roles of the administrative staff on assignment to the NUCLARR Clearinghouse who are tasked with maintaining the data base, dealing with user requests, and processing NUCLARR data. 5 refs., 34 figs., 3 tabs

  2. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Guide to data processing and revision: Part 3, Hardware component failure data entry and revision procedures

    International Nuclear Information System (INIS)

    Gilmore, W.E.; Gertman, D.I.; Gilbert, B.G.; Reece, W.J.

    1988-11-01

    The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) is an automated data base management system for processing and storing human error probability (HEP) and hardware component failure data (HCFD). The NUCLARR system software resides on an IBM (or compatible) personal micro-computer. Users can perform data base searches to furnish HEP estimates and HCFD rates. In this manner, the NUCLARR system can be used to support a variety of risk assessment activities. This volume, Volume 3 of a 5-volume series, presents the procedures used to process HEP and HCFD for entry in NUCLARR and describes how to modify the existing NUCLARR taxonomy in order to add equipment types of action verbs. Volume 3 also specifies the various roles of the administrative staff on assignment to the NUCLARR Clearinghouse who are tasked with maintaining the data base, dealing with user requests, and processing NUCLARR data

  3. Evaluation of the quality of results obtained in institutions participating in interlaboratory experiments and of the reliability characteristics of the analytical methods used on the basis of certification of standard soil samples

    Energy Technology Data Exchange (ETDEWEB)

    Parshin, A.K.; Obol' yaninova, V.G.; Sul' dina, N.P.

    1986-08-20

    Rapid monitoring of the level of pollution of the environment and, especially, of soils necessitates preparation of standard samples (SS) close in properties and material composition to the objects to be analyzed. During 1978-1982 four sets (three types of samples in each) of State Standard Samples of different soils were developed: soddy-podzolic sandy-loamy, typical chernozem, krasnozem, and calcareous sierozem. The certification studies of the SS of the soils were carried out in accordance with the classical scheme of interlab experiment (ILE). More than 100 institutions were involved in the ILE and the total number of independent analytical results was of the order of 10/sup 4/. With such a volume of analytical information at their disposal they were able to find some general characteristics intrinsic to certification studies, to assess the quality of work of the ILE participants with due regard for their specialization, and the reliability characteristics of the analytical methods used.

  4. Methodological procedures and analytical instruments to evaluate an indicators integrated archive for urban management; Guida metodologica per la costruzione di un archivio integrato di indicatori urbani

    Energy Technology Data Exchange (ETDEWEB)

    Del Ciello, R; Napoleoni, S [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dipt. Ambiente

    1998-07-01

    This guide provides the results of a research developed at ENEA (National Agency for new Technology, Energy and the Environment) Casaccia center (Rome, Italy) aimed to define methodological procedures and analytical instruments needed to carry out an indicators integrated archive for urban management. The guide also defines the scheme of a negotiation process aimed to reach and exchange data and information among governmental and local administrations, non-governmental organizations and scientific bodies. [Italian] Il lavoro presenta una sintesi dei risultati di una ricerca condotta presso il C.R. Casaccia dell'ENEA, relativia alla definizione di procedure metodologiche e strumenti di analisi ed elaborazione per realizzare un archivio integrato di indicatori per la gestione dei sistemi urbani. La guida, rivolta ai responsabili delle politiche urbane, deifinisce uno schema dei processi di condivisione degli indicatori urbani attraverso l'organizzazione di opportuni tavoli negoziali, costituiti da rappresentanti delle amministrazioni locali, dell'amministrazione centrale, delle categorie produttive e sociali e delle strutture tecniche operanti sul territorio.

  5. Methodological procedures and analytical instruments to evaluate an indicators integrated archive for urban management; Guida metodologica per la costruzione di un archivio integrato di indicatori urbani

    Energy Technology Data Exchange (ETDEWEB)

    Del Ciello, R.; Napoleoni, S. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dipt. Ambiente

    1998-07-01

    This guide provides the results of a research developed at ENEA (National Agency for new Technology, Energy and the Environment) Casaccia center (Rome, Italy) aimed to define methodological procedures and analytical instruments needed to carry out an indicators integrated archive for urban management. The guide also defines the scheme of a negotiation process aimed to reach and exchange data and information among governmental and local administrations, non-governmental organizations and scientific bodies. [Italian] Il lavoro presenta una sintesi dei risultati di una ricerca condotta presso il C.R. Casaccia dell'ENEA, relativia alla definizione di procedure metodologiche e strumenti di analisi ed elaborazione per realizzare un archivio integrato di indicatori per la gestione dei sistemi urbani. La guida, rivolta ai responsabili delle politiche urbane, deifinisce uno schema dei processi di condivisione degli indicatori urbani attraverso l'organizzazione di opportuni tavoli negoziali, costituiti da rappresentanti delle amministrazioni locali, dell'amministrazione centrale, delle categorie produttive e sociali e delle strutture tecniche operanti sul territorio.

  6. HASL procedures manual

    International Nuclear Information System (INIS)

    Harley, J.H.

    1977-08-01

    Additions and corrections to the following sections of the HASL Procedures Manual are provided: General, Sampling, Field Measurements; General Analytical Chemistry, Chemical Procedures, Data Section, and Specifications

  7. "In situ" extraction of essential oils by use of Dean-Stark glassware and a Vigreux column inside a microwave oven: a procedure for teaching green analytical chemistry.

    Science.gov (United States)

    Chemat, Farid; Perino-Issartier, Sandrine; Petitcolas, Emmanuel; Fernandez, Xavier

    2012-08-01

    One of the principal objectives of sustainable and green processing development remains the dissemination and teaching of green chemistry in colleges, high schools, and academic laboratories. This paper describes simple glassware that illustrates the phenomenon of extraction in a conventional microwave oven as energy source and a process for green analytical chemistry. Simple glassware comprising a Dean-Stark apparatus (for extraction of aromatic plant material and recovery of essential oils and distilled water) and a Vigreux column (as an air-cooled condenser inside the microwave oven) was designed as an in-situ extraction vessel inside a microwave oven. The efficiency of this experiment was validated for extraction of essential oils from 30 g fresh orange peel, a by-product in the production of orange juice. Every laboratory throughout the world can use this equipment. The microwave power is 100 W and the irradiation time 15 min. The method is performed at atmospheric pressure without added solvent or water and furnishes essential oils similar to those obtained by conventional hydro or steam distillation. By use of GC-MS, 22 compounds in orange peel were separated and identified; the main compounds were limonene (72.1%), β-pinene (8.4%), and γ-terpinene (6.9%). This procedure is appropriate for the teaching laboratory, does not require any special microwave equipment, and enables the students to learn the skills of extraction, and chromatographic and spectroscopic analysis. They are also exposed to a dramatic visual example of rapid, sustainable, and green extraction of an essential oil, and are introduced to successful sustainable and green analytical chemistry.

  8. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  9. Human reliability analysis of Lingao Nuclear Power Station

    International Nuclear Information System (INIS)

    Zhang Li; Huang Shudong; Yang Hong; He Aiwu; Huang Xiangrui; Zheng Tao; Su Shengbing; Xi Haiying

    2001-01-01

    The necessity of human reliability analysis (HRA) of Lingao Nuclear Power Station are analyzed, and the method and operation procedures of HRA is briefed. One of the human factors events (HFE) is analyzed in detail and some questions of HRA are discussed. The authors present the analytical results of 61 HFEs, and make a brief introduction of HRA contribution to Lingao Nuclear Power Station

  10. Efficacy, Reliability, and Safety of Completely Autologous Fibrin Glue in Neurosurgical Procedures: Single-Center Retrospective Large-Number Case Study.

    Science.gov (United States)

    Nakayama, Noriyuki; Yano, Hirohito; Egashira, Yusuke; Enomoto, Yukiko; Ohe, Naoyuki; Kanemura, Nobuhiro; Kitagawa, Junichi; Iwama, Toru

    2018-01-01

    Commercially available fibrin glue (Com-FG), which is used commonly worldwide, is produced with pooled human plasma from multiple donors. However, it has added bovine aprotinin, which involves the risk of infection, allogenic immunity, and allergic reactions. We evaluate the efficacy, reliability, and safety of completely autologous fibrin glue (CAFG). From August 2014 to February 2016, prospective data were collected and analyzed from 153 patients. CAFG was prepared with the CryoSeal System using autologous blood and was applied during neurosurgical procedures. Using CAFG-soaked oxidized regenerated cellulose and/or polyglycolic acid sheets, we performed a pinpoint hemostasis, transposed the offending vessels in a microvascular decompression, and covered the dural incision to prevent cerebrospinal fluid leakage. The CryoSeal System had generated up to a mean of 4.51 mL (range, 3.0-8.4 mL) of CAFG from 400 mL autologous blood. Com-FG products were not used in our procedures. Only 6 patients required an additional allogeneic blood transfusion. The hemostatic effective rate was 96.1% (147 of 153 patients). Only 1 patient who received transsphenoidal surgery for a pituitary adenoma presented with the complication of delayed postoperative cerebrospinal fluid leakage (0.65%). No patient developed allergic reactions or systemic complications associated with the use of CAFG. CAFG effectively provides hemostatic, adhesive, and safety performance. The timing and three-dimensional shape of CAFG-soaked oxidized regenerated cellulose and/or polyglycolic acid sheets solidification can be controlled with slow fibrin formation. The cost to prepare CAFG is similar compared with Com-FG products, and it can therefore be easily used at most institutions. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  12. Supervisor assessment of clinical and professional competence of medical trainees: a reliability study using workplace data and a focused analytical literature review.

    NARCIS (Netherlands)

    McGill, D.A.; Vleuten, C.P.M. van der; Clarke, M.J.

    2011-01-01

    Even though rater-based judgements of clinical competence are widely used, they are context sensitive and vary between individuals and institutions. To deal adequately with rater-judgement unreliability, evaluating the reliability of workplace rater-based assessments in the local context is

  13. Supervisor Assessment of Clinical and Professional Competence of Medical Trainees: A Reliability Study Using Workplace Data and a Focused Analytical Literature Review

    Science.gov (United States)

    McGill, D. A.; van der Vleuten, C. P. M.; Clarke, M. J.

    2011-01-01

    Even though rater-based judgements of clinical competence are widely used, they are context sensitive and vary between individuals and institutions. To deal adequately with rater-judgement unreliability, evaluating the reliability of workplace rater-based assessments in the local context is essential. Using such an approach, the primary intention…

  14. Optimally Fortifying Logic Reliability through Criticality Ranking

    Directory of Open Access Journals (Sweden)

    Yu Bai

    2015-02-01

    Full Text Available With CMOS technology aggressively scaling towards the 22-nm node, modern FPGA devices face tremendous aging-induced reliability challenges due to bias temperature instability (BTI and hot carrier injection (HCI. This paper presents a novel anti-aging technique at the logic level that is both scalable and applicable for VLSI digital circuits implemented with FPGA devices. The key idea is to prolong the lifetime of FPGA-mapped designs by strategically elevating the VDD values of some LUTs based on their modular criticality values. Although the idea of scaling VDD in order to improve either energy efficiency or circuit reliability has been explored extensively, our study distinguishes itself by approaching this challenge through an analytical procedure, therefore being able to maximize the overall reliability of the target FPGA design by rigorously modeling the BTI-induced device reliability and optimally solving the VDD assignment problem. Specifically, we first develop a systematic framework to analytically model the reliability of an FPGA LUT (look-up table, which consists of both RAM memory bits and associated switching circuit. We also, for the first time, establish the relationship between signal transition density and a LUT’s reliability in an analytical way. This key observation further motivates us to define the modular criticality as the product of signal transition density and the logic observability of each LUT. Finally, we analytically prove, for the first time, that the optimal way to improve the overall reliability of a whole FPGA device is to fortify individual LUTs according to their modular criticality. To the best of our knowledge, this work is the first to draw such a conclusion.

  15. Valoración de las aguas residuales mediante procedimientos analíticos y biológicos Wastewater evaluation by analytical and biological procedures

    Directory of Open Access Journals (Sweden)

    A. de la Torre

    2002-06-01

    Full Text Available Ciertos procedimientos, basados en aproximaciones analíticas y biológicas, están demostrando ser útiles en la valoración del riesgo de las aguas residuales urbanas procedentes de las Plantas de Tratamiento. Estos efluentes, considerados “mezclas complejas”, compuestos por sustancias de muy diferente naturaleza, origen y características toxicológicas y medio ambientales, requieren una valoración realista. Con el fin de colaborar al conocimiento de una parte de la realidad de nuestro país, presentamos un estudio sobre once depuradoras urbanas en las que se ha realizado un perfil de compuestos orgánicos y una valoración toxicológica mediante tests de toxicidad agudos, crónicos, de estrogenicidad, mutagenicidad y teratogenia. Los resultados muestran que 7 efluentes presentan toxicidad aguda, 3 toxicidad crónica y 4 estrogenicidad. Destacamos el hecho de que los 4 efluentes que presentan estrogenicidad, poseen al menos 3 de las sustancias estrogénicas detectadas mediante el perfil cromatográfico. Este tipo de consideraciones nos hace reflexionar sobre la necesidad de incorporar este tipo de metodologías para disponer de un conocimiento más realista de estas situaciones.Some procedures, based on analytical and biological methods, are useful tools for risk assessment of treatment plant wastewater. In fact, urban effluents, called “complex mixtures” due to their nature, origin and toxicologic and environmental variability, need a more realistic evaluation. In this study, 11 municipal wastewater effluents were studied. Chemical analysis (GC/MS and biological methods (acute and chronic toxicity bioassays and estrogenicity, mutagenity and teratogeny tests were carried out to identify the most frequent organic compounds and toxic effluents. Results showed 7 effluents with acute toxicity, 3 with chronic toxicity and 4, with estrogenic effects. When toxicity and analytical results were compared, it was observed that in effluents with

  16. A solid-phase extraction procedure coupled to 1H NMR, with chemometric analysis, to seek reliable markers of the botanical origin of honey

    International Nuclear Information System (INIS)

    Beretta, Giangiacomo; Caneva, Enrico; Regazzoni, Luca; Bakhtyari, Nazanin Golbamaki; Maffei Facino, Roberto

    2008-01-01

    The aim of this work was to establish an analytical method for identifying the botanical origin of honey, as an alternative to conventional melissopalynological, organoleptic and instrumental methods (gas-chromatography coupled to mass spectrometry (GC-MS), high-performance liquid chromatography HPLC). The procedure is based on the 1 H nuclear magnetic resonance (NMR) profile coupled, when necessary, with electrospray ionisation-mass spectrometry (ESI-MS) and two-dimensional NMR analyses of solid-phase extraction (SPE)-purified honey samples, followed by chemometric analyses. Extracts of 44 commercial Italian honeys from 20 different botanical sources were analyzed. Honeydew, chestnut and linden honeys showed constant, specific, well-resolved resonances, suitable for use as markers of origin. Honeydew honey contained the typical resonances of an aliphatic component, very likely deriving from the plant phloem sap or excreted into it by sap-sucking aphids. Chestnut honey contained the typical signals of kynurenic acid and some structurally related metabolite. In linden honey the 1 H NMR profile gave strong signals attributable to the mono-terpene derivative cyclohexa-1,3-diene-1-carboxylic acid (CDCA) and to its 1-O-β-gentiobiosyl ester (CDCA-GBE). These markers were not detectable in the other honeys, except for the less common nectar honey from rosa mosqueta. We compared and analyzed the data by multivariate techniques. Principal component analysis found different clusters of honeys based on the presence of these specific markers. The results, although obviously only preliminary, suggest that the 1 H NMR profile (with HPLC-MS analysis when necessary) can be used as a reference framework for identifying the botanical origin of honey

  17. A solid-phase extraction procedure coupled to {sup 1}H NMR, with chemometric analysis, to seek reliable markers of the botanical origin of honey

    Energy Technology Data Exchange (ETDEWEB)

    Beretta, Giangiacomo [Istituto di Chimica Farmaceutica e Tossicologica ' Pietro Pratesi' , Faculty of Pharmacy, University of Milan, via Mangiagalli 25, 20133 Milan (Italy)], E-mail: giangiacomo.beretta@unimi.it; Caneva, Enrico [Ciga - Centro Interdipartimentale Grandi Apparecchiature, University of Milan, via Golgi 19, 20133 Milan (Italy); Regazzoni, Luca; Bakhtyari, Nazanin Golbamaki; Maffei Facino, Roberto [Istituto di Chimica Farmaceutica e Tossicologica ' Pietro Pratesi' , Faculty of Pharmacy, University of Milan, via Mangiagalli 25, 20133 Milan (Italy)

    2008-07-14

    The aim of this work was to establish an analytical method for identifying the botanical origin of honey, as an alternative to conventional melissopalynological, organoleptic and instrumental methods (gas-chromatography coupled to mass spectrometry (GC-MS), high-performance liquid chromatography HPLC). The procedure is based on the {sup 1}H nuclear magnetic resonance (NMR) profile coupled, when necessary, with electrospray ionisation-mass spectrometry (ESI-MS) and two-dimensional NMR analyses of solid-phase extraction (SPE)-purified honey samples, followed by chemometric analyses. Extracts of 44 commercial Italian honeys from 20 different botanical sources were analyzed. Honeydew, chestnut and linden honeys showed constant, specific, well-resolved resonances, suitable for use as markers of origin. Honeydew honey contained the typical resonances of an aliphatic component, very likely deriving from the plant phloem sap or excreted into it by sap-sucking aphids. Chestnut honey contained the typical signals of kynurenic acid and some structurally related metabolite. In linden honey the {sup 1}H NMR profile gave strong signals attributable to the mono-terpene derivative cyclohexa-1,3-diene-1-carboxylic acid (CDCA) and to its 1-O-{beta}-gentiobiosyl ester (CDCA-GBE). These markers were not detectable in the other honeys, except for the less common nectar honey from rosa mosqueta. We compared and analyzed the data by multivariate techniques. Principal component analysis found different clusters of honeys based on the presence of these specific markers. The results, although obviously only preliminary, suggest that the {sup 1}H NMR profile (with HPLC-MS analysis when necessary) can be used as a reference framework for identifying the botanical origin of honey.

  18. Proposal of resolution to create an inquiry commission on the french nuclear power plants reliability in case or earthquakes and on the safety, information and warning procedures in case of incidents

    International Nuclear Information System (INIS)

    2003-01-01

    This short paper presents the reasons of the creation of parliamentary inquiry commission of 30 members, on the reliability of the nuclear power plants in France in case of earthquakes and on the safety, information and warning procedures in case of accidents. (A.L.B.)

  19. NASA reliability preferred practices for design and test

    Science.gov (United States)

    1991-01-01

    Given here is a manual that was produced to communicate within the aerospace community design practices that have contributed to NASA mission success. The information represents the best technical advice that NASA has to offer on reliability design and test practices. Topics covered include reliability practices, including design criteria, test procedures, and analytical techniques that have been applied to previous space flight programs; and reliability guidelines, including techniques currently applied to space flight projects, where sufficient information exists to certify that the technique will contribute to mission success.

  20. Further HTGR core support structure reliability studies. Interim report No. 1

    International Nuclear Information System (INIS)

    Platus, D.L.

    1976-01-01

    Results of a continuing effort to investigate high temperature gas cooled reactor (HTGR) core support structure reliability are described. Graphite material and core support structure component physical, mechanical and strength properties required for the reliability analysis are identified. Also described are experimental and associated analytical techniques for determining the required properties, a procedure for determining number of tests required, properties that might be monitored by special surveillance of the core support structure to improve reliability predictions, and recommendations for further studies. Emphasis in the study is directed towards developing a basic understanding of graphite failure and strength degradation mechanisms; and validating analytical methods for predicting strength and strength degradation from basic material properties

  1. A reliable bioassay procedure to evaluate per os toxicity of Bacillus thuringiensis strains against the rice delphacid, Tagosodes orizicolus (Homoptera: Delphacidae

    Directory of Open Access Journals (Sweden)

    Rebeca Mora

    2007-06-01

    Full Text Available A reliable bioassay procedure was developed to test ingested Bacillus thuringiensis (Bt toxins on the rice delphacid Tagosodes orizicolus. Initially, several colonies were established under greenhouse conditions, using rice plants to nurture the insect. For the bioassay, an in vitro feeding system was developed for third to fourth instar nymphs. Insects were fed through Parafilm membranes on sugar (10 % sucrose and honey bee (1:48 vol/vol solutions, observing a natural mortality of 10-15 % and 0-5 %, respectively. Results were reproducible under controlled conditions during the assay (18±0.1 °C at night and 28±0.1 °C during the day, 80 % RH and a 12:12 day:light photoperiod. In addition, natural mortality was quantified on insect colonies, collected from three different geographic areas of Costa Rica, with no significant differences between colonies under controlled conditions. Finally, bioassays were performed to evaluate the toxicity of a Bt collection on T. orizicolus. A preliminary sample of twenty-seven Bt strains was evaluated on coarse bioassays using three loops of sporulated colonies in 9 ml of liquid diet, the strains that exhibited higher percentages of T. orizicolus mortality were further analyzed in bioassays using lyophilized spores and crystals (1 mg/ml. As a result, strains 26-O-to, 40-X-m, 43S-d and 23-O-to isolated from homopteran insects showed mortalities of 74, 96, 44 and 82 % respectively while HD-137, HD-1 and Bti showed 19, 83 and 95 % mortalities. Controls showed mortalities between 0 and 10 % in all bioassays. This is the first report of a reliable bioassay procedure to evaluate per os toxicity for a homopteran species using Bacillus thuringiensis strains. Rev. Biol. Trop. 55 (2: 373-383. Epub 2007 June, 29.Se desarrolló una metodología de bioensayo para evaluar toxinas de Bacillus thuringiensis (Bt ingeridas por Tagosodes orizicolus, plaga del arroz y vector del virus de la hoja blanca. Se establecieron colonias

  2. I. Forensic data analysis by pattern recognition. Categorization of white bond papers by elemental composition. II. Source identification of oil spills by pattern recognition analysis of natural elemental composition. III. Improving the reliability of factor analysis of chemical measured analytical data by utilizing the measured analytical uncertainity. IV. Elucidating the structure of some clinical data

    International Nuclear Information System (INIS)

    Duewer, D.L.

    1977-01-01

    Pattern recognition techniques are applied to the analysis of white bond papers and the problem of determining the source of an oil spill. In each case, an elemental analysis by neutron activation is employed. For the determination of source of oil spills, the field sample was weathered prior to activation analysis. A procedure for including measured analytical uncertainty into data analysis methodology is discussed, with particular reference to factor analysis. The suitability of various dispersion matrices and matrix rank determination criteria for data having analytical uncertainty is investigated. A criterion useful for judging the number of factors insensitive to analytical uncertainty is presented. A model data structure for investigating the behavior of factor analysis techniques in a known, controlled manner is described and analyzed. A chemically interesting test data base having analytical uncertainty is analyzed and compared with the model data. The data structure of 22 blood constituents in three categories of liver disease (viral or toxic hepatitis, alcoholic liver diseases and obstructive processes) is studied using various statistical and pattern recognition techniques. Comparison of classification results on the original data, in combination with principal component analysis, suggests a possible underlying structure for the data. This model structure is tested by the application of two simple data transformations. Analysis of the transformed data appears to confirm that some basic understanding of the studied data has been achieved

  3. Validation of analytical procedure of quantitative determination of 5,7-bis(meta-nitrophenylamino-4,6-dinitrobenzofuroxan by potentiometric titration

    Directory of Open Access Journals (Sweden)

    R. Sh. Markhabullina

    2014-01-01

    Full Text Available A new method of quantitative determination of the substance of 5,7-bis(meta-nitrophenylamino-4,6-dinitrobenzofuroxan using potentiometric titration method is developed. The method has high precision, reliability and sensitivity.

  4. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  5. Improving machinery reliability

    CERN Document Server

    Bloch, Heinz P

    1998-01-01

    This totally revised, updated and expanded edition provides proven techniques and procedures that extend machinery life, reduce maintenance costs, and achieve optimum machinery reliability. This essential text clearly describes the reliability improvement and failure avoidance steps practiced by best-of-class process plants in the U.S. and Europe.

  6. Analytical quality, performance indices and laboratory service

    DEFF Research Database (Denmark)

    Hilden, Jørgen; Magid, Erik

    1999-01-01

    analytical error, bias, cost effectiveness, decision-making, laboratory techniques and procedures, mass screening, models, statistical, quality control......analytical error, bias, cost effectiveness, decision-making, laboratory techniques and procedures, mass screening, models, statistical, quality control...

  7. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  8. Establishment of an analytical procedure for the determination of niobium and tantalum in minerals containing these elements using X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Nguyen Xuan Chien

    2003-01-01

    The study of determination of niobium, tantalum in mineral and tin slags using X-ray fluorescence spectrometry was carried out. Analytical samples of powder and pellet were prepared. the interference of the major accompanied elements in sample with the determination of niobium and tantalum was also studied. The analysis of niobium and tantalum in mineral and in tin slags samples was given in this work. (author)

  9. Multiresidue analytical procedures for pesticides residues in vegetable products; Metodi mutiresiduo per l`analisi di residui di antiparassitari in prodotti vegetali

    Energy Technology Data Exchange (ETDEWEB)

    Gruppo di lavoro per i residui di antiparassitari della Commissione permanente di coordinamento interregionale per i problemi relativi al controllo ufficiale dei prodotti alimentari

    1997-09-01

    Multiresidue methods for pesticide residues in vegetable products, most frequently used by laboratories of the Italian national health service, by the regional and provincial agencies for environmental protection and by the National health institute. The analytical behaviour is presented for 249 pesticides through the different steps of extraction and cleanup, along with data for the gas chromatography (GC), gas chromatography coupled to mass spectrometry (GC/MS) and high performance liquid chromatography coupled to spectrophotometric detector (HPL/UV).

  10. Procedures of amino acid sequencing of peptides in natural proteins collection of knowledge and intelligence for construction of reliable chemical inference system

    OpenAIRE

    Kudo, Yoshihiro; Kanaya, Shigehiko

    1994-01-01

    In order to establish a reliable chemical inference system on amino acid sequencing of natural peptides, as various kinds of relevant knowledge and intelligence as possible are collected. Topics are on didemnins, dolastatin 3, TL-119 and/or A-3302-B, mycosubtilin, patellamide A, duramycin (and cinnamycin), bottoromycin A 2, A19009, galantin I, vancomycin, stenothricin, calf speleen profilin, neocarzinostatin, pancreatic spasmolytic polypeptide, cerebratulus toxin B-IV, RNAase U 2, ferredoxin ...

  11. Transfer of analytical procedures: a panel of strategies selected for risk management, with emphasis on an integrated equivalence-based comparative testing approach.

    Science.gov (United States)

    Agut, C; Caron, A; Giordano, C; Hoffman, D; Ségalini, A

    2011-09-10

    In 2001, a multidisciplinary team made of analytical scientists and statisticians at Sanofi-aventis has published a methodology which has governed, from that time, the transfers from R&D sites to Manufacturing sites of the release monographs. This article provides an overview of the recent adaptations brought to this original methodology taking advantage of our experience and the new regulatory framework, and, in particular, the risk management perspective introduced by ICH Q9. Although some alternate strategies have been introduced in our practices, the comparative testing one, based equivalence testing as statistical approach, remains the standard for assays lying on very critical quality attributes. This is conducted with the concern to control the most important consumer's risk involved at two levels in analytical decisions in the frame of transfer studies: risk, for the receiving laboratory, to take poor release decisions with the analytical method and risk, for the sending laboratory, to accredit such a receiving laboratory on account of its insufficient performances with the method. Among the enhancements to the comparative studies, the manuscript presents the process settled within our company for a better integration of the transfer study into the method life-cycle, just as proposals of generic acceptance criteria and designs for assay and related substances methods. While maintaining rigor and selectivity of the original approach, these improvements tend towards an increased efficiency in the transfer operations. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. OSS reliability measurement and assessment

    CERN Document Server

    Yamada, Shigeru

    2016-01-01

    This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.

  13. Median of patient results as a tool for assessment of analytical stability.

    Science.gov (United States)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. A reliable procedure for decontamination before thawing of human specimens cryostored in liquid nitrogen: three washes with sterile liquid nitrogen (SLN2).

    Science.gov (United States)

    Parmegiani, Lodovico; Accorsi, Antonio; Bernardi, Silvia; Arnone, Alessandra; Cognigni, Graciela Estela; Filicori, Marco

    2012-10-01

    To report a washing procedure, to be performed as frozen specimens are taken out of cryobanks, to minimize the risk of hypothetical culture contamination during thawing. Basic research. Private assisted reproduction center. Two batches of liquid nitrogen (LN(2)) were experimentally contaminated, one with bacteria (Pseudomonas aeruginosa, Escherichia coli, Stenotrophomonas maltophilia) and the other with fungi (Aspergillus niger). Two hundred thirty-two of the most common human gamete/embryo vitrification carriers (Cryotop, Cryoleaf, Cryopette) were immersed in the contaminated LN(2) (117 in the bacteria and 25 in the fungi-contaminated LN(2)). The carriers were tested microbiologically, one group without washing (control) and the other after three subsequent washings in certified ultraviolet sterile liquid nitrogen (SLN(2)). The carriers were randomly allocated to the "three-wash procedure" (three-wash group, 142 carriers) or "no-wash" (control group, 90 carriers) using a specific software tool. Assessment of microorganism growth. In the no-wash control group, 78.6% of the carriers were contaminated by the bacteria and 100% by the fungi. No carriers were found to be contaminated, either by bacteria or fungi, after the three-wash procedure. The three-wash procedure with SLN(2) produced an efficient decontamination of carriers in extreme experimental conditions. For this reason, this procedure could be routinely performed in IVF laboratories for safe thawing of human specimens that are cryostored in nonhermetical cryocontainers, particularly in the case of open or single-straw closed vitrification systems. Copyright © 2012 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  15. Orders- Versus Encounters-Based Image Capture: Implications Pre- and Post-Procedure Workflow, Technical and Build Capabilities, Resulting, Analytics and Revenue Capture: HIMSS-SIIM Collaborative White Paper.

    Science.gov (United States)

    Cram, Dawn; Roth, Christopher J; Towbin, Alexander J

    2016-10-01

    The decision to implement an orders-based versus an encounters-based imaging workflow poses various implications to image capture and storage. The impacts include workflows before and after an imaging procedure, electronic health record build, technical infrastructure, analytics, resulting, and revenue. Orders-based workflows tend to favor some imaging specialties while others require an encounters-based approach. The intent of this HIMSS-SIIM white paper is to offer lessons learned from early adopting institutions to physician champions and informatics leadership developing strategic planning and operational rollouts for specialties capturing clinical multimedia.

  16. A review of analytical procedures for the simultaneous determination of medically important veterinary antibiotics in environmental water: Sample preparation, liquid chromatography, and mass spectrometry.

    Science.gov (United States)

    Kim, Chansik; Ryu, Hong-Duck; Chung, Eu Gene; Kim, Yongseok; Lee, Jae-Kwan

    2018-07-01

    Medically important (MI) antibiotics are defined by the United States Food and Drug Administration as drugs containing certain active antimicrobial ingredients that are used for the treatment of human diseases or enteric pathogens causing food-borne diseases. The presence of MI antibiotic residues in environmental water is a major concern for both aquatic ecosystems and public health, particularly because of their potential to contribute to the development of antimicrobial-resistant microorganisms. In this article, we present a review of global trends in the sales of veterinary MI antibiotics and the analytical methodologies used for the simultaneous determination of antibiotic residues in environmental water. According to recently published government reports, sales volumes have increased steadily, despite many countries having adopted strategies for reducing the consumption of antibiotics. Global attention needs to be directed urgently at establishing new management strategies for reducing the use of MI antimicrobial products in the livestock industry. The development of standardized analytical methods for the detection of multiple residues is required to monitor and understand the fate of antibiotics in the environment. Simultaneous analyses of antibiotics have mostly been conducted using high-performance liquid chromatography-tandem mass spectrometry with a solid-phase extraction (SPE) pretreatment step. Currently, on-line SPE protocols are used for the rapid and sensitive detection of antibiotics in water samples. On-line detection protocols must be established for the monitoring and screening of unknown metabolites and transformation products of antibiotics in environmental water. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Development of rapid analytical methods for Sr-89/90, Pu-239/40 and Pu-238 activity concentrations in fallout, surface water, plants and aerosol filters based on modified routine used analytical procedures

    International Nuclear Information System (INIS)

    Siebert, H.-U.; Thiele, J.; Loennig, M.; Kunert, M.; Kranl, H.

    1995-01-01

    In accordance with the tasks of the National Board for Atomic Safety and Radiation Protection in the system of nuclear, environmental surveillance for many years there has existed a traditional spectrum of methods for the determination of radionuclides in environmental media. Due to the existing environmental monitoring programmes: surveillance of GDR territory with respect to the impact of global radioactive fallout; surveillance of the environment of nuclear facilities and nuclear power plants; surveillance of the environment of mining facilities, and the involved necessity of analyzing a great number of samples, the following demands were made on the radionuclide determination methods: as few as possible, simple and safe steps of analysis; use of effective nuclide selective activity measuring methods; parallel processing of several samples; possible determination of several individual nuclides by one analytic approach; selective separation methods to produce pure element-specific measuring samples, due to the necessary use of gross activity measurements; using of same principal schemes of analysis for different sample media excluding methods of decomposition

  18. Analytical Methodology for the Determination of Radium Isotopes in Environmental Samples

    International Nuclear Information System (INIS)

    2010-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is an extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004, the environment programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. Measurements of radium isotopes are important for radiological and environmental protection, geochemical and geochronological investigations, hydrology, etc. The suite of isotopes creates and stimulates continuing interest in the development of new methods for determination of radium in various media. In this publication, the four most routinely used analytical methods for radium determination in biological and environmental samples, i.e. alpha spectrometry, gamma spectrometry, liquid scintillation spectrometry and mass spectrometry, are reviewed

  19. Procedure to derive analytical models for microwave noise performances of Si/SiGe:C and InP/InGaAs heterojunction bipolar transistors

    International Nuclear Information System (INIS)

    Ramirez-Garcia, E; Enciso-Aguilar, M A; Aniel, F P; Zerounian, N

    2013-01-01

    We present a useful procedure to derive simplified expressions to model the minimum noise factor and the equivalent noise resistance of Si/SiGe:C and InP/InGaAs heterojunction bipolar transistors (HBTs). An acceptable agreement between models and measurements at operation frequencies up to 18 GHz and at several bias points is demonstrated. The development procedure includes all the significant microwave noise sources of the HBTs. These relations should be useful to model F min and R n for state-of-the-art IV-IV and III–V HBTs. The method is the first step to derive noise analyses formulas valid for operation frequencies near the unitary current gain frequency (f T ); however, to achieve this goal a necessary condition is to have access to HFN measurements up to this frequency regime. (paper)

  20. Th-U-PbT dating by Electron Probe Microanalysis, Part I. Monazite: analytical procedures and data treatment;Datacao Th-U-Pb{sub T} com microssonda eletronica, Parte I. Monazita: procedimentos analiticos e tratamento de dados

    Energy Technology Data Exchange (ETDEWEB)

    Vlach, Silvio Roberto Farias, E-mail: srfvlach@usp.b [Universidade de Sao Paulo (IG/USP), SP (Brazil). Inst. de Geociencias. Dept. de Mineralogia e Geotectonica

    2010-03-15

    Dating methodology by the electron probe microanalyser (EPMA) of (Th, U)-bearing minerals, highlighting monazite, acquired greater than ever importance in literature, particularly due to its superior spatial resolution, as well as versatility, which allow correlating petrological processes at times registered only in micro-scales in minerals and rocks with absolute ages. Although the accuracy is inferior to the one achieved with conventional isotopic methods in up to an order of magnitude, EPMA is the instrument that allows the best spatial resolution, reaching a few {mu}m{sup 3} in some conditions. Quantification of minor and trace elements with suitable precision and accuracy involves the own instrumental and analytical set-ups and data treatment strategies, significantly more rigorous when compared with those applied in conventional analyses. Th-U-Pb{sub T} dating is an example of these cases. Each EPMA is a unique machine as for its instrumental characteristics and respective automation system. In such a way, analytical procedures ought to be adjusted for laboratory specificities. The analytical strategies and data treatment adopted in the Electronic Microprobe Laboratory from Instituto de Geociencias of Universidade de Sao Paulo, Brazil, with a JEOL JXA8600S EPMA, and a ThermoNoran-Voyager 4.3 automation system, are presented and compared with the ones used in other laboratories. The influence of instrumental factors and spectral overlaps on Th, U, and Pb quantification is discussed. Applied procedures to interference correction, error propagation, data treatment, and fi nal chemical age presentation as well as to sampling and analyses are emphasized. Some typical applications are discussed, drawing attention to the most relevant aspects of electron microprobe dating. (author)

  1. Reliability Of A Novel Intracardiac Electrogram Method For AV And VV Delay Optimization And Comparability To Echocardiography Procedure For Determining Optimal Conduction Delays In CRT Patients

    Directory of Open Access Journals (Sweden)

    N Reinsch

    2009-03-01

    Full Text Available Background: Echocardiography is widely used to optimize CRT programming. A novel intracardiac electrogram method (IEGM was recently developed as an automated programmer-based method, designed to calculate optimal atrioventricular (AV and interventricular (VV delays and provide optimized delay values as an alternative to standard echocardiographic assessment.Objective: This study was aimed at determining the reliability of this new method. Furthermore the comparability of IEGM to existing echocardiographic parameters for determining optimal conduction delays was verified.Methods: Eleven patients (age 62.9± 8.7; 81% male; 73% ischemic, previously implanted with a cardiac resynchronisation therapy defibrillator (CRT-D underwent both echocardiographic and IEGM-based delay optimization.Results: Applying the IEGM method, concordance of three consecutively performed measurements was found in 3 (27% patients for AV delay and in 5 (45% patients for VV delay. Intra-individual variation between three measurements as assessed by the IEGM technique was up to 20 ms (AV: n=6; VV: n=4. E-wave, diastolic filling time and septal-to-lateral wall motion delay emerged as significantly different between the echo and IEGM optimization techniques (p < 0.05. The final AV delay setting was significantly different between both methods (echo: 126.4 ± 29.4 ms, IEGM: 183.6 ± 16.3 ms; p < 0.001; correlation: R = 0.573, p = 0.066. VV delay showed significant differences for optimized delays (echo: 46.4 ± 23.8 ms, IEGM: 10.9 ± 7.0 ms; p <0.01; correlation: R = -0.278, p = 0.407.Conclusion: The automated programmer-based IEGM-based method provides a simple and safe method to perform CRT optimization. However, the reliability of this method appears to be limited. Thus, it remains difficult for the examiner to determine the optimal hemodynamic settings. Additionally, as there was no correlation between the optimal AV- and VV-delays calculated by the IEGM method and the echo

  2. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  3. Rare extragonadal teratomas in children: complete tumor excision as a reliable and essential procedure for significant survival. Clinical experience and review of the literature.

    Science.gov (United States)

    Paradies, Guglielmo; Zullino, Francesca; Orofino, Antonio; Leggio, Samuele

    2014-01-01

    Extragonadal teratomas are rare tumors in neonates and infants and can sometimes show unusual, distinctive feature such as an unusual location, a clinical sometimes acute, presentation and a "fetiform" histotype of the lesion. We have extrapolated, from our entire experience of teratomas, 4 unusual cases, mostly operated as emergencies; 2 of them were treated just after birth. Aim of this paper is to report the clinical and pathological findings, to evaluate the surgical approach and the long-term biological behaviour in these cases, in the light of survival and current insights reported in the literature. The Authors reviewed the most significant (Tables I and II) clinical, laboratory, radiologic, and pathologic findings, surgical procedures, early and long-term results in 4 children, 1 male and 3 females (M/F ratio: 1/3), suffering from extragonadal teratomas, located in the temporo-zygomatic region of the head (Case n. 1, Fig. 1), retroperitoneal space (Case n. 2, Fig. 2) ,liver (Case n. 3, Figg. 3-5), kidney (Case n. 4, Fig. 6, 7), respectively. Of the 4 patients, 2 were treated neonatally (1 T. of the head, 1 retroperitoneal T.) A prenatal diagnosis had already been made in 2 of the 4 patients, between the 2nd and 3rd trimester of pregnancy, All the infants were born by scheduled caesarean section in a tertiary care hospital and were the immediately referred to thew N.I.C.Us. Because of a mostly acute clinical presentation, the 4 patients were then referred to the surgical unit at different ages: 7 days, 28 days, 7 months, and 4 years respectively. The initial clinical presentation (Table II) was consistent with the site of the mass and/or its side effects. The 2 newborns (Case 1 and 2) both with a prenatally diagnosed mass located at the temporozygomatic region and in the abdominal cavite respectively, already displayed, at birth a mass with a tendency to further growth. The symptoms and signs described to the primary care physician by the parents of the 2

  4. Development of analytical procedures for determination of total chromium by quadrupole ICP-MS and high-resolution ICP-MS, and hexavalent chromium by HPLC-ICP-MS, in different materials used in the automotive industry.

    Science.gov (United States)

    Séby, F; Gagean, M; Garraud, H; Castetbon, A; Donard, O F X

    2003-10-01

    A European directive was recently adopted limiting the use of hazardous substances such as Pb, Hg, Cd, and Cr(VI) in vehicle manufacturing. From July 2003 a maximum of 2 g Cr(VI) will be authorised per vehicle in corrosion-preventing coatings of key components. As no standardised procedures are available to check if produced vehicles are in agreement with this directive, the objective of this work was to develop analytical procedures for total chromium and Cr(VI) determination in these materials. The first step of this study was to optimise digestion procedures for total chromium determination in plastic and metallic materials by inductively coupled plasma mass spectrometry (ICP-MS). High resolution (HR) ICP-MS was used to examine the influence of polyatomic interferences on the detection of the (52)Cr(+) and (53)Cr(+) isotopes. If there was strong interference with m/ z 52 for plastic materials, it was possible to use quadrupole ICP-MS for m/ z 53 if digestions were performed with HNO(3)+H(2)O(2). This mixture was also necessary for digestion of chromium from metallic materials. Extraction procedures in alkaline medium (NH(4)(+)/NH(3) buffer solution at pH 8.9) assisted by sonication were developed for determining Cr(VI) in four different corrosion-preventing coatings by HPLC-ICP-MS. After optimisation and validation with the only solid reference material certified for its Cr(VI) content (BCR 545; welding dusts), the efficiency of this extraction procedure for screw coatings was compared with that described in the EN ISO 3613 standard generally used in routine laboratories. For coatings comprising zinc and aluminium passivated in depth with chromium oxides the extraction procedure developed herein enabled determination of higher Cr(VI) concentrations. This was also observed for the screw covered with a chromium passivant layer on zinc-nickel. For coating comprising a chromium passivant layer on alkaline zinc the standardized extraction procedure was more efficient

  5. Evaluation of analytical procedures for the determination of cadmium, boron and lithium in UALx samples by inductively coupled plasma optical emission spectrometry (ICP OES)

    International Nuclear Information System (INIS)

    Guilhen, Sabine Neusatz; Kakazu, Mauricio H.; Cotrim, Marycel Elena Barboza; Pires, Maria Aparecida Faustino; Souza, Alexandre Luiz de

    2013-01-01

    Used in over 80% of the worldwide diagnostic procedures, Technetium-99m ( 99m Tc), which is obtained from the decay of molybdenum-99 ( 99 Mo), is the most important radioisotope in nuclear medicine. IPEN/CNEN-SP has been developing technologies in order to produce Mo-99 by the irradiation of low-enriched uranium (LEU < 20% of 235U) targets in its research reactor IEA-R1 (IPEN, Sao Paulo/Brazil). These targets consist of low enriched uranium dispersed in a matrix of aluminum (UAlx-Al). Several impurities may be incorporated during the target's production process, such as boron, cadmium and lithium, which have a high capture cross section that may reduce the irradiation's efficiency. This study describes a simple and rapid inductively coupled plasma optical emission spectrometric method for the determination of cadmium, boron and lithium in uranium aluminum (UAlx) dispersion targets. The method involves a previous separation step, in which uranium gets removed from the matrix by chromatographic extraction with the use of a divinylbenzene resin Amberlite XAD - 4 doped with tri-n-butyl phosphate (TBP). TBP selectively separates the uranium, leaving behind the impurities in an aqueous medium for a further quantification by ICP OES. Possible spectroscopic interferences are also discussed in this article, because of the high amount of aluminum in the remaining solution. Experimental and instrumental conditions, such as initial mass, acid solution ratio and amount, resin mass, emission lines and interfering concentrations are carefully established. This method is to be applied for the determination of several others impurities in UAlx in the future, providing means to verify the UAlx targets' compliance to the current established specifications through routine laboratory analysis. (author)

  6. Reliability analysis using network simulation

    International Nuclear Information System (INIS)

    Engi, D.

    1985-01-01

    The models that can be used to provide estimates of the reliability of nuclear power systems operate at many different levels of sophistication. The least-sophisticated models treat failure processes that entail only time-independent phenomena (such as demand failure). More advanced models treat processes that also include time-dependent phenomena such as run failure and possibly repair. However, many of these dynamic models are deficient in some respects because they either disregard the time-dependent phenomena that cannot be expressed in closed-form analytic terms or because they treat these phenomena in quasi-static terms. The next level of modeling requires a dynamic approach that incorporates not only procedures for treating all significant time-dependent phenomena but also procedures for treating these phenomena when they are conditionally linked or characterized by arbitrarily selected probability distributions. The level of sophistication that is required is provided by a dynamic, Monte Carlo modeling approach. A computer code that uses a dynamic, Monte Carlo modeling approach is Q-GERT (Graphical Evaluation and Review Technique - with Queueing), and the present study had demonstrated the feasibility of using Q-GERT for modeling time-dependent, unconditionally and conditionally linked phenomena that are characterized by arbitrarily selected probability distributions

  7. Analytic trigonometry

    CERN Document Server

    Bruce, William J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Analytic Trigonometry details the fundamental concepts and underlying principle of analytic geometry. The title aims to address the shortcomings in the instruction of trigonometry by considering basic theories of learning and pedagogy. The text first covers the essential elements from elementary algebra, plane geometry, and analytic geometry. Next, the selection tackles the trigonometric functions of angles in general, basic identities, and solutions of equations. The text also deals with the trigonometric functions of real numbers. The fifth chapter details the inverse trigonometric functions

  8. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  9. Analytical Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Analytical Labspecializes in Oil and Hydraulic Fluid Analysis, Identification of Unknown Materials, Engineering Investigations, Qualification Testing (to support...

  10. Biomass Compositional Analysis Laboratory Procedures | Bioenergy | NREL

    Science.gov (United States)

    Biomass Compositional Analysis Laboratory Procedures Biomass Compositional Analysis Laboratory Procedures NREL develops laboratory analytical procedures (LAPs) for standard biomass analysis. These procedures help scientists and analysts understand more about the chemical composition of raw biomass

  11. A study on the value of computer-assisted assessment for SPECT/CT-scans in sentinel lymph node diagnostics of penile cancer as well as clinical reliability and morbidity of this procedure.

    Science.gov (United States)

    Lützen, Ulf; Naumann, Carsten Maik; Marx, Marlies; Zhao, Yi; Jüptner, Michael; Baumann, René; Papp, László; Zsótér, Norbert; Aksenov, Alexey; Jünemann, Klaus-Peter; Zuhayra, Maaz

    2016-09-07

    Because of the increasing importance of computer-assisted post processing of image data in modern medical diagnostic we studied the value of an algorithm for assessment of single photon emission computed tomography/computed tomography (SPECT/CT)-data, which has been used for the first time for lymph node staging in penile cancer with non-palpable inguinal lymph nodes. In the guidelines of the relevant international expert societies, sentinel lymph node-biopsy (SLNB) is recommended as a diagnostic method of choice. The aim of this study is to evaluate the value of the afore-mentioned algorithm and in the clinical context the reliability and the associated morbidity of this procedure. Between 2008 and 2015, 25 patients with invasive penile cancer and inconspicuous inguinal lymph node status underwent SLNB after application of the radiotracer Tc-99m labelled nanocolloid. We recorded in a prospective approach the reliability and the complication rate of the procedure. In addition, we evaluated the results of an algorithm for SPECT/CT-data assessment of these patients. SLNB was carried out in 44 groins of 25 patients. In three patients, inguinal lymph node metastases were detected via SLNB. In one patient, bilateral lymph node recurrence of the groins occurred after negative SLNB. There was a false-negative rate of 4 % in relation to the number of patients (1/25), resp. 4.5 % in relation to the number of groins (2/44). Morbidity was 4 % in relation to the number of patients (1/25), resp. 2.3 % in relation to the number of groins (1/44). The results of computer-assisted assessment of SPECT/CT data for sentinel lymph node (SLN)-diagnostics demonstrated high sensitivity of 88.8 % and specificity of 86.7 %. SLNB is a very reliable method, associated with low morbidity. Computer-assisted assessment of SPECT/CT data of the SLN-diagnostics shows high sensitivity and specificity. While it cannot replace the assessment by medical experts, it can still provide substantial

  12. Adjoint sensitivity analysis of dynamic reliability models based on Markov chains - I: Theory

    International Nuclear Information System (INIS)

    Cacuci, D. G.; Cacuci, D. G.; Ionescu-Bujor, M.

    2008-01-01

    The development of the adjoint sensitivity analysis procedure (ASAP) for generic dynamic reliability models based on Markov chains is presented, together with applications of this procedure to the analysis of several systems of increasing complexity. The general theory is presented in Part I of this work and is accompanied by a paradigm application to the dynamic reliability analysis of a simple binary component, namely a pump functioning on an 'up/down' cycle until it fails irreparably. This paradigm example admits a closed form analytical solution, which permits a clear illustration of the main characteristics of the ASAP for Markov chains. In particular, it is shown that the ASAP for Markov chains presents outstanding computational advantages over other procedures currently in use for sensitivity and uncertainty analysis of the dynamic reliability of large-scale systems. This conclusion is further underscored by the large-scale applications presented in Part II. (authors)

  13. Adjoint sensitivity analysis of dynamic reliability models based on Markov chains - I: Theory

    Energy Technology Data Exchange (ETDEWEB)

    Cacuci, D. G. [Commiss Energy Atom, Direct Energy Nucl, Saclay, (France); Cacuci, D. G. [Univ Karlsruhe, Inst Nucl Technol and Reactor Safety, D-76021 Karlsruhe, (Germany); Ionescu-Bujor, M. [Forschungszentrum Karlsruhe, Fus Program, D-76021 Karlsruhe, (Germany)

    2008-07-01

    The development of the adjoint sensitivity analysis procedure (ASAP) for generic dynamic reliability models based on Markov chains is presented, together with applications of this procedure to the analysis of several systems of increasing complexity. The general theory is presented in Part I of this work and is accompanied by a paradigm application to the dynamic reliability analysis of a simple binary component, namely a pump functioning on an 'up/down' cycle until it fails irreparably. This paradigm example admits a closed form analytical solution, which permits a clear illustration of the main characteristics of the ASAP for Markov chains. In particular, it is shown that the ASAP for Markov chains presents outstanding computational advantages over other procedures currently in use for sensitivity and uncertainty analysis of the dynamic reliability of large-scale systems. This conclusion is further underscored by the large-scale applications presented in Part II. (authors)

  14. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  15. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  16. Reliability training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  17. Analytical Chemistry Laboratory (ACL) procedure compendium

    International Nuclear Information System (INIS)

    1993-01-01

    This interim notice covers the following: extractable organic halides in solids, total organic halides, analysis by gas chromatography/Fourier transform-infrared spectroscopy, hexadecane extracts for volatile organic compounds, GC/MS analysis of VOCs, GC/MS analysis of methanol extracts of cryogenic vapor samples, screening of semivolatile organic extracts, GPC cleanup for semivolatiles, sample preparation for GC/MS for semi-VOCs, analysis for pesticides/PCBs by GC with electron capture detection, sample preparation for pesticides/PCBs in water and soil sediment, report preparation, Florisil column cleanup for pesticide/PCBs, silica gel and acid-base partition cleanup of samples for semi-VOCs, concentrate acid wash cleanup, carbon determination in solids using Coulometrics' CO 2 coulometer, determination of total carbon/total organic carbon/total inorganic carbon in radioactive liquids/soils/sludges by hot persulfate method, analysis of solids for carbonates using Coulometrics' Model 5011 coulometer, and soxhlet extraction

  18. Analytical Chemistry Laboratory (ACL) procedure compendium

    International Nuclear Information System (INIS)

    1993-01-01

    The methods cover: C in solutions, F (electrode), elements by atomic emission spectrometry, inorganic anions by ion chromatography, Hg in water/solids/sludges, As, Se, Bi, Pb, data calculations for SST (single shell tank?) samples, Sb, Tl, Ag, Pu, O/M ratio, ignition weight loss, pH value, ammonia (N), Cr(VI), alkalinity, U, C sepn. from soil/sediment/sludge, Pu purif., total N, water, C and S, surface Cl/F, leachable Cl/F, outgassing of Ge detector dewars, gas mixing, gas isotopic analysis, XRF of metals/alloys/compounds, H in Zircaloy, H/O in metals, inpurity extraction, reduced/total Fe in glass, free acid in U/Pu solns, density of solns, Kr/Xe isotopes in FFTF cover gas, H by combustion, MS of Li and Cs isotopes, MS of lanthanide isotopes, GC operation, total Na on filters, XRF spectroscopy QC, multichannel analyzer operation, total cyanide in water/solid/sludge, free cyanide in water/leachate, hydrazine conc., ICP-MS, 99 Tc, U conc./isotopes, microprobe analysis of solids, gas analysis, total cyanide, H/N 2 O in air, and pH in soil

  19. Analytical Chemistry Laboratory (ACL) procedure compendium

    International Nuclear Information System (INIS)

    1993-01-01

    This volume contains the interim change notice for physical testing. Covered are: properties of solutions, slurries, and sludges; rheological measurement with cone/plate viscometer; % solids determination; particle size distribution by laser scanning; penetration resistance of radioactive waste; operation of differential scanning calorimeter, thermogravimetric analyzer, and high temperature DTA and DSC; sodium rod for sodium bonded fuel; filling SP-100 fuel capsules; sodium filling of BEATRIX-II type capsules; removal of alkali metals with ammonia; specific gravity of highly radioactive solutions; bulk density of radioactive granular solids; purification of Li by hot gettering/filtration; and Li filling of MOTA capsules

  20. Analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-15

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  1. Analytical chemistry

    International Nuclear Information System (INIS)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-01

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  2. On-line solid-phase enrichment coupled to packed reactor flow injection analysis in a green analytical procedure to determine low levels of folic acid using fluorescence detection

    Directory of Open Access Journals (Sweden)

    Emara Samy

    2012-12-01

    Full Text Available Abstract Background Analysis of folic acid (FA is not an easy task because of its presence in lower concentrations, its lower stability under acidic conditions, and its sensitiveness against light and high temperature. The present study is concerned with the development and validation of an automated environmentally friendly pre-column derivatization combined by solid-phase enrichment (SPEn to determine low levels of FA. Results Cerium (IV trihydroxyhydroperoxide (CTH as a packed oxidant reactor has been used for oxidative cleavage of FA into highly fluorescent product, 2-amino-4-hydroxypteridine-6-carboxylic acid. FA was injected into a carrier stream of 0.04 M phosphate buffer, pH 3.4 at a flow-rate of 0.25 mL/min. The sample zone containing the analyte was passed through the CTH reactor thermostated at 40°C, and the fluorescent product was trapped and enriched on a head of small ODS column (10 mm x 4.6 mm i.d., 5 μm particle size. The enriched product was then back-flush eluted by column-switching from the small ODS column to the detector with a greener mobile phase consisting of ethanol and phosphate buffer (0.04M, pH 3.4 in the ratio of 5:95 (v/v. The eluent was monitored fluorimetrically at emission and excitation wavelengths of 463 and 367 nm, respectively. The calibration graph was linear over concentrations of FA in the range of 1.25-50 ng/mL, with a detection limit of 0.49 ng/mL. Conclusion A new simple and sensitive green analytical procedure including on-line pre-column derivatization combined by SPEn has been developed for the routine quality control and dosage form assay of FA at very low concentration level. The method was a powerful analytical technique that had excellent sensitivity, sufficient accuracy and required relatively simple and inexpensive instrumentation.

  3. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  4. Human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1992-01-01

    This book resulted from the activity of Task Force 4.2 - 'Human Reliability'. This group was established on February 27th, 1986, at the plenary meeting of the Technical Reliability Committee of VDI, within the framework of the joint committee of VDI on industrial systems technology - GIS. It is composed of representatives of industry, representatives of research institutes, of technical control boards and universities, whose job it is to study how man fits into the technical side of the world of work and to optimize this interaction. In a total of 17 sessions, information from the part of ergonomy dealing with human reliability in using technical systems at work was exchanged, and different methods for its evaluation were examined and analyzed. The outcome of this work was systematized and compiled in this book. (orig.) [de

  5. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  6. Analytical mechanics

    CERN Document Server

    Lemos, Nivaldo A

    2018-01-01

    Analytical mechanics is the foundation of many areas of theoretical physics including quantum theory and statistical mechanics, and has wide-ranging applications in engineering and celestial mechanics. This introduction to the basic principles and methods of analytical mechanics covers Lagrangian and Hamiltonian dynamics, rigid bodies, small oscillations, canonical transformations and Hamilton–Jacobi theory. This fully up-to-date textbook includes detailed mathematical appendices and addresses a number of advanced topics, some of them of a geometric or topological character. These include Bertrand's theorem, proof that action is least, spontaneous symmetry breakdown, constrained Hamiltonian systems, non-integrability criteria, KAM theory, classical field theory, Lyapunov functions, geometric phases and Poisson manifolds. Providing worked examples, end-of-chapter problems, and discussion of ongoing research in the field, it is suitable for advanced undergraduate students and graduate students studying analyt...

  7. Standardizing the practice of human reliability analysis

    International Nuclear Information System (INIS)

    Hallbert, B.P.

    1993-01-01

    The practice of human reliability analysis (HRA) within the nuclear industry varies greatly in terms of posited mechanisms that shape human performance, methods of characterizing and analytically modeling human behavior, and the techniques that are employed to estimate the frequency with which human error occurs. This variation has been a source of contention among HRA practitioners regarding the validity of results obtained from different HRA methods. It has also resulted in attempts to develop standard methods and procedures for conducting HRAs. For many of the same reasons, the practice of HRA has not been standardized or has been standardized only to the extent that individual analysts have developed heuristics and consistent approaches in their practice of HRA. From the standpoint of consumers and regulators, this has resulted in a lack of clear acceptance criteria for the assumptions, modeling, and quantification of human errors in probabilistic risk assessments

  8. Analytic chemistry of molybdenum

    International Nuclear Information System (INIS)

    Parker, G.A.

    1983-01-01

    Electrochemical, colorimetric, gravimetric, spectroscopic, and radiochemical methods for the determination of molybdenum are summarized in this book. Some laboratory procedures are described in detail while literature citations are given for others. The reader is also referred to older comprehensive reviews of the analytical chemistry of molybdenum. Contents, abridged: Gravimetric methods. Titrimetric methods. Colorimetric methods. X-ray fluorescence. Voltammetry. Catalytic methods. Molybdenum in non-ferrous alloys. Molydbenum compounds

  9. Analytical quadrics

    CERN Document Server

    Spain, Barry; Ulam, S; Stark, M

    1960-01-01

    Analytical Quadrics focuses on the analytical geometry of three dimensions. The book first discusses the theory of the plane, sphere, cone, cylinder, straight line, and central quadrics in their standard forms. The idea of the plane at infinity is introduced through the homogenous Cartesian coordinates and applied to the nature of the intersection of three planes and to the circular sections of quadrics. The text also focuses on paraboloid, including polar properties, center of a section, axes of plane section, and generators of hyperbolic paraboloid. The book also touches on homogenous coordi

  10. Preparation of methodology for reliability analysis of selected digital segments of the instrumentation and control systems of NPPs. Pt. 1

    International Nuclear Information System (INIS)

    Hustak, S.; Patrik, M.; Babic, P.

    2000-12-01

    The report is structured as follows: (i) Introduction; (ii) Important notions relating to the safety and dependability of software systems for nuclear power plants (selected notions from IAEA Technical Report No. 397; safety aspects of software application; reliability/dependability aspects of digital systems); (iii) Peculiarities of digital systems and ways to a dependable performance of the required function (failures in the system and principles of defence against them; ensuring resistance of digital systems against failures at various hardware and software levels); (iv) The issue of analytical procedures to assess the safety and reliability of safety-related digital systems (safety and reliability assessment at an early stage of the project; general framework of reliability analysis of complex systems; choice of an appropriate quantitative measure of software reliability); (v) Selected qualitative and quantitative information about the reliability of digital systems; the use of relations between the incidence of various types of faults); and (vi) Conclusions and recommendations. (P.A.)

  11. Schedule Analytics

    Science.gov (United States)

    2016-04-30

    Warfare, Naval Sea Systems Command Acquisition Cycle Time : Defining the Problem David Tate, Institute for Defense Analyses Schedule Analytics Jennifer...research was comprised of the following high- level steps :  Identify and review primary data sources 1...research. However, detailed reviews of the OMB IT Dashboard data revealed that schedule data is highly aggregated. Program start date and program end date

  12. Emergency procedures

    International Nuclear Information System (INIS)

    Abd Nasir Ibrahim; Azali Muhammad; Ab Razak Hamzah; Abd Aziz Mohamed; Mohammad Pauzi Ismail

    2004-01-01

    The following subjects are discussed - Emergency Procedures: emergency equipment, emergency procedures; emergency procedure involving X-Ray equipment; emergency procedure involving radioactive sources

  13. Time-variant reliability assessment through equivalent stochastic process transformation

    International Nuclear Information System (INIS)

    Wang, Zequn; Chen, Wei

    2016-01-01

    Time-variant reliability measures the probability that an engineering system successfully performs intended functions over a certain period of time under various sources of uncertainty. In practice, it is computationally prohibitive to propagate uncertainty in time-variant reliability assessment based on expensive or complex numerical models. This paper presents an equivalent stochastic process transformation approach for cost-effective prediction of reliability deterioration over the life cycle of an engineering system. To reduce the high dimensionality, a time-independent reliability model is developed by translating random processes and time parameters into random parameters in order to equivalently cover all potential failures that may occur during the time interval of interest. With the time-independent reliability model, an instantaneous failure surface is attained by using a Kriging-based surrogate model to identify all potential failure events. To enhance the efficacy of failure surface identification, a maximum confidence enhancement method is utilized to update the Kriging model sequentially. Then, the time-variant reliability is approximated using Monte Carlo simulations of the Kriging model where system failures over a time interval are predicted by the instantaneous failure surface. The results of two case studies demonstrate that the proposed approach is able to accurately predict the time evolution of system reliability while requiring much less computational efforts compared with the existing analytical approach. - Highlights: • Developed a new approach for time-variant reliability analysis. • Proposed a novel stochastic process transformation procedure to reduce the dimensionality. • Employed Kriging models with confidence-based adaptive sampling scheme to enhance computational efficiency. • The approach is effective for handling random process in time-variant reliability analysis. • Two case studies are used to demonstrate the efficacy

  14. Reactor Section standard analytical methods. Part 1

    Energy Technology Data Exchange (ETDEWEB)

    Sowden, D.

    1954-07-01

    the Standard Analytical Methods manual was prepared for the purpose of consolidating and standardizing all current analytical methods and procedures used in the Reactor Section for routine chemical analyses. All procedures are established in accordance with accepted practice and the general analytical methods specified by the Engineering Department. These procedures are specifically adapted to the requirements of the water treatment process and related operations. The methods included in this manual are organized alphabetically within the following five sections which correspond to the various phases of the analytical control program in which these analyses are to be used: water analyses, essential material analyses, cotton plug analyses boiler water analyses, and miscellaneous control analyses.

  15. Redefining reliability

    International Nuclear Information System (INIS)

    Paulson, S.L.

    1995-01-01

    Want to buy some reliability? The question would have been unthinkable in some markets served by the natural gas business even a few years ago, but in the new gas marketplace, industrial, commercial and even some residential customers have the opportunity to choose from among an array of options about the kind of natural gas service they need--and are willing to pay for. The complexities of this brave new world of restructuring and competition have sent the industry scrambling to find ways to educate and inform its customers about the increased responsibility they will have in determining the level of gas reliability they choose. This article discusses the new options and the new responsibilities of customers, the needed for continuous education, and MidAmerican Energy Company's experiment in direct marketing of natural gas

  16. Analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jae Seong

    1993-02-15

    This book is comprised of nineteen chapters, which describes introduction of analytical chemistry, experimental error and statistics, chemistry equilibrium and solubility, gravimetric analysis with mechanism of precipitation, range and calculation of the result, volume analysis on general principle, sedimentation method on types and titration curve, acid base balance, acid base titration curve, complex and firing reaction, introduction of chemical electro analysis, acid-base titration curve, electrode and potentiometry, electrolysis and conductometry, voltammetry and polarographic spectrophotometry, atomic spectrometry, solvent extraction, chromatograph and experiments.

  17. Analytical chemistry

    International Nuclear Information System (INIS)

    Choi, Jae Seong

    1993-02-01

    This book is comprised of nineteen chapters, which describes introduction of analytical chemistry, experimental error and statistics, chemistry equilibrium and solubility, gravimetric analysis with mechanism of precipitation, range and calculation of the result, volume analysis on general principle, sedimentation method on types and titration curve, acid base balance, acid base titration curve, complex and firing reaction, introduction of chemical electro analysis, acid-base titration curve, electrode and potentiometry, electrolysis and conductometry, voltammetry and polarographic spectrophotometry, atomic spectrometry, solvent extraction, chromatograph and experiments.

  18. Analytical chemistry

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    The division for Analytical Chemistry continued to try and develope an accurate method for the separation of trace amounts from mixtures which, contain various other elements. Ion exchange chromatography is of special importance in this regard. New separation techniques were tried on certain trace amounts in South African standard rock materials and special ceramics. Methods were also tested for the separation of carrier-free radioisotopes from irradiated cyclotron discs

  19. Reliability engineering

    International Nuclear Information System (INIS)

    Nieuwhof, G.W.E.

    1976-01-01

    Failure of systems is undesirable, but also inevitable. The consequences of failure can be reduced if the failure mode can be anticipated and repair procedures planned in advance. The fault tree analysis is one method of identifying the most probable failure modes and determining system failure rates. From these rates, repair frequency can be estimated. (author)

  20. An exact method for solving logical loops in reliability analysis

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2009-01-01

    This paper presents an exact method for solving logical loops in reliability analysis. The systems that include logical loops are usually described by simultaneous Boolean equations. First, present a basic rule of solving simultaneous Boolean equations. Next, show the analysis procedures for three-component system with external supports. Third, more detailed discussions are given for the establishment of logical loop relation. Finally, take up two typical structures which include more than one logical loop. Their analysis results and corresponding GO-FLOW charts are given. The proposed analytical method is applicable to loop structures that can be described by simultaneous Boolean equations, and it is very useful in evaluating the reliability of complex engineering systems.

  1. The DYLAM approach to systems safety and reliability assessment

    International Nuclear Information System (INIS)

    Amendola, A.

    1988-01-01

    A survey of the principal features and applications of DYLAM (Dynamic Logical Analytical Methodology) is presented, whose basic principles can be summarized as follows: after a particular modelling of the component states, computerized heuristical procedures generate stochastic configurations of the system, whereas the resulting physical processes are simultaneously simulated to give account of the possible interactions between physics and states and, on the other hand, to search for system dangerous configurations and related probabilities. The association of probabilistic techniques for describing the states with physical equations for describing the process results in a very powerful tool for safety and reliability assessment of systems potentially subjected to dangerous incidental transients. A comprehensive picture of DYLAM capability for manifold applications can be obtained by the review of the study cases analyzed (LMFBR core accident, systems reliability assessment, accident simulation, man-machine interaction analysis, chemical reactors safety, etc.)

  2. Toxicologic evaluation of analytes from Tank 241-C-103

    International Nuclear Information System (INIS)

    Mahlum, D.D.; Young, J.Y.; Weller, R.E.

    1994-11-01

    Westinghouse Hanford Company requested PNL to assemble a toxicology review panel (TRP) to evaluate analytical data compiled by WHC, and provide advice concerning potential health effects associated with exposure to tank-vapor constituents. The team's objectives would be to (1) review procedures used for sampling vapors from tanks, (2) identify constituents in tank-vapor samples that could be related to symptoms reported by workers, (3) evaluate the toxicological implications of those constituents by comparison to establish toxicological databases, (4) provide advice for additional analytical efforts, and (5) support other activities as requested by WHC. The TRP represents a wide range of expertise, including toxicology, industrial hygiene, and occupational medicine. The TRP prepared a list of target analytes that chemists at the Oregon Graduate Institute/Sandia (OGI), Oak Ridge National Laboratory (ORNL), and PNL used to establish validated methods for quantitative analysis of head-space vapors from Tank 241-C-103. this list was used by the analytical laboratories to develop appropriate analytical methods for samples from Tank 241-C-103. Target compounds on the list included acetone, acetonitrile, ammonia, benzene, 1, 3-butadiene, butanal, n-butanol, hexane, 2-hexanone, methylene chloride, nitric oxide, nitrogen dioxide, nitrous oxide, dodecane, tridecane, propane nitrile, sulfur oxide, tributyl phosphate, and vinylidene chloride. The TRP considered constituent concentrations, current exposure limits, reliability of data relative to toxicity, consistency of the analytical data, and whether the material was carcinogenic or teratogenic. A final consideration in the analyte selection process was to include representative chemicals for each class of compounds found

  3. An Introduction To Reliability

    International Nuclear Information System (INIS)

    Park, Kyoung Su

    1993-08-01

    This book introduces reliability with definition of reliability, requirement of reliability, system of life cycle and reliability, reliability and failure rate such as summary, reliability characteristic, chance failure, failure rate which changes over time, failure mode, replacement, reliability in engineering design, reliability test over assumption of failure rate, and drawing of reliability data, prediction of system reliability, conservation of system, failure such as summary and failure relay and analysis of system safety.

  4. Analytical mechanics

    CERN Document Server

    Helrich, Carl S

    2017-01-01

    This advanced undergraduate textbook begins with the Lagrangian formulation of Analytical Mechanics and then passes directly to the Hamiltonian formulation and the canonical equations, with constraints incorporated through Lagrange multipliers. Hamilton's Principle and the canonical equations remain the basis of the remainder of the text. Topics considered for applications include small oscillations, motion in electric and magnetic fields, and rigid body dynamics. The Hamilton-Jacobi approach is developed with special attention to the canonical transformation in order to provide a smooth and logical transition into the study of complex and chaotic systems. Finally the text has a careful treatment of relativistic mechanics and the requirement of Lorentz invariance. The text is enriched with an outline of the history of mechanics, which particularly outlines the importance of the work of Euler, Lagrange, Hamilton and Jacobi. Numerous exercises with solutions support the exceptionally clear and concise treatment...

  5. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  6. On Bayesian System Reliability Analysis

    International Nuclear Information System (INIS)

    Soerensen Ringi, M.

    1995-01-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs

  7. Analytical aids in land management planning

    Science.gov (United States)

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  8. [How Reliable is Neuronavigation?].

    Science.gov (United States)

    Stieglitz, Lennart Henning

    2016-02-17

    Neuronavigation plays a central role in modern neurosurgery. It allows visualizing instruments and three-dimensional image data intraoperatively and supports spatial orientation. Thus it allows to reduce surgical risks and speed up complex surgical procedures. The growing availability and importance of neuronavigation makes clear how relevant it is to know about its reliability and accuracy. Different factors may influence the accuracy during the surgery unnoticed, misleading the surgeon. Besides the best possible optimization of the systems themselves, a good knowledge about its weaknesses is mandatory for every neurosurgeon.

  9. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  10. Corrections for criterion reliability in validity generalization: The consistency of Hermes, the utility of Midas

    Directory of Open Access Journals (Sweden)

    Jesús F. Salgado

    2016-04-01

    Full Text Available There is criticism in the literature about the use of interrater coefficients to correct for criterion reliability in validity generalization (VG studies and disputing whether .52 is an accurate and non-dubious estimate of interrater reliability of overall job performance (OJP ratings. We present a second-order meta-analysis of three independent meta-analytic studies of the interrater reliability of job performance ratings and make a number of comments and reflections on LeBreton et al.s paper. The results of our meta-analysis indicate that the interrater reliability for a single rater is .52 (k = 66, N = 18,582, SD = .105. Our main conclusions are: (a the value of .52 is an accurate estimate of the interrater reliability of overall job performance for a single rater; (b it is not reasonable to conclude that past VG studies that used .52 as the criterion reliability value have a less than secure statistical foundation; (c based on interrater reliability, test-retest reliability, and coefficient alpha, supervisor ratings are a useful and appropriate measure of job performance and can be confidently used as a criterion; (d validity correction for criterion unreliability has been unanimously recommended by "classical" psychometricians and I/O psychologists as the proper way to estimate predictor validity, and is still recommended at present; (e the substantive contribution of VG procedures to inform HRM practices in organizations should not be lost in these technical points of debate.

  11. Reliability Analysis of Money Habitudes

    Science.gov (United States)

    Delgadillo, Lucy M.; Bushman, Brittani S.

    2015-01-01

    Use of the Money Habitudes exercise has gained popularity among various financial professionals. This article reports on the reliability of this resource. A survey administered to young adults at a western state university was conducted, and each Habitude or "domain" was analyzed using Cronbach's alpha procedures. Results showed all six…

  12. Procedures For Microbial-Ecology Laboratory

    Science.gov (United States)

    Huff, Timothy L.

    1993-01-01

    Microbial Ecology Laboratory Procedures Manual provides concise and well-defined instructions on routine technical procedures to be followed in microbiological laboratory to ensure safety, analytical control, and validity of results.

  13. Frontiers of reliability

    CERN Document Server

    Basu, Asit P; Basu, Sujit K

    1998-01-01

    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  14. Multielement trace determination in SiC powders: assessment of interlaboratory comparisons aimed at the validation and standardization of analytical procedures with direct solid sampling based on ETV ICP OES and DC arc OES.

    Science.gov (United States)

    Matschat, Ralf; Hassler, Jürgen; Traub, Heike; Dette, Angelika

    2005-12-01

    The members of the committee NMP 264 "Chemical analysis of non-oxidic raw and basic materials" of the German Standards Institute (DIN) have organized two interlaboratory comparisons for multielement determination of trace elements in silicon carbide (SiC) powders via direct solid sampling methods. One of the interlaboratory comparisons was based on the application of inductively coupled plasma optical emission spectrometry with electrothermal vaporization (ETV ICP OES), and the other on the application of optical emission spectrometry with direct current arc (DC arc OES). The interlaboratory comparisons were organized and performed in the framework of the development of two standards related to "the determination of mass fractions of metallic impurities in powders and grain sizes of ceramic raw and basic materials" by both methods. SiC powders were used as typical examples of this category of material. The aim of the interlaboratory comparisons was to determine the repeatability and reproducibility of both analytical methods to be standardized. This was an important contribution to the practical applicability of both draft standards. Eight laboratories participated in the interlaboratory comparison with ETV ICP OES and nine in the interlaboratory comparison with DC arc OES. Ten analytes were investigated by ETV ICP OES and eleven by DC arc OES. Six different SiC powders were used for the calibration. The mass fractions of their relevant trace elements were determined after wet chemical digestion. All participants followed the analytical requirements described in the draft standards. In the calculation process, three of the calibration materials were used successively as analytical samples. This was managed in the following manner: the material that had just been used as the analytical sample was excluded from the calibration, so the five other materials were used to establish the calibration plot. The results from the interlaboratory comparisons were summarized and

  15. Reliability of Circumplex Axes

    Directory of Open Access Journals (Sweden)

    Micha Strack

    2013-06-01

    Full Text Available We present a confirmatory factor analysis (CFA procedure for computing the reliability of circumplex axes. The tau-equivalent CFA variance decomposition model estimates five variance components: general factor, axes, scale-specificity, block-specificity, and item-specificity. Only the axes variance component is used for reliability estimation. We apply the model to six circumplex types and 13 instruments assessing interpersonal and motivational constructs—Interpersonal Adjective List (IAL, Interpersonal Adjective Scales (revised; IAS-R, Inventory of Interpersonal Problems (IIP, Impact Messages Inventory (IMI, Circumplex Scales of Interpersonal Values (CSIV, Support Action Scale Circumplex (SAS-C, Interaction Problems With Animals (IPI-A, Team Role Circle (TRC, Competing Values Leadership Instrument (CV-LI, Love Styles, Organizational Culture Assessment Instrument (OCAI, Customer Orientation Circle (COC, and System for Multi-Level Observation of Groups (behavioral adjectives; SYMLOG—in 17 German-speaking samples (29 subsamples, grouped by self-report, other report, and metaperception assessments. The general factor accounted for a proportion ranging from 1% to 48% of the item variance, the axes component for 2% to 30%; and scale specificity for 1% to 28%, respectively. Reliability estimates varied considerably from .13 to .92. An application of the Nunnally and Bernstein formula proposed by Markey, Markey, and Tinsley overestimated axes reliabilities in cases of large-scale specificities but otherwise works effectively. Contemporary circumplex evaluations such as Tracey’s RANDALL are sensitive to the ratio of the axes and scale-specificity components. In contrast, the proposed model isolates both components.

  16. Accounting for Proof Test Data in a Reliability Based Design Optimization Framework

    Science.gov (United States)

    Ventor, Gerharad; Scotti, Stephen J.

    2012-01-01

    This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.

  17. Play vs. Procedures

    DEFF Research Database (Denmark)

    Hammar, Emil

    Through the theories of play by Gadamer (2004) and Henricks (2006), I will show how the relationship between play and game can be understood as dialectic and disruptive, thus challenging understandings of how the procedures of games determine player activity and vice versa. As such, I posit some...... analytical consequences for understandings of digital games as procedurally fixed (Boghost, 2006; Flannagan, 2009; Bathwaite & Sharp, 2010). That is, if digital games are argued to be procedurally fixed and if play is an appropriative and dialectic activity, then it could be argued that the latter affects...... and alters the former, and vice versa. Consequently, if the appointed procedures of a game are no longer fixed and rigid in their conveyance of meaning, qua the appropriative and dissolving nature of play, then understandings of games as conveying a fixed meaning through their procedures are inadequate...

  18. Experiments with Cloze Procedure

    Science.gov (United States)

    Evans, Gordon; Haastrup, Kirsten

    1976-01-01

    The Nordic Test Development Group prepared proficiency tests of English designed to provide reliable information on which to base decisions as to whether a candidate would be able to function in a job as described or whether he could be trained to do so. Two subtests used a modified cloze procedure. (Author/CFM)

  19. Mars Analytical Microimager

    Science.gov (United States)

    Batory, Krzysztof J.; Govindjee; Andersen, Dale; Presley, John; Lucas, John M.; Sears, S. Kelly; Vali, Hojatollah

    Unambiguous detection of extraterrestrial nitrogenous hydrocarbon microbiology requires an instrument both to recognize potential biogenic specimens and to successfully discriminate them from geochemical settings. Such detection should ideally be in-situ and not jeopardize other experiments by altering samples. Taken individually most biomarkers are inconclusive. For example, since amino acids can be synthesized abiotically they are not always considered reliable biomarkers. An enantiomeric imbalance, which is characteristic of all terrestrial life, may be questioned because chirality can also be altered abiotically. However, current scientific understanding holds that aggregates of identical proteins or proteinaceous complexes, with their well-defined amino acid residue sequences, are indisputable biomarkers. Our paper describes the Mars Analytical Microimager, an instrument for the simultaneous imaging of generic autofluorescent biomarkers and overall morphology. Autofluorescence from ultraviolet to near-infrared is emitted by all known terrestrial biology, and often as consistent complex bands uncharacteristic of abiotic mineral luminescence. The MAM acquires morphology, and even sub-micron morphogenesis, at a 3-centimeter working distance with resolution approaching a laser scanning microscope. Luminescence is simultaneously collected via a 2.5-micron aperture, thereby permitting accurate correlation of multi-dimensional optical behavior with specimen morphology. A variable wavelength excitation source and photospectrometer serve to obtain steady-state and excitation spectra of biotic and luminescent abiotic sources. We believe this is the first time instrumentation for detecting hydrated or desiccated microbiology non-destructively in-situ has been demonstrated. We have obtained excellent preliminary detection of biota and inorganic matrix discrimination from terrestrial polar analogues, and perimetric morphology of individual magnetotactic bacteria. Proposed

  20. Fundamentals and applications of systems reliability analysis

    International Nuclear Information System (INIS)

    Boesebeck, K.; Heuser, F.W.; Kotthoff, K.

    1976-01-01

    The lecture gives a survey on the application of methods of reliability analysis to assess the safety of nuclear power plants. Possible statements of reliability analysis in connection with specifications of the atomic licensing procedure are especially dealt with. Existing specifications of safety criteria are additionally discussed with the help of reliability analysis by the example of the reliability analysis of a reactor protection system. Beyond the limited application to single safety systems, the significance of reliability analysis for a closed risk concept is explained in the last part of the lecture. (orig./LH) [de

  1. Analytical method comparisons for the accurate determination of PCBs in sediments

    Energy Technology Data Exchange (ETDEWEB)

    Numata, M.; Yarita, T.; Aoyagi, Y.; Yamazaki, M.; Takatsu, A. [National Metrology Institute of Japan, Tsukuba (Japan)

    2004-09-15

    National Metrology Institute of Japan in National Institute of Advanced Industrial Science and Technology (NMIJ/AIST) has been developing several matrix reference materials, for example, sediments, water and biological tissues, for the determinations of heavy metals and organometallic compounds. The matrix compositions of those certified reference materials (CRMs) are similar to compositions of actual samples, and those are useful for validating analytical procedures. ''Primary methods of measurements'' are essential to obtain accurate and SI-traceable certified values in the reference materials, because the methods have the highest quality of measurement. However, inappropriate analytical operations, such as incomplete extraction of analytes or crosscontamination during analytical procedures, will cause error of analytical results, even if one of the primary methods, isotope-dilution, is utilized. To avoid possible procedural bias for the certification of reference materials, we employ more than two analytical methods which have been optimized beforehand. Because the accurate determination of trace POPs in the environment is important to evaluate their risk, reliable CRMs are required by environmental chemists. Therefore, we have also been preparing matrix CRMs for the determination of POPs. To establish accurate analytical procedures for the certification of POPs, extraction is one of the critical steps as described above. In general, conventional extraction techniques for the determination of POPs, such as Soxhlet extraction (SOX) and saponification (SAP), have been characterized well, and introduced as official methods for environmental analysis. On the other hand, emerging techniques, such as microwave-assisted extraction (MAE), pressurized fluid extraction (PFE) and supercritical fluid extraction (SFE), give higher recovery yields of analytes with relatively short extraction time and small amount of solvent, by reasons of the high

  2. Design-reliability assurance program application to ACP600

    International Nuclear Information System (INIS)

    Zhichao, Huang; Bo, Zhao

    2012-01-01

    ACP600 is a newly nuclear power plant technology made by CNNC in China and it is based on the Generation III NPPs design experience and general safety goals. The ACP600 Design Reliability Assurance Program (D-RAP) is implemented as an integral part of the ACP600 design process. A RAP is a formal management system which assures the collection of important characteristic information about plant performance throughout each phase of its life and directs the use of this information in the implementation of analytical and management process which are specifically designed to meet two specific objects: confirm the plant goals and cost effective improvements. In general, typical reliability assurance program have 4 broad functional elements: 1) Goals and performance criteria; 2) Management system and implementing procedures; 3) Analytical tools and investigative methods; and 4) Information management. In this paper we will use the D-RAP technical and Risk-Informed requirements, and establish the RAM and PSA model to optimize the ACP600 design. Compared with previous design process, the D-RAP is more competent for the higher design targets and requirements, enjoying more creativity through an easier implementation of technical breakthroughs. By using D-RAP, the plants goals, system goals, performance criteria and safety criteria can be easier to realize, and the design can be optimized and more rational

  3. Comparison of Three Analytical Methods for Separation of Mineral and Chelated Fraction from an Adulterated Zn-EDTA Fertilizer

    International Nuclear Information System (INIS)

    Khan, M.S.; Qazi, M.A.; Khan, N.A.; Mian, S.M.; Ahmed, N.; Ahmed, N.

    2013-01-01

    Summary: Different analytical procedures are being employed in the world to quantify the chelated portion in a Zn-EDTA fertilizer. Agriculture Department, Government of the Punjab is following Shahid's analytical method in this regard. This method is based on Ion-chromatography (IC) that separates the mineral zinc (Zn) from an adulterated Zn-EDTA fertilizer sample i.e. mixture of mineral and chelated Zn fractions. To find out its effectiveness and suitability, this comparative study was carried out by analyzing adulterated, non-adulterated Zn-EDTA standard and Zn-EDTA samples taken from market in thrice following three methods namely Shahid's (IC) analytical method, Atomic Absorption Spectrophotometric (AAS) method based on the principle of precipitating the mineral Zn fraction at high pH value by using alkali solution of suitable concentration and analysis of filtrate containing only chelated fraction and Association of Official Analytical Chemists (AOAC) method FM-841 respectively. Adulterated Zn-EDTA samples were prepared by mixing of known quantity of mineral Zn with chelated Zn-EDTA standard. The results showed that Shahid's analytical method and AAS method, both successfully estimated the chelated fraction. The AOAC FM-841 method was insensitive to put a ceiling on the mineral fraction hence did not furnish the reliable results. The Shahid's analytical method was selected being equallyeffective to produce reliable results both for solid and liquid Zn-EDTA samples. The AAS method was comparable in only liquid samples. (author)

  4. System Reliability Engineering

    International Nuclear Information System (INIS)

    Lim, Tae Jin

    2005-02-01

    This book tells of reliability engineering, which includes quality and reliability, reliability data, importance of reliability engineering, reliability and measure, the poisson process like goodness of fit test and the poisson arrival model, reliability estimation like exponential distribution, reliability of systems, availability, preventive maintenance such as replacement policies, minimal repair policy, shock models, spares, group maintenance and periodic inspection, analysis of common cause failure, and analysis model of repair effect.

  5. Reliability of structural systems subject to fatigue

    International Nuclear Information System (INIS)

    Rackwitz, R.

    1984-01-01

    Concepts and computational procedures for the reliability calculation of structural systems subject to fatigue are outlined. Systems are dealt with by approximately computing componential times to first failure. So-called first-order reliability methods are then used to formulate dependencies between componential failures and to evaluate the system failure probability. (Author) [pt

  6. Lifetime Reliability Assessment of Concrete Slab Bridges

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    A procedure for lifetime assesment of the reliability of short concrete slab bridges is presented in the paper. Corrosion of the reinforcement is the deterioration mechanism used for estimating the reliability profiles for such bridges. The importance of using sensitivity measures is stressed....... Finally the produce is illustrated on 6 existing UK bridges....

  7. Common-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest, M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CROSSER, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), used independently of one another. Point of equality between reliability of system and common reliability of components found. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  8. Scale Reliability Evaluation with Heterogeneous Populations

    Science.gov (United States)

    Raykov, Tenko; Marcoulides, George A.

    2015-01-01

    A latent variable modeling approach for scale reliability evaluation in heterogeneous populations is discussed. The method can be used for point and interval estimation of reliability of multicomponent measuring instruments in populations representing mixtures of an unknown number of latent classes or subpopulations. The procedure is helpful also…

  9. A procedure for the rapid determination of Pu isotopes and Am-241 in soil and sediment samples by alpha spectrometry

    International Nuclear Information System (INIS)

    2009-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is a extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004 the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. In this report, a rapid procedure for the determination of Pu and Am radionuclides in soil and sediment samples is described that can be used in emergency situations. The method provides accurate and reliable results for the activity concentrations of elevated levels of 239,240 Pu, 238 Pu and 241 Am in soil and sediment samples over the course of 24 hours. The procedure has been validated in accordance with ISO guidelines

  10. Seamless Digital Environment – Data Analytics Use Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-08-01

    Multiple research efforts in the U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program studies the need and design of an underlying architecture to support the increased amount and use of data in the nuclear power plant. More specifically the three LWRS research efforts; Digital Architecture for an Automated Plant, Automated Work Packages, Computer-Based Procedures for Field Workers, and the Online Monitoring efforts all have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment (SDE). A SDE provides a mean to access multiple applications, gather the data points needed, conduct the analysis requested, and present the result to the user with minimal or no effort by the user. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting the nuclear utilities identified the need for research focused on data analytics. The effort was to develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight forward, easy to consume manner. This report describes the use case study initiated by NITSL and conducted in a collaboration between Idaho National Laboratory, Arizona Public Service – Palo Verde Nuclear Generating Station, and NextAxiom Inc.

  11. Reliability based code calibration of fatigue design criteria of nuclear Class-1 piping

    International Nuclear Information System (INIS)

    Mishra, J.; Balasubramaniyan, V.; Chellapandi, P.

    2016-01-01

    Fatigue design of Class-l piping of NPP is carried out using Section-III of American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel code. The fatigue design criteria of ASME are based on the concept of safety factor, which does not provide means for the management of uncertainties for consistently reliable and economical designs. In this regards, a work is taken up to estimate the implicit reliability level associated with fatigue design criteria of Class-l piping specified by ASME Section III, NB-3650. As ASME fatigue curve is not in the form of analytical expression, the reliability level of pipeline fittings and joints is evaluated using the mean fatigue curve developed by Argonne National Laboratory (ANL). The methodology employed for reliability evaluation is FORM, HORSM and MCS. The limit state function for fatigue damage is found to be sensitive to eight parameters, which are systematically modelled as stochastic variables during reliability estimation. In conclusion a number of important aspects related to reliability of various piping product and joints are discussed. A computational example illustrates the developed procedure for a typical pipeline. (author)

  12. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    Science.gov (United States)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  13. Quantization Procedures

    International Nuclear Information System (INIS)

    Cabrera, J. A.; Martin, R.

    1976-01-01

    We present in this work a review of the conventional quantization procedure, the proposed by I.E. Segal and a new quantization procedure similar to this one for use in non linear problems. We apply this quantization procedures to different potentials and we obtain the appropriate equations of motion. It is shown that for the linear case the three procedures exposed are equivalent but for the non linear cases we obtain different equations of motion and different energy spectra. (Author) 16 refs

  14. CARVEDILOL POPULATION PHARMACOKINETIC ANALYSIS – APPLIED VALIDATION PROCEDURE

    Directory of Open Access Journals (Sweden)

    Aleksandra Catić-Đorđević

    2013-09-01

    Full Text Available Carvedilol is a nonselective beta blocker/alpha-1 blocker, which is used for treatment of essential hypertension, chronic stable angina, unstable angina and ischemic left ventricular dysfunction. The aim of this study was to describe carvedilol population pharmacokinetic (PK analysis as well as the validation of analytical procedure, which is an important step regarding this approach. In contemporary clinical practice, population PK analysis is often more important than standard PK approach in setting a mathematical model that describes the PK parameters. Also, it includes the variables that have particular importance in the drugs pharmacokinetics such as sex, body mass, dosage, pharmaceutical form, pathophysiological state, disease associated with the organism or the presence of a specific polymorphism in the isoenzyme important for biotransformation of the drug. One of the most frequently used approach in population PK analysis is the Nonlinear Modeling of Mixed Effects - NONMEM modeling. Analytical methods used in the data collection period is of great importance for the implementation of a population PK analysis of carvedilol in order to obtain reliable data that can be useful in clinical practice. High performance liquid chromatography (HPLC analysis of carvedilol is used to confirm the identity of a drug and provide quantitative results and also to monitor the efficacy of the therapy. Analytical procedures used in other studies could not be fully implemented in our research as it was necessary to perform certain modification and validation of the method with the aim of using the obtained results for the purpose of a population pharmacokinetic analysis. Validation process is a logical terminal phase of analytical procedure development that provides applicability of the procedure itself. The goal of validation is to ensure consistency of the method and accuracy of results or to confirm the selection of analytical method for a given sample

  15. AMSAA Reliability Growth Guide

    National Research Council Canada - National Science Library

    Broemm, William

    2000-01-01

    ... has developed reliability growth methodology for all phases of the process, from planning to tracking to projection. The report presents this methodology and associated reliability growth concepts.

  16. Pre-analytical issues in the haemostasis laboratory: guidance for the clinical laboratories.

    Science.gov (United States)

    Magnette, A; Chatelain, M; Chatelain, B; Ten Cate, H; Mullier, F

    2016-01-01

    Ensuring quality has become a daily requirement in laboratories. In haemostasis, even more than in other disciplines of biology, quality is determined by a pre-analytical step that encompasses all procedures, starting with the formulation of the medical question, and includes patient preparation, sample collection, handling, transportation, processing, and storage until time of analysis. This step, based on a variety of manual activities, is the most vulnerable part of the total testing process and is a major component of the reliability and validity of results in haemostasis and constitutes the most important source of erroneous or un-interpretable results. Pre-analytical errors may occur throughout the testing process and arise from unsuitable, inappropriate or wrongly handled procedures. Problems may arise during the collection of blood specimens such as misidentification of the sample, use of inadequate devices or needles, incorrect order of draw, prolonged tourniquet placing, unsuccessful attempts to locate the vein, incorrect use of additive tubes, collection of unsuitable samples for quality or quantity, inappropriate mixing of a sample, etc. Some factors can alter the result of a sample constituent after collection during transportation, preparation and storage. Laboratory errors can often have serious adverse consequences. Lack of standardized procedures for sample collection accounts for most of the errors encountered within the total testing process. They can also have clinical consequences as well as a significant impact on patient care, especially those related to specialized tests as these are often considered as "diagnostic". Controlling pre-analytical variables is critical since this has a direct influence on the quality of results and on their clinical reliability. The accurate standardization of the pre-analytical phase is of pivotal importance for achieving reliable results of coagulation tests and should reduce the side effects of the influence

  17. System analysis procedures for conducting PSA of nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Yoon Hwan; Jeong, Won Dae; Kim, Tae Un; Kim, Kil You; Han, Sang Hoon; Chang, Seung Chul; Sung, Tae Yong; Yang, Jun Eon; Kang, Dae Il; Park, Jin Hee; Hwang, Mi Jeong; Jin, Young Ho.

    1997-03-01

    This document, the Probabilistic Safety Assessment(PSA) procedures guide for system analysis, is intended to provide the guidelines to analyze the target of system consistently and technically in the performance of PSA for nuclear power plants(NPPs). The guide has been prepared in accordance with the procedures and techniques for fault tree analysis(FTA) used in system analysis. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis(ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. After the construction of fault tree is completed, system unavailability is calculated with the CUT module of KIRAP, and the qualitative and quantitative analysis is performed through the process as above stated. As above mentioned, the procedures for system analysis is based on PSA procedures and methods which has been applied to the safety assessments of constructing NPPs in the country. Accordingly, the method of FTA stated in this procedures guide will be applicable to PSA for the NPPs to be constructed in the future. (author). 6 tabs., 11 figs., 7 refs

  18. System analysis procedures for conducting PSA of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yoon Hwan; Jeong, Won Dae; Kim, Tae Un; Kim, Kil You; Han, Sang Hoon; Chang, Seung Chul; Sung, Tae Yong; Yang, Jun Eon; Kang, Dae Il; Park, Jin Hee; Hwang, Mi Jeong; Jin, Young Ho

    1997-03-01

    This document, the Probabilistic Safety Assessment(PSA) procedures guide for system analysis, is intended to provide the guidelines to analyze the target of system consistently and technically in the performance of PSA for nuclear power plants(NPPs). The guide has been prepared in accordance with the procedures and techniques for fault tree analysis(FTA) used in system analysis. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis(ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. After the construction of fault tree is completed, system unavailability is calculated with the CUT module of KIRAP, and the qualitative and quantitative analysis is performed through the process as above stated. As above mentioned, the procedures for system analysis is based on PSA procedures and methods which has been applied to the safety assessments of constructing NPPs in the country. Accordingly, the method of FTA stated in this procedures guide will be applicable to PSA for the NPPs to be constructed in the future. (author). 6 tabs., 11 figs., 7 refs.

  19. Structural Optimization with Reliability Constraints

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1986-01-01

    During the last 25 years considerable progress has been made in the fields of structural optimization and structural reliability theory. In classical deterministic structural optimization all variables are assumed to be deterministic. Due to the unpredictability of loads and strengths of actual......]. In this paper we consider only structures which can be modelled as systems of elasto-plastic elements, e.g. frame and truss structures. In section 2 a method to evaluate the reliability of such structural systems is presented. Based on a probabilistic point of view a modern structural optimization problem...... is formulated in section 3. The formulation is a natural extension of the commonly used formulations in determinstic structural optimization. The mathematical form of the optimization problem is briefly discussed. In section 4 two new optimization procedures especially designed for the reliability...

  20. Component reliability for electronic systems

    CERN Document Server

    Bajenescu, Titu-Marius I

    2010-01-01

    The main reason for the premature breakdown of today's electronic products (computers, cars, tools, appliances, etc.) is the failure of the components used to build these products. Today professionals are looking for effective ways to minimize the degradation of electronic components to help ensure longer-lasting, more technically sound products and systems. This practical book offers engineers specific guidance on how to design more reliable components and build more reliable electronic systems. Professionals learn how to optimize a virtual component prototype, accurately monitor product reliability during the entire production process, and add the burn-in and selection procedures that are the most appropriate for the intended applications. Moreover, the book helps system designers ensure that all components are correctly applied, margins are adequate, wear-out failure modes are prevented during the expected duration of life, and system interfaces cannot lead to failure.

  1. Let's Talk... Analytics

    Science.gov (United States)

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  2. Analytics for Education

    Science.gov (United States)

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  3. Analytical chemistry instrumentation

    International Nuclear Information System (INIS)

    Laing, W.R.

    1986-01-01

    In nine sections, 48 chapters cover 1) analytical chemistry and the environment 2) environmental radiochemistry 3) automated instrumentation 4) advances in analytical mass spectrometry 5) fourier transform spectroscopy 6) analytical chemistry of plutonium 7) nuclear analytical chemistry 8) chemometrics and 9) nuclear fuel technology

  4. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  5. A Big Data Analytics Methodology Program in the Health Sector

    Science.gov (United States)

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  6. Support to NPP operation and maintenance technology risk management. A concept for establishing criteria and procedure for the selection of components with respect to their importance. Stage 3.1. NPP equipment reliability management

    International Nuclear Information System (INIS)

    Stvan, F.

    2003-12-01

    A proposal was developed for a procedure using the deterministic approach to the assessment of components from the operational point of view and other aspects that cannot be directly and readily quantified and of the probabilistic approach for the assessment of component importance with respect to nuclear safety. A specific PSA study performed for the Dukovany NPP was employed. The structure of the report is as follows: (1) Aspects of component selection; (2) Introductory procedure; (3) Criteria for the selection of components with respect to their importance (4) Assessing the priority of use of the assets - effect on production, safety, and profit; (5) Assessment of the risk aspect of the assets - effect on major processes; (6) Assessment of the level of use of the assets; (7) Assessment of the structure of the assets - optimal structure for maintenance in relation to the major processes; (8) Assessment of the criteria for estimating the importance of the components; (9) Probabilistic assessment of importance from the safety aspect by means of PSA; and (10) Deterministic assessment of importance from the safety aspect. (P.A.)

  7. Multielement analytical procedure coupling INAA, ICP-MS and ICP-AES: Application to the determination of major and trace elements in sediment samples of the Bouregreg river (Morocco)

    International Nuclear Information System (INIS)

    Bounouira, H.; CEA - CNRS/UMR, Centre de Saclay, 91 - Gif sur Yvette; Choukri, A.; Hakam, O.K.; Cherkaoui, R.; Gaudry, A.; Delmas, R.; Mariet, C.; Chakiri, S.

    2008-01-01

    Instrumental neutron activation analysis (INAA), inductively coupled plasma-mass spectrometry (ICP-MS) and inductively coupled plasma-atomic emission spectrometry (ICP-AES) were used for the determination of major and trace elements in sediment samples of the Bouregreg river (Morocco). The reliability of the results was checked, by using IAEA Soil-7 certified reference material. Results obtained by the three techniques were compared to control digestions efficiencies. A general good agreement was found between INAA and both ICP-MS and ICP-AES after alkaline fusion (ICPf). The ICP-MS technique used after acid attack (ICPa) was satisfactory for a few elements. A principal component analysis (PCA) has been used for analyzing the variability of concentrations, and defining the most influential sites with respect to the general variation trends. Three groups of elements could be distinguished. For these groups a normalization of concentrations to the central element concentration (that means Mn, Si or Al) is proposed. (author)

  8. Reliability demonstration test planning using bayesian analysis

    International Nuclear Information System (INIS)

    Chandran, Senthil Kumar; Arul, John A.

    2003-01-01

    In Nuclear Power Plants, the reliability of all the safety systems is very critical from the safety viewpoint and it is very essential that the required reliability requirements be met while satisfying the design constraints. From practical experience, it is found that the reliability of complex systems such as Safety Rod Drive Mechanism is of the order of 10 -4 with an uncertainty factor of 10. To demonstrate the reliability of such systems is prohibitive in terms of cost and time as the number of tests needed is very large. The purpose of this paper is to develop a Bayesian reliability demonstrating testing procedure for exponentially distributed failure times with gamma prior distribution on the failure rate which can be easily and effectively used to demonstrate component/subsystem/system reliability conformance to stated requirements. The important questions addressed in this paper are: With zero failures, how long one should perform the tests and how many components are required to conclude with a given degree of confidence, that the component under test, meets the reliability requirement. The procedure is explained with an example. This procedure can also be extended to demonstrate with more number of failures. The approach presented is applicable for deriving test plans for demonstrating component failure rates of nuclear power plants, as the failure data for similar components are becoming available in existing plants elsewhere. The advantages of this procedure are the criterion upon which the procedure is based is simple and pertinent, the fitting of the prior distribution is an integral part of the procedure and is based on the use of information regarding two percentiles of this distribution and finally, the procedure is straightforward and easy to apply in practice. (author)

  9. On the NPP structural reliability

    International Nuclear Information System (INIS)

    Klemin, A.I.; Polyakov, E.F.

    1980-01-01

    Reviewed are the main statements peculiarities and possibilities of the first branch guiding technical material (GTM) ''The methods of calculation of structural reliability of NPP and its systems at the stage of projecting''. It is stated, that in GTM presented are recomendations on the calculation of reliability of such specific systems, as the system of the reactor control and protection the system of measuring instruments and automatics and safe systems. GTM are based on analytical methods of modern theory of realibility with the Use of metodology of minimal cross sections of complex systems. It is stressed, that the calculations on the proposed methods permit to calculate a wide complex of reliability parameters, reflecting separately or together prorerties of NPP dependability and maintainability. For NPP, operating by a variable schedule of leading, aditionally considered are parameters, characterizing reliability with account of the proposed regime of power change, i.e. taking into account failures, caused by decrease of the obtained power lower, than the reguired or increase of the required power higher, than the obtained

  10. Application of CWC analytical procedures for safeguards; Analysis of phosphorus-containing organic chemical signatures from environmental samples; Final report on task FIN A844 on the Finnish support programme to IAEA safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Rautio, M; Bjoerk, H; Haekkinen, V; Kostiainen, O; Kuitunen, M L; Lehtonen, P; Mesilaakso, M; Soederstroem, M [Finnish Inst. for Verification of the Chemical Weapons Convention, Helsinki (Finland)

    1995-03-01

    Solvent extraction can be used for the recovery of U and Pu from irradiated fuel. The most potential organic chemical signatures are extractants and solvents used in reprocessing plants. The PUREX process is widely used in reprocessing. It uses tri-n-butyl phosphate (TBP) as extractant in an organic solvent for U and Pu from irradiated fuel and U from its ores. TBP is a strong extractant for tetra and hexavalent actinides from nitric acid media. Stable complexes are formed between actinide nitrate and TBP which are soluble in the organic phase. Sample containing TBP and some radiolysis products can indicate that TBP is used for reprocessing nuclear fuel. The TBP will decompose in the PUREX process to mono-and dibutyl phosphates (MBP and DBP). TBP, DBP and MBP have been analysed from air, water, soil, and sediment samples according to slightly modified procedures presented in Recommended Operating Procedures for Sampling and Analysis in the Verification of Chemical Disarmament. The limits of detection for the phosphates have been determined for air, water and soil samples. (orig.) (12 refs., 8 figs., 4 tabs.).

  11. Exact reliability quantification of highly reliable systems with maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Bris, Radim, E-mail: radim.bris@vsb.c [VSB-Technical University Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Applied Mathematics, 17. listopadu 15, 70833 Ostrava-Poruba (Czech Republic)

    2010-12-15

    When a system is composed of highly reliable elements, exact reliability quantification may be problematic, because computer accuracy is limited. Inaccuracy can be due to different aspects. For example, an error may be made when subtracting two numbers that are very close to each other, or at the process of summation of many very different numbers, etc. The basic objective of this paper is to find a procedure, which eliminates errors made by PC when calculations close to an error limit are executed. Highly reliable system is represented by the use of directed acyclic graph which is composed from terminal nodes, i.e. highly reliable input elements, internal nodes representing subsystems and edges that bind all of these nodes. Three admissible unavailability models of terminal nodes are introduced, including both corrective and preventive maintenance. The algorithm for exact unavailability calculation of terminal nodes is based on merits of a high-performance language for technical computing MATLAB. System unavailability quantification procedure applied to a graph structure, which considers both independent and dependent (i.e. repeatedly occurring) terminal nodes is based on combinatorial principle. This principle requires summation of a lot of very different non-negative numbers, which may be a source of an inaccuracy. That is why another algorithm for exact summation of such numbers is designed in the paper. The summation procedure uses benefits from a special number system with the base represented by the value 2{sup 32}. Computational efficiency of the new computing methodology is compared with advanced simulation software. Various calculations on systems from references are performed to emphasize merits of the methodology.

  12. Representative Sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Petersen, Lars; Esbensen, Kim Harry

    2005-01-01

    regime in order to secure the necessary reliability of: samples (which must be representative, from the primary sampling onwards), analysis (which will not mean anything outside the miniscule analytical volume without representativity ruling all mass reductions involved, also in the laboratory) and data...

  13. Investigating Reliabilities of Intraindividual Variability Indicators

    Science.gov (United States)

    Wang, Lijuan; Grimm, Kevin J.

    2012-01-01

    Reliabilities of the two most widely used intraindividual variability indicators, "ISD[superscript 2]" and "ISD", are derived analytically. Both are functions of the sizes of the first and second moments of true intraindividual variability, the size of the measurement error variance, and the number of assessments within a burst. For comparison,…

  14. Reliability of power system with open access

    International Nuclear Information System (INIS)

    Ehsani, A.; Ranjbar, A. M.; Fotuhi Firuzabad, M.; Ehsani, M.

    2003-01-01

    Recently, in many countries, electric utility industry is undergoing considerable changes in regard to its structure and regulation. It can be clearly seen that the thrust towards privatization and deregulation or re regulation of the electric utility industry will introduce numerous reliability problems that will require new criteria and analytical tools that recognize the residual uncertainties in the new environment. In this paper, different risks and uncertainties in competitive electricity markets are briefly introduced; the approach of customers, operators, planners, generation bodies and network providers to the reliability of deregulated system is studied; the impact of dispersed generation on system reliability is evaluated; and finally, the reliability cost/reliability worth issues in the new competitive environment are considered

  15. Environmental Measurements Laboratory (EML) procedures manual

    International Nuclear Information System (INIS)

    Chieco, N.A.; Bogen, D.C.; Knutson, E.O.

    1990-11-01

    Volume 1 of this manual documents the procedures and existing technology that are currently used by the Environmental Measurements Laboratory. A section devoted to quality assurance has been included. These procedures have been updated and revised and new procedures have been added. They include: sampling; radiation measurements; analytical chemistry; radionuclide data; special facilities; and specifications. 228 refs., 62 figs., 37 tabs. (FL)

  16. Structural Reliability Analysis of Wind Turbines: A Review

    Directory of Open Access Journals (Sweden)

    Zhiyu Jiang

    2017-12-01

    Full Text Available The paper presents a detailed review of the state-of-the-art research activities on structural reliability analysis of wind turbines between the 1990s and 2017. We describe the reliability methods including the first- and second-order reliability methods and the simulation reliability methods and show the procedure for and application areas of structural reliability analysis of wind turbines. Further, we critically review the various structural reliability studies on rotor blades, bottom-fixed support structures, floating systems and mechanical and electrical components. Finally, future applications of structural reliability methods to wind turbine designs are discussed.

  17. Design for Reliability of Power Electronic Systems

    DEFF Research Database (Denmark)

    Wang, Huai; Ma, Ke; Blaabjerg, Frede

    2012-01-01

    Advances in power electronics enable efficient and flexible processing of electric power in the application of renewable energy sources, electric vehicles, adjustable-speed drives, etc. More and more efforts are devoted to better power electronic systems in terms of reliability to ensure high......). A collection of methodologies based on Physics-of-Failure (PoF) approach and mission profile analysis are presented in this paper to perform reliability-oriented design of power electronic systems. The corresponding design procedures and reliability prediction models are provided. Further on, a case study...... on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical components IGBTs. Different aspects of improving the reliability of the power converter are mapped. Finally, the challenges and opportunities to achieve more reliable power electronic systems are addressed....

  18. Stress analysis of R2 pressure vessel. Structural reliability benchmark exercise

    International Nuclear Information System (INIS)

    Vestergaard, N.

    1987-05-01

    The Structural Reliability Benchmark Exercise (SRBE) is sponsored by the EEC as part of the Reactor Safety Programme. The objectives of the SRBE are to evaluate and improve 1) inspection procedures, which use non-destructive methods to locate defects in pressure (reactor) vessels, as well as 2) analytical damage accumulation models, which predict the time to failure of vessels containing defects. In order to focus attention, an experimental presure vessel has been inspected, subjected fatigue loadings and subsequently analysed by several teams using methods of their choice. The present report contains the first part of the analytical damage accumulation analysis. The stress distributions in the welds of the experimental pressure vessel were determined. These stress distributions will be used to determine the driving forces of the damage accumulation models, which will be addressed in a future report. (author)

  19. Reliability data banks

    International Nuclear Information System (INIS)

    Cannon, A.G.; Bendell, A.

    1991-01-01

    Following an introductory chapter on Reliability, what is it, why it is needed, how it is achieved and measured, the principles of reliability data bases and analysis methodologies are the subject of the next two chapters. Achievements due to the development of data banks are mentioned for different industries in the next chapter, FACTS, a comprehensive information system for industrial safety and reliability data collection in process plants are covered next. CREDO, the Central Reliability Data Organization is described in the next chapter and is indexed separately, as is the chapter on DANTE, the fabrication reliability Data analysis system. Reliability data banks at Electricite de France and IAEA's experience in compiling a generic component reliability data base are also separately indexed. The European reliability data system, ERDS, and the development of a large data bank come next. The last three chapters look at 'Reliability data banks, - friend foe or a waste of time'? and future developments. (UK)

  20. Suncor maintenance and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Little, S. [Suncor Energy, Calgary, AB (Canada)

    2006-07-01

    Fleet maintenance and reliability at Suncor Energy was discussed in this presentation, with reference to Suncor Energy's primary and support equipment fleets. This paper also discussed Suncor Energy's maintenance and reliability standard involving people, processes and technology. An organizational maturity chart that graphed organizational learning against organizational performance was illustrated. The presentation also reviewed the maintenance and reliability framework; maintenance reliability model; the process overview of the maintenance and reliability standard; a process flow chart of maintenance strategies and programs; and an asset reliability improvement process flow chart. An example of an improvement initiative was included, with reference to a shovel reliability review; a dipper trip reliability investigation; bucket related failures by type and frequency; root cause analysis of the reliability process; and additional actions taken. Last, the presentation provided a graph of the results of the improvement initiative and presented the key lessons learned. tabs., figs.

  1. Sampling and analytical procedures for the determination of VOCs released into air from natural and anthropogenic sources: A comparison between SPME (Solid Phase Micro Extraction) and ST (Solid Trap) methods

    International Nuclear Information System (INIS)

    Tassi, F.; Capecchiacci, F.; Buccianti, A.; Vaselli, O.

    2012-01-01

    In the present study, two sampling and analytical methods for VOC determination in fumarolic exhalations related to hydrothermal-magmatic reservoirs in volcanic and geothermal areas and biogas released from waste landfills were compared: (a) Solid Traps (STs), consisting of three phase (Carboxen B, Carboxen C and Carbosieve S111) absorbent stainless steel tubes and (b) Solid Phase Micro Extraction (SPME) fibers, composed of DiVinylBenzene (DVB), Carboxen and PolyDimethylSiloxane. These techniques were applied to pre-concentrate VOCs discharged from: (i) low-to-high temperature fumaroles collected at Vulcano Island, Phlegrean Fields (Italy), and Nisyros Island (Greece), (ii) recovery wells in a solid waste disposal site located near Florence (Italy). A glass condensing system cooled with water was used to collect the dry fraction of the fumarolic gases, in order to allow more efficient VOC absorption avoiding any interference by water vapor and acidic gases, such as SO 2 , H 2 S, HF and HCl, typically present at relatively high concentrations in these fluids. Up to 37 organic species, in the range of 40–400 m/z, were determined by coupling gas chromatography to mass spectrometry (GC–MS). This study shows that the VOC compositions of fumaroles and biogas determined via SPME and ST are largely consistent and can be applied to the analysis of VOCs in gases released from different natural and anthropogenic environments. The SPME method is rapid and simple and more appropriate for volcanic and geothermal emissions, where VOCs are present at relatively high concentrations and prolonged gas sampling may be hazardous for the operator. The ST method, allowing the collection of large quantities of sample, is to be preferred to analyze the VOC composition of fluids from diffuse emissions and air, where these compounds are present at relatively low concentrations.

  2. Calculation of the reliability of large complex systems by the relevant path method

    International Nuclear Information System (INIS)

    Richter, G.

    1975-03-01

    In this paper, analytical methods are presented and tested with which the probabilistic reliability data of technical systems can be determined for given fault trees and block diagrams and known reliability data of the components. (orig./AK) [de

  3. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  4. Case study on the use of PSA methods: Human reliability analysis

    International Nuclear Information System (INIS)

    1991-04-01

    The overall objective of treating human reliability in a probabilistic safety analysis is to ensure that the key human interactions of typical crews are accurately and systematically incorporated into the study in a traceable manner. An additional objective is to make the human reliability analysis (HRA) as realistic as possible, taking into account the emergency procedures, the man-machine interface, the focus of training process, and the knowledge and experience of the crews. Section 3 of the paper describes an overview of this analytical process which leads to three more detailed example problems described in Section 4. Section 5 discusses a peer review process. References are presented that are useful in performing HRAs. In addition appendices are provided for definitions, selected data and a generic list of performance shaping factors. 35 refs, figs and tabs

  5. Human factors reliability benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1989-08-01

    The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim of assessing the state of the art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participated in the HF-RBE. The HF-RBE was organised around two study cases: (1) analysis of routine functional Test and Maintenance (TPM) procedures: with the aim of assessing the probability of test induced failures, the probability of failures to remain unrevealed and the potential to initiate transients because of errors performed in the test; (2) analysis of human actions during an operational transient: with the aim of assessing the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. This report summarises the contributions received from the participants and analyses these contributions on a comparative basis. The aim of this analysis was to compare the procedures, modelling techniques and quantification methods used, to obtain insight in the causes and magnitude of the variability observed in the results, to try to identify preferred human reliability assessment approaches and to get an understanding of the current state of the art in the field identifying the limitations that are still inherent to the different approaches

  6. Analyticity without Differentiability

    Science.gov (United States)

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  7. Understanding Business Analytics

    Science.gov (United States)

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  8. A procedure for the determination of Po-210 in water samples by alpha spectrometry

    International Nuclear Information System (INIS)

    2009-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is a extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004 the Environment programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. In the case of 210 Po, this started with the collection and review of about 130 papers from the scientific literature. Based on this review, two candidate methods for the chemical separation of 210 Po from water samples were selected for testing, refinement and validation in accordance with ISO guidelines. A comprehensive methodology for calculation of results including quantification of measurement uncertainty was also developed. This report presents the final procedure which was developed based on that work

  9. Human Reliability Program Overview

    Energy Technology Data Exchange (ETDEWEB)

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  10. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  11. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  12. Descrição de algumas variáveis em um procedimento de supervisão de terapia analítica do comportamento Description of some variables in a behavior analytic therapy supervision procedure

    Directory of Open Access Journals (Sweden)

    Sandra Bernadete da Silva Moreira

    2003-01-01

    Full Text Available Neste trabalho foi realizado um estudo descritivo da interação verbal livre e contínua entre um supervisor de terapia e um terapeuta iniciante, com o objetivo de identificar variáveis envolvidas no procedimento de supervisão adotado. O comportamento verbal dos participantes foi dividido em classes funcionais de respostas, denominadas "categorias de verbalizações", a partir das quais todas as respostas vocais puderam ser classificadas. Os resultados mostraram uma regularidade no comportamento do supervisor, enquanto os comportamentos do terapeuta e do cliente apresentaram alterações ao longo dos encontros de supervisões e das sessões terapêuticas. A análise da interação verbal livre em uma díade permitiu fazer inferências acerca de algumas variáveis de controle neste tipo de interação.In this work an analysis of a free, ongoing verbal interaction between therapy supervisor and a beginning therapist was carried out aiming to identify variables involved in the supervision procedure adopted. The participant's verbal behavior was divided into functional classes of responses, named "verbalizations categories", from which all vocal responses could be classified. The results showed a regularity in the supervisor verbal behavior, while the therapist's and client's behavior showed changes along supervision meetings and therapy sessions. The analysis of a free verbal interaction in a dyad allowed to make inferences about some controlling variables in this sort of interaction.

  13. Reliable Design Versus Trust

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  14. Pocket Handbook on Reliability

    Science.gov (United States)

    1975-09-01

    exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future

  15. Principles of Bridge Reliability

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, Andrzej S.

    The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated......, and as an example the reliability profile and a sensitivity analyses for a corroded reinforced concrete bridge is shown....

  16. Reliability in engineering '87

    International Nuclear Information System (INIS)

    Tuma, M.

    1987-01-01

    The participants heard 51 papers dealing with the reliability of engineering products. Two of the papers were incorporated in INIS, namely ''Reliability comparison of two designs of low pressure regeneration of the 1000 MW unit at the Temelin nuclear power plant'' and ''Use of probability analysis of reliability in designing nuclear power facilities.''(J.B.)

  17. Operator reliability assessment system (OPERAS)

    International Nuclear Information System (INIS)

    Singh, A.; Spurgin, A.J.; Martin, T.; Welsch, J.; Hallam, J.W.

    1991-01-01

    OPERAS is a personal-computer (PC) based software to collect and process simulator data on control-room operators responses during requalification training scenarios. The data collection scheme is based upon approach developed earlier during the EPRI Operator Reliability Experiments project. The software allows automated data collection from simulator, thus minimizing simulator staff time and resources to collect, maintain and process data which can be useful in monitoring, assessing and enhancing the progress of crew reliability and effectiveness. The system is designed to provide the data and output information in the form of user-friendly charts, tables and figures for use by plant staff. OPERAS prototype software has been implemented at the Diablo Canyon (PWR) and Millstone (BWR) plants and is currently being used to collect operator response data. Data collected from similator include plant-state variables such as reactor pressure and temperature, malfunction, times at which annunciators are activated, operator actions and observations of crew behavior by training staff. The data and systematic analytical results provided by the OPERAS system can contribute to increase objectivity by the utility probabilistic risk analysis (PRA) and training staff in monitoring and assessing reliability of their crews

  18. Large Sample Confidence Intervals for Item Response Theory Reliability Coefficients

    Science.gov (United States)

    Andersson, Björn; Xin, Tao

    2018-01-01

    In applications of item response theory (IRT), an estimate of the reliability of the ability estimates or sum scores is often reported. However, analytical expressions for the standard errors of the estimators of the reliability coefficients are not available in the literature and therefore the variability associated with the estimated reliability…

  19. Human factors reliability Benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1989-06-01

    The Joint Research Centre of the European Commission has organized a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim of assessing the state of the art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participated in the HF-RBE. The HF-RBE was organized around two study cases: (1) analysis of routine functional Test and Maintenance (T and M) procedures: with the aim of assessing the probability of test induced failures, the probability of failures to remain unrevealed and the potential to initiate transients because of errors performed in the test; (2) analysis of human actions during an operational transient: with the aim of assessing the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. This report contains the final summary reports produced by the participants in the exercise

  20. Analytical application of thiosulfatobismuthates(3)

    International Nuclear Information System (INIS)

    Kobylecka, J.; Cyganski, A.

    1980-01-01

    The analytical application of caesium-sodium thiosulfatobismuthate(3) is presented. Gravimetric, compleximetric and thermal methods of caesium determination have been developed. The gravimetric method is based on precipitation of caesium as Cs 2 Na[Bi(S 2 O 3 ) 3 ], filtration and desiccation of precipitate at about 100 0 C. In compleximetric procedure the precipitate was dissolved in nitric acid and bismuth was titrated by EDTA solution. In thermal method the precipitate was heated up to 320 0 C while the released sulphur dioxide was absorbed by sodium terachloromercurate and formed acid was determined alkalimetrically. The described rapid methods have a satisfactory precision and accuracy. (author)

  1. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  2. Human factor reliability program

    International Nuclear Information System (INIS)

    Knoblochova, L.

    2017-01-01

    The human factor's reliability program was at Slovenske elektrarne, a.s. (SE) nuclear power plants. introduced as one of the components Initiatives of Excellent Performance in 2011. The initiative's goal was to increase the reliability of both people and facilities, in response to 3 major areas of improvement - Need for improvement of the results, Troubleshooting support, Supporting the achievement of the company's goals. The human agent's reliability program is in practice included: - Tools to prevent human error; - Managerial observation and coaching; - Human factor analysis; -Quick information about the event with a human agent; -Human reliability timeline and performance indicators; - Basic, periodic and extraordinary training in human factor reliability(authors)

  3. Of plants and reliability

    International Nuclear Information System (INIS)

    Schneider Horst

    2009-01-01

    Behind the political statements made about the transformer event at the Kruemmel nuclear power station (KKK) in the summer of 2009 there are fundamental issues of atomic law. Pursuant to Articles 20 and 28 of its Basic Law, Germany is a state in which the rule of law applies. Consequently, the aspects of atomic law associated with the incident merit a closer look, all the more so as the items concerned have been known for many years. Important aspects in the debate about the Kruemmel nuclear power plant are the fact that the transformer is considered part of the nuclear power station under atomic law and thus a ''plant'' subject to surveillance by the nuclear regulatory agencies, on the one hand, and the reliability under atomic law of the operator and the executive personnel responsible, on the other hand. Both ''plant'' and ''reliability'' are terms focusing on nuclear safety. Hence the question to what extent safety was affected in the Kruemmel incident. The classification of the event as 0 = no or only a very slight safety impact on the INES scale (INES = International Nuclear Event Scale) should not be used to put aside the safety issue once and for all. Points of fact and their technical significance must be considered prior to any legal assessment. Legal assessments and regulations are associated with facts and circumstances. Any legal examination is based on the facts as determined and elucidated. Any other procedure would be tantamount to an inadmissible legal advance conviction. Now, what is the position of political statements, i.e. political assessments and political responsibility? If everything is done the correct way, they come at the end, after exploration of the facts and evaluation under applicable law. Sometimes things are handled differently, with consequences which are not very helpful. In the light of the provisions about the rule of law as laid down in the Basic Law, the new federal government should be made to observe the proper sequence of

  4. Solution standards for quality control of nuclear-material analytical measurements

    International Nuclear Information System (INIS)

    Clark, J.P.

    1981-01-01

    Analytical chemistry measurement control depends upon reliable solution standards. At the Savannah River Plant Control Laboratory over a thousand analytical measurements are made daily for process control, product specification, accountability, and nuclear safety. Large quantities of solution standards are required for a measurement quality control program covering the many different analytical chemistry methods. Savannah River Plant produced uranium, plutonium, neptunium, and americium metals or oxides are dissolved to prepare stock solutions for working or Quality Control Standards (QCS). Because extensive analytical effort is required to characterize or confirm these solutions, they are prepared in large quantities. These stock solutions are diluted and blended with different chemicals and/or each other to synthesize QCS that match the matrices of different process streams. The target uncertainty of a standard's reference value is 10% of the limit of error of the methods used for routine measurements. Standard Reference Materials from NBS are used according to special procedures to calibrate the methods used in measuring the uranium and plutonium standards so traceability can be established. Special precautions are required to minimize the effects of temperature, radiolysis, and evaporation. Standard reference values are periodically corrected to eliminate systematic errors caused by evaporation or decay products. Measurement control is achieved by requiring analysts to analyze a blind QCS each shift a measurement system is used on plant samples. Computer evaluation determines whether or not a measurement is within the +- 3 sigma control limits. Monthly evaluations of the QCS measurements are made to determine current bias correction factors for accountability measurements and detect significant changes in the bias and precision statistics. The evaluations are also used to plan activities for improving the reliability of the analytical chemistry measurements

  5. Reliability analysis of prestressed concrete containment structures

    International Nuclear Information System (INIS)

    Jiang, J.; Zhao, Y.; Sun, J.

    1993-01-01

    The reliability analysis of prestressed concrete containment structures subjected to combinations of static and dynamic loads with consideration of uncertainties of structural and load parameters is presented. Limit state probabilities for given parameters are calculated using the procedure developed at BNL, while that with consideration of parameter uncertainties are calculated by a fast integration for time variant structural reliability. The limit state surface of the prestressed concrete containment is constructed directly incorporating the prestress. The sensitivities of the Choleskey decomposition matrix and the natural vibration character are calculated by simplified procedures. (author)

  6. A reliability analysis tool for SpaceWire network

    Science.gov (United States)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  7. Waste minimization in analytical methods

    International Nuclear Information System (INIS)

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-01-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department's goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision

  8. Assessing the Impact of Imperfect Diagnosis on Service Reliability

    DEFF Research Database (Denmark)

    Grønbæk, Lars Jesper; Schwefel, Hans-Peter; Kjærgaard, Jens Kristian

    2010-01-01

    , representative diagnosis performance metrics have been defined and their closed-form solutions obtained for the Markov model. These equations enable model parameterization from traces of implemented diagnosis components. The diagnosis model has been integrated in a reliability model assessing the impact...... of the diagnosis functions for the studied reliability problem. In a simulation study we finally analyze trade-off properties of diagnosis heuristics from literature, map them to the analytic Markov model, and investigate its suitability for service reliability optimization....

  9. Analytical measurements of fission products during a severe nuclear accident

    Science.gov (United States)

    Doizi, D.; Reymond la Ruinaz, S.; Haykal, I.; Manceron, L.; Perrin, A.; Boudon, V.; Vander Auwera, J.; tchana, F. Kwabia; Faye, M.

    2018-01-01

    The Fukushima accident emphasized the fact that ways to monitor in real time the evolution of a nuclear reactor during a severe accident remain to be developed. No fission products were monitored during twelve days; only dose rates were measured, which is not sufficient to carry out an online diagnosis of the event. The first measurements were announced with little reliability for low volatile fission products. In order to improve the safety of nuclear plants and minimize the industrial, ecological and health consequences of a severe accident, it is necessary to develop new reliable measurement systems, operating at the earliest and closest to the emission source of fission products. Through the French program ANR « Projet d'Investissement d'Avenir », the aim of the DECA-PF project (diagnosis of core degradation from fission products measurements) is to monitor in real time the release of the major fission products (krypton, xenon, gaseous forms of iodine and ruthenium) outside the nuclear reactor containment. These products are released at different times during a nuclear accident and at different states of the nuclear core degradation. Thus, monitoring these fission products gives information on the situation inside the containment and helps to apply the Severe Accident Management procedures. Analytical techniques have been proposed and evaluated. The results are discussed here.

  10. Analytical measurements of fission products during a severe nuclear accident

    Directory of Open Access Journals (Sweden)

    Doizi D.

    2018-01-01

    Full Text Available The Fukushima accident emphasized the fact that ways to monitor in real time the evolution of a nuclear reactor during a severe accident remain to be developed. No fission products were monitored during twelve days; only dose rates were measured, which is not sufficient to carry out an online diagnosis of the event. The first measurements were announced with little reliability for low volatile fission products. In order to improve the safety of nuclear plants and minimize the industrial, ecological and health consequences of a severe accident, it is necessary to develop new reliable measurement systems, operating at the earliest and closest to the emission source of fission products. Through the French program ANR « Projet d’Investissement d’Avenir », the aim of the DECA-PF project (diagnosis of core degradation from fission products measurements is to monitor in real time the release of the major fission products (krypton, xenon, gaseous forms of iodine and ruthenium outside the nuclear reactor containment. These products are released at different times during a nuclear accident and at different states of the nuclear core degradation. Thus, monitoring these fission products gives information on the situation inside the containment and helps to apply the Severe Accident Management procedures. Analytical techniques have been proposed and evaluated. The results are discussed here.

  11. A reliability program approach to operational safety

    International Nuclear Information System (INIS)

    Mueller, C.J.; Bezella, W.A.

    1985-01-01

    A Reliability Program (RP) model based on proven reliability techniques is being formulated for potential application in the nuclear power industry. Methods employed under NASA and military direction, commercial airline and related FAA programs were surveyed and a review of current nuclear risk-dominant issues conducted. The need for a reliability approach to address dependent system failures, operating and emergency procedures and human performance, and develop a plant-specific performance data base for safety decision making is demonstrated. Current research has concentrated on developing a Reliability Program approach for the operating phase of a nuclear plant's lifecycle. The approach incorporates performance monitoring and evaluation activities with dedicated tasks that integrate these activities with operation, surveillance, and maintenance of the plant. The detection, root-cause evaluation and before-the-fact correction of incipient or actual systems failures as a mechanism for maintaining plant safety is a major objective of the Reliability Program. (orig./HP)

  12. Analytical toxicology of emerging drugs of abuse--an update.

    Science.gov (United States)

    Meyer, Markus R; Peters, Frank T

    2012-12-01

    The steady increase of new drugs of abuse on the illicit drug market is a great challenge for analytical toxicologists. Because most of these new drugs or drug classes are not included in established analytical methods targeting classic drugs of abuse, analytical procedures must be adapted or new procedures must be developed to cover such new compounds. This review summarizes procedures for analysis of these drugs of abuse published from January 2009 to January 2012 covering the following classes of emerging drugs of abuse as follows: β-keto-amphetamines, pyrrolidinophenones, tryptamines, and synthetic cannabinoids.

  13. RADCHEM - Radiochemical procedures for the determination of Sr, U, Pu, Am and Cm

    Energy Technology Data Exchange (ETDEWEB)

    Sidhu, R. [Inst. for Energy Technology (Norway)

    2006-04-15

    An accurate determination of radionuclides from various sources in the environment is essential for assessment of the potential hazards and suitable countermeasures both in case of accidents, authorised release and routine surveillance. Reliable radiochemical separation and detection techniques are needed for accurate determination of alpha and beta emitters. Rapid analytical methods are needed in case of an accident for early decision-making. The objective of this project has been to compare and evaluate radiochemical procedures used at Nordic laboratories for the determination of strontium, uranium, plutonium, americium and curium. To gather detailed information on the procedures in use, a questionnaire regarding various aspects of radionuclide determination was developed and distributed to all (sixteen) relevant laboratories in the Nordic countries. The response and the procedures used by each laboratory were then discussed between those who answered the questionnaire. This report summaries the findings and gives recommendation on suitable practice. (au)

  14. RADCHEM - Radiochemical procedures for the determination of Sr, U, Pu, Am and Cm

    International Nuclear Information System (INIS)

    Sidhu, R.

    2006-04-01

    An accurate determination of radionuclides from various sources in the environment is essential for assessment of the potential hazards and suitable countermeasures both in case of accidents, authorised release and routine surveillance. Reliable radiochemical separation and detection techniques are needed for accurate determination of alpha and beta emitters. Rapid analytical methods are needed in case of an accident for early decision-making. The objective of this project has been to compare and evaluate radiochemical procedures used at Nordic laboratories for the determination of strontium, uranium, plutonium, americium and curium. To gather detailed information on the procedures in use, a questionnaire regarding various aspects of radionuclide determination was developed and distributed to all (sixteen) relevant laboratories in the Nordic countries. The response and the procedures used by each laboratory were then discussed between those who answered the questionnaire. This report summaries the findings and gives recommendation on suitable practice. (au)

  15. A Procedure for the Sequential Determination of Radionuclides in Environmental Samples. Liquid Scintillation Counting and Alpha Spectrometry for 90Sr, 241Am and Pu Radioisotopes

    International Nuclear Information System (INIS)

    2014-01-01

    Since 2004, IAEA activities related to the terrestrial environment have aimed at the development of a set of procedures to determine radionuclides in environmental samples. Reliable, comparable and ‘fit for purpose’ results are an essential requirement for any decision based on analytical measurements. For the analyst, tested and validated analytical procedures are extremely important tools for the production of analytical data. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available for reference to both the analyst and the customer. This publication describes a combined procedure for the sequential determination of 90 Sr, 241 Am and Pu radioisotopes in environmental samples. The method is based on the chemical separation of strontium, americium and plutonium using ion exchange chromatography, extraction chromatography and precipitation followed by alpha spectrometric and liquid scintillation counting detection. The method was tested and validated in terms of repeatability and trueness in accordance with International Organization for Standardization (ISO) guidelines using reference materials and proficiency test samples. Reproducibility tests were performed later at the IAEA Terrestrial Environment Laboratory. The calculations of the massic activity, uncertainty budget, decision threshold and detection limit are also described in this publication. The procedure is introduced for the determination of 90 Sr, 241 Am and Pu radioisotopes in environmental samples such as soil, sediment, air filter and vegetation samples. It is expected to be of general use to a wide range of laboratories, including the Analytical Laboratories for the Measurement of Environmental Radioactivity (ALMERA) network for routine environmental monitoring purposes

  16. Analytic nuclear scattering theories

    International Nuclear Information System (INIS)

    Di Marzio, F.; University of Melbourne, Parkville, VIC

    1999-01-01

    A wide range of nuclear reactions are examined in an analytical version of the usual distorted wave Born approximation. This new approach provides either semi analytic or fully analytic descriptions of the nuclear scattering processes. The resulting computational simplifications, when used within the limits of validity, allow very detailed tests of both nuclear interaction models as well as large basis models of nuclear structure to be performed

  17. Using Analytic Hierarchy Process in Textbook Evaluation

    Science.gov (United States)

    Kato, Shigeo

    2014-01-01

    This study demonstrates the application of the analytic hierarchy process (AHP) in English language teaching materials evaluation, focusing in particular on its potential for systematically integrating different components of evaluation criteria in a variety of teaching contexts. AHP is a measurement procedure wherein pairwise comparisons are made…

  18. Appendix 1: Analytical Techniques (Online supplementary material ...

    Indian Academy of Sciences (India)

    HP

    Further details of analytical techniques are given in http://www.actlabs.com. Zircon U–Pb dating and trace element analysis. The zircons were separated using standard procedures including crushing (in iron mortar and pestle), sieving (375 to 75 micron), tabling, heavy liquid separation (bromoform and methylene iodide) ...

  19. Analytical Electron Microscope

    Data.gov (United States)

    Federal Laboratory Consortium — The Titan 80-300 is a transmission electron microscope (TEM) equipped with spectroscopic detectors to allow chemical, elemental, and other analytical measurements to...

  20. Course on Advanced Analytical Chemistry and Chromatography

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Fristrup, Peter; Nielsen, Kristian Fog

    2011-01-01

    Methods of analytical chemistry constitute an integral part of decision making in chemical research, and students must master a high degree of knowledge, in order to perform reliable analysis. At DTU departments of chemistry it was thus decided to develop a course that was attractive to master...... students of different direction of studies, to Ph.D. students and to professionals that need an update of their current state of skills and knowledge. A course of 10 ECTS points was devised with the purpose of introducing students to analytical chemistry and chromatography with the aim of including theory...

  1. Integration of human reliability analysis into the probabilistic risk assessment process: phase 1

    International Nuclear Information System (INIS)

    Bell, B.J.; Vickroy, S.C.

    1985-01-01

    The US Nuclear Regulatory Commission and Pacific Northwest Laboratory initiated a research program in 1984 to develop a testable set of analytical procedures for integrating human reliability analysis (HRA) into the probabilistic risk assessment (PRA) process to more adequately assess the overall impact of human performance on risk. In this three phase program, stand-alone HRA/PRA analytic procedures will be developed and field evaluated to provide improved methods, techniques, and models for applying quantitative and qualitative human error data which systematically integrate HRA principles, techniques, and analyses throughout the entire PRA process. Phase 1 of the program involved analysis of state-of-the-art PRAs to define the structures and processes currently in use in the industry. Phase 2 research will involve developing a new or revised PRA methodology which will enable more efficient regulation of the industry using quantitative or qualitative results of the PRA. Finally, Phase 3 will be to field test those procedures to assure that the results generated by the new methodologies will be usable and acceptable to the NRC. This paper briefly describes the first phase of the program and outlines the second

  2. Integration of human reliability analysis into the probabilistic risk assessment process: Phase 1

    International Nuclear Information System (INIS)

    Bell, B.J.; Vickroy, S.C.

    1984-10-01

    A research program was initiated to develop a testable set of analytical procedures for integrating human reliability analysis (HRA) into the probabilistic risk assessment (PRA) process to more adequately assess the overall impact of human performance on risk. In this three-phase program, stand-alone HRA/PRA analytic procedures will be developed and field evaluated to provide improved methods, techniques, and models for applying quantitative and qualitative human error data which systematically integrate HRA principles, techniques, and analyses throughout the entire PRA process. Phase 1 of the program involved analysis of state-of-the-art PRAs to define the structures and processes currently in use in the industry. Phase 2 research will involve developing a new or revised PRA methodology which will enable more efficient regulation of the industry using quantitative or qualitative results of the PRA. Finally, Phase 3 will be to field test those procedures to assure that the results generated by the new methodologies will be usable and acceptable to the NRC. This paper briefly describes the first phase of the program and outlines the second

  3. Reliability and safety engineering

    CERN Document Server

    Verma, Ajit Kumar; Karanki, Durga Rao

    2016-01-01

    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  4. Green analytical chemistry introduction to chloropropanols determination at no economic and analytical performance costs?

    Science.gov (United States)

    Jędrkiewicz, Renata; Orłowski, Aleksander; Namieśnik, Jacek; Tobiszewski, Marek

    2016-01-15

    In this study we perform ranking of analytical procedures for 3-monochloropropane-1,2-diol determination in soy sauces by PROMETHEE method. Multicriteria decision analysis was performed for three different scenarios - metrological, economic and environmental, by application of different weights to decision making criteria. All three scenarios indicate capillary electrophoresis-based procedure as the most preferable. Apart from that the details of ranking results differ for these three scenarios. The second run of rankings was done for scenarios that include metrological, economic and environmental criteria only, neglecting others. These results show that green analytical chemistry-based selection correlates with economic, while there is no correlation with metrological ones. This is an implication that green analytical chemistry can be brought into laboratories without analytical performance costs and it is even supported by economic reasons. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Environmental procedures

    International Nuclear Information System (INIS)

    1992-01-01

    The European Bank has pledged in its Agreement to place environmental management at the forefront of its operations to promote sustainable economic development in central and eastern Europe. The Bank's environmental policy is set out in the document titled, Environmental Management: The Bank's Policy Approach. This document, Environmental Procedures, presents the procedures which the European Bank has adopted to implement this policy approach with respect to its operations. The environmental procedures aim to: ensure that throughout the project approval process, those in positions of responsibility for approving projects are aware of the environmental implications of the project, and can take these into account when making decisions; avoid potential liabilities that could undermine the success of a project for its sponsors and the Bank; ensure that environmental costs are estimated along with other costs and liabilities; and identify opportunities for environmental enhancement associated with projects. The review of environmental aspects of projects is conducted by many Bank staff members throughout the project's life. This document defines the responsibilities of the people and groups involved in implementing the environmental procedures. Annexes contain Environmental Management: The Bank's Policy Approach, examples of environmental documentation for the project file and other ancillary information

  6. Mission Reliability Estimation for Repairable Robot Teams

    Science.gov (United States)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the

  7. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  8. Reliability of electronic systems

    International Nuclear Information System (INIS)

    Roca, Jose L.

    2001-01-01

    Reliability techniques have been developed subsequently as a need of the diverse engineering disciplines, nevertheless they are not few those that think they have been work a lot on reliability before the same word was used in the current context. Military, space and nuclear industries were the first ones that have been involved in this topic, however not only in these environments it is that it has been carried out this small great revolution in benefit of the increase of the reliability figures of the products of those industries, but rather it has extended to the whole industry. The fact of the massive production, characteristic of the current industries, drove four decades ago, to the fall of the reliability of its products, on one hand, because the massively itself and, for other, to the recently discovered and even not stabilized industrial techniques. Industry should be changed according to those two new requirements, creating products of medium complexity and assuring an enough reliability appropriated to production costs and controls. Reliability began to be integral part of the manufactured product. Facing this philosophy, the book describes reliability techniques applied to electronics systems and provides a coherent and rigorous framework for these diverse activities providing a unifying scientific basis for the entire subject. It consists of eight chapters plus a lot of statistical tables and an extensive annotated bibliography. Chapters embrace the following topics: 1- Introduction to Reliability; 2- Basic Mathematical Concepts; 3- Catastrophic Failure Models; 4-Parametric Failure Models; 5- Systems Reliability; 6- Reliability in Design and Project; 7- Reliability Tests; 8- Software Reliability. This book is in Spanish language and has a potentially diverse audience as a text book from academic to industrial courses. (author)

  9. Subsea HIPPS design procedure

    International Nuclear Information System (INIS)

    Aaroe, R.; Lund, B.F.; Onshus, T.

    1995-01-01

    The paper is based on a feasibility study investigating the possibilities of using a HIPPS (High Integrity Pressure Protection System) to protect a subsea pipeline that is not rated for full wellhead shut-in pressure. The study was called the Subsea OPPS Feasibility Study, and was performed by SINTEF, Norway. Here, OPPS is an acronym for Overpressure Pipeline Protection System. A design procedure for a subsea HIPPS is described, based on the experience and knowledge gained through the ''Subsea OPPS Feasibility Study''. Before a subsea HIPPS can be applied, its technical feasibility, reliability and profitability must be demonstrated. The subsea HIPPS design procedure will help to organize and plan the design activities both with respect to development and verification of a subsea HIPPS. The paper also gives examples of how some of the discussed design steps were performed in the Subsea OPPS Feasibility Study. Finally, further work required to apply a subsea HIPPS is discussed

  10. Operational safety reliability research

    International Nuclear Information System (INIS)

    Hall, R.E.; Boccio, J.L.

    1986-01-01

    Operating reactor events such as the TMI accident and the Salem automatic-trip failures raised the concern that during a plant's operating lifetime the reliability of systems could degrade from the design level that was considered in the licensing process. To address this concern, NRC is sponsoring the Operational Safety Reliability Research project. The objectives of this project are to identify the essential tasks of a reliability program and to evaluate the effectiveness and attributes of such a reliability program applicable to maintaining an acceptable level of safety during the operating lifetime at the plant

  11. Circuit design for reliability

    CERN Document Server

    Cao, Yu; Wirth, Gilson

    2015-01-01

    This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units.  The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.

  12. Sample preparation procedures utilized in microbial metabolomics: An overview.

    Science.gov (United States)

    Patejko, Małgorzata; Jacyna, Julia; Markuszewski, Michał J

    2017-02-01

    Bacteria are remarkably diverse in terms of their size, structure and biochemical properties. Due to this fact, it is hard to develop a universal method for handling bacteria cultures during metabolomic analysis. The choice of suitable processing methods constitutes a key element in any analysis, because only appropriate selection of procedures may provide accurate results, leading to reliable conclusions. Because of that, every analytical experiment concerning bacteria requires individually and very carefully planned research methodology. Although every study varies in terms of sample preparation, there are few general steps to follow while planning experiment, like sampling, separation of cells from growth medium, stopping their metabolism and extraction. As a result of extraction, all intracellular metabolites should be washed out from cell environment. What is more, extraction method utilized cannot cause any chemical decomposition or degradation of the metabolome. Furthermore, chosen extraction method should correlate with analytical technique, so it will not disturb or prolong following sample preparation steps. For those reasons, we observe a need to summarize sample preparation procedures currently utilized in microbial metabolomic studies. In the presented overview, papers concerning analysis of extra- and intracellular metabolites, published over the last decade, have been discussed. Presented work gives some basic guidelines that might be useful while planning experiments in microbial metabolomics. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Reliability studies in a developing technology

    International Nuclear Information System (INIS)

    Mitchell, L.A.; Osgood, C.; Radcliffe, S.J.

    1975-01-01

    The standard methods of reliability analysis can only be applied if valid failure statistics are available. In a developing technology the statistics which have been accumulated, over many years of conventional experience, are often rendered useless by environmental effects. Thus new data, which take account of the new environment, are required. This paper discusses the problem of optimizing the acquisition of these data when time-scales and resources are limited. It is concluded that the most fruitful strategy in assessing the reliability of mechanisms is to study the failures of individual joints whilst developing, where necessary, analytical tools to facilitate the use of these data. The approach is illustrated by examples from the field of tribology. Failures of rolling element bearings in moist, high-pressure carbon dioxide illustrate the important effects of apparently minor changes in the environment. New analytical techniques are developed from a study of friction failures in sliding joints. (author)

  14. Structural reliability analysis and seismic risk assessment

    International Nuclear Information System (INIS)

    Hwang, H.; Reich, M.; Shinozuka, M.

    1984-01-01

    This paper presents a reliability analysis method for safety evaluation of nuclear structures. By utilizing this method, it is possible to estimate the limit state probability in the lifetime of structures and to generate analytically the fragility curves for PRA studies. The earthquake ground acceleration, in this approach, is represented by a segment of stationary Gaussian process with a zero mean and a Kanai-Tajimi Spectrum. All possible seismic hazard at a site represented by a hazard curve is also taken into consideration. Furthermore, the limit state of a structure is analytically defined and the corresponding limit state surface is then established. Finally, the fragility curve is generated and the limit state probability is evaluated. In this paper, using a realistic reinforced concrete containment as an example, results of the reliability analysis of the containment subjected to dead load, live load and ground earthquake acceleration are presented and a fragility curve for PRA studies is also constructed

  15. Analytical approximations for wide and narrow resonances

    International Nuclear Information System (INIS)

    Suster, Luis Carlos; Martinez, Aquilino Senra; Silva, Fernando Carvalho da

    2005-01-01

    This paper aims at developing analytical expressions for the adjoint neutron spectrum in the resonance energy region, taking into account both narrow and wide resonance approximations, in order to reduce the numerical computations involved. These analytical expressions, besides reducing computing time, are very simple from a mathematical point of view. The results obtained with this analytical formulation were compared to a reference solution obtained with a numerical method previously developed to solve the neutron balance adjoint equations. Narrow and wide resonances of U 238 were treated and the analytical procedure gave satisfactory results as compared with the reference solution, for the resonance energy range. The adjoint neutron spectrum is useful to determine the neutron resonance absorption, so that multigroup adjoint cross sections used by the adjoint diffusion equation can be obtained. (author)

  16. Analytical approximations for wide and narrow resonances

    Energy Technology Data Exchange (ETDEWEB)

    Suster, Luis Carlos; Martinez, Aquilino Senra; Silva, Fernando Carvalho da [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia. Programa de Engenharia Nuclear]. E-mail: aquilino@lmp.ufrj.br

    2005-07-01

    This paper aims at developing analytical expressions for the adjoint neutron spectrum in the resonance energy region, taking into account both narrow and wide resonance approximations, in order to reduce the numerical computations involved. These analytical expressions, besides reducing computing time, are very simple from a mathematical point of view. The results obtained with this analytical formulation were compared to a reference solution obtained with a numerical method previously developed to solve the neutron balance adjoint equations. Narrow and wide resonances of U{sup 238} were treated and the analytical procedure gave satisfactory results as compared with the reference solution, for the resonance energy range. The adjoint neutron spectrum is useful to determine the neutron resonance absorption, so that multigroup adjoint cross sections used by the adjoint diffusion equation can be obtained. (author)

  17. AK-SYS: An adaptation of the AK-MCS method for system reliability

    International Nuclear Information System (INIS)

    Fauriat, W.; Gayton, N.

    2014-01-01

    A lot of research work has been proposed over the last two decades to evaluate the probability of failure of a structure involving a very time-consuming mechanical model. Surrogate model approaches based on Kriging, such as the Efficient Global Reliability Analysis (EGRA) or the Active learning and Kriging-based Monte-Carlo Simulation (AK-MCS) methods, are very efficient and each has advantages of its own. EGRA is well suited to evaluating small probabilities, as the surrogate can be used to classify any population. AK-MCS is built in relation to a given population and requires no optimization program for the active learning procedure to be performed. It is therefore easier to implement and more likely to spend computational effort on areas with a significant probability content. When assessing system reliability, analytical approaches and first-order approximation are widely used in the literature. However, in the present paper we rather focus on sampling techniques and, considering the recent adaptation of the EGRA method for systems, a strategy is presented to adapt the AK-MCS method for system reliability. The AK-SYS method, “Active learning and Kriging-based SYStem reliability method”, is presented. Its high efficiency and accuracy are illustrated via various examples

  18. Designing Glass Panels for Economy and Reliability

    Science.gov (United States)

    Moore, D. M.

    1983-01-01

    Analytical method determines probability of failure of rectangular glass plates subjected to uniformly distributed loads such as those from wind, earthquake, snow, and deadweight. Developed as aid in design of protective glass covers for solar-cell arrays and solar collectors, method is also useful in estimating the reliability of large windows in buildings exposed to high winds and is adapted to nonlinear stress analysis of simply supported plates of any elastic material.

  19. The Analytical Hierarchy Process

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    2007-01-01

    The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use.......The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use....

  20. Signals: Applying Academic Analytics

    Science.gov (United States)

    Arnold, Kimberly E.

    2010-01-01

    Academic analytics helps address the public's desire for institutional accountability with regard to student success, given the widespread concern over the cost of higher education and the difficult economic and budgetary conditions prevailing worldwide. Purdue University's Signals project applies the principles of analytics widely used in…

  1. Analytic Moufang-transformations

    International Nuclear Information System (INIS)

    Paal, Eh.N.

    1988-01-01

    The paper is aimed to be an introduction to the concept of an analytic birepresentation of an analytic Moufang loop. To describe the deviation of (S,T) from associativity, the associators (S,T) are defined and certain constraints for them, called the minimality conditions of (S,T) are established

  2. Quine's "Strictly Vegetarian" Analyticity

    NARCIS (Netherlands)

    Decock, L.B.

    2017-01-01

    I analyze Quine’s later writings on analyticity from a linguistic point of view. In Word and Object Quine made room for a “strictly vegetarian” notion of analyticity. In later years, he developed this notion into two more precise notions, which I have coined “stimulus analyticity” and “behaviorist

  3. Learning analytics dashboard applications

    NARCIS (Netherlands)

    Verbert, K.; Duval, E.; Klerkx, J.; Govaerts, S.; Santos, J.L.

    2013-01-01

    This article introduces learning analytics dashboards that visualize learning traces for learners and teachers. We present a conceptual framework that helps to analyze learning analytics applications for these kinds of users. We then present our own work in this area and compare with 15 related

  4. Learning Analytics Considered Harmful

    Science.gov (United States)

    Dringus, Laurie P.

    2012-01-01

    This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through…

  5. Analytical mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    1990-01-01

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  6. Analytical mass spectrometry. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-31

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  7. System-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, NEWTONP, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), used independently of one another. Program finds probability required to yield given system reliability. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  8. Neutron logging reliability techniques and apparatus

    International Nuclear Information System (INIS)

    Johnstone, C.W.

    1978-01-01

    This invention relates in general to neutron logging of earth formations, and in particular, to novel apparatus and procedures for determining the validity, or reliability, of data derived at least in part by logging neutron characteristics of earth formations and, if desired, for affording verifiably accurate indications of such data

  9. Neutron logging reliability techniques and apparatus

    International Nuclear Information System (INIS)

    Johnstone, C.W.

    1974-01-01

    This invention relates in general to neutron logging of earth formations, and in particular, to novel apparatus and procedures for determining the validity, or reliability, of data derived at least in part by logging neutron characteristics of earth formations and, if desired, for affording verifiably accurate indications of such data. (author)

  10. Importance of implementing an analytical quality control system in a core laboratory.

    Science.gov (United States)

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions

  11. Quo vadis, analytical chemistry?

    Science.gov (United States)

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  12. Radiochemical procedures

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1982-01-01

    The modern counting instrumentation has largely obviated the need for separation processes in the radiochemical analysis but problems in low-level radioactivity measurement, environmental-type analyses, and special situations caused in the last years a renaissance of the need for separation techniques. Most of the radiochemical procedures, based on the classic works of the Manhattan Project chemists of the 1940's, were published in the National Nuclear Energy Series (NNES). Improvements such as new solvent extraction and ion exchange separations have been added to these methods throughout the years. Recently the Los Alamos Group have reissued their collected Radiochemical Procedures containing a short summary and review of basic inorganic chemistry - 'Chemistry of the Elements on the Basis of Electronic Configuration'. (A.L.)

  13. Seamless Digital Environment - Plan for Data Analytics Use Case Study

    International Nuclear Information System (INIS)

    Oxstrand, Johanna Helene; Bly, Aaron Douglas

    2016-01-01

    The U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program initiated research in to what is needed in order to provide a roadmap or model for Nuclear Power Plants to reference when building an architecture that can support the growing data supply and demand flowing through their networks. The Digital Architecture project published report Digital Architecture Planning Model (Oxstrand et. al, 2016) discusses things to consider when building an architecture to support the increasing needs and demands of data throughout the plant. Once the plant is able to support the data demands it still needs to be able to provide the data in an easy, quick and reliable method. A common method is to create a ''one stop shop'' application that a user can go to get all the data they need. The creation of this leads to the need of creating a Seamless Digital Environment (SDE) to integrate all the ''siloed'' data. An SDE is the desired perception that should be presented to users by gathering the data from any data source (e.g., legacy applications and work management systems) without effort by the user. The goal for FY16 was to complete a feasibility study for data mining and analytics for employing information from computer-based procedures enabled technologies for use in developing improved business analytics. The research team collaborated with multiple organizations to identify use cases or scenarios, which could be beneficial to investigate in a feasibility study. Many interesting potential use cases were identified throughout the FY16 activity. Unfortunately, due to factors out of the research team's control, none of the studies were initiated this year. However, the insights gained and the relationships built with both PVNGS and NextAxiom will be valuable when moving forward with future research. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting it was identified would be very beneficial to the industry to

  14. Training benefits of research on operator reliability

    International Nuclear Information System (INIS)

    Worledge, D.H.

    1989-01-01

    The purpose of the EPRI Operator Reliability Experiments (ORE) Program is to collect data for use in reliability and safety studies of nuclear power plant operation which more realistically take credit for operator performance in preventing core damage. The three objectives in fulfilling this purpose are: to obtain quantitative/qualitative performance data on operating crew responses in the control room for potential accident sequences by using plant simulators; to test the human cognitive reliability (HCR) correlation; and to develop a data collection analysis procedure. This paper discusses the background to this program, data collection and analysis, and the results of quantitative/qualitative insights stemming from initial work. Special attention is paid to how this program impacts upon simulator use and assessment of simulator fidelity. Attention is also paid to the use of data collection procedures to assist training departments in assessing the quality of their training programs

  15. Improving the safety and reliability of Monju

    International Nuclear Information System (INIS)

    Itou, Kazumoto; Maeda, Hiroshi; Moriyama, Masatoshi

    1998-01-01

    Comprehensive safety review has been performed at Monju to determine why the Monju secondary sodium leakage accident occurred. We investigated how to improve the situation based on the results of the safety review. The safety review focused on five aspects of whether the facilities for dealing with the sodium leakage accident were adequate: the reliability of the detection method, the reliability of the method for preventing the spread of the sodium leakage accident, whether the documented operating procedures are adequate, whether the quality assurance system, program, and actions were properly performed and so on. As a result, we established for Monju a better method of dealing with sodium leakage accidents, rapid detection of sodium leakage, improvement of sodium drain facilities, and way to reduce damage to Monju systems after an accident. We also improve the operation procedures and quality assurance actions to increase the safety and reliability of Monju. (author)

  16. Hawaii Electric System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Loose, Verne William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2012-08-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  17. Hawaii electric system reliability.

    Energy Technology Data Exchange (ETDEWEB)

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  18. LED system reliability

    NARCIS (Netherlands)

    Driel, W.D. van; Yuan, C.A.; Koh, S.; Zhang, G.Q.

    2011-01-01

    This paper presents our effort to predict the system reliability of Solid State Lighting (SSL) applications. A SSL system is composed of a LED engine with micro-electronic driver(s) that supplies power to the optic design. Knowledge of system level reliability is not only a challenging scientific

  19. Integrated system reliability analysis

    DEFF Research Database (Denmark)

    Gintautas, Tomas; Sørensen, John Dalsgaard

    Specific targets: 1) The report shall describe the state of the art of reliability and risk-based assessment of wind turbine components. 2) Development of methodology for reliability and risk-based assessment of the wind turbine at system level. 3) Describe quantitative and qualitative measures...

  20. Reliability of neural encoding

    DEFF Research Database (Denmark)

    Alstrøm, Preben; Beierholm, Ulrik; Nielsen, Carsten Dahl

    2002-01-01

    The reliability with which a neuron is able to create the same firing pattern when presented with the same stimulus is of critical importance to the understanding of neuronal information processing. We show that reliability is closely related to the process of phaselocking. Experimental results f...

  1. Reliability of thermal interface materials: A review

    International Nuclear Information System (INIS)

    Due, Jens; Robinson, Anthony J.

    2013-01-01

    Thermal interface materials (TIMs) are used extensively to improve thermal conduction across two mating parts. They are particularly crucial in electronics thermal management since excessive junction-to-ambient thermal resistances can cause elevated temperatures which can negatively influence device performance and reliability. Of particular interest to electronic package designers is the thermal resistance of the TIM layer at the end of its design life. Estimations of this allow the package to be designed to perform adequately over its entire useful life. To this end, TIM reliability studies have been performed using accelerated stress tests. This paper reviews the body of work which has been performed on TIM reliability. It focuses on the various test methodologies with commentary on the results which have been obtained for the different TIM materials. Based on the information available in the open literature, a test procedure is proposed for TIM selection based on beginning and end of life performance. - Highlights: ► This paper reviews the body of work which has been performed on TIM reliability. ► Test methodologies for reliability testing are outlined. ► Reliability results for the different TIM materials are discussed. ► A test procedure is proposed for TIM selection BOLife and EOLife performance.

  2. 8. All Polish Conference on Analytical Chemistry: Analytical Chemistry for the Community of the 21. Century

    International Nuclear Information System (INIS)

    Koscielniak, P.; Wieczorek, M.; Kozak, J.

    2010-01-01

    Book of Abstracts contains short descriptions of lectures, communications and posters presented during 8 th All Polish Conference on Analytical Chemistry (Cracow, 4-9.07.2010). Scientific programme consisted of: basic analytical problems, preparation of the samples, chemometry and metrology, miniaturization of the analytical procedures, environmental analysis, medicinal analyses, industrial analyses, food analyses, biochemical analyses, analysis of relicts of the past. Several posters were devoted to the radiochemical separations, radiochemical analysis, environmental behaviour of the elements important for the nuclear science and the professional tests.

  3. Analytical standards for accountability of uranium hexafluoride - 1972

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    An analytical standard for the accountability of uranium hexafluoride is presented that includes procedures for subsampling, determination of uranium, determination of metallic impurities and isotopic analysis by gas and thermal ionization mass spectrometry

  4. Towards analytical mix design for large-stone asphalt mixes.

    CSIR Research Space (South Africa)

    Rust, FC

    1992-08-01

    Full Text Available This paper addresses the development of an analytically based design procedure for large-aggregate asphalt and its application in thirteen trial sections. The physical and engineering properties of the various materials are discussed and related...

  5. Predictive analytics and child protection: constraints and opportunities.

    Science.gov (United States)

    Russell, Jesse

    2015-08-01

    This paper considers how predictive analytics might inform, assist, and improve decision making in child protection. Predictive analytics represents recent increases in data quantity and data diversity, along with advances in computing technology. While the use of data and statistical modeling is not new to child protection decision making, its use in child protection is experiencing growth, and efforts to leverage predictive analytics for better decision-making in child protection are increasing. Past experiences, constraints and opportunities are reviewed. For predictive analytics to make the most impact on child protection practice and outcomes, it must embrace established criteria of validity, equity, reliability, and usefulness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Design reliability engineering

    International Nuclear Information System (INIS)

    Buden, D.; Hunt, R.N.M.

    1989-01-01

    Improved design techniques are needed to achieve high reliability at minimum cost. This is especially true of space systems where lifetimes of many years without maintenance are needed and severe mass limitations exist. Reliability must be designed into these systems from the start. Techniques are now being explored to structure a formal design process that will be more complete and less expensive. The intent is to integrate the best features of design, reliability analysis, and expert systems to design highly reliable systems to meet stressing needs. Taken into account are the large uncertainties that exist in materials, design models, and fabrication techniques. Expert systems are a convenient method to integrate into the design process a complete definition of all elements that should be considered and an opportunity to integrate the design process with reliability, safety, test engineering, maintenance and operator training. 1 fig

  7. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  8. Evaluation of Evidence from Analytical Procedures in Auditing

    DEFF Research Database (Denmark)

    Holm, Claus

    Afhandlingens formål er at bidrage til den eksisterende viden på det område indenfor revision, der vedrører vurdering af bevismateriale. Afhandlingen fokuserer på vurdering af bevismateriale fremskaffet ved anvendelsen af regnskabsanalytisk revision. Den begrebsmæssige betydning af revisionsmæssi...

  9. Analytical procedure for the radiometric determination of uranium in ores

    International Nuclear Information System (INIS)

    Bone, S.J.; Porritt, R.E.J.

    1976-06-01

    Two methods are described for the non-destructive determination of uranium in ores: a beta-gamma measuring method and a gamma-spectrometrical one. The first has the advantage that the analysis is not influenced by a radioactive unbalance in the sample (say by loss of radium as a result of chemical decomposition of the ores) and that it can be carried out with comparitively simple apparative expenditure. It is, however, relatively inaccurate (+-25%) and should only be used as a surveying method. The gamma-spectrometrical analysis (accuracy about +-10%) gives information about an unbalance present between U 238 and Ra 226 and thus enables an appropriate correction to be made. A thorium contribution with its decay products can also be corrected. (RB) [de

  10. Analytic Procedures For Designing and Evaluating Decision Aids.

    Science.gov (United States)

    1980-04-01

    the taxonomy of decision charateristics . Chapter 5 applies the taxonomies to the information processing functions needed for AAW decisions, and...rationality emphasizes the extent to which organizations and other social institutions consist of individuals who pursue individual objectives by means of...adaptive rationality is always wrong or naive; most of us know persons that seem to be naturally good decision-makers. There is no logic that guarantees

  11. Analytical Chemistry Laboratory (ACL) procedure compendium. Volume 4, Organic methods

    Energy Technology Data Exchange (ETDEWEB)

    1993-08-01

    This interim notice covers the following: extractable organic halides in solids, total organic halides, analysis by gas chromatography/Fourier transform-infrared spectroscopy, hexadecane extracts for volatile organic compounds, GC/MS analysis of VOCs, GC/MS analysis of methanol extracts of cryogenic vapor samples, screening of semivolatile organic extracts, GPC cleanup for semivolatiles, sample preparation for GC/MS for semi-VOCs, analysis for pesticides/PCBs by GC with electron capture detection, sample preparation for pesticides/PCBs in water and soil sediment, report preparation, Florisil column cleanup for pesticide/PCBs, silica gel and acid-base partition cleanup of samples for semi-VOCs, concentrate acid wash cleanup, carbon determination in solids using Coulometrics` CO{sub 2} coulometer, determination of total carbon/total organic carbon/total inorganic carbon in radioactive liquids/soils/sludges by hot persulfate method, analysis of solids for carbonates using Coulometrics` Model 5011 coulometer, and soxhlet extraction.

  12. A Procedure for the Sequential Determination of Radionuclides in Phosphogypsum Liquid Scintillation Counting and Alpha Spectrometry for 210Po, 210Pb, 226Ra, Th and U Radioisotopes

    International Nuclear Information System (INIS)

    2014-01-01

    Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, tested and validated analytical procedures are extremely important tools for the production of such analytical data. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. In this publication, a combined procedure for the sequential determination of 210Po, 210Pb, 226Ra, Th and U radioisotopes in phosphogypsum is described. The method is based on the dissolution of small amounts of phosphogypsum by microwave digestion, followed by sequential separation of 210Po, 210Pb, Th and U radioisotopes by selective extraction chromatography using Sr, TEVA and UTEVA resins. Radium-226 is separated from interfering elements using Ba(Ra)SO4 co-precipitation. Lead-210 is determined by liquid scintillation counting. The alpha source of 210Po is prepared by autodeposition on a silver plate. The alpha sources of Th and U are prepared by electrodeposition on a stainless steel plate. A comprehensive methodology for the calculation of results, including the quantification of measurement uncertainty, was also developed. The procedure is introduced as a recommended procedure and validated in terms of trueness, repeatability and reproducibility in accordance with ISO guidelines

  13. Reliability of hospital cost profiles in inpatient surgery.

    Science.gov (United States)

    Grenda, Tyler R; Krell, Robert W; Dimick, Justin B

    2016-02-01

    With increased policy emphasis on shifting risk from payers to providers through mechanisms such as bundled payments and accountable care organizations, hospitals are increasingly in need of metrics to understand their costs relative to peers. However, it is unclear whether Medicare payments for surgery can reliably compare hospital costs. We used national Medicare data to assess patients undergoing colectomy, pancreatectomy, and open incisional hernia repair from 2009 to 2010 (n = 339,882 patients). We first calculated risk-adjusted hospital total episode payments for each procedure. We then used hierarchical modeling techniques to estimate the reliability of total episode payments for each procedure and explored the impact of hospital caseload on payment reliability. Finally, we quantified the number of hospitals meeting published reliability benchmarks. Mean risk-adjusted total episode payments ranged from $13,262 (standard deviation [SD] $14,523) for incisional hernia repair to $25,055 (SD $22,549) for pancreatectomy. The reliability of hospital episode payments varied widely across procedures and depended on sample size. For example, mean episode payment reliability for colectomy (mean caseload, 157) was 0.80 (SD 0.18), whereas for pancreatectomy (mean caseload, 13) the mean reliability was 0.45 (SD 0.27). Many hospitals met published reliability benchmarks for each procedure. For example, 90% of hospitals met reliability benchmarks for colectomy, 40% for pancreatectomy, and 66% for incisional hernia repair. Episode payments for inpatient surgery are a reliable measure of hospital costs for commonly performed procedures, but are less reliable for lower volume operations. These findings suggest that hospital cost profiles based on Medicare claims data may be used to benchmark efficiency, especially for more common procedures. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Analytically solvable models of reaction-diffusion systems

    Energy Technology Data Exchange (ETDEWEB)

    Zemskov, E P; Kassner, K [Institut fuer Theoretische Physik, Otto-von-Guericke-Universitaet, Universitaetsplatz 2, 39106 Magdeburg (Germany)

    2004-05-01

    We consider a class of analytically solvable models of reaction-diffusion systems. An analytical treatment is possible because the nonlinear reaction term is approximated by a piecewise linear function. As particular examples we choose front and pulse solutions to illustrate the matching procedure in the one-dimensional case.

  15. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    Science.gov (United States)

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  16. Computer controlled quality of analytical measurements

    International Nuclear Information System (INIS)

    Clark, J.P.; Huff, G.A.

    1979-01-01

    A PDP 11/35 computer system is used in evaluating analytical chemistry measurements quality control data at the Barnwell Nuclear Fuel Plant. This computerized measurement quality control system has several features which are not available in manual systems, such as real-time measurement control, computer calculated bias corrections and standard deviation estimates, surveillance applications, evaluaton of measurement system variables, records storage, immediate analyst recertificaton, and the elimination of routine analysis of known bench standards. The effectiveness of the Barnwell computer system has been demonstrated in gathering and assimilating the measurements of over 1100 quality control samples obtained during a recent plant demonstration run. These data were used to determine equaitons for predicting measurement reliability estimates (bias and precision); to evaluate the measurement system; and to provide direction for modification of chemistry methods. The analytical chemistry measurement quality control activities represented 10% of the total analytical chemistry effort

  17. Google analytics integrations

    CERN Document Server

    Waisberg, Daniel

    2015-01-01

    A roadmap for turning Google Analytics into a centralized marketing analysis platform With Google Analytics Integrations, expert author Daniel Waisberg shows you how to gain a more meaningful, complete view of customers that can drive growth opportunities. This in-depth guide shows not only how to use Google Analytics, but also how to turn this powerful data collection and analysis tool into a central marketing analysis platform for your company. Taking a hands-on approach, this resource explores the integration and analysis of a host of common data sources, including Google AdWords, AdSens

  18. Development of integrated analytical data management system

    International Nuclear Information System (INIS)

    Onishi, Koichi; Wachi, Isamu; Hiroki, Toshio

    1986-01-01

    The Analysis Subsection of Technical Service Section, Tokai Reprocessing Plant, Tokai Works, is engaged in analysis activities required for the management of processes and measurements in the plant. Currently, it has been desired to increase the reliability of analytical data and to perform analyses more rapidly to cope with the increasing number of analysis works. To meet this end, on-line data processing has been promoted and advanced analytical equipment has been introduced in order to enhance automization. In the present study, an integrated analytical data mangement system is developed which serves for improvement of reliability of analytical data as well as for rapid retrieval and automatic compilation of these data. Fabrication of a basic model of the system has been nearly completed and test operation has already been started. In selecting hardware to be used, examinations were made on easiness of system extension, Japanese language processing function for improving man-machine interface, large-capacity auxiliary memory system, and data base processing function. The existing analysis works wer reviewed in establishing the basic design of the system. According to this basic design, the system can perform such works as analysis of application slips received from clients as well as recording, sending, filing and retrieval of analysis results. (Nogami, K.)

  19. Life cycle management of analytical methods.

    Science.gov (United States)

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Reliability of construction materials

    International Nuclear Information System (INIS)

    Merz, H.

    1976-01-01

    One can also speak of reliability with respect to materials. While for reliability of components the MTBF (mean time between failures) is regarded as the main criterium, this is replaced with regard to materials by possible failure mechanisms like physical/chemical reaction mechanisms, disturbances of physical or chemical equilibrium, or other interactions or changes of system. The main tasks of the reliability analysis of materials therefore is the prediction of the various failure reasons, the identification of interactions, and the development of nondestructive testing methods. (RW) [de

  1. Structural Reliability Methods

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Madsen, H. O.

    The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...... of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature...... of the uncertainties and their interplay is the developed, step-by-step. The concepts presented are illustrated by numerous examples throughout the text....

  2. Reliability and mechanical design

    International Nuclear Information System (INIS)

    Lemaire, Maurice

    1997-01-01

    A lot of results in mechanical design are obtained from a modelisation of physical reality and from a numerical solution which would lead to the evaluation of needs and resources. The goal of the reliability analysis is to evaluate the confidence which it is possible to grant to the chosen design through the calculation of a probability of failure linked to the retained scenario. Two types of analysis are proposed: the sensitivity analysis and the reliability analysis. Approximate methods are applicable to problems related to reliability, availability, maintainability and safety (RAMS)

  3. RTE - 2013 Reliability Report

    International Nuclear Information System (INIS)

    Denis, Anne-Marie

    2014-01-01

    RTE publishes a yearly reliability report based on a standard model to facilitate comparisons and highlight long-term trends. The 2013 report is not only stating the facts of the Significant System Events (ESS), but it moreover underlines the main elements dealing with the reliability of the electrical power system. It highlights the various elements which contribute to present and future reliability and provides an overview of the interaction between the various stakeholders of the Electrical Power System on the scale of the European Interconnected Network. (author)

  4. Analytical quality control [An IAEA service

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1973-07-01

    In analytical chemistry the determination of small or trace amounts of elements or compounds in different types of materials is increasingly important. The results of these findings have a great influence on different fields of science, and on human life. Their reliability, precision and accuracy must, therefore, be checked by analytical quality control measures. The International Atomic Energy Agency (IAEA) set up an Analytical Quality Control Service (AQCS) in 1962 to assist laboratories in Member States in the assessment of their reliability in radionuclide analysis, and in other branches of applied analysis in which radionuclides may be used as analytical implements. For practical reasons, most analytical laboratories are not in a position to check accuracy internally, as frequently resources are available for only one method; standardized sample material, particularly in the case of trace analysis, is not available and can be prepared by the institutes themselves only in exceptional cases; intercomparisons are organized rather seldom and many important types of analysis are so far not covered. AQCS assistance is provided by the shipment to laboratories of standard reference materials containing known quantities of different trace elements or radionuclides, as well as by the organization of analytical intercomparisons in which the participating laboratories are provided with aliquots of homogenized material of unknown composition for analysis. In the latter case the laboratories report their data to the Agency's laboratory, which calculates averages and distributions of results and advises each laboratory of its performance relative to all the others. Throughout the years several dozens of intercomparisons have been organized and many thousands of samples provided. The service offered, as a consequence, has grown enormously. The programme for 1973 and 1974, which is currently being distributed to Member States, will contain 31 different types of materials.

  5. Radionuclides in analytical chemistry

    International Nuclear Information System (INIS)

    Tousset, J.

    1984-01-01

    Applications of radionuclides in analytical chemistry are reviewed in this article: tracers, radioactive sources and activation analysis. Examples are given in all these fields and it is concluded that these methods should be used more widely [fr

  6. Mobility Data Analytics Center.

    Science.gov (United States)

    2016-01-01

    Mobility Data Analytics Center aims at building a centralized data engine to efficiently manipulate : large-scale data for smart decision making. Integrating and learning the massive data are the key to : the data engine. The ultimate goal of underst...

  7. Analytical strategies for phosphoproteomics

    DEFF Research Database (Denmark)

    Thingholm, Tine E; Jensen, Ole N; Larsen, Martin R

    2009-01-01

    sensitive and specific strategies. Today, most phosphoproteomic studies are conducted by mass spectrometric strategies in combination with phospho-specific enrichment methods. This review presents an overview of different analytical strategies for the characterization of phosphoproteins. Emphasis...

  8. Go-flow: a reliability analysis methodology applicable to piping system

    International Nuclear Information System (INIS)

    Matsuoka, T.; Kobayashi, M.

    1985-01-01

    Since the completion of the Reactor Safety Study, the use of probabilistic risk assessment technique has been becoming more widespread in the nuclear community. Several analytical methods are used for the reliability analysis of nuclear power plants. The GO methodology is one of these methods. Using the GO methodology, the authors performed a reliability analysis of the emergency decay heat removal system of the nuclear ship Mutsu, in order to examine its applicability to piping systems. By this analysis, the authors have found out some disadvantages of the GO methodology. In the GO methodology, the signal is on-to-off or off-to-on signal, therefore the GO finds out the time point at which the state of a system changes, and can not treat a system which state changes as off-on-off. Several computer runs are required to obtain the time dependent failure probability of a system. In order to overcome these disadvantages, the authors propose a new analytical methodology: GO-FLOW. In GO-FLOW, the modeling method (chart) and the calculation procedure are similar to those in the GO methodology, but the meaning of signal and time point, and the definitions of operators are essentially different. In the paper, the GO-FLOW methodology is explained and two examples of the analysis by GO-FLOW are given

  9. Encyclopedia of analytical surfaces

    CERN Document Server

    Krivoshapko, S N

    2015-01-01

    This encyclopedia presents an all-embracing collection of analytical surface classes. It provides concise definitions  and description for more than 500 surfaces and categorizes them in 38 classes of analytical surfaces. All classes are cross references to the original literature in an excellent bibliography. The encyclopedia is of particular interest to structural and civil engineers and serves as valuable reference for mathematicians.

  10. Intermediate algebra & analytic geometry

    CERN Document Server

    Gondin, William R

    1967-01-01

    Intermediate Algebra & Analytic Geometry Made Simple focuses on the principles, processes, calculations, and methodologies involved in intermediate algebra and analytic geometry. The publication first offers information on linear equations in two unknowns and variables, functions, and graphs. Discussions focus on graphic interpretations, explicit and implicit functions, first quadrant graphs, variables and functions, determinate and indeterminate systems, independent and dependent equations, and defective and redundant systems. The text then examines quadratic equations in one variable, system

  11. Approach to reliability assessment

    International Nuclear Information System (INIS)

    Green, A.E.; Bourne, A.J.

    1975-01-01

    Experience has shown that reliability assessments can play an important role in the early design and subsequent operation of technological systems where reliability is at a premium. The approaches to and techniques for such assessments, which have been outlined in the paper, have been successfully applied in variety of applications ranging from individual equipments to large and complex systems. The general approach involves the logical and systematic establishment of the purpose, performance requirements and reliability criteria of systems. This is followed by an appraisal of likely system achievment based on the understanding of different types of variational behavior. A fundamental reliability model emerges from the correlation between the appropriate Q and H functions for performance requirement and achievement. This model may cover the complete spectrum of performance behavior in all the system dimensions

  12. 18 CFR 39.4 - Funding of the Electric Reliability Organization.

    Science.gov (United States)

    2010-04-01

    ... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.4 Funding of the Electric Reliability Organization. (a) Any... Reliability Organization. 39.4 Section 39.4 Conservation of Power and Water Resources FEDERAL ENERGY...

  13. SRL online Analytical Development

    International Nuclear Information System (INIS)

    Jenkins, C.W.

    1991-01-01

    The Savannah River Site is operated by the Westinghouse Savannah River Co. for the Department of Energy to produce special nuclear materials for defense. R ampersand D support for site programs is provided by the Savannah River Laboratory, which I represent. The site is known primarily for its nuclear reactors, but actually three fourths of the efforts at the site are devoted to fuel/target fabrication, fuel/target reprocessing, and waste management. All of these operations rely heavily on chemical processes. The site is therefore a large chemical plant. There are then many potential applications for process analytical chemistry at SRS. The Savannah River Laboratory (SRL) has an Analytical Development Section of roughly 65 personnel that perform analyses for R ampersand D efforts at the lab, act as backup to the site Analytical Laboratories Department and develop analytical methods and instruments. I manage a subgroup of the Analytical Development Section called the Process Control ampersand Analyzer Development Group. The Prime mission of this group is to develop online/at-line analytical systems for site applications

  14. The rating reliability calculator

    Directory of Open Access Journals (Sweden)

    Solomon David J

    2004-04-01

    Full Text Available Abstract Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program.

  15. Structural systems reliability analysis

    International Nuclear Information System (INIS)

    Frangopol, D.

    1975-01-01

    For an exact evaluation of the reliability of a structure it appears necessary to determine the distribution densities of the loads and resistances and to calculate the correlation coefficients between loads and between resistances. These statistical characteristics can be obtained only on the basis of a long activity period. In case that such studies are missing the statistical properties formulated here give upper and lower bounds of the reliability. (orig./HP) [de

  16. Reliability and maintainability

    International Nuclear Information System (INIS)

    1994-01-01

    Several communications in this conference are concerned with nuclear plant reliability and maintainability; their titles are: maintenance optimization of stand-by Diesels of 900 MW nuclear power plants; CLAIRE: an event-based simulation tool for software testing; reliability as one important issue within the periodic safety review of nuclear power plants; design of nuclear building ventilation by the means of functional analysis; operation characteristic analysis for a power industry plant park, as a function of influence parameters

  17. Reliability data book

    International Nuclear Information System (INIS)

    Bento, J.P.; Boerje, S.; Ericsson, G.; Hasler, A.; Lyden, C.O.; Wallin, L.; Poern, K.; Aakerlund, O.

    1985-01-01

    The main objective for the report is to improve failure data for reliability calculations as parts of safety analyses for Swedish nuclear power plants. The work is based primarily on evaluations of failure reports as well as information provided by the operation and maintenance staff of each plant. In the report are presented charts of reliability data for: pumps, valves, control rods/rod drives, electrical components, and instruments. (L.E.)

  18. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  19. Analysis and Application of Reliability

    International Nuclear Information System (INIS)

    Jeong, Hae Seong; Park, Dong Ho; Kim, Jae Ju

    1999-05-01

    This book tells of analysis and application of reliability, which includes definition, importance and historical background of reliability, function of reliability and failure rate, life distribution and assumption of reliability, reliability of unrepaired system, reliability of repairable system, sampling test of reliability, failure analysis like failure analysis by FEMA and FTA, and cases, accelerated life testing such as basic conception, acceleration and acceleration factor, and analysis of accelerated life testing data, maintenance policy about alternation and inspection.

  20. Implementing self sustained quality control procedures in a clinical laboratory.

    Science.gov (United States)

    Khatri, Roshan; K C, Sanjay; Shrestha, Prabodh; Sinha, J N

    2013-01-01

    Quality control is an essential component in every clinical laboratory which maintains the excellence of laboratory standards, supplementing to proper disease diagnosis, patient care and resulting in overall strengthening of health care system. Numerous quality control schemes are available, with combinations of procedures, most of which are tedious, time consuming and can be "too technical" whereas commercially available quality control materials can be expensive especially for laboratories in developing nations like Nepal. Here, we present a procedure performed at our centre with self prepared control serum and use of simple statistical tools for quality assurance. The pooled serum was prepared as per guidelines for preparation of stabilized liquid quality control serum from human sera. Internal Quality Assessment was performed on this sample, on a daily basis which included measurement of 12 routine biochemical parameters. The results were plotted on Levey-Jennings charts and analysed with quality control rules, for a period of one month. The mean levels of biochemical analytes in self prepared control serum were within normal physiological range. This serum was evaluated every day along with patients' samples. The results obtained were plotted on control charts and analysed using common quality control rules to identify possible systematic and random errors. Immediate mitigation measures were taken and the dispatch of erroneous reports was avoided. In this study we try to highlight on a simple internal quality control procedure which can be performed by laboratories, with minimum technology, expenditure, and expertise and improve reliability and validity of the test reports.