WorldWideScience

Sample records for isotopic analysis software

  1. Spectral analysis software improves confidence in plant and soil water stable isotope analyses performed by isotope ratio infrared spectroscopy (IRIS).

    Science.gov (United States)

    West, A G; Goldsmith, G R; Matimati, I; Dawson, T E

    2011-08-30

    Previous studies have demonstrated the potential for large errors to occur when analyzing waters containing organic contaminants using isotope ratio infrared spectroscopy (IRIS). In an attempt to address this problem, IRIS manufacturers now provide post-processing spectral analysis software capable of identifying samples with the types of spectral interference that compromises their stable isotope analysis. Here we report two independent tests of this post-processing spectral analysis software on two IRIS systems, OA-ICOS (Los Gatos Research Inc.) and WS-CRDS (Picarro Inc.). Following a similar methodology to a previous study, we cryogenically extracted plant leaf water and soil water and measured the δ(2)H and δ(18)O values of identical samples by isotope ratio mass spectrometry (IRMS) and IRIS. As an additional test, we analyzed plant stem waters and tap waters by IRMS and IRIS in an independent laboratory. For all tests we assumed that the IRMS value represented the "true" value against which we could compare the stable isotope results from the IRIS methods. Samples showing significant deviations from the IRMS value (>2σ) were considered to be contaminated and representative of spectral interference in the IRIS measurement. Over the two studies, 83% of plant species were considered contaminated on OA-ICOS and 58% on WS-CRDS. Post-analysis, spectra were analyzed using the manufacturer's spectral analysis software, in order to see if the software correctly identified contaminated samples. In our tests the software performed well, identifying all the samples with major errors. However, some false negatives indicate that user evaluation and testing of the software are necessary. Repeat sampling of plants showed considerable variation in the discrepancies between IRIS and IRMS. As such, we recommend that spectral analysis of IRIS data must be incorporated into standard post-processing routines. Furthermore, we suggest that the results from spectral analysis be

  2. Uranium Isotopic Analysis with the FRAM Isotopic Analysis Code

    International Nuclear Information System (INIS)

    Vo, D.T.; Sampson, T.E.

    1999-01-01

    FRAM is the acronym for Fixed-Energy Response-Function Analysis with Multiple efficiency. This software was developed at Los Alamos National Laboratory originally for plutonium isotopic analysis. Later, it was adapted for uranium isotopic analysis in addition to plutonium. It is a code based on a self-calibration using several gamma-ray peaks for determining the isotopic ratios. The versatile-parameter database structure governs all facets of the data analysis. User editing of the parameter sets allows great flexibility in handling data with different isotopic distributions, interfering isotopes, and different acquisition parameters such as energy calibration and detector type

  3. Achievements in testing of the MGA and FRAM isotopic software codes under the DOE/NNSA-IRSN cooperation of gamma-ray isotopic measurement systems

    International Nuclear Information System (INIS)

    Vo, Duc; Wang, Tzu-Fang; Funk, Pierre; Weber, Anne-Laure; Pepin, Nicolas; Karcher, Anna

    2009-01-01

    DOE/NNSA and IRSN collaborated on a study of gamma-ray instruments and analysis methods used to perform isotopic measurements of special nuclear materials. The two agencies agreed to collaborate on the project in response to inconsistencies that were found in the various versions of software and hardware used to determine the isotopic abundances of uranium and plutonium. IRSN used software developed internally to test the MGA and FRAM isotopic analysis codes for criteria used to stop data acquisition. The stop-criterion test revealed several unusual behaviors in both the MGA and FRAM software codes.

  4. Cooperation on Improved Isotopic Identification and Analysis Software for Portable, Electrically Cooled High-Resolution Gamma Spectrometry Systems Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Dreyer, Jonathan G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Wang, Tzu-Fang [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Vo, Duc T. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Funk, Pierre F. [Inst. for Radiological Protection and Nuclear Safety (IRSN), Fontenay-aux-Roses (France); Weber, Anne-Laure [Inst. for Radiological Protection and Nuclear Safety (IRSN), Fontenay-aux-Roses (France)

    2017-07-20

    Under a 2006 agreement between the Department of Energy (DOE) of the United States of America and the Institut de Radioprotection et de Sûreté Nucléaire (IRSN) of France, the National Nuclear Security Administration (NNSA) within DOE and IRSN initiated a collaboration to improve isotopic identification and analysis of nuclear material [i.e., plutonium (Pu) and uranium (U)]. The specific aim of the collaborative project was to develop new versions of two types of isotopic identification and analysis software: (1) the fixed-energy response-function analysis for multiple energies (FRAM) codes and (2) multi-group analysis (MGA) codes. The project is entitled Action Sheet 4 – Cooperation on Improved Isotopic Identification and Analysis Software for Portable, Electrically Cooled, High-Resolution Gamma Spectrometry Systems (Action Sheet 4). FRAM and MGA/U235HI are software codes used to analyze isotopic ratios of U and Pu. FRAM is an application that uses parameter sets for the analysis of U or Pu. MGA and U235HI are two separate applications that analyze Pu or U, respectively. They have traditionally been used by safeguards practitioners to analyze gamma spectra acquired with high-resolution gamma spectrometry (HRGS) systems that are cooled by liquid nitrogen. However, it was discovered that these analysis programs were not as accurate when used on spectra acquired with a newer generation of more portable, electrically cooled HRGS (ECHRGS) systems. In response to this need, DOE/NNSA and IRSN collaborated to update the FRAM and U235HI codes to improve their performance with newer ECHRGS systems. Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL) performed this work for DOE/NNSA.

  5. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    Science.gov (United States)

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  6. Real-time software for multi-isotopic source term estimation

    International Nuclear Information System (INIS)

    Goloubenkov, A.; Borodin, R.; Sohier, A.

    1996-01-01

    Consideration is given to development of software for one of crucial components of the RODOS - assessment of the source rate (SR) from indirect measurements. Four components of the software are described in the paper. First component is a GRID system, which allow to prepare stochastic meteorological and radioactivity fields using measured data. Second part is a model of atmospheric transport which can be adapted for emulation of practically any gamma dose/spectrum detectors. The third one is a method which allows space-time and quantitative discrepancies in measured and modelled data to be taken into account simultaneously. It bases on the preference scheme selected by an expert. Last component is a special optimization method for calculation of multi-isotopic SR and its uncertainties. Results of a validation of the software using tracer experiments data and Chernobyl source estimation for main dose-forming isotopes are enclosed in the paper

  7. Gamma-ray isotopic analysis development at Los Alamos

    Energy Technology Data Exchange (ETDEWEB)

    Thomas E. Sampson

    1999-11-01

    This report describes the development history and characteristics of software developed in the Safeguards Science and Technology group at Los Alamos for gamma-ray isotopic analysis. This software analyzes the gamma-ray spectrum from measurements performed on actinide samples (principally plutonium and uranium) of arbitrary size, geometry, and physical and chemical composition. The results are obtained without calibration using only fundamental tabulated nuclear constants. Characteristics of the current software versions are discussed in some detail and many examples of implemented measurement systems are shown.

  8. Development of a code for the isotopic analysis of Uranium

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. H.; Kang, M. Y.; Kim, Jinhyeong; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of)

    2013-05-15

    To strengthen the national nuclear nonproliferation regime by an establishment of nuclear forensic system, the techniques for nuclear material analysis and the categorization of important domestic nuclear materials are being developed. MGAU and FRAM are commercial software for the isotopic analysis of Uranium by using γ-spectroscopy, but the diversity of detection geometry and some effects - self attenuation, coincidence summing, etc. - suggest an analysis tool under continual improvement and modification. Hence, developing another code for HPGe γ- and x-ray spectrum analysis is started in this study. The analysis of the 87-101 keV region of Uranium spectrum is attempted based on the isotopic responses similar to those developed in MGAU. The code for isotopic analysis of Uranium is started from a fitting.

  9. The CSSIAR v.1.00 Software: A new tool based on SIAR to assess soil redistribution using Compound Specific Stable Isotopes

    Directory of Open Access Journals (Sweden)

    de los Santos-Villalobos Sergio

    2017-01-01

    Full Text Available Soil erosion is one of the biggest challenges for food production around the world. Many techniques have been used to evaluate and mitigate soil degradation. Nowadays isotopic techniques are becoming a powerful tool to assess soil apportionment. One of the innovative techniques used is the Compound Specific Stable Isotopes (CSSI analysis, which has been used to track down sediments and specify their sources by the isotopic signature of δ13C in specific fatty acids. The application of this technique on soil apportionment has been recently developed, however there is a lack of user-friendly Software for data processing and interpretation. The aim of this article is to introduce a new open source tool for working with data sets generated by the use of the CSSI technique to assess soil apportionment, called the CSSIARv1.00 Software

  10. The CSSIAR v.1.00 Software: A new tool based on SIAR to assess soil redistribution using Compound Specific Stable Isotopes

    Science.gov (United States)

    Sergio, de los Santos-Villalobos; Claudio, Bravo-Linares; dos Anjos Roberto, Meigikos; Renan, Cardoso; Max, Gibbs; Andrew, Swales; Lionel, Mabit; Gerd, Dercon

    Soil erosion is one of the biggest challenges for food production around the world. Many techniques have been used to evaluate and mitigate soil degradation. Nowadays isotopic techniques are becoming a powerful tool to assess soil apportionment. One of the innovative techniques used is the Compound Specific Stable Isotopes (CSSI) analysis, which has been used to track down sediments and specify their sources by the isotopic signature of δ13 C in specific fatty acids. The application of this technique on soil apportionment has been recently developed, however there is a lack of user-friendly Software for data processing and interpretation. The aim of this article is to introduce a new open source tool for working with data sets generated by the use of the CSSI technique to assess soil apportionment, called the CSSIARv1.00 Software

  11. Measurement system analysis (MSA) of the isotopic ratio for uranium isotope enrichment process control

    Energy Technology Data Exchange (ETDEWEB)

    Medeiros, Josue C. de; Barbosa, Rodrigo A.; Carnaval, Joao Paulo R., E-mail: josue@inb.gov.br, E-mail: rodrigobarbosa@inb.gov.br, E-mail: joaocarnaval@inb.gov.br [Industrias Nucleares do Brasil (INB), Rezende, RJ (Brazil)

    2013-07-01

    Currently, one of the stages in nuclear fuel cycle development is the process of uranium isotope enrichment, which will provide the amount of low enriched uranium for the nuclear fuel production to supply 100% Angra 1 and 20% Angra 2 demands. Determination of isotopic ration n({sup 235}U)/n({sup 238}U) in uranium hexafluoride (UF{sub 6} - used as process gas) is essential in order to control of enrichment process of isotopic separation by gaseous centrifugation cascades. The uranium hexafluoride process is performed by gas continuous feeding in separation unit which uses the centrifuge force principle, establishing a density gradient in a gas containing components of different molecular weights. The elemental separation effect occurs in a single ultracentrifuge that results in a partial separation of the feed in two fractions: an enriched on (product) and another depleted (waste) in the desired isotope ({sup 235}UF{sub 6}). Industrias Nucleares do Brasil (INB) has used quadrupole mass spectrometry (QMS) by electron impact (EI) to perform isotopic ratio n({sup 235}U)/n({sup 238}U) analysis in the process. The decision of adjustments and change te input variables are based on the results presented in these analysis. A study of stability, bias and linearity determination has been performed in order to evaluate the applied method, variations and systematic errors in the measurement system. The software used to analyze the techniques above was the Minitab 15. (author)

  12. Radiation and environmental data analysis computer (REDAC) hardware, software band analysis procedures

    International Nuclear Information System (INIS)

    Hendricks, T.J.

    1985-01-01

    The REDAC was conceived originally as a tape verifier for the Radiation and Environmental Data Acquisition Recorder (REDAR). From that simple beginning in 1971, the REDAC has evolved into a family of systems used for complete analysis of data obtained by the REDAR and other acquisition systems. Portable or mobile REDACs are deployed to support checkout and analysis tasks in the field. Laboratory systems are additionally used for software development, physics investigations, data base management and graphics. System configurations range from man-portable systems to a large laboratory-based system which supports time-shared analysis and development tasks. Custom operating software allows the analyst to process data either interactively or by batch procedures. Analysis packages are provided for numerous necessary functions. All these analysis procedures can be performed even on the smallest man-portable REDAC. Examples of the multi-isotope stripping and radiation isopleth mapping are presented. Techniques utilized for these operations are also presented

  13. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  14. Development of Emittance Analysis Software for Ion Beam Characterization

    International Nuclear Information System (INIS)

    Padilla, M.J.; Liu, Yuan

    2007-01-01

    Transverse beam emittance is a crucial property of charged particle beams that describes their angular and spatial spread. It is a figure of merit frequently used to determine the quality of ion beams, the compatibility of an ion beam with a given beam transport system, and the ability to suppress neighboring isotopes at on-line mass separator facilities. Generally, a high-quality beam is characterized by a small emittance. In order to determine and improve the quality of ion beams used at the Holifield Radioactive Ion Beam Facility (HRIBF) for nuclear physics and nuclear astrophysics research, the emittances of the ion beams are measured at the off-line Ion Source Test Facilities. In this project, emittance analysis software was developed to perform various data processing tasks for noise reduction, to evaluate root-mean-square emittance, Twiss parameters, and area emittance of different beam fractions. The software also provides 2D and 3D graphical views of the emittance data, beam profiles, emittance contours, and RMS. Noise exclusion is essential for accurate determination of beam emittance values. A Self-Consistent, Unbiased Elliptical Exclusion (SCUBEEx) method is employed. Numerical data analysis techniques such as interpolation and nonlinear fitting are also incorporated into the software. The software will provide a simplified, fast tool for comprehensive emittance analysis. The main functions of the software package have been completed. In preliminary tests with experimental emittance data, the analysis results using the software were shown to be accurate

  15. DEVELOPMENT OF EMITTANCE ANALYSIS SOFTWARE FOR ION BEAM CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Padilla, M. J.; Liu, Y.

    2007-01-01

    Transverse beam emittance is a crucial property of charged particle beams that describes their angular and spatial spread. It is a fi gure of merit frequently used to determine the quality of ion beams, the compatibility of an ion beam with a given beam transport system, and the ability to suppress neighboring isotopes at on-line mass separator facilities. Generally a high quality beam is characterized by a small emittance. In order to determine and improve the quality of ion beams used at the Holifi eld Radioactive Ion beam Facility (HRIBF) for nuclear physics and nuclear astrophysics research, the emittances of the ion beams are measured at the off-line Ion Source Test Facilities. In this project, emittance analysis software was developed to perform various data processing tasks for noise reduction, to evaluate root-mean-square emittance, Twiss parameters, and area emittance of different beam fractions. The software also provides 2D and 3D graphical views of the emittance data, beam profi les, emittance contours, and RMS. Noise exclusion is essential for accurate determination of beam emittance values. A Self-Consistent, Unbiased Elliptical Exclusion (SCUBEEx) method is employed. Numerical data analysis techniques such as interpolation and nonlinear fi tting are also incorporated into the software. The software will provide a simplifi ed, fast tool for comprehensive emittance analysis. The main functions of the software package have been completed. In preliminary tests with experimental emittance data, the analysis results using the software were shown to be accurate.

  16. Potku – New analysis software for heavy ion elastic recoil detection analysis

    International Nuclear Information System (INIS)

    Arstila, K.; Julin, J.; Laitinen, M.I.; Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T.; Sajavaara, T.

    2014-01-01

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight–energy (ToF–E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF–E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments

  17. Potku – New analysis software for heavy ion elastic recoil detection analysis

    Energy Technology Data Exchange (ETDEWEB)

    Arstila, K., E-mail: kai.arstila@jyu.fi [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Julin, J.; Laitinen, M.I. [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T. [Department of Mathematical Information Technology, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Sajavaara, T. [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland)

    2014-07-15

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight–energy (ToF–E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF–E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments.

  18. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  19. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Andrew J. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Capo, Rosemary C. [Univ. of Pittsburgh, PA (United States); Stewart, Brian W. [Univ. of Pittsburgh, PA (United States); Phan, Thai T. [Univ. of Pittsburgh, PA (United States); Jain, Jinesh C. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Hakala, Alexandra [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Guthrie, George D. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States)

    2016-09-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  20. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Hakala, Jacqueline Alexandra [National Energy Technology Lab. (NETL), Morgantown, WV (United States)

    2016-11-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  1. Isotope dilution analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fudge, A.

    1978-12-15

    The following aspects of isotope dilution analysis are covered in this report: fundamental aspects of the technique; elements of interest in the nuclear field, choice and standardization of spike nuclide; pre-treatment to achieve isotopic exchange and chemical separation; sensitivity; selectivity; and accuracy.

  2. iMS2Flux – a high–throughput processing tool for stable isotope labeled mass spectrometric data used for metabolic flux analysis

    Directory of Open Access Journals (Sweden)

    Poskar C Hart

    2012-11-01

    Full Text Available Abstract Background Metabolic flux analysis has become an established method in systems biology and functional genomics. The most common approach for determining intracellular metabolic fluxes is to utilize mass spectrometry in combination with stable isotope labeling experiments. However, before the mass spectrometric data can be used it has to be corrected for biases caused by naturally occurring stable isotopes, by the analytical technique(s employed, or by the biological sample itself. Finally the MS data and the labeling information it contains have to be assembled into a data format usable by flux analysis software (of which several dedicated packages exist. Currently the processing of mass spectrometric data is time-consuming and error-prone requiring peak by peak cut-and-paste analysis and manual curation. In order to facilitate high-throughput metabolic flux analysis, the automation of multiple steps in the analytical workflow is necessary. Results Here we describe iMS2Flux, software developed to automate, standardize and connect the data flow between mass spectrometric measurements and flux analysis programs. This tool streamlines the transfer of data from extraction via correction tools to 13C-Flux software by processing MS data from stable isotope labeling experiments. It allows the correction of large and heterogeneous MS datasets for the presence of naturally occurring stable isotopes, initial biomass and several mass spectrometry effects. Before and after data correction, several checks can be performed to ensure accurate data. The corrected data may be returned in a variety of formats including those used by metabolic flux analysis software such as 13CFLUX, OpenFLUX and 13CFLUX2. Conclusion iMS2Flux is a versatile, easy to use tool for the automated processing of mass spectrometric data containing isotope labeling information. It represents the core framework for a standardized workflow and data processing. Due to its flexibility

  3. Software ASPRO-NUC. Gamma-ray spectrometry, routine NAA, isotope identification and data management

    International Nuclear Information System (INIS)

    Kolotov, V.P.; Atrashkevich, V.V.

    1995-01-01

    The software ASPRO-NUC is based on new improved algorithms suggested and tested in the laboratory and intended for routine analysis. The package consists of the program ASPRO for gamma-ray spectra processing (peak search, multiplets deconvolution by means of method of moments, computation of correction coefficient for geometry and material of radioactive source), a program for isotope identification and a program for NAA by means of relative standardization. All output information is loaded into a data base (Paradox v.3.5 format) for supporting of queries, creation of reports, planning of routine analysis, estimation of expenses, supporting of network of analytical survey, etc. The ASPRO-NUC package also includes a vast nuclear data base containing evaluated decay and activation data (reactor, generator of fast neutrons, Cf-252 source). The data base environment allows for easy integration of a gamma spectrometer into a flexible information shell and the creation of a logical system for information management. (author) 15 refs.; 2 figs.; 2 tabs

  4. Extermination Of Uranium Isotopes Composition Using A Micro Computer With An IEEE-488 Interface For Mass Spectrometer Analysis

    International Nuclear Information System (INIS)

    Prajitno; Taftazani, Agus; Yusuf

    1996-01-01

    A mass spectrometry method can be used to make qualitative or quantitative analysis. For qualitative analysis, identification of unknown materials by a Mass Spectrometer requires definite assignment of mass number to peak on chart. In quantitative analysis, a mass spectrometer is used to determine isotope composition material in the sample. Analysis system of a Mass Spectrometer possession of PPNY-BATAN based on comparison ion current intensity which enter the collector, and have been used to analyse isotope composition. Calculation of isotope composition have been manually done. To increase the performance and to avoid manual data processing, a micro computer and IEEE-488 interface have been installed, also software packaged has been made. So that the determination of the isotope composition of material in the sample will be faster and more efficient. Tile accuracy of analysis using this program on sample standard U 3 O 8 NBS 010 is between 93,87% - 99,98%

  5. Actinide isotopic analysis systems

    International Nuclear Information System (INIS)

    Koenig, Z.M.; Ruhter, W.D.; Gunnink, R.

    1990-01-01

    This manual provides instructions and procedures for using the Lawrence Livermore National Laboratory's two-detector actinide isotope analysis system to measure plutonium samples with other possible actinides (including uranium, americium, and neptunium) by gamma-ray spectrometry. The computer program that controls the system and analyzes the gamma-ray spectral data is driven by a menu of one-, two-, or three-letter options chosen by the operator. Provided in this manual are descriptions of these options and their functions, plus detailed instructions (operator dialog) for choosing among the options. Also provided are general instructions for calibrating the actinide isotropic analysis system and for monitoring its performance. The inventory measurement of a sample's total plutonium and other actinides content is determined by two nondestructive measurements. One is a calorimetry measurement of the sample's heat or power output, and the other is a gamma-ray spectrometry measurement of its relative isotopic abundances. The isotopic measurements needed to interpret the observed calorimetric power measurement are the relative abundances of various plutonium and uranium isotopes and americium-241. The actinide analysis system carries out these measurements. 8 figs

  6. Basic methods of isotope analysis

    International Nuclear Information System (INIS)

    Ochkin, A.V.; Rozenkevich, M.B.

    2000-01-01

    The bases of the most applied methods of the isotope analysis are briefly presented. The possibilities and analytical characteristics of the mass-spectrometric, spectral, radiochemical and special methods of the isotope analysis, including application of the magnetic resonance, chromatography and refractometry, are considered [ru

  7. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  8. Application of Metric-based Software Reliability Analysis to Example Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Smidts, Carol

    2008-07-01

    The software reliability of TELLERFAST ATM software is analyzed by using two metric-based software reliability analysis methods, a state transition diagram-based method and a test coverage-based method. The procedures for the software reliability analysis by using the two methods and the analysis results are provided in this report. It is found that the two methods have a relation of complementary cooperation, and therefore further researches on combining the two methods to reflect the benefit of the complementary cooperative effect to the software reliability analysis are recommended

  9. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  10. Application of Software Safety Analysis Methods

    International Nuclear Information System (INIS)

    Park, G. Y.; Hur, S.; Cheon, S. W.; Kim, D. H.; Lee, D. Y.; Kwon, K. C.; Lee, S. J.; Koo, Y. H.

    2009-01-01

    A fully digitalized reactor protection system, which is called the IDiPS-RPS, was developed through the KNICS project. The IDiPS-RPS has four redundant and separated channels. Each channel is mainly composed of a group of bistable processors which redundantly compare process variables with their corresponding setpoints and a group of coincidence processors that generate a final trip signal when a trip condition is satisfied. Each channel also contains a test processor called the ATIP and a display and command processor called the COM. All the functions were implemented in software. During the development of the safety software, various software safety analysis methods were applied, in parallel to the verification and validation (V and V) activities, along the software development life cycle. The software safety analysis methods employed were the software hazard and operability (Software HAZOP) study, the software fault tree analysis (Software FTA), and the software failure modes and effects analysis (Software FMEA)

  11. Dependability Analysis Methods For Configurable Software

    International Nuclear Information System (INIS)

    Dahll, Gustav; Pulkkinen, Urho

    1996-01-01

    Configurable software systems are systems which are built up by standard software components in the same way as a hardware system is built up by standard hardware components. Such systems are often used in the control of NPPs, also in safety related applications. A reliability analysis of such systems is therefore necessary. This report discusses what configurable software is, and what is particular with respect to reliability assessment of such software. Two very commonly used techniques in traditional reliability analysis, viz. failure mode, effect and criticality analysis (FMECA) and fault tree analysis are investigated. A real example is used to illustrate the discussed methods. Various aspects relevant to the assessment of the software reliability in such systems are discussed. Finally some models for quantitative software reliability assessment applicable on configurable software systems are described. (author)

  12. Calcium Isotope Analysis by Mass Spectrometry

    Science.gov (United States)

    Boulyga, S.; Richter, S.

    2010-12-01

    The variations in the isotopic composition of calcium caused by fractionation in heterogeneous systems and by nuclear reactions can provide insight into numerous biological, geological, and cosmic processes, and therefore isotopic analysis finds a wide spectrum of applications in cosmo- and geochemistry, paleoclimatic, nutritional, and biomedical studies. The measurement of calcium isotopic abundances in natural samples has challenged the analysts for more than three decades. Practically all Ca isotopes suffer from significant isobaric interferences, whereas low-abundant isotopes can be particularly affected by neighboring major isotopes. The extent of natural variations of stable isotopes appears to be relatively limited, and highly precise techniques are required to resolve isotopic effects. Isotope fractionation during sample preparation and measurements and instrumental mass bias can significantly exceed small isotope abundance variations in samples, which have to be investigated. Not surprisingly, a TIMS procedure developed by Russell et al. (Russell et al., 1978. Geochim Cosmochim Acta 42: 1075-1090) for Ca isotope measurements was considered as revolutionary for isotopic measurements in general, and that approach is used nowadays (with small modifications) for practically all isotopic systems and with different mass spectrometric techniques. Nevertheless, despite several decades of calcium research and corresponding development of mass spectrometers, the available precision and accuracy is still not always sufficient to achieve the challenging goals. This presentation discusses figures of merits of presently used analytical methods and instrumentation, and attempts to critically assess their limitations. Additionally, the availability of Ca isotope reference materials will be discussed.

  13. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1993-05-01

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  14. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  15. Fault tree analysis of KNICS RPS software

    International Nuclear Information System (INIS)

    Park, Gee Yong; Kwon, Kee Choon; Koh, Kwang Yong; Jee, Eun Kyoung; Seong, Poong Hyun; Lee, Dae Hyung

    2008-01-01

    This paper describes the application of a software Fault Tree Analysis (FTA) as one of the analysis techniques for a Software Safety Analysis (SSA) at the design phase and its analysis results for the safety-critical software of a digital reactor protection system, which is called the KNICS RPS, being developed in the KNICS (Korea Nuclear Instrumentation and Control Systems) project. The software modules in the design description were represented by Function Blocks (FBs), and the software FTA was performed based on the well-defined fault tree templates for the FBs. The SSA, which is part of the verification and validation (V and V) activities, was activated at each phase of the software lifecycle for the KNICS RPS. At the design phase, the software HAZOP (Hazard and Operability) and the software FTA were employed in the SSA in such a way that the software HAZOP was performed first and then the software FTA was applied. The software FTA was applied to some critical modules selected from the software HAZOP analysis

  16. Isotopic Abundance and Chemical Purity Analysis of Stable Isotope Deuterium Labeled Sudan I

    Directory of Open Access Journals (Sweden)

    CAI Yin-ping;LEI Wen;ZHENG Bo;DU Xiao-ning

    2014-02-01

    Full Text Available It is important that to analysis of the isotopic abundance and chemical purity of Sudan I-D5, which is the internal standard of isotope dilution mass spectrometry. The isotopic abundance of Sudan I-D5 is detected by “mass cluster” classification method and LC-MS. The repeatability and reproducibility experiments were carried out by using different mass spectrometers and different operators. The RSD was less than 0.1%, so the repeatability and reproducibility were satisfactory. The accuracy and precision of the isotopic abundance analysis method was good with the results of F test and t test. The high performance liquid chromatography (HPLC had been used for detecting the chemical purity of Sudan I-D5 as external standard method.

  17. On-Orbit Software Analysis

    Science.gov (United States)

    Moran, Susanne I.

    2004-01-01

    The On-Orbit Software Analysis Research Infusion Project was done by Intrinsyx Technologies Corporation (Intrinsyx) at the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC). The Project was a joint collaborative effort between NASA Codes IC and SL, Kestrel Technology (Kestrel), and Intrinsyx. The primary objectives of the Project were: Discovery and verification of software program properties and dependencies, Detection and isolation of software defects across different versions of software, and Compilation of historical data and technical expertise for future applications

  18. Applications of stable isotope analysis in mammalian ecology.

    Science.gov (United States)

    Walter, W David; Kurle, Carolyn M; Hopkins, John B

    2014-01-01

    In this editorial, we provide a brief introduction and summarize the 10 research articles included in this Special Issue on Applications of stable isotope analysis in mammalian ecology. The first three articles report correction and discrimination factors that can be used to more accurately estimate the diets of extinct and extant mammals using stable isotope analysis. The remaining seven applied research articles use stable isotope analysis to address a variety of wildlife conservation and management questions from the oceans to the mountains.

  19. Performance evaluation of spectral deconvolution analysis tool (SDAT) software used for nuclear explosion radionuclide measurements

    International Nuclear Information System (INIS)

    Foltz Biegalski, K.M.; Biegalski, S.R.; Haas, D.A.

    2008-01-01

    The Spectral Deconvolution Analysis Tool (SDAT) software was developed to improve counting statistics and detection limits for nuclear explosion radionuclide measurements. SDAT utilizes spectral deconvolution spectroscopy techniques and can analyze both β-γ coincidence spectra for radioxenon isotopes and high-resolution HPGe spectra from aerosol monitors. Spectral deconvolution spectroscopy is an analysis method that utilizes the entire signal deposited in a gamma-ray detector rather than the small portion of the signal that is present in one gamma-ray peak. This method shows promise to improve detection limits over classical gamma-ray spectroscopy analytical techniques; however, this hypothesis has not been tested. To address this issue, we performed three tests to compare the detection ability and variance of SDAT results to those of commercial off- the-shelf (COTS) software which utilizes a standard peak search algorithm. (author)

  20. Nuclear Fuel Depletion Analysis Using Matlab Software

    Science.gov (United States)

    Faghihi, F.; Nematollahi, M. R.

    Coupled first order IVPs are frequently used in many parts of engineering and sciences. In this article, we presented a code including three computer programs which are joint with the Matlab software to solve and plot the solutions of the first order coupled stiff or non-stiff IVPs. Some engineering and scientific problems related to IVPs are given and fuel depletion (production of the 239Pu isotope) in a Pressurized Water Nuclear Reactor (PWR) are computed by the present code.

  1. Measurement of isotope abundance variations in nature by gravimetric spiking isotope dilution analysis (GS-IDA).

    Science.gov (United States)

    Chew, Gina; Walczyk, Thomas

    2013-04-02

    Subtle variations in the isotopic composition of elements carry unique information about physical and chemical processes in nature and are now exploited widely in diverse areas of research. Reliable measurement of natural isotope abundance variations is among the biggest challenges in inorganic mass spectrometry as they are highly sensitive to methodological bias. For decades, double spiking of the sample with a mix of two stable isotopes has been considered the reference technique for measuring such variations both by multicollector-inductively coupled plasma mass spectrometry (MC-ICPMS) and multicollector-thermal ionization mass spectrometry (MC-TIMS). However, this technique can only be applied to elements having at least four stable isotopes. Here we present a novel approach that requires measurement of three isotope signals only and which is more robust than the conventional double spiking technique. This became possible by gravimetric mixing of the sample with an isotopic spike in different proportions and by applying principles of isotope dilution for data analysis (GS-IDA). The potential and principle use of the technique is demonstrated for Mg in human urine using MC-TIMS for isotopic analysis. Mg is an element inaccessible to double spiking methods as it consists of three stable isotopes only and shows great potential for metabolically induced isotope effects waiting to be explored.

  2. A study of software safety analysis system for safety-critical software

    International Nuclear Information System (INIS)

    Chang, H. S.; Shin, H. K.; Chang, Y. W.; Jung, J. C.; Kim, J. H.; Han, H. H.; Son, H. S.

    2004-01-01

    The core factors and requirements for the safety-critical software traced and the methodology adopted in each stage of software life cycle are presented. In concept phase, Failure Modes and Effects Analysis (FMEA) for the system has been performed. The feasibility evaluation of selected safety parameter was performed and Preliminary Hazards Analysis list was prepared using HAZOP(Hazard and Operability) technique. And the check list for management control has been produced via walk-through technique. Based on the evaluation of the check list, activities to be performed in requirement phase have been determined. In the design phase, hazard analysis has been performed to check the safety capability of the system with regard to safety software algorithm using Fault Tree Analysis (FTA). In the test phase, the test items based on FMEA have been checked for fitness guided by an accident scenario. The pressurizer low pressure trip algorithm has been selected to apply FTA method to software safety analysis as a sample. By applying CASE tool, the requirements traceability of safety critical system has been enhanced during all of software life cycle phases

  3. SWEPP Gamma-Ray Spectrometer System software design description

    International Nuclear Information System (INIS)

    Femec, D.A.; Killian, E.W.

    1994-08-01

    To assist in the characterization of the radiological contents of contract-handled waste containers at the Stored Waste Examination Pilot Plant (SWEPP), the SWEPP Gamma-Ray Spectrometer (SGRS) System has been developed by the Radiation Measurements and Development Unit of the Idaho National Engineering Laboratory. The SGRS system software controls turntable and detector system activities. In addition to determining the concentrations of gamma-ray-emitting radionuclides, this software also calculates attenuation-corrected isotopic mass ratios of-specific interest. This document describes the software design for the data acquisition and analysis software associated with the SGRS system

  4. SWEPP Gamma-Ray Spectrometer System software design description

    Energy Technology Data Exchange (ETDEWEB)

    Femec, D.A.; Killian, E.W.

    1994-08-01

    To assist in the characterization of the radiological contents of contract-handled waste containers at the Stored Waste Examination Pilot Plant (SWEPP), the SWEPP Gamma-Ray Spectrometer (SGRS) System has been developed by the Radiation Measurements and Development Unit of the Idaho National Engineering Laboratory. The SGRS system software controls turntable and detector system activities. In addition to determining the concentrations of gamma-ray-emitting radionuclides, this software also calculates attenuation-corrected isotopic mass ratios of-specific interest. This document describes the software design for the data acquisition and analysis software associated with the SGRS system.

  5. Stable isotope analysis

    International Nuclear Information System (INIS)

    Tibari, Elghali; Taous, Fouad; Marah, Hamid

    2014-01-01

    This report presents results related to stable isotopes analysis carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal. These analyzes cover 127 samples. These results demonstrate that Oxygen-18 and Deuterium in water analysis were performed by infrared Laser spectroscopy using a LGR / DLT-100 with Autosampler. Also, the results are expressed in δ values (‰) relative to V-SMOW to ± 0.3 ‰ for oxygen-18 and ± 1 ‰ for deuterium.

  6. HTSSIP: An R package for analysis of high throughput sequencing data from nucleic acid stable isotope probing (SIP experiments.

    Directory of Open Access Journals (Sweden)

    Nicholas D Youngblut

    Full Text Available Combining high throughput sequencing with stable isotope probing (HTS-SIP is a powerful method for mapping in situ metabolic processes to thousands of microbial taxa. However, accurately mapping metabolic processes to taxa is complex and challenging. Multiple HTS-SIP data analysis methods have been developed, including high-resolution stable isotope probing (HR-SIP, multi-window high-resolution stable isotope probing (MW-HR-SIP, quantitative stable isotope probing (qSIP, and ΔBD. Currently, there is no publicly available software designed specifically for analyzing HTS-SIP data. To address this shortfall, we have developed the HTSSIP R package, an open-source, cross-platform toolset for conducting HTS-SIP analyses in a straightforward and easily reproducible manner. The HTSSIP package, along with full documentation and examples, is available from CRAN at https://cran.r-project.org/web/packages/HTSSIP/index.html and Github at https://github.com/buckleylab/HTSSIP.

  7. New Isotope Analysis Method: Atom Trap Mass Spectrometry

    International Nuclear Information System (INIS)

    Ko, Kwang Hoon; Park, Hyun Min; Han, Jae Min; Kim, Taek Soo; Cha, Yong Ho; Lim, Gwon; Jeong, Do Young

    2011-01-01

    Trace isotope analysis has been an important role in science, archaeological dating, geology, biology and nuclear industry. Some fission products such as Sr-90, Cs-135 and Kr-85 can be released to the environment when nuclear accident occurs and the reprocessing factory operates. Thus, the analysis of artificially produced radioactive isotopes has been of interest in nuclear industry. But it is difficult to detect them due to low natural abundance less then 10 -10 . In general, radio-chemical method has been applied to detect ultra-trace radio isotopes. But this method has disadvantages of long measurement time for long lived radioisotopes and toxic chemical process for the purification. The Accelerator Mass Spectrometer has high isotope selectivity, but the system is huge and its selectivity is affected by isobars. The laser based method, such as RIMS (Resonance Ionization Mass Spectrometry) has the advantage of isobar-effect free characteristics. But the system size is still huge for high isotope selective system. Recently, ATTA (Atom Trap Trace Analysis) has been successfully applied to detect ultra-trace isotope, Kr-81 and Kr-85. ATTA is the isobar-effect free detection with high isotope selectivity and the system size is small. However, it requires steady atomic beam source during detection, and is not allowed simultaneous detection of several isotopes. In this presentation, we introduce new isotope detection method which is a coupled method of Atom Trap Mass Spectrometry (ATMS). We expect that it can overcome the disadvantage of ATTA while it has both advantages of ATTA and mass spectrometer. The basic concept and the system design will be presented. In addition, the experimental status of ATMS will also be presented

  8. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  9. Gamma-Ray Spectrum Analysis Software GDA

    International Nuclear Information System (INIS)

    Wanabongse, P.

    1998-01-01

    The developmental work on computer software for gamma-ray spectrum analysis has been completed as a software package version 1.02 named GDA, which is an acronym for Gamma-spectrum Deconvolution and Analysis. The software package consists of three 3.5-inch diskettes for setup and a user's manual. GDA software can be installed for using on a personal computer with Windows 95 or Windows NT 4.0 operating system. A computer maybe the type of 80486 CPU with 8 megabytes of memory

  10. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume II. Software description and listings

    International Nuclear Information System (INIS)

    Ruhter, W.D.

    1984-05-01

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and has dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV 241 Pu and 208-keV 237 U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings

  11. Simulation and Analysis of Isotope Separation System for Fusion Fuel Recovery System

    Science.gov (United States)

    Senevirathna, Bathiya; Gentile, Charles

    2011-10-01

    This paper presents results of a simulation of the Fuel Recovery System (FRS) for the Laser Inertial Fusion Engine (LIFE) reactor. The LIFE reaction will produce exhaust gases that will need to be recycled in the FRS along with xenon, the chamber's intervention gas. Solids and liquids will first be removed and then vapor traps are used to remove large gas molecules such as lead. The gas will be reacted with lithium at high temperatures to extract the hydrogen isotopes, protium, deuterium, and tritium in hydride form. The hydrogen isotopes will be recovered using a lithium blanket processing system already in place and this product will be sent to the Isotope Separation System (ISS). The ISS will be modeled in software to analyze its effectiveness. Aspen HYSYS was chosen for this purpose for its widespread use industrial gas processing systems. Reactants and corresponding chemical reactions had to be initialized in the software. The ISS primarily consists of four cryogenic distillation columns and these were modeled in HYSYS based on design requirements. Fractional compositions of the distillate and liquid products were analyzed and used to optimize the overall system.

  12. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  13. Stable isotope analysis in primatology: a critical review.

    Science.gov (United States)

    Sandberg, Paul A; Loudon, James E; Sponheimer, Matt

    2012-11-01

    Stable isotope analysis has become an important tool in ecology over the last 25 years. A wealth of ecological information is stored in animal tissues in the relative abundances of the stable isotopes of several elements, particularly carbon and nitrogen, because these isotopes navigate through ecological processes in predictable ways. Stable carbon and nitrogen isotopes have been measured in most primate taxonomic groups and have yielded information about dietary content, dietary variability, and habitat use. Stable isotopes have recently proven useful for addressing more fine-grained questions about niche dynamics and anthropogenic effects on feeding ecology. Here, we discuss stable carbon and nitrogen isotope systematics and critically review the published stable carbon and nitrogen isotope data for modern primates with a focus on the problems and prospects for future stable isotope applications in primatology. © 2012 Wiley Periodicals, Inc.

  14. A guide for the laboratory information management system (LIMS) for light stable isotopes--Versions 7 and 8

    Science.gov (United States)

    Coplen, Tyler B.

    2000-01-01

    The reliability and accuracy of isotopic data can be improved by utilizing database software to (i) store information about samples, (ii) store the results of mass spectrometric isotope-ratio analyses of samples, (iii) calculate analytical results using standardized algorithms stored in a database, (iv) normalize stable isotopic data to international scales using isotopic reference materials, and (v) generate multi-sheet paper templates for convenient sample loading of automated mass-spectrometer sample preparation manifolds. Such a database program, the Laboratory Information Management System (LIMS) for Light Stable Isotopes, is presented herein. Major benefits of this system include (i) a dramatic improvement in quality assurance, (ii) an increase in laboratory efficiency, (iii) a reduction in workload due to the elimination or reduction of retyping of data by laboratory personnel, and (iv) a decrease in errors in data reported to sample submitters. Such a database provides a complete record of when and how often laboratory reference materials have been analyzed and provides a record of what correction factors have been used through time. It provides an audit trail for laboratories. LIMS for Light Stable Isotopes is available for both Microsoft Office 97 Professional and Microsoft Office 2000 Professional as versions 7 and 8, respectively. Both source code (mdb file) and precompiled executable files (mde) are available. Numerous improvements have been made for continuous flow isotopic analysis in this version (specifically 7.13 for Microsoft Access 97 and 8.13 for Microsoft Access 2000). It is much easier to import isotopic results from Finnigan ISODAT worksheets, even worksheets on which corrections for amount of sample (linearity corrections) have been added. The capability to determine blank corrections using isotope mass balance from analyses of elemental analyzer samples has been added. It is now possible to calculate and apply drift corrections to isotopic

  15. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  16. IsoDesign: a software for optimizing the design of 13C-metabolic flux analysis experiments.

    Science.gov (United States)

    Millard, Pierre; Sokol, Serguei; Letisse, Fabien; Portais, Jean-Charles

    2014-01-01

    The growing demand for (13) C-metabolic flux analysis ((13) C-MFA) in the field of metabolic engineering and systems biology is driving the need to rationalize expensive and time-consuming (13) C-labeling experiments. Experimental design is a key step in improving both the number of fluxes that can be calculated from a set of isotopic data and the precision of flux values. We present IsoDesign, a software that enables these parameters to be maximized by optimizing the isotopic composition of the label input. It can be applied to (13) C-MFA investigations using a broad panel of analytical tools (MS, MS/MS, (1) H NMR, (13) C NMR, etc.) individually or in combination. It includes a visualization module to intuitively select the optimal label input depending on the biological question to be addressed. Applications of IsoDesign are described, with an example of the entire (13) C-MFA workflow from the experimental design to the flux map including important practical considerations. IsoDesign makes the experimental design of (13) C-MFA experiments more accessible to a wider biological community. IsoDesign is distributed under an open source license at http://metasys.insa-toulouse.fr/software/isodes/ © 2013 Wiley Periodicals, Inc.

  17. Isotope analysis of lithium by thermionic mass spectrometry

    International Nuclear Information System (INIS)

    Kakazu, M.H.; Sarkis, J.E.S.

    1991-04-01

    An analytical mass spectrometric method for the isotope analysis of lithium has been studied. The analysis were carried out by using a single focusing thermoionic mass spectrometer Varian Mat TH5 with 90 sup(0) magnetic sector field and 21.4 cm deflection radius, equipped with a dual Re-filament thermal ionization ion source. The effect of different lithium chemical forms, such as, carbonate, chloride, nitrate and sulfate upon the isotopic ratios sup(6)Li/ sup(7)Li has been studied. Isotopic fractionation of lithium was studied in terms of the time of analysis. The results obtained with lithium carbonate yielded a precision of ±0.1% and an accuracy of ± 0.6%, whereas with other chemical forms yielded precisions of ±0.5% and accuracies of ±2%. A fractionation correction factor, K=1.005, was obtained for different samples of lithium carbonate isotopic standard CBNM IRM 016, which has been considered constant. (author)

  18. Metabolic Flux Analysis in Isotope Labeling Experiments Using the Adjoint Approach.

    Science.gov (United States)

    Mottelet, Stephane; Gaullier, Gil; Sadaka, Georges

    2017-01-01

    Comprehension of metabolic pathways is considerably enhanced by metabolic flux analysis (MFA-ILE) in isotope labeling experiments. The balance equations are given by hundreds of algebraic (stationary MFA) or ordinary differential equations (nonstationary MFA), and reducing the number of operations is therefore a crucial part of reducing the computation cost. The main bottleneck for deterministic algorithms is the computation of derivatives, particularly for nonstationary MFA. In this article, we explain how the overall identification process may be speeded up by using the adjoint approach to compute the gradient of the residual sum of squares. The proposed approach shows significant improvements in terms of complexity and computation time when it is compared with the usual (direct) approach. Numerical results are obtained for the central metabolic pathways of Escherichia coli and are validated against reference software in the stationary case. The methods and algorithms described in this paper are included in the sysmetab software package distributed under an Open Source license at http://forge.scilab.org/index.php/p/sysmetab/.

  19. Basic methods of isotope analysis; Osnovnye metody analiza izotopov

    Energy Technology Data Exchange (ETDEWEB)

    Ochkin, A V; Rozenkevich, M B

    2000-07-01

    The bases of the most applied methods of the isotope analysis are briefly presented. The possibilities and analytical characteristics of the mass-spectrometric, spectral, radiochemical and special methods of the isotope analysis, including application of the magnetic resonance, chromatography and refractometry, are considered.

  20. Isotope analysis in petroleum exploration

    International Nuclear Information System (INIS)

    Rodrigues, R.

    1982-01-01

    The study about isotopic analysis in petroleum exploration performed at Petrobras Research Center is showed. The results of the petroleum recuperation in same Brazilian basin and shelves are comented. (L.H.L.L.) [pt

  1. Analysis of barium by isotope mass spectrometry

    International Nuclear Information System (INIS)

    Long Kaiming; Jia Baoting; Liu Xuemei

    2004-01-01

    The isotopic abundance ratios for barium at sub-microgram level are analyzed by thermal surface ionization mass spectrometry (TIMS). Rhenium trips used for sample preparation are firstly treated to eliminate possible barium background interference. During the preparation of barium samples phosphoric acid is added as an emitting and stabilizing reagent. The addition of phosphoric acid increases the collection efficiency and ion current strength and stability for barium. A relative standard deviation of 0.02% for the isotopic abundance ratio of 137 Ba to 138 Ba is achieved when the 138 Ba ion current is (1-3) x 10 -12 A. The experimental results also demonstrate that the isotope fractionation effect is negligibly small in the isotopic analysis of barium

  2. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    Science.gov (United States)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  3. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  4. Analysis of open source GIS software

    OpenAIRE

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  5. Optical spectroscopy of laser-produced plasmas for standoff isotopic analysis

    Science.gov (United States)

    Harilal, S. S.; Brumfield, B. E.; LaHaye, N. L.; Hartig, K. C.; Phillips, M. C.

    2018-06-01

    Rapid, in-field, and non-contact isotopic analysis of solid materials is extremely important to a large number of applications, such as nuclear nonproliferation monitoring and forensics, geochemistry, archaeology, and biochemistry. Presently, isotopic measurements for these and many other fields are performed in laboratory settings. Rapid, in-field, and non-contact isotopic analysis of solid material is possible with optical spectroscopy tools when combined with laser ablation. Laser ablation generates a transient vapor of any solid material when a powerful laser interacts with a sample of interest. Analysis of atoms, ions, and molecules in a laser-produced plasma using optical spectroscopy tools can provide isotopic information with the advantages of real-time analysis, standoff capability, and no sample preparation requirement. Both emission and absorption spectroscopy methods can be used for isotopic analysis of solid materials. However, applying optical spectroscopy to the measurement of isotope ratios from solid materials presents numerous challenges. Isotope shifts arise primarily due to variation in nuclear charge distribution caused by different numbers of neutrons, but the small proportional nuclear mass differences between nuclei of various isotopes lead to correspondingly small differences in optical transition wavelengths. Along with this, various line broadening mechanisms in laser-produced plasmas and instrumental broadening generated by the detection system are technical challenges frequently encountered with emission-based optical diagnostics. These challenges can be overcome by measuring the isotope shifts associated with the vibronic emission bands from molecules or by using the techniques of laser-based absorption/fluorescence spectroscopy to marginalize the effect of instrumental broadening. Absorption and fluorescence spectroscopy probe the ground state atoms existing in the plasma when it is cooler, which inherently provides narrower

  6. Software Design for Smile Analysis

    Directory of Open Access Journals (Sweden)

    A. Sarkhosh

    2010-12-01

    Full Text Available Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproduciblesmile. The record then should be analyzed to determine its characteristics. In this study,we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients.Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high(=0.7-1.Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo.Conclusion: The results obtained from smile analysis could be used in diagnosis,treatment planning and evaluation of the treatment progress.

  7. Analysis method for beta-gamma coincidence spectra from radio-xenon isotopes

    International Nuclear Information System (INIS)

    Yang Wenjing; Yin Jingpeng; Huang Xiongliang; Cheng Zhiwei; Shen Maoquan; Zhang Yang

    2012-01-01

    Radio-xenon isotopes monitoring is one important method for the verification of CTBT, what includes the measurement methods of HPGe γ spectrometer and β-γ coincidence. The article describes the analytic flowchart and method of three-dimensional beta-gamma coincidence spectra from β-γ systems, and analyses in detail the principles and methods of the regions of interest of coincidence spectra and subtracting the interference, finally gives the formula of radioactivity of Xenon isotopes and minimum detectable concentrations. Studying on the principles of three-dimensional beta-gamma coincidence spectra, which can supply the foundation for designing the software of β-γ coincidence systems. (authors)

  8. Intercomparison of gamma ray analysis software packages

    International Nuclear Information System (INIS)

    1998-04-01

    The IAEA undertook an intercomparison exercise to review available software for gamma ray spectra analysis. This document describes the methods used in the intercomparison exercise, characterizes the software packages reviewed and presents the results obtained. Only direct results are given without any recommendation for a particular software or method for gamma ray spectra analysis

  9. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  10. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  11. Advanced concepts for gamma-ray isotopic analysis and instrumentation

    International Nuclear Information System (INIS)

    Buckley, W.M.; Carlson, J.B.

    1994-07-01

    The Safeguards Technology Program at the Lawrence Livermore National Laboratory is developing actinide isotopic analysis technologies in response to needs that address issues of flexibility of analysis, robustness of analysis, ease-of-use, automation and portability. Recent developments such as the Intelligent Actinide Analysis System (IAAS), begin to address these issues. We are continuing to develop enhancements on this and other instruments that improve ease-of-use, automation and portability. Requests to analyze samples with unusual isotopics, contamination, or containers have made us aware of the need for more flexible and robust analysis. We have modified the MGA program to extend its plutonium isotopic analysis capability to samples with greater 241 Am content or U isotopics. We are looking at methods for dealing with tantalum or lead contamination and contamination with high-energy gamma emitters, such as 233 U. We are looking at ways to allow the program to use additional information about the sample to further extend the domain of analyzable samples. These unusual analyses will come from the domain of samples that need to be measured because of complex reconfiguration or environmental cleanup

  12. Ion Mobility Mass Spectrometry Direct Isotope Abundance Analysis

    International Nuclear Information System (INIS)

    Manard, Manuel J.; Weeks, Stephan; Kyle, Kevin

    2010-01-01

    The nuclear forensics community is currently engaged in the analysis of illicit nuclear or radioactive material for the purposes of non-proliferations and attribution. One technique commonly employed for gathering nuclear forensics information is isotope analysis. At present, the state-of-the-art methodology for obtaining isotopic distributions is thermal ionization mass spectrometry (TIMS). Although TIMS is highly accurate at determining isotope distributions, the technique requires an elementally pure sample to perform the measurement. The required radiochemical separations give rise to sample preparation times that can be in excess of one to two weeks. Clearly, the nuclear forensics community is in need of instrumentation and methods that can expedite their decision making process in the event of a radiological release or nuclear detonation. Accordingly, we are developing instrumentation that couples a high resolution IM drift cell to the front end of a MS. The IM cell provides a means of separating ions based upon their collision cross-section and mass-to-charge ratio (m/z). Two analytes with the same m/z, but with different collision cross-sections (shapes) would exit the cell at different times, essentially enabling the cell to function in a similar manner to a gas chromatography (GC) column. Thus, molecular and atomic isobaric interferences can be effectively removed from the ion beam. The mobility selected chemical species could then be introduced to a MS for high-resolution mass analysis to generate isotopic distributions of the target analytes. The outcome would be an IM/MS system capable of accurately measuring isotopic distributions while concurrently eliminating isobaric interferences and laboratory radiochemical sample preparation. The overall objective of this project is developing instrumentation and methods to produce near real-time isotope distributions with a modular mass spectrometric system that performs the required gas-phase chemistry and

  13. Romanian wines characterization with CF-IRMS (Continuous Flow Isotope Ratio Mass Spectrometry) isotopic analysis

    International Nuclear Information System (INIS)

    Costinel, Diana; Ionete, Roxana Elena; Vremera, Raluca; Stanciu, Vasile

    2007-01-01

    Wine growing has been known for centuries long in Romania. The country has been favored by its geographical position in south-eastern Europe, by its proximity to the Black Sea, as well as by the specificity of the local soil and climate. Alongside France, Italy, Spain, Germany, countries in this area like Romania could also be called 'a vine homeland' in Europe. High quality wines produced in this region were object of trade ever since ancient times. Under current EU research projects, it is necessary to develop new methods of evidencing wine adulteration and safety. The use of mass spectrometry (MS) to determine the ratios of stable isotopes in bio-molecules now provides the means to prove the botanical and geographical origin of a wide variety of foodstuffs - and therefore, to authenticate and eliminate fraud. Isotope analysis has been officially adopted by the EU as a means of controlling adulteration of wine. Adulteration of wine can happen in many ways, e.g. addition of non-grape ethanol, addition of non-grape sugar, water or other unauthorized substances, undeclared mixing of wines from different wards, geographical areas or countries, mislabelling of variety and age. The present paper emphasize the isotopic analysis for D/H, 18 O/ 16 O, 13 C/ 12 C from wines, using a new generation Isotope Ratio MS, Finnigan Delta V Plus, coupling with a three flexible continuous flow preparation device (GasBench II, TC Elemental Analyser and GC-C/TC). Therefore authentication of wines is an important problem to which isotopic analysis has made a significant contribution. (authors)

  14. Software criticality analysis of COTS/SOUP

    Energy Technology Data Exchange (ETDEWEB)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-09-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading.

  15. Software criticality analysis of COTS/SOUP

    International Nuclear Information System (INIS)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-01-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading

  16. DeuteRater: a tool for quantifying peptide isotope precision and kinetic proteomics.

    Science.gov (United States)

    Naylor, Bradley C; Porter, Michael T; Wilson, Elise; Herring, Adam; Lofthouse, Spencer; Hannemann, Austin; Piccolo, Stephen R; Rockwood, Alan L; Price, John C

    2017-05-15

    Using mass spectrometry to measure the concentration and turnover of the individual proteins in a proteome, enables the calculation of individual synthesis and degradation rates for each protein. Software to analyze concentration is readily available, but software to analyze turnover is lacking. Data analysis workflows typically don't access the full breadth of information about instrument precision and accuracy that is present in each peptide isotopic envelope measurement. This method utilizes both isotope distribution and changes in neutromer spacing, which benefits the analysis of both concentration and turnover. We have developed a data analysis tool, DeuteRater, to measure protein turnover from metabolic D 2 O labeling. DeuteRater uses theoretical predictions for label-dependent change in isotope abundance and inter-peak (neutromer) spacing within the isotope envelope to calculate protein turnover rate. We have also used these metrics to evaluate the accuracy and precision of peptide measurements and thereby determined the optimal data acquisition parameters of different instruments, as well as the effect of data processing steps. We show that these combined measurements can be used to remove noise and increase confidence in the protein turnover measurement for each protein. Source code and ReadMe for Python 2 and 3 versions of DeuteRater are available at https://github.com/JC-Price/DeuteRater . Data is at https://chorusproject.org/pages/index.html project number 1147. Critical Intermediate calculation files provided as Tables S3 and S4. Software has only been tested on Windows machines. jcprice@chem.byu.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  17. Acoustic Emission Analysis Applet (AEAA) Software

    Science.gov (United States)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  18. SWEPP gamma-ray spectrometer system software user's guide

    International Nuclear Information System (INIS)

    Femec, D.A.

    1994-08-01

    The SWEPP Gamma-Ray Spectrometer (SGRS) System has been developed by the Radiation Measurement and Development Unit of the Idaho National Engineering Laboratory to assist in the characterization of the radiological contents of contact-handled waste containers at the Stored Waste Examination Pilot Plant (SWEPP). In addition to determining the concentrations of gamma-ray-emitting radionuclides, the software also calculates attenuation-corrected isotopic mass ratios of specific interest, and provides controls for SGRS hardware as required. This document serves as a user's guide for the data acquisition and analysis software associated with the SGRS system

  19. Selective laser ionization for mass-spectral isotopic analysis

    International Nuclear Information System (INIS)

    Miller, C.M.; Nogar, N.S.; Downey, S.W.

    1983-01-01

    Resonant enhancement of the ionization process can provide a high degree of elemental selectivity, thus eliminating or drastically reducing the interference problem. In addition, extension of this method to isotopically selective ionization has the potential for greatly increasing the range of isotope ratios that can be determined experimentally. This gain can be realized by reducing or eliminating the tailing of the signal from the high-abundance isotope into that of the low-abundance isotope, augmenting the dispersion of the mass spectrometer. We briefly discuss the hardware and techniques used in both our pulsed and cw RIMS experiments. Results are presented for both cw ionization experiments on Lu/Yb mixtures, and spectroscopic studies of multicolor RIMS of Tc. Lastly, we discuss practical limits of cw RIMS analysis in terms of detection limits and measurable isotope ratios

  20. Software safety analysis practice in installation phase

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H. W.; Chen, M. H.; Shyu, S. S., E-mail: hwhwang@iner.gov.t [Institute of Nuclear Energy Research, No. 1000 Wenhua Road, Chiaan Village, Longtan Township, 32546 Taoyuan County, Taiwan (China)

    2010-10-15

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  1. Software safety analysis practice in installation phase

    International Nuclear Information System (INIS)

    Huang, H. W.; Chen, M. H.; Shyu, S. S.

    2010-10-01

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  2. Advances in isotopic analysis for food authenticity testing

    DEFF Research Database (Denmark)

    Laursen, Kristian Holst; Bontempo, L.; Camin, Federica

    2016-01-01

    Abstract Stable isotope analysis has been used for food authenticity testing for more than 30 years and is today being utilized on a routine basis for a wide variety of food commodities. During the past decade, major analytical method developments have been made and the fundamental understanding...... authenticity testing is currently developing even further. In this chapter, we aim to provide an overview of the latest developments in stable isotope analysis for food authenticity testing. As several review articles and book chapters have recently addressed this topic, we will primarily focus on relevant...... literature from the past 5 years. We will focus on well-established methods for food authenticity testing using stable isotopes but will also include recent methodological developments, new applications, and current and future challenges....

  3. Software development for teleroentgenogram analysis

    Science.gov (United States)

    Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.

    2017-09-01

    A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.

  4. Determination of marble provenance: limits of isotopic analysis

    International Nuclear Information System (INIS)

    Germann, K.; Holzmann, G.; Winkler, F.J.

    1980-01-01

    Provenance determination of Thessalian stelae marbles using the C/O isotopic analysis proved to be misleading, as the isotopic composition even in very small quarrying areas is heterogeneous and isotopic coincidence of marbles from very distant sources occurs. Therefore additional geological features must be taken into consideration and preference should be given to combinations of both petrographical and geochemical properties. Geological field work to establish the range of possible marble sources and the variability within these sources is one of the prerequisites of provenance studies. (author)

  5. Comparison of gas chromatography/isotope ratio mass spectrometry and liquid chromatography/isotope ratio mass spectrometry for carbon stable-isotope analysis of carbohydrates

    NARCIS (Netherlands)

    Moerdijk-Poortvliet, Tanja C. W.; Schierbeek, Henk; Houtekamer, Marco; van Engeland, Tom; Derrien, Delphine; Stal, Lucas J.; Boschker, Henricus T. S.

    2015-01-01

    We compared gas chromatography/isotope ratio mass spectrometry (GC/IRMS) and liquid chromatography/isotope ratio mass spectrometry (LC/IRMS) for the measurement of δ(13)C values in carbohydrates. Contrary to GC/IRMS, no derivatisation is needed for LC/IRMS analysis of carbohydrates. Hence, although

  6. Comparison of gas chromatography/isotope ratio mass spectrometry and liquid chromatography/isotope ratio mass spectrometry for carbon stable-isotope analysis of carbohydrates

    NARCIS (Netherlands)

    Moerdijk-Poortvliet, T.C.W.; Schierbeek, H.; Houtekamer, M.; van Engeland, T.; Derrien, D.; Stal, L.J.; Boschker, H.T.S.

    2015-01-01

    We compared gas chromatography/isotope ratio mass spectrometry (GC/IRMS) and liquid chromatography/isotope ratio mass spectrometry (LC/IRMS) for the measurement of d13C values in carbohydrates. Contrary to GC/IRMS, no derivatisation is needed for LC/IRMS analysis of carbohydrates. Hence, although

  7. Comparison of gas chromatography/isotope ratio mass spectrometry and liquid chromatography/isotope ratio mass spectrometry for carbon stable-isotope analysis of carbohydrates

    NARCIS (Netherlands)

    Moerdijk-Poortvliet, T.C.W.; Schierbeek, H.; Houtekamer, M.; van Engeland, T.; Derrien, D.; Stal, L.J.; Boschker, H.T.S.

    2015-01-01

    Rationale: We compared gas chromatography/isotope ratio mass spectrometry (GC/IRMS) and liquid chromatography/isotope ratio mass spectrometry (LC/IRMS) for the measurement of δ13C values in carbohydrates. Contrary to GC/IRMS, no derivatisation is needed for LC/IRMS analysis of carbohydrates. Hence,

  8. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  9. Software for Graph Analysis and Visualization

    Directory of Open Access Journals (Sweden)

    M. I. Kolomeychenko

    2014-01-01

    Full Text Available This paper describes the software for graph storage, analysis and visualization. The article presents a comparative analysis of existing software for analysis and visualization of graphs, describes the overall architecture of application and basic principles of construction and operation of the main modules. Furthermore, a description of the developed graph storage oriented to storage and processing of large-scale graphs is presented. The developed algorithm for finding communities and implemented algorithms of autolayouts of graphs are the main functionality of the product. The main advantage of the developed software is high speed processing of large size networks (up to millions of nodes and links. Moreover, the proposed graph storage architecture is unique and has no analogues. The developed approaches and algorithms are optimized for operating with big graphs and have high productivity.

  10. GWAMA: software for genome-wide association meta-analysis

    Directory of Open Access Journals (Sweden)

    Mägi Reedik

    2010-05-01

    Full Text Available Abstract Background Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. Results We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. Conclusions The GWAMA (Genome-Wide Association Meta-Analysis software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.

  11. Isotopic analysis of radioactive waste packages (an inexpensive approach)

    International Nuclear Information System (INIS)

    Padula, D.A.; Richmond, J.S.

    1983-01-01

    A computer printout of the isotopic analysis for all radioactive waste packages containing resins, or other aqueous filter media is now required at the disposal sites at Barnwell, South Carolina, and Beatty, Nevada. Richland, Washington requires an isotopic analysis for all radioactive waste packages. The NRC (Nuclear Regulatory Commission), through 10 CFR 61, will require shippers of radioactive waste to classify and label for disposal all radioactive waste forms. These forms include resins, filters, sludges, and dry active waste (trash). The waste classification is to be based upon 10 CFR 61 (Section 1-7). The isotopes upon which waste classification is to be based are tabulated. 7 references, 8 tables

  12. Gamma-ray spectral analysis software designed for extreme ease of use or unattended operation

    International Nuclear Information System (INIS)

    Buckley, W.M.; Carlson, J.B.; Romine, W.A.

    1993-07-01

    We are developing isotopic analysis software in the Safeguards Technology Program that advances usability in two complimentary directions. The first direction is towards Graphical User Interfaces (GUIs) for very easy. to use applications. The second is toward a minimal user interface, but with additional features for unattended or fully automatic applications. We are developing a GUI-based spectral viewing engine that is currently running in the MS-Windows environment. We intend to use this core application to provide the common user interface for our data analysis, and subsequently data acquisition and instrument control applications. We are also investigating sets of cases where the MGA methodology produces reduced accuracy results, incorrect errors, or incorrect results. We try to determine the root cause for the problem and extend the methodology or replace portions of the Methodology so that MGA will function over a wider domain of analysis without requiring intervention and analysis by a spectroscopist. This effort is necessary for applications where such intervention is inconvenient or impractical

  13. A new paradigm for the development of analysis software

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2012-01-01

    For the CANDU industry, analysis software is an important tool for scientists and engineers to examine issues related to safety, operation, and design. However, the software quality assurance approach currently used for these tools assumes the software is the delivered product. In this paper, we present a model that shifts the emphasis from software being the end-product to software being support for the end-product, the science. We describe a novel software development paradigm that supports this shift and provides the groundwork for re-examining the quality assurance practices used for analysis software. (author)

  14. Isotopic analysis of uranium by thermoionic mass spectrometry

    International Nuclear Information System (INIS)

    Moraes, N.M.P. de.

    1979-01-01

    Uranium isotopic ratio measurements by thermoionic spectrometry are presented. Emphasis is given upon the investigation of the parameters that directly affect the precision and accuracy of the results. Optimized procedures, namely, chemical processing, sample loading on the filaments, vaporization, ionization and measurements of ionic currents, are established. Adequate statistical analysis of the data for the calculation of the internal and external variances and mean standard deviation are presented. These procedures are applied to natural and NBS isotopic standard uranium samples. The results obtained agree with the certified values within specified limits. 235 U/ 238 U isotopic ratios values determined for NBS-U500, and a series of standard samples with variable isotopic compositon, are used to calculate mass discrimination factor [pt

  15. ATTA - A new method of ultrasensitive isotope trace analysis

    International Nuclear Information System (INIS)

    Bailey, K.; Chen, C.Y.; Du, X.; Li, Y.M.; Lu, Z.-T.; O'Connor, T.P.; Young, L.

    2000-01-01

    A new method of ultrasensitive isotope trace analysis has been developed. This method, based on the technique of laser manipulation of neutral atoms, has been used to count individual 85 Kr and 81 Kr atoms present in a natural krypton gas sample with isotopic abundances in the range of 10 -11 and 10 -13 , respectively. This method is free of contamination from other isotopes and elements and can be applied to various different isotope tracers for a wide range of applications. The demonstrated detection efficiency is 1x10 -7 . System improvements could increase the efficiency by many orders of magnitude

  16. Stable isotope analysis of Dacryoconarid carbonate microfossils: a new tool for Devonian oxygen and carbon isotope stratigraphy.

    Science.gov (United States)

    Frappier, Amy Benoit; Lindemann, Richard H; Frappier, Brian R

    2015-04-30

    Dacryoconarids are extinct marine zooplankton known from abundant, globally distributed calcite microfossils in the Devonian, but their shell stable isotope composition has not been previously explored. Devonian stable isotope stratigraphy is currently limited to less common invertebrates or bulk rock analyses of uncertain provenance. As with Cenozoic planktonic foraminifera, isotopic analysis of dacryoconarid shells could facilitate higher-resolution, geographically widespread stable isotope records of paleoenvironmental change, including marine hypoxia events, climate changes, and biocrises. We explored the use of Dacryoconarid isotope stratigraphy as a viable method in interpreting paleoenvironments. We applied an established method for determining stable isotope ratios (δ(13) C, δ(18) O values) of small carbonate microfossils to very well-preserved dacryoconarid shells. We analyzed individual calcite shells representing five common genera using a Kiel carbonate device coupled to a MAT 253 isotope ratio mass spectrometer. Calcite shell δ(13) C and δ(18) O values were compared by taxonomic group, rock unit, and locality. Single dacryoconarid calcite shells are suitable for stable isotope analysis using a Kiel-IRMS setup. The dacryoconarid shell δ(13) C values (-4.7 to 2.3‰) and δ(18) O values (-10.3 to -4.8‰) were consistent across taxa, independent of shell size or part, but varied systematically through time. Lower fossil δ(18) O values were associated with warmer water temperature and more variable δ(13) C values were associated with major bioevents. Dacryoconarid δ(13) C and δ(18) O values differed from bulk rock carbonate values. Dacryoconarid individual microfossil δ(13) C and δ(18) O values are highly sensitive to paleoenvironmental changes, thus providing a promising avenue for stable isotope chemostratigraphy to better resolve regional to global paleoceanographic changes throughout the upper Silurian to the upper Devonian. Our results

  17. A proposal for performing software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Gallagher, J.M.

    1997-01-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper. The method concentrates on finding hazards during the early stages of the software life cycle, using an extension of HAZOP

  18. Isotope Ratio Monitoring Gas Chromatography Mass Spectrometry (IRM-GCMS)

    International Nuclear Information System (INIS)

    Freeman, K.H.; Ricci, S.A.; Studley, A.; Hayes, J.M.

    1989-01-01

    On Earth, the C-13 content of organic compounds is depleted by roughly 13 to 23 permil from atmospheric carbon dioxide. This difference is largely due to isotope effects associated with the fixation of inorganic carbon by photosynthetic organisms. If life once existed on Mars, then it is reasonable to expect to observe a similar fractionation. Although the strongly oxidizing conditions on the surface of Mars make preservation of ancient organic material unlikely, carbon-isotope evidence for the existence of life on Mars may still be preserved. Carbon depleted in C-13 could be preserved either in organic compounds within buried sediments, or in carbonate minerals produced by the oxidation of organic material. A technique is introduced for rapid and precise measurement of the C-13 contents of individual organic compounds. A gas chromatograph is coupled to an isotope-ratio mass spectrometer through a combustion interface, enabling on-line isotopic analysis of isolated compounds. The isotope ratios are determined by integration of ion currents over the course of each chromatographic peak. Software incorporates automatic peak determination, corrections for background, and deconvolution of overlapped peaks. Overall performance of the instrument was evaluated by the analysis of a mixture of high purity n-alkanes of know isotopic composition. Isotopic values measured via IRM-GCMS averaged withing 0.55 permil of their conventionally measured values

  19. UNiquant, a Program for Quantitative Proteomics Analysis Using Stable Isotope Labeling

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Xin; Tolmachev, Aleksey V.; Shen, Yulei; Liu, Miao; Huang, Lin; Zhang, Zhixin; Anderson, Gordon A.; Smith, Richard D.; Chan, Wing C.; Hinrichs, Steven; Fu, Kai; Ding, Shi-Jian

    2011-03-04

    We present UNiquant, a new software program for analyzing stable isotope labeling (SIL) based quantitative proteomics data. UNiquant surpassed the performance of two other platforms, MaxQuant and Mascot Distiller, using complex proteome mixtures having either known or unknown heavy/light ratios. UNiquant is compatible with a broad spectrum of search engines and SIL methods, providing outstanding peptide pair identification and accurate measurement of the relative peptide/protein abundance.

  20. Biometrics from the carbon isotope ratio analysis of amino acids in human hair.

    Science.gov (United States)

    Jackson, Glen P; An, Yan; Konstantynova, Kateryna I; Rashaid, Ayat H B

    2015-01-01

    This study compares and contrasts the ability to classify individuals into different grouping factors through either bulk isotope ratio analysis or amino-acid-specific isotope ratio analysis of human hair. Using LC-IRMS, we measured the isotope ratios of 14 amino acids in hair proteins independently, and leucine/isoleucine as a co-eluting pair, to provide 15 variables for classification. Multivariate analysis confirmed that the essential amino acids and non-essential amino acids were mostly independent variables in the classification rules, thereby enabling the separation of dietary factors of isotope intake from intrinsic or phenotypic factors of isotope fractionation. Multivariate analysis revealed at least two potential sources of non-dietary factors influencing the carbon isotope ratio values of the amino acids in human hair: body mass index (BMI) and age. These results provide evidence that compound-specific isotope ratio analysis has the potential to go beyond region-of-origin or geospatial movements of individuals-obtainable through bulk isotope measurements-to the provision of physical and characteristic traits about the individuals, such as age and BMI. Further development and refinement, for example to genetic, metabolic, disease and hormonal factors could ultimately be of great assistance in forensic and clinical casework. Copyright © 2014 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  1. Software safety analysis application in installation phase

    International Nuclear Information System (INIS)

    Huang, H. W.; Yih, S.; Wang, L. H.; Liao, B. C.; Lin, J. M.; Kao, T. M.

    2010-01-01

    This work performed a software safety analysis (SSA) in the installation phase of the Lungmen nuclear power plant (LMNPP) in Taiwan, under the cooperation of INER and TPC. The US Nuclear Regulatory Commission (USNRC) requests licensee to perform software safety analysis (SSA) and software verification and validation (SV and V) in each phase of software development life cycle with Branch Technical Position (BTP) 7-14. In this work, 37 safety grade digital instrumentation and control (I and C) systems were analyzed by Failure Mode and Effects Analysis (FMEA), which is suggested by IEEE Standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The FMEA showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (authors)

  2. Development of a New VLBI Data Analysis Software

    Science.gov (United States)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  3. SIMS analysis of isotopic impurities in ion implants

    International Nuclear Information System (INIS)

    Sykes, D.E.; Blunt, R.T.

    1986-01-01

    The n-type dopant species Si and Se used for ion implantation in GaAs are multi-isotopic with the most abundant isotope not chosen because of potential interferences with residual gases. SIMS analysis of a range of 29 Si implants produced by several designs of ion implanter all showed significant 28 Si impurity with a different depth distribution from that of the deliberately implanted 29 Si isotope. This effect was observed to varying degrees with all fifteen implanters examined and in every 29 Si implant analysed to date 29 Si + , 29 Si ++ and 30 Si implants all show the same effect. In the case of Se implantation, poor mass resolution results in the implantation of all isotopes with the same implant distribution (i.e. energy), whilst implants carried out with good mass resolution show the implantation of all isotopes with the characteristic lower depth distribution of the impurity isotopes as found in the Si implants. This effect has also been observed in p-type implants into GaAs (Mg) and for Ga implanted in Si. A tentative explanation of the effect is proposed. (author)

  4. Krypton isotope analysis using near-resonant stimulated Raman spectroscopy

    International Nuclear Information System (INIS)

    Whitehead, C.A.; Cannon, B.D.; Wacker, J.F.

    1994-12-01

    A method for measuring low relative abundances of 85 Kr in one liter or less samples of air has been under development here at Pacific Northwest Laboratory. The goal of the Krypton Isotope Laser Analysis (KILA) method is to measure ratios of 10 -10 or less of 85 Kr to more abundant stable krypton. Mass spectrometry and beta counting are the main competing technologies used in rare-gas trace analysis and are limited in application by such factors as sample size, counting times, and selectivity. The use of high-resolution lasers to probe hyperfine levels to determine isotopic abundance has received much attention recently. In this study, we report our progress on identifying and implementing techniques for trace 85 Kr analysis on small gas samples in a static cell as well as limitations on sensitivity and selectivity for the technique. High-resolution pulsed and cw lasers are employed in a laser-induced fluorescence technique that preserves the original sample. This technique, is based on resonant isotopic depletion spectroscopy (RIDS) in which one isotope is optically depleted while preserving the population of a less abundant isotope. The KILA method consists of three steps. In the first step, the 1s 5 metastable level of krypton is populated via radiative cascade following two-photon excitation of the 2p 6 energy level. Next, using RBDS, the stable krypton isotopes are optically depleted to the ground state through the 1s 4 level with the bulk of the 85 Kr population being preserved. Finally, the remaining metastable population is probed to determine 85 Kr concentration. The experimental requirements for each of these steps are outlined below

  5. U and Pb isotope analysis of uraninite and galena by ion microprobe

    Energy Technology Data Exchange (ETDEWEB)

    Evins, L.Z.; Sunde, T.; Schoeberg, H. [Swedish Museum of Natural History, Stockholm (Sweden). Laboratory for Isotope Geology; Fayek, M. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Geological Sciences

    2001-10-01

    Accurate isotopic analysis of minerals by ion microprobe, or SIMS (Secondary Ion Mass Spectrometry) usually requires a standard to correct for instrumental mass bias effects that occur during analysis. We have calibrated two uraninite crystals and one galena crystal to be used as ion probe standards. As part of this study we describe the analytical procedures and problems encountered while trying to establish fractionation factors for U and Pb isotopes measured in galena and uraninite. Only the intra-element isotopic mass fractionation is considered and not the interelement fractionation. Galena and uraninite were analysed with TIMS (Thermal Ionisation Mass Spectrometry) prior to SIMS. One uraninite crystal (P88) comes from Sweden and is ca 900 Ma old, the other from Maine, USA (LAMNH-30222) and is ca 350 Ma old. The galena sample comes from the Paleoproterozoic ore district Bergslagen in Sweden. SIMS analyses were performed at two different laboratories: the NORDSM facility in Stockholm, which has a high resolution Cameca IMS 1270 ion microprobe, and the Oak Ridge National Laboratory (ORNL) in Tennessee, which has a Cameca IMS 4f ion microprobe. The results show that during the analysis of galena, Pb isotopes fractionate in favour of the lighter isotope by as much as 0.5%/amu. A Pb isotope fractionation factor for uraninite was more difficult to calculate, probably due to the formation of hydride interferences encountered during analysis with the Cameca IMS 1270 ion microprobe. However, drying the sample in vacuum prior to analysis, and using high-energy filtering and a cold trap during analysis can minimise these hydride interferences. A large fractionation of U isotopes of ca 1.4%/amu in favour of the lighter isotope was calculated for uraninite.

  6. U and Pb isotope analysis of uraninite and galena by ion microprobe

    International Nuclear Information System (INIS)

    Evins, L.Z.; Sunde, T.; Schoeberg, H.; Fayek, M.

    2001-10-01

    Accurate isotopic analysis of minerals by ion microprobe, or SIMS (Secondary Ion Mass Spectrometry) usually requires a standard to correct for instrumental mass bias effects that occur during analysis. We have calibrated two uraninite crystals and one galena crystal to be used as ion probe standards. As part of this study we describe the analytical procedures and problems encountered while trying to establish fractionation factors for U and Pb isotopes measured in galena and uraninite. Only the intra-element isotopic mass fractionation is considered and not the interelement fractionation. Galena and uraninite were analysed with TIMS (Thermal Ionisation Mass Spectrometry) prior to SIMS. One uraninite crystal (P88) comes from Sweden and is ca 900 Ma old, the other from Maine, USA (LAMNH-30222) and is ca 350 Ma old. The galena sample comes from the Paleoproterozoic ore district Bergslagen in Sweden. SIMS analyses were performed at two different laboratories: the NORDSM facility in Stockholm, which has a high resolution Cameca IMS 1270 ion microprobe, and the Oak Ridge National Laboratory (ORNL) in Tennessee, which has a Cameca IMS 4f ion microprobe. The results show that during the analysis of galena, Pb isotopes fractionate in favour of the lighter isotope by as much as 0.5%/amu. A Pb isotope fractionation factor for uraninite was more difficult to calculate, probably due to the formation of hydride interferences encountered during analysis with the Cameca IMS 1270 ion microprobe. However, drying the sample in vacuum prior to analysis, and using high-energy filtering and a cold trap during analysis can minimise these hydride interferences. A large fractionation of U isotopes of ca 1.4%/amu in favour of the lighter isotope was calculated for uraninite

  7. Long-term preservation of analysis software environment

    International Nuclear Information System (INIS)

    Toppe Larsen, Dag; Blomer, Jakob; Buncic, Predrag; Charalampidis, Ioannis; Haratyunyan, Artem

    2012-01-01

    Long-term preservation of scientific data represents a challenge to experiments, especially regarding the analysis software. Preserving data is not enough; the full software and hardware environment is needed. Virtual machines (VMs) make it possible to preserve hardware “in software”. A complete infrastructure package has been developed for easy deployment and management of VMs, based on CERN virtual machine (CernVM). Further, a HTTP-based file system, CernVM file system (CVMFS), is used for the distribution of the software. It is possible to process data with any given software version, and a matching, regenerated VM version. A point-and-click web user interface is being developed for setting up the complete processing chain, including VM and software versions, number and type of processing nodes, and the particular type of analysis and data. This paradigm also allows for distributed cloud-computing on private and public clouds, for both legacy and contemporary experiments.

  8. PIV/HPIV Film Analysis Software Package

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A PIV/HPIV film analysis software system was developed that calculates the 2-dimensional spatial autocorrelations of subregions of Particle Image Velocimetry (PIV) or Holographic Particle Image Velocimetry (HPIV) film recordings. The software controls three hardware subsystems including (1) a Kodak Megaplus 1.4 camera and EPIX 4MEG framegrabber subsystem, (2) an IEEE/Unidex 11 precision motion control subsystem, and (3) an Alacron I860 array processor subsystem. The software runs on an IBM PC/AT host computer running either the Microsoft Windows 3.1 or Windows 95 operating system. It is capable of processing five PIV or HPIV displacement vectors per second, and is completely automated with the exception of user input to a configuration file prior to analysis execution for update of various system parameters.

  9. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  10. Software development for the simulation and design of the cryogenic distillation cascade used for hydrogen isotope separation

    Energy Technology Data Exchange (ETDEWEB)

    Draghia, Mirela Mihaela, E-mail: mirela.draghia@istech-ro.com; Pasca, Gheorghe; Porcariu, Florina

    2016-11-01

    Highlights: • Software for designing and simulation of a cryogenic distillation cascade. • The simulation provides the distribution of all the molecular species involved along each cryogenic distillation column and also the temperature profile along the columns. • Useful information that are relevant for ITER Isotope Separation System. - Abstract: The hydrogen isotope separation system (ISS) based on cryogenic distillation is one of the key systems of the fuel cycle of a fusion reactor. Similar with ITER ISS in a Water Detritiation Facility for a CANDU reactor, one of the main systems is cryogenic distillation. The developments on the CANDU water detritiation systems have shown that a cascade of four cryogenic distillation columns is required in order to achieve the required decontamination factor of the heavy water and a tritium enrichment up to 99.9%. This paper aims to present the results of the design and simulation activities in support to the development of the Cernavoda Tritium Removal Facility (CTRF). Beside the main features of software developed “in house”, an introduction to the main relevant issues of a CANDU tritium removal facility for the ITER ISS is provided as well. Based on the input data (e.g. the flow rates, the composition of the gas supplied into the cryogenic distillation cascade, pressure drop along the column, liquid inventory) the simulation provides the distribution of all the molecular species involved along each cryogenic distillation column and also the temperature profile along the columns. The approach for the static and dynamic simulation of a cryogenic distillation process is based on theoretical plates model and the calculations are performed incrementally plate by plate.

  11. Software development for the simulation and design of the cryogenic distillation cascade used for hydrogen isotope separation

    International Nuclear Information System (INIS)

    Draghia, Mirela Mihaela; Pasca, Gheorghe; Porcariu, Florina

    2016-01-01

    Highlights: • Software for designing and simulation of a cryogenic distillation cascade. • The simulation provides the distribution of all the molecular species involved along each cryogenic distillation column and also the temperature profile along the columns. • Useful information that are relevant for ITER Isotope Separation System. - Abstract: The hydrogen isotope separation system (ISS) based on cryogenic distillation is one of the key systems of the fuel cycle of a fusion reactor. Similar with ITER ISS in a Water Detritiation Facility for a CANDU reactor, one of the main systems is cryogenic distillation. The developments on the CANDU water detritiation systems have shown that a cascade of four cryogenic distillation columns is required in order to achieve the required decontamination factor of the heavy water and a tritium enrichment up to 99.9%. This paper aims to present the results of the design and simulation activities in support to the development of the Cernavoda Tritium Removal Facility (CTRF). Beside the main features of software developed “in house”, an introduction to the main relevant issues of a CANDU tritium removal facility for the ITER ISS is provided as well. Based on the input data (e.g. the flow rates, the composition of the gas supplied into the cryogenic distillation cascade, pressure drop along the column, liquid inventory) the simulation provides the distribution of all the molecular species involved along each cryogenic distillation column and also the temperature profile along the columns. The approach for the static and dynamic simulation of a cryogenic distillation process is based on theoretical plates model and the calculations are performed incrementally plate by plate.

  12. Food certification based on isotopic analysis, according to the European standards

    International Nuclear Information System (INIS)

    Costinel, Diana; Ionete, Roxana Elena; Vremera, Raluca; Stanciu, Vasile; Iordache, Andreea

    2007-01-01

    Full text: Under current EU research projects, several public research institutions, universities and private companies are collaborating to develop new methods of evidencing food adulteration and consequently assessing food safety. The use of mass spectrometry (MS) to determine the ratio of stable isotopes in bio-molecules now provides the means to prove the natural origin of a wide variety of foodstuffs - and therefore, to identify the fraud and consequently to reject the improper products or certify the food quality. Isotope analysis has been officially adopted by the EU as a means of controlling adulteration of some food stuffs. A network of research organizations developed the use of isotopic analysis to support training and technology transfer to encourage uptake of the technique. There were also developed proficiency-testing schemes to ensure the correct use of isotopic techniques in national testing laboratories. In addition, ensuring the food quality and safety is a requirement, which must be fulfilled for the integration in EU. The present paper emphasizes the isotopic analysis for D/H, 18 O/ 16 O, 13 C/ 12 C from food (honey, juice, wines) using a new generation Isotope Ratio MS, Finnigan Delta V Plus, coupled to a three flexible continuous flow preparation device (GasBench II, TC Elemental Analyser and GC-C/TC). (authors)

  13. The Need to Support and Maintain Legacy Software: Ensuring Ongoing Support for the Isotopics Codes

    International Nuclear Information System (INIS)

    Weber, A.-L.; Funk, P.; McGinnis, B.; Vo, D.; Wang, T.-F.; Peerani, P.; Zsigrai, J.; )

    2015-01-01

    Since about four decades, gamma evaluation codes for plutonium and uranium isotope abundance measurements are a key component of international, regional and domestic safeguards inspections. However, the development of these codes still relies upon a very limited number of experts. This led the safeguards authorities to express concerns, and to request continuity of knowledge and maintenance capability for the codes. The presentation describes initiatives undertaken in the past ten years to ensure ongoing support for the isotopic codes. As a follow-up to the 2005 international workshop, the IAEA issued a roadmap for future developments of gamma codes, followed by a request for support in this field to several MSSP's (namely JNT A 01684). The international working group on gamma spectrometry techniques for U and Pu isotopics (IWG-GST) was launched by the European, French and US MSSPs in 2007, to respond to the needs expressed by the IAEA and other national or international inspectorates. Its activities started with the organization in 2008 of a workshop on gamma spectrometry analysis codes for U and Pu isotopics. The working group is currently developing an international database of reference spectra that will be made available to the community of users and developers. In parallel, IRSN contributes to the JNT A 01684 by advising the IAEA on establishing a procedure for validating a new version of isotopics codes compared to the previous version. The most recent initiative, proposed by the IAEA, consists in organizing an inter-comparison exercise to assess the performances of U and Pu isotopics and mass assay techniques based on medium resolution gamma spectrometry (MRGS). All these initiatives contributed to the continuity of knowledge and maintenance of the gamma isotopic codes, but further efforts are needed to ensure the long-term sustainability of the codes. (author)

  14. Compound-Specific Isotope Analysis of Diesel Fuels in a Forensic Investigation

    Science.gov (United States)

    Muhammad, Syahidah; Frew, Russell; Hayman, Alan

    2015-02-01

    Compound-specific isotope analysis (CSIA) offers great potential as a tool to provide chemical evidence in a forensic investigation. Many attempts to trace environmental oil spills were successful where isotopic values were particularly distinct. However, difficulties arise when a large data set is analyzed and the isotopic differences between samples are subtle. In the present study, discrimination of diesel oils involved in a diesel theft case was carried out to infer the relatedness of the samples to potential source samples. This discriminatory analysis used a suite of hydrocarbon diagnostic indices, alkanes, to generate carbon and hydrogen isotopic data of the compositions of the compounds which were then processed using multivariate statistical analyses to infer the relatedness of the data set. The results from this analysis were put into context by comparing the data with the δ13C and δ2H of alkanes in commercial diesel samples obtained from various locations in the South Island of New Zealand. Based on the isotopic character of the alkanes, it is suggested that diesel fuels involved in the diesel theft case were distinguishable. This manuscript shows that CSIA when used in tandem with multivariate statistical analysis provide a defensible means to differentiate and source-apportion qualitatively similar oils at the molecular level. This approach was able to overcome confounding challenges posed by the near single-point source of origin i.e. the very subtle differences in isotopic values between the samples.

  15. Compound-Specific Isotope Analysis of Diesel Fuels in a Forensic Investigation

    Directory of Open Access Journals (Sweden)

    Syahidah Akmal Muhammad

    2015-02-01

    Full Text Available Compound-specific isotope analysis (CSIA offers great potential as a tool to provide chemical evidence in a forensic investigation. Many attempts to trace environmental oil spills were successful where isotopic values were particularly distinct. However, difficulties arise when a large data set is analyzed and the isotopic differences between samples are subtle. In the present study, discrimination of diesel oils involved in a diesel theft case was carried out to infer the relatedness of the samples to potential source samples. This discriminatory analysis used a suite of hydrocarbon diagnostic indices, alkanes, to generate carbon and hydrogen isotopic data of the compositions of the compounds which were then processed using multivariate statistical analyses to infer the relatedness of the data set. The results from this analysis were put into context by comparing the data with the δ13C and δ2H of alkanes in commercial diesel samples obtained from various locations in the South Island of New Zealand. Based on the isotopic character of the alkanes, it is suggested that diesel fuels involved in the diesel theft case were distinguishable. This manuscript shows that CSIA when used in tandem with multivariate statistical analysis provide a defensible means to differentiate and source-apportion qualitatively similar oils at the molecular level. This approach was able to overcome confounding challenges posed by the near single-point source of origin i.e. the very subtle differences in isotopic values between the samples.

  16. Compound-specific isotope analysis of diesel fuels in a forensic investigation.

    Science.gov (United States)

    Muhammad, Syahidah A; Frew, Russell D; Hayman, Alan R

    2015-01-01

    Compound-specific isotope analysis (CSIA) offers great potential as a tool to provide chemical evidence in a forensic investigation. Many attempts to trace environmental oil spills were successful where isotopic values were particularly distinct. However, difficulties arise when a large data set is analyzed and the isotopic differences between samples are subtle. In the present study, discrimination of diesel oils involved in a diesel theft case was carried out to infer the relatedness of the samples to potential source samples. This discriminatory analysis used a suite of hydrocarbon diagnostic indices, alkanes, to generate carbon and hydrogen isotopic data of the compositions of the compounds which were then processed using multivariate statistical analyses to infer the relatedness of the data set. The results from this analysis were put into context by comparing the data with the δ(13)C and δ(2)H of alkanes in commercial diesel samples obtained from various locations in the South Island of New Zealand. Based on the isotopic character of the alkanes, it is suggested that diesel fuels involved in the diesel theft case were distinguishable. This manuscript shows that CSIA when used in tandem with multivariate statistical analysis provide a defensible means to differentiate and source-apportion qualitatively similar oils at the molecular level. This approach was able to overcome confounding challenges posed by the near single-point source of origin, i.e., the very subtle differences in isotopic values between the samples.

  17. Isotopic analysis of plutonium by computer controlled mass spectrometry

    International Nuclear Information System (INIS)

    1974-01-01

    Isotopic analysis of plutonium chemically purified by ion exchange is achieved using a thermal ionization mass spectrometer. Data acquisition from and control of the instrument is done automatically with a dedicated system computer in real time with subsequent automatic data reduction and reporting. Separation of isotopes is achieved by varying the ion accelerating high voltage with accurate computer control

  18. Optical spectroscopy of laser-produced plasmas for standoff isotopic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Harilal, Sivanandan S.; Brumfield, Brian E.; LaHaye, Nicole L.; Hartig, Kyle C.; Phillips, Mark C.

    2018-04-20

    This review article covers the present status of isotope detection through emission, absorption, and fluorescence spectroscopy of atoms and molecules in a laser-produced plasma formed from a solid sample. A description of the physics behind isotope shifts in atoms and molecules is presented, followed by the physics behind solid sampling of laser ablation plumes, optical methods for isotope measurements, the suitable physical conditions of laser-produced plasma plumes for isotopic analysis, and the current status. Finally, concluding remarks will be made on the existing gaps between previous works in the literature and suggestions for future work.

  19. System and method for high precision isotope ratio destructive analysis

    Science.gov (United States)

    Bushaw, Bruce A; Anheier, Norman C; Phillips, Jon R

    2013-07-02

    A system and process are disclosed that provide high accuracy and high precision destructive analysis measurements for isotope ratio determination of relative isotope abundance distributions in liquids, solids, and particulate samples. The invention utilizes a collinear probe beam to interrogate a laser ablated plume. This invention provides enhanced single-shot detection sensitivity approaching the femtogram range, and isotope ratios that can be determined at approximately 1% or better precision and accuracy (relative standard deviation).

  20. Using MASHA+TIMEPIX Setup for Registration Beta Decay Isotopes Produced in Heavy Ion Induced Reactions

    Science.gov (United States)

    Rodin, A. M.; Belozerov, A. V.; Chernysheva, E. V.; Dmitriev, S. N.; Gulyaev, A. V.; Gulyaeva, A. V.; Itkis, M. G.; Novoselov, A. S.; Oganessian, Yu. Ts.; Salamatin, V. S.; Stepantsov, S. V.; Vedeneev, V. Yu.; Yukhimchuk, S. A.; Krupa, L.; Granja, C.; Pospisil, S.; Kliman, J.; Motycak, S.; Sivacek, I.

    2015-06-01

    Radon and mercury isotopes were produced in multi nucleon transfer (48Ca + 232Th) and complete fusion (48Ca + naturalNd) reactions, respectively. The isotopes with given masses were detected using two detectors: a multi-strip detector of the well-type (made in CANBERRA) and a position-sensitive quantum counting hybrid pixel detector of the TIMEPIX type. The isotopes implanted into the detectors then emit alpha- and betaparticles until reaching the long lived isotopes. The position of the isotopes, the tracks, the time and energy of beta-particles were measured and analyzed. A new software for the particle recognition and data analysis of experimental results was developed and used. It was shown that MASHA+ TIMEPIX setup is a powerful instrument for investigation of neutron-rich isotopes far from stability limits.

  1. Development of output user interface software to support analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wahanani, Nursinta Adi, E-mail: sintaadi@batan.go.id; Natsir, Khairina, E-mail: sintaadi@batan.go.id; Hartini, Entin, E-mail: sintaadi@batan.go.id [Center for Development of Nuclear Informatics - National Nuclear Energy Agency, PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)

    2014-09-30

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu{sup 239} and Pu{sup 241}. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.

  2. Development of output user interface software to support analysis

    International Nuclear Information System (INIS)

    Wahanani, Nursinta Adi; Natsir, Khairina; Hartini, Entin

    2014-01-01

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu 239 and Pu 241 . Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis

  3. Hg stable isotope analysis by the double-spike method.

    Science.gov (United States)

    Mead, Chris; Johnson, Thomas M

    2010-06-01

    Recent publications suggest great potential for analysis of Hg stable isotope abundances to elucidate sources and/or chemical processes that control the environmental impact of mercury. We have developed a new MC-ICP-MS method for analysis of mercury isotope ratios using the double-spike approach, in which a solution containing enriched (196)Hg and (204)Hg is mixed with samples and provides a means to correct for instrumental mass bias and most isotopic fractionation that may occur during sample preparation and introduction into the instrument. Large amounts of isotopic fractionation induced by sample preparation and introduction into the instrument (e.g., by batch reactors) are corrected for. This may greatly enhance various Hg pre-concentration methods by correcting for minor fractionation that may occur during preparation and removing the need to demonstrate 100% recovery. Current precision, when ratios are normalized to the daily average, is 0.06 per thousand, 0.06 per thousand, 0.05 per thousand, and 0.05 per thousand (2sigma) for (202)Hg/(198)Hg, (201)Hg/(198)Hg, (200)Hg/(198)Hg, and (199)Hg/(198)Hg, respectively. This is slightly better than previously published methods. Additionally, this precision was attained despite the presence of large amounts of other Hg isotopes (e.g., 5.0% atom percent (198)Hg) in the spike solution; substantially better precision could be achieved if purer (196)Hg were used.

  4. Results of Am isotopic ratio analysis in irradiated MOX fuels

    Energy Technology Data Exchange (ETDEWEB)

    Koyama, Shin-ichi; Osaka, Masahiko; Mitsugashira, Toshiaki; Konno, Koichi [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center; Kajitani, Mikio

    1997-04-01

    For analysis of a small quantity of americium, it is necessary to separate from curium which has similar chemical property. As a chemical separation method for americium and curium, the oxidation of americium with pentavalent bismuth and subsequent co-precipitation of trivalent curium with BIP O{sub 4} were applied to analyze americium in irradiated MOX fuels which contained about 30wt% plutonium and 0.9wt% {sup 241}Am before irradiation and were irradiated up to 26.2GWd/t in the experimental fast reactor Joyo. The purpose of this study is to measure isotopic ratio of americium and to evaluate the change of isotopic ratio with irradiation. Following results are obtained in this study. (1) The isotopic ratio of americium ({sup 241}Am, {sup 242m}Am and {sup 243}Am) can be analyzed in the MOX fuels by isolating americium. The isotopic ratio of {sup 242m}Am and {sup 243}Am increases up to 0.62at% and 0.82at% at maximum burnup, respectively, (2) The results of isotopic analysis indicates that the contents of {sup 241}Am decreases, whereas {sup 242m}Am, {sup 243}Am increase linearly with increasing burnup. (author)

  5. RangerMaster trademark: Real-time pattern recognition software for in-field analysis of radiation sources

    International Nuclear Information System (INIS)

    Murray, W.S.; Ziemba, F.; Szluk, N.

    1998-01-01

    RangerMaster trademark is the embedded firmware for Quantrad Sensor's integrated nuclear instrument package, the Ranger trademark. The Ranger trademark, which is both a gamma-ray and neutron detection system, was originally developed at Los Alamos National Laboratory for in situ surveys at the Plutonium Facility to confirm the presence of nuclear materials. The new RangerMaster trademark software expands the library of isotopes and simplifies the operation of the instrument by providing an easy mode suitable for untrained operators. The expanded library of the Ranger trademark now includes medical isotopes 99 Tc, 201 Tl, 111 In, 67 Ga, 133 Xe, 103 Pa, and 131 I; industrial isotopes 241 Am, 57 Co, 133 Ba, 137 Cs, 40 K, 60 Co, 232 Th, 226 Ra, and 207 Bi; and nuclear materials 235 U, 238 U, 233 U, and 239 Pu. To accomplish isotopic identification, a simulated spectrum for each of the isotopes was generated using SYNTH. The SYNTH spectra formed the basis for the knowledge-based expert system and selection of the regions of interest that are used in the pattern recognition system. The knowledge-based pattern recognition system was tested against actual spectra under field conditions

  6. RangerMasterTM: real-time pattern recognition software for in-field analysis of radiation sources

    International Nuclear Information System (INIS)

    Murray, W.S.; Ziemba, F.; Szluk, N.

    1998-01-01

    RangerMaster TM is the embedded firmware for Quantrad Sensor's integrated nuclear instrument package, the Ranger TM . The Ranger TM , which is both a gamma-ray and neutron detection system, was originally developed at Los Alamos National Laboratory for in situ surveys at the Plutonium Facility to confirm the presence of nuclear materials. The new RangerMaster TM software expands the library of isotopes and simplifies the operation of the instrument by providing an 'easy' mode suitable for untrained operators. The expanded library of the Ranger TM now includes medical isotopes 99 Tc, 201 Tl, 111 In, 67 Ga, 133 Xe, 103 Pa, and 131 I; industrial isotopes 241 Am, 57 Co, 133 Ba, 137 Cs, 40 K, 60 Co, 232 Th, 226 Ra, and 207 Bi; and nuclear materials 235 U, 238 U, 233 U, and 239 Pu. To accomplish isotopic identification, a simulated spectrum for each of the isotopes was generated using SYNTH 2 . The SYNTH spectra formed the basis for the knowledge-based expert system and selection of the regions of interest that are used in the pattern recognition system. The knowledge-based pattern recognition system was tested against actual spectra under field conditions. (author)

  7. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  8. Finite mixture models for the computation of isotope ratios in mixed isotopic samples

    Science.gov (United States)

    Koffler, Daniel; Laaha, Gregor; Leisch, Friedrich; Kappel, Stefanie; Prohaska, Thomas

    2013-04-01

    parameters of the algorithm, i.e. the maximum count of ratios, the minimum relative group-size of data points belonging to each ratio has to be defined. Computation of the models can be done with statistical software. In this study Leisch and Grün's flexmix package [2] for the statistical open-source software R was applied. A code example is available in the electronic supplementary material of Kappel et al. [1]. In order to demonstrate the usefulness of finite mixture models in fields dealing with the computation of multiple isotope ratios in mixed samples, a transparent example based on simulated data is presented and problems regarding small group-sizes are illustrated. In addition, the application of finite mixture models to isotope ratio data measured in uranium oxide particles is shown. The results indicate that finite mixture models perform well in computing isotope ratios relative to traditional estimation procedures and can be recommended for more objective and straightforward calculation of isotope ratios in geochemistry than it is current practice. [1] S. Kappel, S. Boulyga, L. Dorta, D. Günther, B. Hattendorf, D. Koffler, G. Laaha, F. Leisch and T. Prohaska: Evaluation Strategies for Isotope Ratio Measurements of Single Particles by LA-MC-ICPMS, Analytical and Bioanalytical Chemistry, 2013, accepted for publication on 2012-12-18 (doi: 10.1007/s00216-012-6674-3) [2] B. Grün and F. Leisch: Fitting finite mixtures of generalized linear regressions in R. Computational Statistics & Data Analysis, 51(11), 5247-5252, 2007. (doi:10.1016/j.csda.2006.08.014)

  9. EDS operator and control software

    International Nuclear Information System (INIS)

    Ott, L.L.

    1985-04-01

    The Enrichment Diagnostic System (EDS) was developed at Lawrence Livermore National Laboratory (LLNL) to acquire, display and analyze large quantities of transient data for a real-time Advanced Vapor Laser Isotope Separation (AVLIS) experiment. Major topics discussed in this paper are the EDS operator interface (SHELL) program, the data acquisition and analysis scheduling software, and the graphics software. The workstation concept used in EDS, the software used to configure a user's workstation, and the ownership and management of a diagnostic are described. An EDS diagnostic is a combination of hardware and software designed to study specific aspects of the process. Overall system performance is discussed from the standpoint of scheduling techniques, evaluation tools, optimization techniques, and program-to-program communication methods. EDS is based on a data driven design which keeps the need to modify software to a minimum. This design requires a fast and reliable data base management system. A third party data base management product, Berkeley Software System Database, written explicitly for HP1000's, is used for all EDS data bases. All graphics is done with an in-house graphics product, Device Independent Graphics Library (DIGLIB). Examples of devices supported by DIGLIB are: Versatec printer/plotters, Raster Technologies Graphic Display Controllers, and HP terminals (HP264x and HP262x). The benefits derived by using HP hardware and software as well as obstacles imposed by the HP environment are presented in relation to EDS development and implementation

  10. SWEPP gamma-ray spectrometer system software test plan and report

    International Nuclear Information System (INIS)

    Femec, D.A.

    1994-09-01

    The SWEPP Gamma-Ray Spectrometer (SGRS) System has been developed by the Radiation Measurements and Development Unit of the Idaho National Engineering Laboratory to assist in the characterization of the radiological contents of contact-handled waste containers at the Stored Waste Examination Pilot Plant (SWEPP). In addition to determining the concentrations of gamma-ray-emitting radionuclides, the software also calculates attenuation-corrected isotopic mass ratios of specific interest, and provides controls for SGRS hardware as required. This document presents the test plan and report for the data acquisition and analysis software associated with the SGRS system

  11. Cavity Ring-down Spectroscopy for Carbon Isotope Analysis with 2 μm Diode Laser

    International Nuclear Information System (INIS)

    Hiromoto, K.; Tomita, H.; Watanabe, K.; Kawarabayashi, J.; Iguchi, T.

    2009-01-01

    We have made a prototype based on CRDS with 2 μm diode laser for carbon isotope analysis of CO 2 in air. The carbon isotope ratio was obtained to be (1.085±0.012)x10 -2 which shows good agreement with the isotope ratio measured by the magnetic sector-type mass spectrometer within uncertainty. Hence, we demonstrated the carbon isotope analysis based on CRDS with 2 μm tunable diode laser.

  12. Existing and emerging technologies for measuring stable isotope labelled retinol in biological samples: isotope dilution analysis of body retinol stores.

    Science.gov (United States)

    Preston, Tom

    2014-01-01

    This paper discusses some of the recent improvements in instrumentation used for stable isotope tracer measurements in the context of measuring retinol stores, in vivo. Tracer costs, together with concerns that larger tracer doses may perturb the parameter under study, demand that ever more sensitive mass spectrometric techniques are developed. GCMS is the most widely used technique. It has high sensitivity in terms of sample amount and uses high resolution GC, yet its ability to detect low isotope ratios is limited by background noise. LCMSMS may become more accessible for tracer studies. Its ability to measure low level stable isotope tracers may prove superior to GCMS, but it is isotope ratio MS (IRMS) that has been designed specifically for low level stable isotope analysis through accurate analysis of tracer:tracee ratios (the tracee being the unlabelled species). Compound-specific isotope analysis, where GC is interfaced to IRMS, is gaining popularity. Here, individual 13C-labelled compounds are separated by GC, combusted to CO2 and transferred on-line for ratiometric analysis by IRMS at the ppm level. However, commercially-available 13C-labelled retinol tracers are 2 - 4 times more expensive than deuterated tracers. For 2H-labelled compounds, GC-pyrolysis-IRMS has now become more generally available as an operating mode on the same IRMS instrument. Here, individual compounds are separated by GC and pyrolysed to H2 at high temperature for analysis by IRMS. It is predicted that GC-pyrolysis-IRMS will facilitate low level tracer procedures to measure body retinol stores, as has been accomplished in the case of fatty acids and amino acids. Sample size requirements for GC-P-IRMS may exceed those of GCMS, but this paper discusses sample preparation procedures and predicts improvements, particularly in the efficiency of sample introduction.

  13. On the interference of Kr during carbon isotope analysis of methane using continuous-flow combustion–isotope ratio mass spectrometry

    NARCIS (Netherlands)

    Schmitt, J.; Seth, B.; Bock, M; van der Veen, C.; Möller, L.; Sapart, C.J.; Prokopiou, M.; Sowers, T.; Röckmann, T.; Fischer, H

    2013-01-01

    Stable carbon isotope analysis of methane ( 13C of CH4) on atmospheric samples is one key method to constrain the current and past atmospheric CH4 budget. A frequently applied measurement technique is gas chromatography (GC) isotope ratio mass spectrometry (IRMS) coupled to a

  14. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    An improvement plan is then drawn and implemented. This paper studied the state of Nigerian software development organizations based on selected attributes. Force field analysis is used to partition the factors obtained into driving and restraining forces. An attempt was made to improve the software development process ...

  15. Applications of stable isotope analysis in foodstuffs surveillance and environmental research

    International Nuclear Information System (INIS)

    Pichlmayer, F.; Blochberger, F.

    1991-12-01

    The instrumental coupling of Elemental Analysis and Mass Spectrometry, constituting a convenient tool for isotope ratio measurements of the bioelements in solid or liquid samples is now well established. Advantages of this technique compared with the so far usual wet chemistry sample preparation are: speed of analysis, easy operation and minor sample consumption. The performance of the system is described and some applications are given. Detection of foodstuffs adulterations is mainly based on the natural carbon isotope differences between C 3 - and C 4 -plants. In the field of environmental research the existing small isotopic variations of carbon, nitrogen and sulfur in nature, which depend on substance origin and history, are used as intrinsic signature of the considered sample. Examples of source appointment or exclusion by help of this natural isotopic tracer method are dealt with. (authors)

  16. Development of a fatigue analysis software system

    International Nuclear Information System (INIS)

    Choi, B. I.; Lee, H. J.; Han, S. W.; Kim, J. Y.; Hwang, K. H.; Kang, J. Y.

    2001-01-01

    A general purpose fatigue analysis software to predict fatigue lives of mechanical components and structures was developed. This software has some characteristic features including functions of searching weak regions on the free surface in order to reduce computing time significantly, a database of fatigue properties for various materials, and an expert system which can assist any users to get more proper results. This software can be used in the environment consists of commercial finite element packages. Using the software developed fatigue analyses for a SAE keyhole specimen and an automobile knuckle were carried out. It was observed that the results were agree well with those from commercial packages

  17. Study of medical isotope production facility stack emissions and noble gas isotopic signature using automatic gamma-spectra analysis platform

    Science.gov (United States)

    Zhang, Weihua; Hoffmann, Emmy; Ungar, Kurt; Dolinar, George; Miley, Harry; Mekarski, Pawel; Schrom, Brian; Hoffman, Ian; Lawrie, Ryan; Loosz, Tom

    2013-04-01

    The nuclear industry emissions of the four CTBT (Comprehensive Nuclear-Test-Ban Treaty) relevant radioxenon isotopes are unavoidably detected by the IMS along with possible treaty violations. Another civil source of radioxenon emissions which contributes to the global background is radiopharmaceutical production companies. To better understand the source terms of these background emissions, a joint project between HC, ANSTO, PNNL and CRL was formed to install real-time detection systems to support 135Xe, 133Xe, 131mXe and 133mXe measurements at the ANSTO and CRL 99Mo production facility stacks as well as the CANDU (CANada Deuterium Uranium) primary coolant monitoring system at CRL. At each site, high resolution gamma spectra were collected every 15 minutes using a HPGe detector to continuously monitor a bypass feed from the stack or CANDU primary coolant system as it passed through a sampling cell. HC also conducted atmospheric monitoring for radioxenon at approximately 200 km distant from CRL. A program was written to transfer each spectrum into a text file format suitable for the automatic gamma-spectra analysis platform and then email the file to a server. Once the email was received by the server, it was automatically analysed with the gamma-spectrum software UniSampo/Shaman to perform radionuclide identification and activity calculation for a large number of gamma-spectra in a short period of time (less than 10 seconds per spectrum). The results of nuclide activity together with other spectrum parameters were saved into the Linssi database. This database contains a large amount of radionuclide information which is a valuable resource for the analysis of radionuclide distribution within the noble gas fission product emissions. The results could be useful to identify the specific mechanisms of the activity release. The isotopic signatures of the various radioxenon species can be determined as a function of release time. Comparison of 133mXe and 133Xe activity

  18. Elementary study on γ analysis software for low level measurement

    International Nuclear Information System (INIS)

    Ruan Guanglin; Huang Xianguo; Xing Shixiong

    2001-01-01

    The difficulty in using fashion γ analysis software in low level measurement is discussed. The ROI report file of ORTEC operation system has been chosen as interface file to write γ analysis software for low-level measurement. The author gives software flowchart and applied example and discusses the existent problems

  19. Continuous software quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The software for the ATLAS experiment on the Large Hadron Collider at CERN has evolved over many years to meet the demands of Monte Carlo simulation, particle detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by an active worldwide developer community. In order to run the experiment software efficiently at hundreds of computing centres it is essential to maintain a high level of software quality standards. The methods proposed to improve software quality practices by incorporating checks into the new ATLAS software build infrastructure.

  20. Potential of isotope analysis (C, Cl) to identify dechlorination mechanisms

    Science.gov (United States)

    Cretnik, Stefan; Thoreson, Kristen; Bernstein, Anat; Ebert, Karin; Buchner, Daniel; Laskov, Christine; Haderlein, Stefan; Shouakar-Stash, Orfan; Kliegman, Sarah; McNeill, Kristopher; Elsner, Martin

    2013-04-01

    Chloroethenes are commonly used in industrial applications, and detected as carcinogenic contaminants in the environment. Their dehalogenation is of environmental importance in remediation processes. However, a detailed understanding frequently accounted problem is the accumulation of toxic degradation products such as cis-dichloroethylene (cis-DCE) at contaminated sites. Several studies have addressed the reductive dehalogenation reactions using biotic and abiotic model systems, but a crucial question in this context has remained open: Do environmental transformations occur by the same mechanism as in their corresponding in vitro model systems? The presented study shows the potential to close this research gap using the latest developments in compound specific chlorine isotope analysis, which make it possible to routinely measure chlorine isotope fractionation of chloroethenes in environmental samples and complex reaction mixtures.1,2 In particular, such chlorine isotope analysis enables the measurement of isotope fractionation for two elements (i.e., C and Cl) in chloroethenes. When isotope values of both elements are plotted against each other, different slopes reflect different underlying mechanisms and are remarkably insensitive towards masking. Our results suggest that different microbial strains (G. lovleyi strain SZ, D. hafniense Y51) and the isolated cofactor cobalamin employ similar mechanisms of reductive dechlorination of TCE. In contrast, evidence for a different mechanism was obtained with cobaloxime cautioning its use as a model for biodegradation. The study shows the potential of the dual isotope approach as a tool to directly compare transformation mechanisms of environmental scenarios, biotic transformations, and their putative chemical lab scale systems. Furthermore, it serves as an essential reference when using the dual isotope approach to assess the fate of chlorinated compounds in the environment.

  1. Software hazard analysis for nuclear digital protection system by Colored Petri Net

    International Nuclear Information System (INIS)

    Bai, Tao; Chen, Wei-Hua; Liu, Zhen; Gao, Feng

    2017-01-01

    Highlights: •A dynamic hazard analysis method is proposed for the safety-critical software. •The mechanism relies on Colored Petri Net. •Complex interactions between software and hardware are captured properly. •Common failure mode in software are identified effectively. -- Abstract: The software safety of a nuclear digital protection system is critical for the safety of nuclear power plants as any software defect may result in severe damage. In order to ensure the safety and reliability of safety-critical digital system products and their applications, software hazard analysis is required to be performed during the lifecycle of software development. The dynamic software hazard modeling and analysis method based on Colored Petri Net is proposed and applied to the safety-critical control software of the nuclear digital protection system in this paper. The analysis results show that the proposed method can explain the complex interactions between software and hardware and identify the potential common cause failure in software properly and effectively. Moreover, the method can find the dominant software induced hazard to safety control actions, which aids in increasing software quality.

  2. Suma-alpha software description. Study of its applications to detection problems and environmental radioactivity measurements

    International Nuclear Information System (INIS)

    Gasco, C.; Perez, C.

    2010-01-01

    Software named suma-espectros has been developed by TECNASA/CIEMAT for adding counts automatically from the alpha spectra, energy to energy, with the purpose of: evaluating real background of alpha spectrometers, studying its temporal variations, increasing the possibilities of isotopes detection -where it has been impossible to detect due elapsed time of the measurement- and implementing other applications. The programme is written in Visual-Basic and it can export data to Excel spreadsheets for later treatment. The software has established by default a channels range for adding the counts energy by energy but it can be adapted to the analysis of different isotopes and backgrounds simply changing a text file that is incorporated to the programme. The description of the programme management is described for whoever can realise its applications immediately. This software has the advantage of emitting an add-spectrum in cnf format that is used by alpha analyst (Genie 2K) for de convoluting spectra or doing calculations. (Author) 3 refs.

  3. Computer codes for problems of isotope and radiation research

    International Nuclear Information System (INIS)

    Remer, M.

    1986-12-01

    A survey is given of computer codes for problems in isotope and radiation research. Altogether 44 codes are described as titles with abstracts. 17 of them are in the INIS scope and are processed individually. The subjects are indicated in the chapter headings: 1) analysis of tracer experiments, 2) spectrum calculations, 3) calculations of ion and electron trajectories, 4) evaluation of gamma irradiation plants, and 5) general software

  4. Failure mode and effects analysis of software-based automation systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Helminen, A.

    2002-08-01

    Failure mode and effects analysis (FMEA) is one of the well-known analysis methods having an established position in the traditional reliability analysis. The purpose of FMEA is to identify possible failure modes of the system components, evaluate their influences on system behaviour and propose proper countermeasures to suppress these effects. The generic nature of FMEA has enabled its wide use in various branches of industry reaching from business management to the design of spaceships. The popularity and diverse use of the analysis method has led to multiple interpretations, practices and standards presenting the same analysis method. FMEA is well understood at the systems and hardware levels, where the potential failure modes usually are known and the task is to analyse their effects on system behaviour. Nowadays, more and more system functions are realised on software level, which has aroused the urge to apply the FMEA methodology also on software based systems. Software failure modes generally are unknown - 'software modules do not fail, they only display incorrect behaviour' - and depend on dynamic behaviour of the application. These facts set special requirements on the FMEA of software based systems and make it difficult to realise. In this report the failure mode and effects analysis is studied for the use of reliability analysis of software-based systems. More precisely, the target system of FMEA is defined to be a safety-critical software-based automation application in a nuclear power plant, implemented on an industrial automation system platform. Through a literature study the report tries to clarify the intriguing questions related to the practical use of software failure mode and effects analysis. The study is a part of the research project 'Programmable Automation System Safety Integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002). In the project various safety assessment methods and tools for

  5. Automated Software Vulnerability Analysis

    Science.gov (United States)

    Sezer, Emre C.; Kil, Chongkyung; Ning, Peng

    Despite decades of research, software continues to have vulnerabilities. Successful exploitations of these vulnerabilities by attackers cost millions of dollars to businesses and individuals. Unfortunately, most effective defensive measures, such as patching and intrusion prevention systems, require an intimate knowledge of the vulnerabilities. Many systems for detecting attacks have been proposed. However, the analysis of the exploited vulnerabilities is left to security experts and programmers. Both the human effortinvolved and the slow analysis process are unfavorable for timely defensive measure to be deployed. The problem is exacerbated by zero-day attacks.

  6. Optical spectroscopy versus mass spectrometry: The race for fieldable isotopic analysis

    International Nuclear Information System (INIS)

    Barshick, C.M.; Young, J.P.; Shaw, R.W.

    1995-01-01

    Several techniques have been developed to provide on-site isotopic analyses, including decay-counting and mass spectrometry, as well as methods that rely on the accessibility of optical transitions for isotopic selectivity (e.g., laser-induced fluorescence and optogalvanic spectroscopy). The authors have been investigating both mass spectrometry and optogalvanic spectroscopy for several years. Although others have considered these techniques for isotopic analysis, the authors have focussed on the use of a dc glow discharge for atomization and ionization, and a demountable discharge cell for rapid sample exchange. The authors' goal is a fieldable instrument that provides useful uranium isotope ratio information

  7. A 'Toolbox' Equivalent Process for Safety Analysis Software

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Eng, Tony

    2004-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or 'toolbox', of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and design documentation, user's instructions, test report, a

  8. New developments on COSI6, the simulation software for fuel cycle analysis

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Maryan; Boucher, Lionel [CEA, DEN, DER, SPRC, LECy, Centre de Cadarache, Batiment 230, Saint-Paul-lez-Durance, 13108 (France)

    2009-06-15

    New developments on COSI6, the simulation software for fuel cycle analysis 'COSI', is a code simulating a pool of nuclear electricity generating plants with its associated fuel cycle facilities. This code has been designed to study various short, medium and long term options for the introduction of various types of nuclear reactors and for the use of associated nuclear materials. COSI calculates the mass and the isotopic composition of all the materials, in each part of the nuclear park, at any time. Following the complete renewal of the code in 2006, new developments have been implemented into improve physical models, user convenience and to enlarge the scope of utilisation. COSI can now be coupled with CESAR 5 (JEFF 2.2: 200 FP available), allowing to perform waste packages calculations: high level waste (glasses), intermediate level waste (compacted waste), liquid and gaseous waste coming from processing, reactor high level waste. Those packages are managed in new kinds of facilities: finished warehouse, interim storage, waste disposal. The calculable features for waste packages are: mass (initial heavy metal), isotopic content, packages mass, packages volume, number of packages, activity, radiotoxicity, decay heat, necessary area. Developments on COSI are still ongoing. The reactivity equivalence function (based on Baker formula) is now available: - for thorium cycle: compositions for [Th+Pu] or [Th+U] can be calculated taking into account all nuclides impacts. - for ADS: compositions for [MA+Pu] can be calculated taking into account all nuclides impacts. The physical model is also developed: nuclear data (branching ratios, fission energies, fission yields) will be different between thermal reactors and fast reactors. The first step towards proliferation resistance methodology implementation is the evaluation of physical data needed for proliferation evaluation: heating rate from Pu in material (Watt/kg), weight fraction of even isotopes

  9. Direct uranium isotope ratio analysis of single micrometer-sized glass particles

    International Nuclear Information System (INIS)

    Kappel, Stefanie; Boulyga, Sergei F.; Prohaska, Thomas

    2012-01-01

    We present the application of nanosecond laser ablation (LA) coupled to a ‘Nu Plasma HR’ multi collector inductively coupled plasma mass spectrometer (MC-ICP-MS) for the direct analysis of U isotope ratios in single, 10–20 μm-sized, U-doped glass particles. Method development included studies with respect to (1) external correction of the measured U isotope ratios in glass particles, (2) the applied laser ablation carrier gas (i.e. Ar versus He) and (3) the accurate determination of lower abundant 236 U/ 238 U isotope ratios (i.e. 10 −5 ). In addition, a data processing procedure was developed for evaluation of transient signals, which is of potential use for routine application of the developed method. We demonstrate that the developed method is reliable and well suited for determining U isotope ratios of individual particles. Analyses of twenty-eight S1 glass particles, measured under optimized conditions, yielded average biases of less than 0.6% from the certified values for 234 U/ 238 U and 235 U/ 238 U ratios. Experimental results obtained for 236 U/ 238 U isotope ratios deviated by less than −2.5% from the certified values. Expanded relative total combined standard uncertainties U c (k = 2) of 2.6%, 1.4% and 5.8% were calculated for 234 U/ 238 U, 235 U/ 238 U and 236 U/ 238 U, respectively. - Highlights: ► LA-MC-ICP-MS was fully validated for the direct analysis of individual particles. ► Traceability was established by using an IRMM glass particle reference material. ► Measured U isotope ratios were in agreement with the certified range. ► A comprehensive total combined uncertainty evaluation was performed. ► The analysis of 236 U/ 238 U isotope ratios was improved by using a deceleration filter.

  10. pH-dependent equilibrium isotope fractionation associated with the compound specific nitrogen and carbon isotope analysis of substituted anilines by SPME-GC/IRMS.

    Science.gov (United States)

    Skarpeli-Liati, Marita; Turgeon, Aurora; Garr, Ashley N; Arnold, William A; Cramer, Christopher J; Hofstetter, Thomas B

    2011-03-01

    Solid-phase microextraction (SPME) coupled to gas chromatography/isotope ratio mass spectrometry (GC/IRMS) was used to elucidate the effects of N-atom protonation on the analysis of N and C isotope signatures of selected aromatic amines. Precise and accurate isotope ratios were measured using polydimethylsiloxane/divinylbenzene (PDMS/DVB) as the SPME fiber material at solution pH-values that exceeded the pK(a) of the substituted aniline's conjugate acid by two pH-units. Deviations of δ(15)N and δ(13)C-values from reference measurements by elemental analyzer IRMS were small (IRMS. Under these conditions, the detection limits for accurate isotope ratio measurements were between 0.64 and 2.1 mg L(-1) for δ(15)N and between 0.13 and 0.54 mg L(-1) for δ(13)C, respectively. Substantial inverse N isotope fractionation was observed by SPME-GC/IRMS as the fraction of protonated species increased with decreasing pH leading to deviations of -20‰ while the corresponding δ(13)C-values were largely invariant. From isotope ratio analysis at different solution pHs and theoretical calculations by density functional theory, we derived equilibrium isotope effects, EIEs, pertinent to aromatic amine protonation of 0.980 and 1.001 for N and C, respectively, which were very similar for all compounds investigated. Our work shows that N-atom protonation can compromise accurate compound-specific N isotope analysis of aromatic amines.

  11. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    Science.gov (United States)

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  12. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  13. One-Click Data Analysis Software for Science Operations

    Science.gov (United States)

    Navarro, Vicente

    2015-12-01

    One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

  14. Oxygen isotope analysis of plant water without extraction procedure

    International Nuclear Information System (INIS)

    Gan, K.S.; Wong, S.C.; Farquhar, G.D.; Yong, J.W.H.

    2001-01-01

    Isotopic analyses of plant water (mainly xylem, phloem and leaf water) are gaming importance as the isotopic signals reflect plant-environment interactions, affect the oxygen isotopic composition of atmospheric O 2 and CO 2 and are eventually incorporated into plant organic matter. Conventionally, such isotopic measurements require a time-consuming process of isolating the plant water by azeotropic distillation or vacuum extraction, which would not complement the speed of isotope analysis provided by continuous-flow IRMS (Isotope-Ratio Mass Spectrometry), especially when large data sets are needed for statistical calculations in biological studies. Further, a substantial amount of plant material is needed for water extraction and leaf samples would invariably include unenriched water from the fine veins. To measure sub-microlitre amount of leaf mesophyll water, a new approach is undertaken where a small disc of fresh leaf is cut using a specially designed leaf punch, and pyrolysed directly in an IRMS. By comparing with results from pyrolysis of the dry matter of the same leaf, the 18 O content of leaf water can be determined without extraction from fresh leaves. This method is validated using a range of cellulose-water mixtures to simulate the constituents of fresh leaf. Cotton leaf water δ 18 O obtained from both methods of fresh leaf pyrolysis and azeotropic distillation will be compared. The pyrolysis technique provides a robust approach to measure the isotopic content of water or any volatile present in a homogeneous solution or solid hydrous substance

  15. Maintaining high precision of isotope ratio analysis over extended periods of time.

    Science.gov (United States)

    Brand, Willi A

    2009-06-01

    Stable isotope ratios are reliable and long lasting process tracers. In order to compare data from different locations or different sampling times at a high level of precision, a measurement strategy must include reliable traceability to an international stable isotope scale via a reference material (RM). Since these international RMs are available in low quantities only, we have developed our own analysis schemes involving laboratory working RM. In addition, quality assurance RMs are used to control the long-term performance of the delta-value assignments. The analysis schemes allow the construction of quality assurance performance charts over years of operation. In this contribution, the performance of three typical techniques established in IsoLab at the MPI-BGC in Jena is discussed. The techniques are (1) isotope ratio mass spectrometry with an elemental analyser for delta(15)N and delta(13)C analysis of bulk (organic) material, (2) high precision delta(13)C and delta(18)O analysis of CO(2) in clean-air samples, and (3) stable isotope analysis of water samples using a high-temperature reaction with carbon. In addition, reference strategies on a laser ablation system for high spatial resolution delta(13)C analysis in tree rings is exemplified briefly.

  16. Visual querying and analysis of large software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on

  17. Development of isotope dilution gamma-ray spectrometry for plutonium analysis

    Energy Technology Data Exchange (ETDEWEB)

    Li, T.K.; Parker, J.L. (Los Alamos National Lab., NM (United States)); Kuno, Y.; Sato, S.; Kurosawa, A.; Akiyama, T. (Power Reactor and Nuclear Fuel Development Corp., Tokai, Ibaraki (Japan))

    1991-01-01

    We are studying the feasibility of determining the plutonium concentration and isotopic distribution of highly radioactive, spent-fuel dissolver solutions by employing high-resolution gamma-ray spectrometry. The study involves gamma-ray plutonium isotopic analysis for both dissolver and spiked dissolver solution samples, after plutonium is eluted through an ion-exchange column and absorbed in a small resin bead bag. The spike is well characterized, dry plutonium containing {approximately}98% of {sup 239}Pu. By using measured isotopic information, the concentration of elemental plutonium in the dissolver solution can be determined. Both the plutonium concentration and the isotopic composition of the dissolver solution obtained from this study agree well with values obtained by traditional isotope dilution mass spectrometry (IDMS). Because it is rapid, easy to operate and maintain, and costs less, this new technique could be an alternative method to IDMS for input accountability and verification measurements in reprocessing plants. 7 refs., 4 figs., 4 tabs.

  18. Isotopic analysis of boron by thermal ionization mass spectrometry

    International Nuclear Information System (INIS)

    Kakazu, M.H.; Sarkis, J.E.S.; Souza, I.M.S.

    1991-07-01

    This paper presents a methodology for isotopic analysis of boron by thermal ionization mass spectrometry technique through the ion intensity measurement of Na 2 BO + 2 in H 3 BO 3 , B o and B 4 C. The samples were loaded on single tantalum filaments by different methods. In the case of H 3 BO 3 , the method of neutralization with NaOH was used. For B 4 C the alcaline fusion with Na 2 CO 3 and for B o dissolution with 1:1 nitric sulfuric acid mixture followed by neutralization with NaOH was used. The isotopic ratio measurements were obtained by the use of s Faraday cup detector with external precision of ±0,4% and accuracy of ±0,1%, relative to H 3 BO 3 isotopic standard NBS 951. The effects of isotopic fractionation was studied in function of the time during the analyses and the different chemical forms of deposition. (author)

  19. Effective Results Analysis for the Similar Software Products’ Orthogonality

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2009-10-01

    Full Text Available It is defined the concept of similar software. There are established conditions of archiving the software components. It is carried out the orthogonality evaluation and the correlation between the orthogonality and the complexity of the homogenous software components is analyzed. Shall proceed to build groups of similar software products, belonging to the orthogonality intervals. There are presented in graphical form the results of the analysis. There are detailed aspects of the functioning of the software product allocated for the orthogonality.

  20. Enhanced forensic discrimination of pollutants by position-specific isotope analysis using isotope ratio monitoring by (13)C nuclear magnetic resonance spectrometry.

    Science.gov (United States)

    Julien, Maxime; Nun, Pierrick; Höhener, Patrick; Parinet, Julien; Robins, Richard J; Remaud, Gérald S

    2016-01-15

    In forensic environmental investigations the main issue concerns the inference of the original source of the pollutant for determining the liable party. Isotope measurements in geochemistry, combined with complimentary techniques for contaminant identification, have contributed significantly to source determination at polluted sites. In this work we have determined the intramolecular (13)C profiles of several molecules well-known as pollutants. By giving additional analytical parameters, position-specific isotope analysis performed by isotope ratio monitoring by (13)C nuclear magnetic resonance (irm-(13)C NMR) spectrometry gives new information to help in answering the major question: what is the origin of the detected contaminant? We have shown that isotope profiling of the core of a molecule reveals both the raw materials and the process used in its manufacture. It also can reveal processes occurring between the contamination site 'source' and the sampling site. Thus, irm-(13)C NMR is shown to be a very good complement to compound-specific isotope analysis currently performed by mass spectrometry for assessing polluted sites involving substantial spills of pollutant. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Characterization of wines according the geographical origin by analysis of isotopes and minerals and the influence of harvest on the isotope values.

    Science.gov (United States)

    Dutra, S V; Adami, L; Marcon, A R; Carnieli, G J; Roani, C A; Spinelli, F R; Leonardelli, S; Vanderlinde, R

    2013-12-01

    We studied Brazilian wines produced by microvinification from Cabernet Sauvignon and Merlot grapes, vintages 2007 and 2008, from the Serra Gaúcha, Campanha and Serra do Sudeste regions, in order to differentiate them according to geographical origin by using isotope and mineral element analyses. In addition, the influence of vintage production in isotope values was verified. Isotope analysis was performed by isotope ratio mass spectrometry (IRMS), and the determination of minerals was by flame atomic absorption (FAA). The best parameters to classify the wines in the 2008 vintage were Rb and Li. The results of the δ(13)C of wine ethanol, Rb and Li showed a significant difference between the varieties regardless of the region studied. The δ(18)O values of water and δ(13)C of ethanol showed significant differences, regardless of the variety. Discriminant analysis of isotope and minerals values allowed to classify approximately 80% of the wines from the three regions studied. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Introducing a New Software for Geodetic Analysis

    Science.gov (United States)

    Hjelle, Geir Arne; Dähnn, Michael; Fausk, Ingrid; Kirkvik, Ann-Silje; Mysen, Eirik

    2017-04-01

    At the Norwegian Mapping Authority, we are currently developing Where, a new software for geodetic analysis. Where is built on our experiences with the Geosat software, and will be able to analyse and combine data from VLBI, SLR, GNSS and DORIS. The software is mainly written in Python which has proved very fruitful. The code is quick to write and the architecture is easily extendable and maintainable, while at the same time taking advantage of well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where, including benchmarks against other software packages, and outline our plans for further progress. In addition we will report on some investigations we have done experimenting with alternative weighting strategies for VLBI.

  3. Isotope analysis of micro metal particles by adopting laser-ablation mass spectrometry

    International Nuclear Information System (INIS)

    Song, Kyu Seok; Ha, Young Kyung; Han, Sun Ho; Park, Yong Joon; Kim, Won Ho

    2005-01-01

    The isotope analysis of microparticles in environmental samples as well as laboratory samples is an important task. A special concern is necessary in particle analysis of swipe samples. Micro particles are normally analyzed either by dissolving particles in the solvents and adopting conventional analytical methods or direct analysis method such as a laser-ablation ICP mass spectrometry (LA-ICP-MS), SIMS, and SNMS (sputtered neutral mass spectrometry). But the LA-ICPMS uses large amount of samples because normally laser beam is tightly focused on the target particle for the complete ablation. The SIMS and SNMS utilize ion beams for the generation of sample ions from the particle. But the number of ions generated by an ion beam is less than 5% of the total generated particles in SIMS. The SNMS is also an excellent analytical technique for particle analysis, however, ion beam and frequency tunable laser system are required for the analysis. Recently a direct analysis of elements as well as isotopes by using laser-ablation is recognized one of the most efficient detection technology for particle samples. The laser-ablation mass spectrometry requires only one laser source without frequency tuneability with no sample pretreatment. Therefore this technique is one of the simplest analysis techniques for solid samples as well as particles. In this study as a part of the development of the new isotope analysis techniques for particles samples, a direct laser-ablation is adopted with mass spectrometry. Zinc and gadolinium were chosen as target samples, since these elements have isotopes with minor abundance (0.62% for Zn, and 0.2% for Gd). The preliminary result indicates that isotopes of these two elements are analyzed within 10% of natural abundance with good mass resolution by using direct laser-ablation mass spectrometry

  4. High burn-up plutonium isotopic compositions recommended for use in shielding analysis

    International Nuclear Information System (INIS)

    Zimmerman, M.G.

    1977-06-01

    Isotopic compositions for plutonium generated and recycled in LWR's were estimated for use in shielding calculations. The values were obtained by averaging isotopic values from many sources in the literature. These isotopic values should provide the basis for a reasonable prediction of exposure rates from the range of LWR fuel expected in the future. The isotopic compositions given are meant to be used for shielding calculations, and the values are not necessarily applicable to other forms of analysis, such as inventory assessment or criticality safety. 11 tables, 2 figs

  5. Software para análise quantitativa da deglutição Swallowing quantitative analysis software

    Directory of Open Access Journals (Sweden)

    André Augusto Spadotto

    2008-02-01

    Full Text Available OBJETIVO: Apresentar um software que permita uma análise detalhada da dinâmica da deglutição. MATERIAIS E MÉTODOS: Participaram deste estudo dez indivíduos após acidente vascular encefálico, sendo seis do gênero masculino, com idade média de 57,6 anos. Foi realizada videofluoroscopia da deglutição e as imagens foram digitalizadas em microcomputador, com posterior análise do tempo do trânsito faríngeo da deglutição, por meio de um cronômetro e do software. RESULTADOS: O tempo médio do trânsito faríngeo da deglutição apresentou-se diferente quando comparados os métodos utilizados (cronômetro e software. CONCLUSÃO: Este software é um instrumento de análise dos parâmetros tempo e velocidade da deglutição, propiciando melhor compreensão da dinâmica da deglutição, com reflexos tanto na abordagem clínica dos pacientes com disfagia como para fins de pesquisa científica.OBJECTIVE: The present paper is aimed at introducing a software to allow a detailed analysis of the swallowing dynamics. MATERIALS AND METHODS: The sample included ten (six male and four female stroke patients, with mean age of 57.6 years. Swallowing videofluoroscopy was performed and images were digitized for posterior analysis of the pharyngeal transit time with the aid of a chronometer and the software. RESULTS: Differences were observed in the average pharyngeal swallowing transit time as a result of measurements with chronometer and software. CONCLUSION: This software is a useful tool for the analysis of parameters such as swallowing time and speed, allowing a better understanding of the swallowing dynamics, both in the clinical approach of patients with oropharyngeal dysphagia and for scientific research purposes.

  6. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    Science.gov (United States)

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  7. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  8. High precision analysis of trace lithium isotope by thermal ionization mass spectrometry

    International Nuclear Information System (INIS)

    Tang Lei; Liu Xuemei; Long Kaiming; Liu Zhao; Yang Tianli

    2010-01-01

    High precision analysis method of ng lithium by thermal ionization mass spectrometry is developed. By double-filament measurement,phosphine acid ion enhancer and sample pre-baking technique,the precision of trace lithium analysis is improved. For 100 ng lithium isotope standard sample, relative standard deviation is better than 0.086%; for 10 ng lithium isotope standard sample, relative standard deviation is better than 0.90%. (authors)

  9. Fault tree synthesis for software design analysis of PLC based safety-critical systems

    International Nuclear Information System (INIS)

    Koo, S. R.; Cho, C. H.; Seong, P. H.

    2006-01-01

    As a software verification and validation should be performed for the development of PLC based safety-critical systems, a software safety analysis is also considered in line with entire software life cycle. In this paper, we propose a technique of software safety analysis in the design phase. Among various software hazard analysis techniques, fault tree analysis is most widely used for the safety analysis of nuclear power plant systems. Fault tree analysis also has the most intuitive notation and makes both qualitative and quantitative analyses possible. To analyze the design phase more effectively, we propose a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Consequently, we can analyze the safety of software on the basis of fault tree synthesis. (authors)

  10. Effective Results Analysis for the Similar Software Products’ Orthogonality

    OpenAIRE

    Ion Ivan; Daniel Milodin

    2009-01-01

    It is defined the concept of similar software. There are established conditions of archiving the software components. It is carried out the orthogonality evaluation and the correlation between the orthogonality and the complexity of the homogenous software components is analyzed. Shall proceed to build groups of similar software products, belonging to the orthogonality intervals. There are presented in graphical form the results of the analysis. There are detailed aspects of the functioning o...

  11. Integrated analysis software for bulk power system stability

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, T; Nagao, T; Takahashi, K [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1994-12-31

    This paper presents Central Research Inst.of Electric Power Industry - CRIEPI`s - own developed three softwares for bulk power network analysis and the user support system which arranges tremendous data necessary for these softwares with easy and high reliability. (author) 3 refs., 7 figs., 2 tabs.

  12. Software design specification and analysis(NuFDS) approach for the safety critical software based on porgrammable logic controller(PLC)

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Jung, Jin Yong; Choi, Seong Soo

    2004-01-01

    This paper introduces the software design specification and analysis technique for the safety-critical system based on Programmable Logic Controller (PLC). During software development phases, the design phase should perform an important role to connect between requirements phase and implementation phase as a process of translating problem requirements into software structures. In this work, the Nuclear FBD-style Design Specification and analysis (NuFDS) approach was proposed. The NuFDS approach for nuclear Instrumentation and Control (I and C) software are suggested in a straight forward manner. It consists of four major specifications as follows; Database, Software Architecture, System Behavior, and PLC Hardware Configuration. Additionally, correctness, completeness, consistency, and traceability check techniques are also suggested for the formal design analysis in NuFDS approach. In addition, for the tool supporting, we are developing NuSDS tool based on the NuFDS approach which is a tool, especially for the software design specification in nuclear fields

  13. Isotope analysis of closely adjacent minerals

    International Nuclear Information System (INIS)

    Smith, M.P.

    1990-01-01

    This patent describes a method of determining an indicator of at least one of hydrocarbon formation, migration, and accumulation during mineral development. It comprises: searching for a class of minerals in a mineral specimen comprising more than one class of minerals; identifying in the mineral specimen a target sample of the thus searched for class; directing thermally pyrolyzing laser beam radiation onto surface mineral substance of the target sample in the mineral specimen releasing surface mineral substance pyrolysate gases therefrom; and determining isotope composition essentially of the surface mineral substance from analyzing the pyrolysate gases released from the thus pyrolyzed target sample, the isotope composition including isotope(s) selected from the group consisting of carbon, hydrogen, and oxygen isotopes; determining an indicator of at least one of hydrocarbon formation, migration, and accumulation during mineral development of the target mineral from thus determined isotope composition of surface mineral substance pyrolysate

  14. Isotopic neutron sources for neutron activation analysis

    International Nuclear Information System (INIS)

    Hoste, J.

    1988-06-01

    This User's Manual is an attempt to provide for teaching and training purposes, a series of well thought out demonstrative experiments in neutron activation analysis based on the utilization of an isotopic neutron source. In some cases, these ideas can be applied to solve practical analytical problems. 19 refs, figs and tabs

  15. Microcalorimeter Q-spectroscopy for rapid isotopic analysis of trace actinide samples

    Energy Technology Data Exchange (ETDEWEB)

    Croce, M.P., E-mail: mpcroce@lanl.gov [Los Alamos National Laboratory, Los Alamos, NM (United States); Bond, E.M.; Hoover, A.S.; Kunde, G.J.; Mocko, V.; Rabin, M.W.; Weisse-Bernstein, N.R.; Wolfsberg, L.E. [Los Alamos National Laboratory, Los Alamos, NM (United States); Bennett, D.A.; Hays-Wehle, J.; Schmidt, D.R.; Ullom, J.N. [National Institute of Standards and Technology, Boulder, CO (United States)

    2015-06-01

    We are developing superconducting transition-edge sensor (TES) microcalorimeters that are optimized for rapid isotopic analysis of trace actinide samples by Q-spectroscopy. By designing mechanically robust TESs and simplified detector assembly methods, we have developed a detector for Q-spectroscopy of actinides that can be assembled in minutes. We have characterized the effects of each simplification and present the results. Finally, we show results of isotopic analysis of plutonium samples with Q-spectroscopy detectors and compare the results to mass spectrometry.

  16. Microcalorimeter Q-spectroscopy for rapid isotopic analysis of trace actinide samples

    International Nuclear Information System (INIS)

    Croce, M.P.; Bond, E.M.; Hoover, A.S.; Kunde, G.J.; Mocko, V.; Rabin, M.W.; Weisse-Bernstein, N.R.; Wolfsberg, L.E.; Bennett, D.A.; Hays-Wehle, J.; Schmidt, D.R.; Ullom, J.N.

    2015-01-01

    We are developing superconducting transition-edge sensor (TES) microcalorimeters that are optimized for rapid isotopic analysis of trace actinide samples by Q-spectroscopy. By designing mechanically robust TESs and simplified detector assembly methods, we have developed a detector for Q-spectroscopy of actinides that can be assembled in minutes. We have characterized the effects of each simplification and present the results. Finally, we show results of isotopic analysis of plutonium samples with Q-spectroscopy detectors and compare the results to mass spectrometry

  17. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  18. Software safety analysis on the model specified by NuSCR and SMV input language at requirements phase of software development life cycle using SMV

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun

    2005-01-01

    Safety-critical software process is composed of development process, verification and validation (V and V) process and safety analysis process. Safety analysis process has been often treated as an additional process and not found in a conventional software process. But software safety analysis (SSA) is required if software is applied to a safety system, and the SSA shall be performed independently for the safety software through software development life cycle (SDLC). Of all the phases in software development, requirements engineering is generally considered to play the most critical role in determining the overall software quality. NASA data demonstrate that nearly 75% of failures found in operational software were caused by errors in the requirements. The verification process in requirements phase checks the correctness of software requirements specification, and the safety analysis process analyzes the safety-related properties in detail. In this paper, the method for safety analysis at requirements phase of software development life cycle using symbolic model verifier (SMV) is proposed. Hazard is discovered by hazard analysis and in other to use SMV for the safety analysis, the safety-related properties are expressed by computation tree logic (CTL)

  19. Potential application of gas chromatography to the analysis of hydrogen isotopes

    International Nuclear Information System (INIS)

    Warner, D.K.; Sprague, R.E.; Bohl, D.R.

    1976-01-01

    Gas chromatography is used at Mound Laboratory for the analysis of hydrogen isotopic impurities in gas mixtures. This instrumentation was used to study the applicability of the gas chromatography technique to the determination of the major components of hydrogen isotopic gas mixtures. The results of this study, including chromatograms and precision data, are presented

  20. PuMA: the Porous Microstructure Analysis software

    Science.gov (United States)

    Ferguson, Joseph C.; Panerai, Francesco; Borner, Arnaud; Mansour, Nagi N.

    2018-01-01

    The Porous Microstructure Analysis (PuMA) software has been developed in order to compute effective material properties and perform material response simulations on digitized microstructures of porous media. PuMA is able to import digital three-dimensional images obtained from X-ray microtomography or to generate artificial microstructures. PuMA also provides a module for interactive 3D visualizations. Version 2.1 includes modules to compute porosity, volume fractions, and surface area. Two finite difference Laplace solvers have been implemented to compute the continuum tortuosity factor, effective thermal conductivity, and effective electrical conductivity. A random method has been developed to compute tortuosity factors from the continuum to rarefied regimes. Representative elementary volume analysis can be performed on each property. The software also includes a time-dependent, particle-based model for the oxidation of fibrous materials. PuMA was developed for Linux operating systems and is available as a NASA software under a US & Foreign release.

  1. Discrimination of ginseng cultivation regions using light stable isotope analysis.

    Science.gov (United States)

    Kim, Kiwook; Song, Joo-Hyun; Heo, Sang-Cheol; Lee, Jin-Hee; Jung, In-Woo; Min, Ji-Sook

    2015-10-01

    Korean ginseng is considered to be a precious health food in Asia. Today, thieves frequently compromise ginseng farms by pervasive theft. Thus, studies regarding the characteristics of ginseng according to growth region are required in order to deter ginseng thieves and prevent theft. In this study, 6 regions were selected on the basis of Korea regional criteria (si, gun, gu), and two ginseng-farms were randomly selected from each of the 6 regions. Then 4-6 samples of ginseng were acquired from each ginseng farm. The stable isotopic compositions of H, O, C, and N of the collected ginseng samples were analyzed. As a result, differences in the hydrogen isotope ratios could be used to distinguish regional differences, and differences in the nitrogen isotope ratios yielded characteristic information regarding the farms from which the samples were obtained. Thus, stable isotope values could be used to differentiate samples according to regional differences. Therefore, stable isotope analysis serves as a powerful tool to discriminate the regional origin of Korean ginseng samples from across Korea. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. Software for computerised analysis of cardiotocographic traces.

    Science.gov (United States)

    Romano, M; Bifulco, P; Ruffo, M; Improta, G; Clemente, F; Cesarelli, M

    2016-02-01

    Despite the widespread use of cardiotocography in foetal monitoring, the evaluation of foetal status suffers from a considerable inter and intra-observer variability. In order to overcome the main limitations of visual cardiotocographic assessment, computerised methods to analyse cardiotocographic recordings have been recently developed. In this study, a new software for automated analysis of foetal heart rate is presented. It allows an automatic procedure for measuring the most relevant parameters derivable from cardiotocographic traces. Simulated and real cardiotocographic traces were analysed to test software reliability. In artificial traces, we simulated a set number of events (accelerations, decelerations and contractions) to be recognised. In the case of real signals, instead, results of the computerised analysis were compared with the visual assessment performed by 18 expert clinicians and three performance indexes were computed to gain information about performances of the proposed software. The software showed preliminary performance we judged satisfactory in that the results matched completely the requirements, as proved by tests on artificial signals in which all simulated events were detected from the software. Performance indexes computed in comparison with obstetricians' evaluations are, on the contrary, not so satisfactory; in fact they led to obtain the following values of the statistical parameters: sensitivity equal to 93%, positive predictive value equal to 82% and accuracy equal to 77%. Very probably this arises from the high variability of trace annotation carried out by clinicians. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  3. Dissolution of barite for the analysis of strontium isotopes and other chemical and isotopic variations using aqueous sodium carbonate

    Science.gov (United States)

    Breit, G.N.; Simmons, E.C.; Goldhaber, M.B.

    1985-01-01

    A simple procedure for preparing barite samples for chemical and isotopic analysis is described. Sulfate ion, in barite, in the presence of high concentrations of aqueous sodium carbonate, is replaced by carbonate. This replacement forms insoluble carbonates with the cations commonly in barite: Ba, Sr, Ca and Pb. Sulfate is released into the solution by the carbonate replacement and is separated by filtration. The aqueous sulfate can then be reprecipitated for analysis of the sulfur and oxygen isotopes. The cations in the carbonate phase can be dissolved by acidifying the solid residue. Sr can be separated from the solution for Sr isotope analysis by ion-exchange chromatography. The sodium carbonate used contains amounts of Sr which will affect almost all barite 87Sr 86Sr ratios by less than 0.00001 at 1.95?? of the mean. The procedure is preferred over other techniques used for preparing barite samples for the determination of 87Sr 86Sr ratios because it is simple, rapid and enables simultaneous determination of many compositional parameters on the same material. ?? 1985.

  4. Physical and Human Controls on the Carbon Composition of Organic Matter in Tropical Rivers: An Integrated Analysis of Landscape Properties and River Isotopic Composition

    Energy Technology Data Exchange (ETDEWEB)

    Ballester, M. V.R.; Victoria, R. L.; Krusche, A. V. [Centro de Energia Nuclear na Agricultura, Universidade de Sao Paulo, Piracicaba (Brazil); Bernardes, M. [Universidade Federal Fluminense, Rio de Janeiro (Brazil); Neill, C.; Deegan, L. [Marine Biological Laboratory, Woods Hole, MA (United States); Richey, J. E. [University of Washington, Seatle, WA (United States)

    2013-05-15

    We applied an integrated analysis of landscape properties including soil properties, land cover and riverine isotopic composition. To evaluate physical and human controls on the carbon composition of organic matter in tropical rivers, we applied an integrated analysis of landscape properties including soil properties, land cover and riverine isotopic composition. Our main objective was to establish the relationship between basin attributes and forms, fluxes and composition of dissolved and particulate organic matter in river channels. A physical template was developed as a GIS-based comprehensive tool to support the understanding of the biogeochemistry of the surface waters of two tropical rivers: the Ji-Parana (Western Amazonia) and the Piracicaba (southeastern of Brazil). For each river we divided the basin into drainage units, organized according to river network morphology and degree of land use impact. Each sector corresponded to a sampling point where river isotopic composition was analysed. River sites and basin characteristics were calculated using datasets compiled as layers in ArcGis Geographical Information System and ERDAS-IMAGINE (Image Processing) software. Each delineated drainage area was individually characterized in terms of topography, soils, river network and land use. Carbon stable isotopic composition of dissolved organic matter (DOM) and particulate organic matter (POM) was determined at several sites along the main tributaries and small streams. The effects of land use on fluvial carbon composition were quantified by a linear regression analysis, relating basin cover and river isotopic composition. The results showed that relatively recent land cover changes have already had an impact on the composition of the riverine DOM and POM, indicating that, as in natural ecosystems, vegetation plays a key role in the composition of riverine organic matter in agricultural ecosystems. (author)

  5. An effective technique for the software requirements analysis of NPP safety-critical systems, based on software inspection, requirements traceability, and formal specification

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Yoo, Junbeom; Cha, Sung Deok; Yoo, Yeong Jae

    2005-01-01

    A thorough requirements analysis is indispensable for developing and implementing safety-critical software systems such as nuclear power plant (NPP) software systems because a single error in the requirements can generate serious software faults. However, it is very difficult to completely analyze system requirements. In this paper, an effective technique for the software requirements analysis is suggested. For requirements verification and validation (V and V) tasks, our technique uses software inspection, requirement traceability, and formal specification with structural decomposition. Software inspection and requirements traceability analysis are widely considered the most effective software V and V methods. Although formal methods are also considered an effective V and V activity, they are difficult to use properly in the nuclear fields as well as in other fields because of their mathematical nature. In this work, we propose an integrated environment (IE) approach for requirements, which is an integrated approach that enables easy inspection by combining requirement traceability and effective use of a formal method. The paper also introduces computer-aided tools for supporting IE approach for requirements. Called the nuclear software inspection support and requirements traceability (NuSISRT), the tool incorporates software inspection, requirement traceability, and formal specification capabilities. We designed the NuSISRT to partially automate software inspection and analysis of requirement traceability. In addition, for the formal specification and analysis, we used the formal requirements specification and analysis tool for nuclear engineering (NuSRS)

  6. A portable medium-resolution gamma-ray spectrometer and analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Lavietes, A.D.; McQuaid, J.H.; Ruhter, W.D.; Buckley, W.M.; Clark, D-L. [Lawrence Livermore National Lab., CA (United States); Paulus, T.J. [EG and G ORTEC, Oak Ridge, TN (United States)

    1996-07-01

    There is a strong need for portable radiometric instrumentation that can both accurately confirm the presence of nuclear materials and allow isotopic analysis of radionuclides in the field. To fulfill this need the Safeguards Technology Program at LLNL has developed a hand-held, non-cryogenic, low-power gamma-ray and x-ray measurements and analysis instrument that can both search for and then accurately verify the presence of nuclear materials. We will report on the use of cadmium zinc telluride (CZT) detectors, detector electronics, and the new field-portable instrument being developed. We will also describe the isotopic analysis that allows enrichment measurements to be made accurately in the field. These systems provide capability for safeguards inspection and verification applications and could find application in counter-smuggling operations.

  7. A portable medium-resolution gamma-ray spectrometer and analysis software

    International Nuclear Information System (INIS)

    Lavietes, A.D.; McQuaid, J.H.; Ruhter, W.D.; Buckley, W.M.; Clark, D-L.; Paulus, T.J.

    1996-07-01

    There is a strong need for portable radiometric instrumentation that can both accurately confirm the presence of nuclear materials and allow isotopic analysis of radionuclides in the field. To fulfill this need the Safeguards Technology Program at LLNL has developed a hand-held, non-cryogenic, low-power gamma-ray and x-ray measurements and analysis instrument that can both search for and then accurately verify the presence of nuclear materials. We will report on the use of cadmium zinc telluride (CZT) detectors, detector electronics, and the new field-portable instrument being developed. We will also describe the isotopic analysis that allows enrichment measurements to be made accurately in the field. These systems provide capability for safeguards inspection and verification applications and could find application in counter-smuggling operations

  8. NuFTA: A CASE Tool for Automatic Software Fault Tree Analysis

    International Nuclear Information System (INIS)

    Yun, Sang Hyun; Lee, Dong Ah; Yoo, Jun Beom

    2010-01-01

    Software fault tree analysis (SFTA) is widely used for analyzing software requiring high-reliability. In SFTA, experts predict failures of system through HA-ZOP (Hazard and Operability study) or FMEA (Failure Mode and Effects Analysis) and draw software fault trees about the failures. Quality and cost of the software fault tree, therefore, depend on knowledge and experience of the experts. This paper proposes a CASE tool NuFTA in order to assist experts of safety analysis. The NuFTA automatically generate software fault trees from NuSCR formal requirements specification. NuSCR is a formal specification language used for specifying software requirements of KNICS RPS (Reactor Protection System) in Korea. We used the SFTA templates proposed by in order to generate SFTA automatically. The NuFTA also generates logical formulae summarizing the failure's cause, and we have a plan to use the formulae usefully through formal verification techniques

  9. Usability study of clinical exome analysis software: top lessons learned and recommendations.

    Science.gov (United States)

    Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W

    2014-10-01

    New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to

  10. Principles of isotopic analysis by mass spectrometry

    International Nuclear Information System (INIS)

    Herrmann, M.

    1980-01-01

    The use of magnetic sector field mass spectrometers in isotopic analysis, especially for nitrogen gas, is outlined. Two measuring methods are pointed out: the scanning mode for significantly enriched samples and the double collector method for samples near the natural abundance of 15 N. The calculation formulas are derived and advice is given for corrections. (author)

  11. BIM Software Capability and Interoperability Analysis : An analytical approach toward structural usage of BIM software (S-BIM)

    OpenAIRE

    A. Taher, Ali

    2016-01-01

    This study focused on the structuralanalysis of BIM models. Different commercial software (Autodesk products and Rhinoceros)are presented through modelling and analysis of different structures with varying complexity,section properties, geometry, and material. Beside the commercial software, differentarchitectural and different tools for structural analysis are evaluated (dynamo, grasshopper,add-on tool, direct link, indirect link via IFC). BIM and Structural BIM (S-BIM)

  12. Quantifying inter-laboratory variability in stable isotope analysis of ancient skeletal remains.

    Directory of Open Access Journals (Sweden)

    William J Pestle

    Full Text Available Over the past forty years, stable isotope analysis of bone (and tooth collagen and hydroxyapatite has become a mainstay of archaeological and paleoanthropological reconstructions of paleodiet and paleoenvironment. Despite this method's frequent use across anthropological subdisciplines (and beyond, the present work represents the first attempt at gauging the effects of inter-laboratory variability engendered by differences in a sample preparation, and b analysis (instrumentation, working standards, and data calibration. Replicate analyses of a 14C-dated ancient human bone by twenty-one archaeological and paleoecological stable isotope laboratories revealed significant inter-laboratory isotopic variation for both collagen and carbonate. For bone collagen, we found a sizeable range of 1.8‰ for δ13Ccol and 1.9‰ for δ15Ncol among laboratories, but an interpretatively insignificant average pairwise difference of 0.2‰ and 0.4‰ for δ13Ccol and δ15Ncol respectively. For bone hydroxyapatite the observed range increased to a troublingly large 3.5‰ for δ13Cap and 6.7‰ for δ18Oap, with average pairwise differences of 0.6‰ for δ13Cap and a disquieting 2.0‰ for δ18Oap. In order to assess the effects of preparation versus analysis on isotopic variability among laboratories, a subset of the samples prepared by the participating laboratories were analyzed a second time on the same instrument. Based on this duplicate analysis, it was determined that roughly half of the isotopic variability among laboratories could be attributed to differences in sample preparation, with the other half resulting from differences in analysis (instrumentation, working standards, and data calibration. These findings have serious implications for choices made in the preparation and extraction of target biomolecules, the comparison of results obtained from different laboratories, and the interpretation of small differences in bone collagen and hydroxyapatite

  13. Quantifying inter-laboratory variability in stable isotope analysis of ancient skeletal remains.

    Science.gov (United States)

    Pestle, William J; Crowley, Brooke E; Weirauch, Matthew T

    2014-01-01

    Over the past forty years, stable isotope analysis of bone (and tooth) collagen and hydroxyapatite has become a mainstay of archaeological and paleoanthropological reconstructions of paleodiet and paleoenvironment. Despite this method's frequent use across anthropological subdisciplines (and beyond), the present work represents the first attempt at gauging the effects of inter-laboratory variability engendered by differences in a) sample preparation, and b) analysis (instrumentation, working standards, and data calibration). Replicate analyses of a 14C-dated ancient human bone by twenty-one archaeological and paleoecological stable isotope laboratories revealed significant inter-laboratory isotopic variation for both collagen and carbonate. For bone collagen, we found a sizeable range of 1.8‰ for δ13Ccol and 1.9‰ for δ15Ncol among laboratories, but an interpretatively insignificant average pairwise difference of 0.2‰ and 0.4‰ for δ13Ccol and δ15Ncol respectively. For bone hydroxyapatite the observed range increased to a troublingly large 3.5‰ for δ13Cap and 6.7‰ for δ18Oap, with average pairwise differences of 0.6‰ for δ13Cap and a disquieting 2.0‰ for δ18Oap. In order to assess the effects of preparation versus analysis on isotopic variability among laboratories, a subset of the samples prepared by the participating laboratories were analyzed a second time on the same instrument. Based on this duplicate analysis, it was determined that roughly half of the isotopic variability among laboratories could be attributed to differences in sample preparation, with the other half resulting from differences in analysis (instrumentation, working standards, and data calibration). These findings have serious implications for choices made in the preparation and extraction of target biomolecules, the comparison of results obtained from different laboratories, and the interpretation of small differences in bone collagen and hydroxyapatite isotope values

  14. Research and Development of Statistical Analysis Software System of Maize Seedling Experiment

    OpenAIRE

    Hui Cao

    2014-01-01

    In this study, software engineer measures were used to develop a set of software system for maize seedling experiments statistics and analysis works. During development works, B/S structure software design method was used and a set of statistics indicators for maize seedling evaluation were established. The experiments results indicated that this set of software system could finish quality statistics and analysis for maize seedling very well. The development of this software system explored a...

  15. Chromatographic speciation of Cr(III)-species, inter-species equilibrium isotope fractionation and improved chemical purification strategies for high-precision isotope analysis.

    Science.gov (United States)

    Larsen, K K; Wielandt, D; Schiller, M; Bizzarro, M

    2016-04-22

    Chromatographic purification of chromium (Cr), which is required for high-precision isotope analysis, is complicated by the presence of multiple Cr-species with different effective charges in the acid digested sample aliquots. The differing ion exchange selectivity and sluggish reaction rates of these species can result in incomplete Cr recovery during chromatographic purification. Because of large mass-dependent inter-species isotope fractionation, incomplete recovery can affect the accuracy of high-precision Cr isotope analysis. Here, we demonstrate widely differing cation distribution coefficients of Cr(III)-species (Cr(3+), CrCl(2+) and CrCl2(+)) with equilibrium mass-dependent isotope fractionation spanning a range of ∼1‰/amu and consistent with theory. The heaviest isotopes partition into Cr(3+), intermediates in CrCl(2+) and the lightest in CrCl2(+)/CrCl3°. Thus, for a typical reported loss of ∼25% Cr (in the form of Cr(3+)) through chromatographic purification, this translates into 185 ppm/amu offset in the stable Cr isotope ratio of the residual sample. Depending on the validity of the mass-bias correction during isotope analysis, this further results in artificial mass-independent effects in the mass-bias corrected (53)Cr/(52)Cr (μ(53)Cr* of 5.2 ppm) and (54)Cr/(52)Cr (μ(54)Cr* of 13.5 ppm) components used to infer chronometric and nucleosynthetic information in meteorites. To mitigate these fractionation effects, we developed strategic chemical sample pre-treatment procedures that ensure high and reproducible Cr recovery. This is achieved either through 1) effective promotion of Cr(3+) by >5 days exposure to HNO3H2O2 solutions at room temperature, resulting in >∼98% Cr recovery for most types of sample matrices tested using a cationic chromatographic retention strategy, or 2) formation of Cr(III)-Cl complexes through exposure to concentrated HCl at high temperature (>120 °C) for several hours, resulting in >97.5% Cr recovery using a

  16. Lead isotopic compositions of environmental certified reference materials for an inter-laboratory comparison of lead isotope analysis

    International Nuclear Information System (INIS)

    Aung, Nyein Nyein; Uryu, Tsutomu; Yoshinaga, Jun

    2004-01-01

    Lead isotope ratios, viz. 207 Pb/ 206 Pb and 208 Pb/ 206 Pb, of the commercially available certified reference materials (CRMs) issued in Japan are presented with an objective to provide a data set, which will be useful for the quality assurance of analytical procedures, instrumental performance and method validation of the laboratories involved in environmental lead isotope ratio analysis. The analytical method used in the present study was inductively coupled plasma quadrupole mass spectrometry (ICPQMS) presented by acid digestion and with/without chemical separation of lead from the matrix. The precision of the measurements in terms of the relative standard deviation (RSD) of triplicated analyses was 0.19% and 0.14%, for 207 Pb/ 206 Pb and 208 Pb/ 206 Pb, respectively. The trueness of lead isotope ratio measurements of the present study was tested with a few CRMs, which have been analyzed by other analytical methods and reported in various literature. The lead isotopic ratios of 18 environmental matrix CRMs (including 6 CRMs analyzed for our method validation) are presented and the distribution of their ratios is briefly discussed. (author)

  17. Analysis of growth and tissue replacement rates by stable sulfur isotope turnover.

    Science.gov (United States)

    Arneson, L. S.; Macko, S. A.; Macavoy, S. E.

    2003-12-01

    Stable isotope analysis has become a powerful tool to study animal ecology. Analysis of stable isotope ratios of elements such as carbon, nitrogen, sulfur, hydrogen, oxygen and others have been used to trace migratory routes, reconstruct dietary sources and determine the physiological condition of individual animals. The isotopes most commonly used are carbon, due to differential carbon fractionation in C3 and C4 plants, and nitrogen, due to the approximately 3% enrichment in 15N per trophic level. Although all cells express sulfur-containing compounds, such as cysteine, methionine, and coenzyme A, the turnover rate of sulfur in tissues has not been examined in most studies, owing to the difficulty in determining the δ 34S signature. In this study, we have assessed the rate of sulfur isotopic turnover in mouse tissues following a diet change from terrestrial (7%) to marine (19%) source. Turnover models reflecting both growth rate and metabolic tissue replacement will be developed for blood, liver, fat and muscle tissues.

  18. Continuous Software Quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The regular application of software quality tools in large collaborative projects is required to reduce code defects to an acceptable level. If left unchecked the accumulation of defects invariably results in performance degradation at scale and problems with the long-term maintainability of the code. Although software quality tools are effective for identification there remains a non-trivial sociological challenge to resolve defects in a timely manner. This is a ongoing concern for the ATLAS software which has evolved over many years to meet the demands of Monte Carlo simulation, detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by a community of hundreds of developers worldwide. It is therefore preferable to address code defects before they are introduced into a widely used software release. Recent wholesale changes to the ATLAS software infrastructure have provided an ideal opportunity to apply software quali...

  19. Design and validation of Segment - freely available software for cardiovascular image analysis

    International Nuclear Information System (INIS)

    Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan

    2010-01-01

    Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page (http://segment.heiberg.se). Segment

  20. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    1996-01-01

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  1. In Situ Carbon Isotope Analysis by Laser Ablation MC-ICP-MS.

    Science.gov (United States)

    Chen, Wei; Lu, Jue; Jiang, Shao-Yong; Zhao, Kui-Dong; Duan, Deng-Fei

    2017-12-19

    Carbon isotopes have been widely used in tracing a wide variety of geological and environmental processes. The carbon isotope composition of bulk rocks and minerals was conventionally analyzed by isotope ratio mass spectrometry (IRMS), and, more recently, secondary ionization mass spectrometry (SIMS) has been widely used to determine carbon isotope composition of carbon-bearing solid materials with good spatial resolution. Here, we present a new method that couples a RESOlution S155 193 nm laser ablation system with a Nu Plasma II MC-ICP-MS, with the aim of measuring carbon isotopes in situ in carbonate minerals (i.e., calcite and aragonite). Under routine operating conditions for δ 13 C analysis, instrumental bias generally drifts by 0.8‰-2.0‰ in a typical analytical session of 2-3 h. Using a magmatic calcite as the standard, the carbon isotopic composition was determined for a suite of calcite samples with δ 13 C values in the range of -6.94‰ to 1.48‰. The obtained δ 13 C data are comparable to IRMS values. The combined standard uncertainty for magmatic calcite is ICP-MS can serve as an appropriate method to analyze carbon isotopes of carbonate minerals in situ.

  2. A software package for biomedical image processing and analysis

    International Nuclear Information System (INIS)

    Goncalves, J.G.M.; Mealha, O.

    1988-01-01

    The decreasing cost of computing power and the introduction of low cost imaging boards justifies the increasing number of applications of digital image processing techniques in the area of biomedicine. There is however a large software gap to be fulfilled, between the application and the equipment. The requirements to bridge this gap are twofold: good knowledge of the hardware provided and its interface to the host computer, and expertise in digital image processing and analysis techniques. A software package incorporating these two requirements was developed using the C programming language, in order to create a user friendly image processing programming environment. The software package can be considered in two different ways: as a data structure adapted to image processing and analysis, which acts as the backbone and the standard of communication for all the software; and as a set of routines implementing the basic algorithms used in image processing and analysis. Hardware dependency is restricted to a single module upon which all hardware calls are based. The data structure that was built has four main features: hierchical, open, object oriented, and object dependent dimensions. Considering the vast amount of memory needed by imaging applications and the memory available in small imaging systems, an effective image memory management scheme was implemented. This software package is being used for more than one and a half years by users with different applications. It proved to be an excellent tool for helping people to get adapted into the system, and for standardizing and exchanging software, yet preserving flexibility allowing for users' specific implementations. The philosophy of the software package is discussed and the data structure that was built is described in detail

  3. A tool to include gamma analysis software into a quality assurance program.

    Science.gov (United States)

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  4. 13C metabolic flux analysis: optimal design of isotopic labeling experiments.

    Science.gov (United States)

    Antoniewicz, Maciek R

    2013-12-01

    Measuring fluxes by 13C metabolic flux analysis (13C-MFA) has become a key activity in chemical and pharmaceutical biotechnology. Optimal design of isotopic labeling experiments is of central importance to 13C-MFA as it determines the precision with which fluxes can be estimated. Traditional methods for selecting isotopic tracers and labeling measurements did not fully utilize the power of 13C-MFA. Recently, new approaches were developed for optimal design of isotopic labeling experiments based on parallel labeling experiments and algorithms for rational selection of tracers. In addition, advanced isotopic labeling measurements were developed based on tandem mass spectrometry. Combined, these approaches can dramatically improve the quality of 13C-MFA results with important applications in metabolic engineering and biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Impact analysis of a hydrogen isotopes container

    International Nuclear Information System (INIS)

    Lee, M. S.; Hwang, C. S.; Jeong, H. S.

    2003-01-01

    The container used for the radioactive materials, containing hydrogen isotopes is evaluated in a view of hypothetical accident. The computational analysis is a cost effective tool to minimize testing and streamline the regulatory procedures, and supports experimental programs to qualify the container for the safe transport of radioactive materials. The numerical analysis of 9m free-drop onto a flat unyielding, horizontal surface has been performed using the explicit finite element computer program ABAQUS. Especially free-drop simulations for 30 .deg. C tilted condition are precisely estimated

  6. HeteroGenius: A Framework for Hybrid Analysis of Heterogeneous Software Specifications

    Directory of Open Access Journals (Sweden)

    Manuel Giménez

    2014-01-01

    Full Text Available Nowadays, software artifacts are ubiquitous in our lives being an essential part of home appliances, cars, cell phones, and even in more critical activities like aeronautics and health sciences. In this context software failures may produce enormous losses, either economical or, in the worst case, in human lives. Software analysis is an area in software engineering concerned with the application of diverse techniques in order to prove the absence of errors in software pieces. In many cases different analysis techniques are applied by following specific methodological combinations that ensure better results. These interactions between tools are usually carried out at the user level and it is not supported by the tools. In this work we present HeteroGenius, a framework conceived to develop tools that allow users to perform hybrid analysis of heterogeneous software specifications. HeteroGenius was designed prioritising the possibility of adding new specification languages and analysis tools and enabling a synergic relation of the techniques under a graphical interface satisfying several well-known usability enhancement criteria. As a case-study we implemented the functionality of Dynamite on top of HeteroGenius.

  7. Isotope dilution analysis of environmental samples

    International Nuclear Information System (INIS)

    Tolgyessy, J.; Lesny, J.; Korenova, Z.; Klas, J.; Klehr, E.H.

    1986-01-01

    Isotope dilution analysis has been used for the determination of several trace elements - especially metals - in a variety of environmental samples, including aerosols, water, soils, biological materials and geological materials. Variations of the basic concept include classical IDA, substoichiometric IDA, and more recently, sub-superequivalence IDA. Each variation has its advantages and limitations. A periodic chart has been used to identify those elements which have been measured in environmental samples using one or more of these methods. (author)

  8. Using recurrence plot analysis for software execution interpretation and fault detection

    Science.gov (United States)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  9. Peak fitting and identification software library for high resolution gamma-ray spectra

    International Nuclear Information System (INIS)

    Uher, Josef; Roach, Greg; Tickner, James

    2010-01-01

    A new gamma-ray spectral analysis software package is under development in our laboratory. It can be operated as a stand-alone program or called as a software library from Java, C, C++ and MATLAB TM environments. It provides an advanced graphical user interface for data acquisition, spectral analysis and radioisotope identification. The code uses a peak-fitting function that includes peak asymmetry, Compton continuum and flexible background terms. Peak fitting function parameters can be calibrated as functions of energy. Each parameter can be constrained to improve fitting of overlapping peaks. All of these features can be adjusted by the user. To assist with peak identification, the code can automatically measure half-lives of single or multiple overlapping peaks from a time series of spectra. It implements library-based peak identification, with options for restricting the search based on radioisotope half-lives and reaction types. The software also improves the reliability of isotope identification by utilizing Monte-Carlo simulation results.

  10. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  11. Change Impact Analysis of Crosscutting in Software Architectural Design

    NARCIS (Netherlands)

    van den Berg, Klaas

    2006-01-01

    Software architectures should be amenable to changes in user requirements and implementation technology. The analysis of the impact of these changes can be based on traceability of architectural design elements. Design elements have dependencies with other software artifacts but also evolve in time.

  12. Hanford Isotope Project strategic business analysis Cesium-137 (Cs-137)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-10-01

    The purpose of this business analysis is to address the beneficial reuse of Cesium 137 (Cs-137) in order to utilize a valuable national asset and possibly save millions of tax dollars. Food irradiation is the front runner application along with other uses. This business analysis supports the objectives of the Department of Energy National Isotope Strategy distributed in August 1994 which describes the DOE plans for the production and distribution of isotope products and services. As part of the Department`s mission as stated in that document. ``The Department of Energy will also continue to produce and distribute other radioisotopes and enriched stable isotopes for medical diagnostics and therapeutics, industrial, agricultural, and other useful applications on a businesslike basis. This is consistent with the goals and objectives of the National Performance Review. The Department will endeavor to look at opportunities for private sector to co-fund or invest in new ventures. Also, the Department will seek to divest from ventures that can more profitably or reliably be operated by the private sector.``

  13. Hanford Isotope Project strategic business analysis Cesium-137 (Cs-137)

    International Nuclear Information System (INIS)

    1995-10-01

    The purpose of this business analysis is to address the beneficial reuse of Cesium 137 (Cs-137) in order to utilize a valuable national asset and possibly save millions of tax dollars. Food irradiation is the front runner application along with other uses. This business analysis supports the objectives of the Department of Energy National Isotope Strategy distributed in August 1994 which describes the DOE plans for the production and distribution of isotope products and services. As part of the Department's mission as stated in that document. ''The Department of Energy will also continue to produce and distribute other radioisotopes and enriched stable isotopes for medical diagnostics and therapeutics, industrial, agricultural, and other useful applications on a businesslike basis. This is consistent with the goals and objectives of the National Performance Review. The Department will endeavor to look at opportunities for private sector to co-fund or invest in new ventures. Also, the Department will seek to divest from ventures that can more profitably or reliably be operated by the private sector.''

  14. Basic concepts and formulations for isotope geochemical modelling of groundwater systems

    International Nuclear Information System (INIS)

    Kalin, R.M.

    1996-01-01

    This chapter describes the basic chemical principles and methodologies for geochemical models and their use in the field of isotope hydrology. Examples of calculation procedures are given on actual field data. Summary information on available PC software for geochemical modeling is included. The specific software, NETPATH, which can be used for chemical speciation, mass balance and isotope balance along a flow path in groundwater systems, is discussed at some length with an illustrative example of its application to field data. (author). Refs, 14 figs, 15 tabs

  15. Basic concepts and formulations for isotope geochemical modelling of groundwater systems

    Energy Technology Data Exchange (ETDEWEB)

    Kalin, R M [The Queen` s University, Belfast, Northern Ireland (United Kingdom). Dept. of Civil Engineering

    1996-10-01

    This chapter describes the basic chemical principles and methodologies for geochemical models and their use in the field of isotope hydrology. Examples of calculation procedures are given on actual field data. Summary information on available PC software for geochemical modeling is included. The specific software, NETPATH, which can be used for chemical speciation, mass balance and isotope balance along a flow path in groundwater systems, is discussed at some length with an illustrative example of its application to field data. (author). Refs, 14 figs, 15 tabs.

  16. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    Science.gov (United States)

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.

  17. Neutron activation analysis-comparative (NAAC)

    International Nuclear Information System (INIS)

    Zimmer, W.H.

    1979-01-01

    A software system for the reduction of comparative neutron activation analysis data is presented. Libraries are constructed to contain the elemental composition and isotopic nuclear data of an unlimited number of standards. Ratios to unknown sample data are performed by standard calibrations. Interfering peak corrections, second-order activation-product corrections, and deconvolution of multiplets are applied automatically. Passive gamma-energy analysis can be performed with the same software. 3 figures

  18. Development of design and analysis software for advanced nuclear system

    International Nuclear Information System (INIS)

    Wu Yican; Hu Liqin; Long Pengcheng; Luo Yuetong; Li Yazhou; Zeng Qin; Lu Lei; Zhang Junjun; Zou Jun; Xu Dezheng; Bai Yunqing; Zhou Tao; Chen Hongli; Peng Lei; Song Yong; Huang Qunying

    2010-01-01

    A series of professional codes, which are necessary software tools and data libraries for advanced nuclear system design and analysis, were developed by the FDS Team, including the codes of automatic modeling, physics and engineering calculation, virtual simulation and visualization, system engineering and safety analysis and the related database management etc. The development of these software series was proposed as an exercise of development of nuclear informatics. This paper introduced the main functions and key techniques of the software series, as well as some tests and practical applications. (authors)

  19. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  20. Practicality for Software Hazard Analysis for Nuclear Safety I and C System

    International Nuclear Information System (INIS)

    Kim, Yong-Ho; Moon, Kwon-Ki; Chang, Young-Woo; Jeong, Soo-Hyun

    2016-01-01

    We are using the concept of system safety in engineering. It is difficult to make any system perfectly safe and probably a complete system may not easily be achieved. The standard definition of a system from MIL-STD- 882E is: “The organization of hardware, software, material, facilities, personnel, data, and services needed to perform a designated function within a stated environment with specified results.” From the perspective of the system safety engineer and the hazard analysis process, software is considered as a subsystem. Regarding hazard analysis, to date, methods for identifying software failures and determining their effects is still a research problem. Since the success of software development is based on rigorous test of hardware and software, it is necessary to check the balance between software test and hardware test, and in terms of efficiency. Lessons learned and experience from similar systems are important for the work of hazard analysis. No major hazard has been issued for the software developed and verified in Korean NPPs. In addition to hazard analysis, software development, and verification and validation were thoroughly performed. It is reasonable that the test implementation including the development of the test case, stress and abnormal conditions, error recovery situations, and high risk hazardous situations play a key role in detecting and preventing software faults

  1. Practicality for Software Hazard Analysis for Nuclear Safety I and C System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong-Ho; Moon, Kwon-Ki; Chang, Young-Woo; Jeong, Soo-Hyun [KEPCO Engineering and Construction Co., Deajeon (Korea, Republic of)

    2016-10-15

    We are using the concept of system safety in engineering. It is difficult to make any system perfectly safe and probably a complete system may not easily be achieved. The standard definition of a system from MIL-STD- 882E is: “The organization of hardware, software, material, facilities, personnel, data, and services needed to perform a designated function within a stated environment with specified results.” From the perspective of the system safety engineer and the hazard analysis process, software is considered as a subsystem. Regarding hazard analysis, to date, methods for identifying software failures and determining their effects is still a research problem. Since the success of software development is based on rigorous test of hardware and software, it is necessary to check the balance between software test and hardware test, and in terms of efficiency. Lessons learned and experience from similar systems are important for the work of hazard analysis. No major hazard has been issued for the software developed and verified in Korean NPPs. In addition to hazard analysis, software development, and verification and validation were thoroughly performed. It is reasonable that the test implementation including the development of the test case, stress and abnormal conditions, error recovery situations, and high risk hazardous situations play a key role in detecting and preventing software faults.

  2. Development of Cell Analysis Software for Cultivated Corneal Endothelial Cells.

    Science.gov (United States)

    Okumura, Naoki; Ishida, Naoya; Kakutani, Kazuya; Hongo, Akane; Hiwa, Satoru; Hiroyasu, Tomoyuki; Koizumi, Noriko

    2017-11-01

    To develop analysis software for cultured human corneal endothelial cells (HCECs). Software was designed to recognize cell borders and to provide parameters such as cell density, coefficient of variation, and polygonality of cultured HCECs based on phase contrast images. Cultured HCECs with high or low cell density were incubated with Ca-free and Mg-free phosphate-buffered saline for 10 minutes to reveal the cell borders and were then analyzed with software (n = 50). Phase contrast images showed that cell borders were not distinctly outlined, but these borders became more distinctly outlined after phosphate-buffered saline treatment and were recognized by cell analysis software. The cell density value provided by software was similar to that obtained using manual cell counting by an experienced researcher. Morphometric parameters, such as the coefficient of variation and polygonality, were also produced by software, and these values were significantly correlated with cell density (Pearson correlation coefficients -0.62 and 0.63, respectively). The software described here provides morphometric information from phase contrast images, and it enables subjective and noninvasive quality assessment for tissue engineering therapy of the corneal endothelium.

  3. Quantitative mass-spectrometric analysis of hydrogen helium isotope mixtures

    International Nuclear Information System (INIS)

    Langer, U.

    1998-12-01

    This work deals with the mass-spectrometric method for the quantitative analysis of hydrogen-helium-isotope mixtures, with special attention to fusion plasma diagnostics. The aim was to use the low-resolution mass spectrometry, a standard measuring method which is well established in science and industry. This task is solved by means of the vector mass spectrometry, where a mass spectrum is repeatedly measured, but with stepwise variation of the parameter settings of a quadruple mass spectrometer. In this way, interfering mass spectra can be decomposed and, moreover, it is possible to analyze underestimated mass spectra of complex hydrogen-helium-isotope mixtures. In this work experimental investigations are presented which show that there are different parameters which are suitable for the UMS-method. With an optimal choice of the parameter settings hydrogen-helium-isotope mixtures can be analyzed with an accuracy of 1-3 %. In practice, a low sensitivity for small helium concentration has to be noted. To cope with this task, a method for selective hydrogen pressure reduction has been developed. Experimental investigations and calculations show that small helium amounts (about 1 %) in a hydrogen atmosphere can be analyzed with an accuracy of 3 - 10 %. Finally, this work deals with the effects of the measuring and calibration error on the resulting error in spectrum decomposition. This aspect has been investigated both in general mass-spectrometric gas analysis and in the analysis of hydrogen-helium-mixtures by means of the vector mass spectrometry. (author)

  4. Hydrology of Bishop Creek, California: An Isotopic Analysis

    Science.gov (United States)

    Michael L. Space; John W. Hess; Stanley D. Smith

    1989-01-01

    Five power generation plants along an eleven kilometer stretch divert Bishop Creek water for hydro-electric power. Stream diversion may be adversely affecting the riparian vegetation. Stable isotopic analysis is employed to determine surface water/ground-water interactions along the creek. surface water originates primarily from three headwater lakes. Discharge into...

  5. Study of gamma ray analysis software's. Application to activation analysis of geological samples

    International Nuclear Information System (INIS)

    Silva, Luiz Roberto Nogueira da

    1998-01-01

    A comparative evaluation of the gamma-ray analysis software VISPECT, in relation to two commercial gamma-ray analysis software packages, OMNIGAM (EG and G Ortec) and SAMPO 90 (Canberra) was performed. For this evaluation, artificial gamma ray spectra were created, presenting peaks of different intensities and located at four different regions of the spectrum. Multiplet peaks with equal and different intensities, but with different channel separations, were also created. The results obtained showed a good performance of VISPECT in detecting and analysing single and multiplet peaks of different intensities in the gamma-ray spectrum. Neutron activation analysis of the geological reference material GS-N (IWG-GIT) and of the granite G-94, used in a Proficiency Testing Trial of Analytical Geochemistry Laboratories, was also performed , in order to evaluate the VISEPCT software in the analysis of real samples. The results obtained by using VISPECT were as good or better than the ones obtained using the other programs. (author)

  6. Development of interactive software for fuel management analysis

    International Nuclear Information System (INIS)

    Graves, H.W. Jr.

    1986-01-01

    Electronic computation plays a central part in engineering analysis of all types. Utilization of microcomputers for calculations that were formerly carried out on large mainframe computers presents a unique opportunity to develop software that not only takes advantage of the lower cost of using these machines, but also increases the efficiency of the engineers performing these calculations. This paper reviews the use of electronic computers in engineering analysis, discusses the potential for microcomputer utilization in this area, and describes a series of steps to be followed in software development that can yield significant gains in engineering design efficiency

  7. Recent developments in application of stable isotope analysis on agro-product authenticity and traceability.

    Science.gov (United States)

    Zhao, Yan; Zhang, Bin; Chen, Gang; Chen, Ailiang; Yang, Shuming; Ye, Zhihua

    2014-02-15

    With the globalisation of agro-product markets and convenient transportation of food across countries and continents, the potential for distribution of mis-labelled products increases accordingly, highlighting the need for measures to identify the origin of food. High quality food with identified geographic origin is a concern not only for consumers, but also for agriculture farmers, retailers and administrative authorities. Currently, stable isotope ratio analysis in combination with other chemical methods gradually becomes a promising approach for agro-product authenticity and traceability. In the last five years, a growing number of research papers have been published on tracing agro-products by stable isotope ratio analysis and techniques combining with other instruments. In these reports, the global variety of stable isotope compositions has been investigated, including light elements such as C, N, H, O and S, and heavy isotopes variation such as Sr and B. Several factors also have been considered, including the latitude, altitude, evaporation and climate conditions. In the present paper, an overview is provided on the authenticity and traceability of the agro-products from both animal and plant sources by stable isotope ratio analysis. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Lead isotope ratio analysis of bullet samples by using quadrupole ICP-MS

    International Nuclear Information System (INIS)

    Tamura, Shu-ichi; Hokura, Akiko; Nakai, Izumi; Oishi, Masahiro

    2006-01-01

    The measurement conditions for the precise analysis of the lead stable isotope ratio by using an ICP-MS equipped with a quadrupole mass spectrometer were studied in order to apply the technique to the forensic identification of bullet samples. The values of the relative standard deviation obtained for the ratio of 208 Pb/ 206 Pb, 207 Pb/ 206 Pb and 204 Pb/ 206 Pb were lower than 0.2% after optimization of the analytical conditions, including the optimum lead concentration of the sample solution to be about 70 ppb and an integration time for 1 m/s of 15 s. This method was applied to an analysis of lead in bullets for rifles and handguns; a stable isotope ratio of lead was found to be suitable for the identification of bullets. This study has demonstrated that the lead isotope ratio measured by using a quadrupole ICP-MS was useful for a practical analysis of bullet samples in forensic science. (author)

  9. MAUS: MICE Analysis User Software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The Muon Ionization Cooling Experiment (MICE) has developed the MICE Analysis User Software (MAUS) to simulate and analyse experimental data. It serves as the primary codebase for the experiment, providing for online data quality checks and offline batch simulation and reconstruction. The code is structured in a Map-Reduce framework to allow parallelization whether on a personal machine or in the control room. Various software engineering practices from industry are also used to ensure correct and maintainable physics code, which include unit, functional and integration tests, continuous integration and load testing, code reviews, and distributed version control systems. Lastly, there are various small design decisions like using JSON as the data structure, using SWIG to allow developers to write components in either Python or C++, or using the SCons python-based build system that may be of interest to other experiments.

  10. An analysis software of tritium distribution in food and environmental water in China

    International Nuclear Information System (INIS)

    Li Wenhong; Xu Cuihua; Ren Tianshan; Deng Guilong

    2006-01-01

    Objective: The purpose of developing this analysis-software of tritium distribution in food and environmental water is to collect tritium monitoring data, to analyze the data, both automatically, statistically and graphically, and to study and share the data. Methods: Based on the data obtained before, analysis-software is wrote by using VC++. NET as tool software. The software first transfers data from EXCEL into a database. It has additive function of data-append, so operators can embody new monitoring data easily. Results: After turning the monitoring data saved as EXCEL file by original researchers into a database, people can easily access them. The software provides a tool of distributing-analysis of tritium. Conclusion: This software is a first attempt of data-analysis about tritium level in food and environmental water in China. Data achieving, searching and analyzing become easily and directly with the software. (authors)

  11. Hydrogen isotope analysis by quadrupole mass spectrometry

    International Nuclear Information System (INIS)

    Ellefson, R.E.; Moddeman, W.E.; Dylla, H.F.

    1981-03-01

    The analysis of isotopes of hydrogen (H, D, T) and helium ( 3 He, 4 He) and selected impurities using a quadrupole mass spectrometer (QMS) has been investigated as a method of measuring the purity of tritium gas for injection into the Tokamak Fusion Test Reactor (TFTR). A QMS was used at low resolution, m/Δm 3 He, and 4 He in HT/D 2

  12. Incorporation of a Cuban radiological station to the global net of isotopes in precipitations

    International Nuclear Information System (INIS)

    Dominguez L, O.; Ramos V, E.O.; Prendes A, M.; Alonso A, D.; Caveda R, C.A.

    2006-01-01

    From March, 2002 the West station of the National Net of Environmental Radiological Surveillance located in the Center of Protection and Hygiene of the Radiations, belongs to the Global Net of Isotopes in Precipitations. The obtained isotopic information of the analysis of the samples of monthly monitored precipitations (oxygen-18, deuterium and tritium) its are stored in a database, which is available through Internet. For the acceptance in the Global Net, it was necessary the incorporation to the monitoring of the station the meteorological surface variables. Also it was developed a software for the calculation of the tension of the water steam starting from the values of humidity and temperature. The obtained results in 2002 and published recently, its are inside the range of values reported for these isotopes in the Caribbean area. (Author)

  13. Optimizing sample pretreatment for compound-specific stable carbon isotopic analysis of amino sugars in marine sediment

    Science.gov (United States)

    Zhu, R.; Lin, Y.-S.; Lipp, J. S.; Meador, T. B.; Hinrichs, K.-U.

    2014-09-01

    Amino sugars are quantitatively significant constituents of soil and marine sediment, but their sources and turnover in environmental samples remain poorly understood. The stable carbon isotopic composition of amino sugars can provide information on the lifestyles of their source organisms and can be monitored during incubations with labeled substrates to estimate the turnover rates of microbial populations. However, until now, such investigation has been carried out only with soil samples, partly because of the much lower abundance of amino sugars in marine environments. We therefore optimized a procedure for compound-specific isotopic analysis of amino sugars in marine sediment, employing gas chromatography-isotope ratio mass spectrometry. The whole procedure consisted of hydrolysis, neutralization, enrichment, and derivatization of amino sugars. Except for the derivatization step, the protocol introduced negligible isotopic fractionation, and the minimum requirement of amino sugar for isotopic analysis was 20 ng, i.e., equivalent to ~8 ng of amino sugar carbon. Compound-specific stable carbon isotopic analysis of amino sugars obtained from marine sediment extracts indicated that glucosamine and galactosamine were mainly derived from organic detritus, whereas muramic acid showed isotopic imprints from indigenous bacterial activities. The δ13C analysis of amino sugars provides a valuable addition to the biomarker-based characterization of microbial metabolism in the deep marine biosphere, which so far has been lipid oriented and biased towards the detection of archaeal signals.

  14. Is it really organic? – Multi-isotopic analysis as a tool to discriminate between organic and conventional plants

    DEFF Research Database (Denmark)

    Laursen, K.H.; Mihailova, A.; Kelly, S.D.

    2013-01-01

    for discrimination of organically and conventionally grown plants. The study was based on wheat, barley, faba bean and potato produced in rigorously controlled long-term field trials comprising 144 experimental plots. Nitrogen isotope analysis revealed the use of animal manure, but was unable to discriminate between......Novel procedures for analytical authentication of organic plant products are urgently needed. Here we present the first study encompassing stable isotopes of hydrogen, carbon, nitrogen, oxygen, magnesium and sulphur as well as compound-specific nitrogen and oxygen isotope analysis of nitrate...... plants that were fertilised with synthetic nitrogen fertilisers or green manures from atmospheric nitrogen fixing legumes. This limitation was bypassed using oxygen isotope analysis of nitrate in potato tubers, while hydrogen isotope analysis allowed complete discrimination of organic and conventional...

  15. Isotope analysis by emission spectroscopy; Analyse isotopique par spectroscopie d'emission

    Energy Technology Data Exchange (ETDEWEB)

    Artaud, J; Gerstenkorn, S [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires; Blaise, J [Centre National de la Recherche Scientifique (CNRS), Lab. Aime Cotton, 92 - Meudon-Bellevue (France)

    1959-07-01

    Quantitative analysis of isotope mixtures by emission spectroscopy is resulting from the phenomenon called 'isotope shift', say from the fact that spectral lines produced by a mixture of isotopes of a same element are complex. Every spectral line is, indeed, resulting from several lines respectively corresponding to each isotope. Then isotopic components are near one to others, and their separation is effected by means of Fabry-Perot calibration standard: the apparatus allowing to measure abundances is the Fabry-Perot photo-electric spectrometer, designed in 1948 by MM. JACQUINOT and DUFOUR. This method has been used to make abundance determination in the case of helium, lithium, lead and uranium. In the case of lithium, the utilised analysis line depends on the composition of examined isotopic mixture. For mixtures containing 7 to 93 pour cent of one of isotopes of lithium, this line is the lithium blue line: {lambda} = 4603 angstrom. In other cases the red line {lambda} = 6707 angstrom is preferable, though it allows to do easily nothing but relative determinations. Helium shows no particular difficulty and the analysis line selected was {lambda} = 6678 angstrom. For lead the line {lambda} = 5201 angstrom gives the possibility to determine the isotope abundance for the four isotopes of lead notwithstanding the presence of hyperfine structure of {sup 207}Pb. For uranium, line {lambda} 5027 angstrom is used, and this method allows to determine the composition of isotope mixtures, the content of which in {sup 235}U may shorten to 0,1 per cent. Relative precision is about 2 per cent for contents in {sup 235}U over 1 per cent. For lower contents, this line {lambda} = 5027 angstrom will allow relative measures when using previously dosed mixtures. (author) [French] L'analyse quantitative des melanges isotopiques par spectroscopie d'emission doit son existence au phenomene appele 'deplacement isotopique', c'est-a-dire au fait que les raies spectrales emises par un

  16. On-Site Inspection RadioIsotopic Spectroscopy (Osiris) System Development

    Energy Technology Data Exchange (ETDEWEB)

    Caffrey, Gus J. [Idaho National Laboratory, Idaho Falls, ID (United States); Egger, Ann E. [Idaho National Laboratory, Idaho Falls, ID (United States); Krebs, Kenneth M. [Idaho National Laboratory, Idaho Falls, ID (United States); Milbrath, B. D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jordan, D. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Warren, G. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wilmer, N. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-09-01

    We have designed and tested hardware and software for the acquisition and analysis of high-resolution gamma-ray spectra during on-site inspections under the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The On-Site Inspection RadioIsotopic Spectroscopy—Osiris—software filters the spectral data to display only radioisotopic information relevant to CTBT on-site inspections, e.g.,132I. A set of over 100 fission-product spectra was employed for Osiris testing. These spectra were measured, where possible, or generated by modeling. The synthetic test spectral compositions include non-nuclear-explosion scenarios, e.g., a severe nuclear reactor accident, and nuclear-explosion scenarios such as a vented underground nuclear test. Comparing its computer-based analyses to expert visual analyses of the test spectra, Osiris correctly identifies CTBT-relevant fission product isotopes at the 95% level or better.The Osiris gamma-ray spectrometer is a mechanically-cooled, battery-powered ORTEC Transpec-100, chosen to avoid the need for liquid nitrogen during on-site inspections. The spectrometer was used successfully during the recent 2014 CTBT Integrated Field Exercise in Jordan. The spectrometer is controlled and the spectral data analyzed by a Panasonic Toughbook notebook computer. To date, software development has been the main focus of the Osiris project. In FY2016-17, we plan to modify the Osiris hardware, integrate the Osiris software and hardware, and conduct rigorous field tests to ensure that the Osiris system will function correctly during CTBT on-site inspections. The planned development will raise Osiris to technology readiness level TRL-8; transfer the Osiris technology to a commercial manufacturer, and demonstrate Osiris to potential CTBT on-site inspectors.

  17. On-Site Inspection RadioIsotopic Spectroscopy (Osiris) System Development

    International Nuclear Information System (INIS)

    Caffrey, Gus J.; Egger, Ann E.; Krebs, Kenneth M.; Milbrath, B. D.; Jordan, D. V.; Warren, G. A.; Wilmer, N. G.

    2015-01-01

    We have designed and tested hardware and software for the acquisition and analysis of high-resolution gamma-ray spectra during on-site inspections under the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The On-Site Inspection RadioIsotopic Spectroscopy-Osiris-software filters the spectral data to display only radioisotopic information relevant to CTBT on-site inspections, e.g.,132I. A set of over 100 fission-product spectra was employed for Osiris testing. These spectra were measured, where possible, or generated by modeling. The synthetic test spectral compositions include non-nuclear-explosion scenarios, e.g., a severe nuclear reactor accident, and nuclear-explosion scenarios such as a vented underground nuclear test. Comparing its computer-based analyses to expert visual analyses of the test spectra, Osiris correctly identifies CTBT-relevant fission product isotopes at the 95% level or better.The Osiris gamma-ray spectrometer is a mechanically-cooled, battery-powered ORTEC Transpec-100, chosen to avoid the need for liquid nitrogen during on-site inspections. The spectrometer was used successfully during the recent 2014 CTBT Integrated Field Exercise in Jordan. The spectrometer is controlled and the spectral data analyzed by a Panasonic Toughbook notebook computer. To date, software development has been the main focus of the Osiris project. In FY2016-17, we plan to modify the Osiris hardware, integrate the Osiris software and hardware, and conduct rigorous field tests to ensure that the Osiris system will function correctly during CTBT on-site inspections. The planned development will raise Osiris to technology readiness level TRL-8; transfer the Osiris technology to a commercial manufacturer, and demonstrate Osiris to potential CTBT on-site inspectors.

  18. Stable isotope analysis of precipitation samples obtained via crowdsourcing reveals the spatiotemporal evolution of Superstorm Sandy.

    Directory of Open Access Journals (Sweden)

    Stephen P Good

    Full Text Available Extra-tropical cyclones, such as 2012 Superstorm Sandy, pose a significant climatic threat to the northeastern United Sates, yet prediction of hydrologic and thermodynamic processes within such systems is complicated by their interaction with mid-latitude water patterns as they move poleward. Fortunately, the evolution of these systems is also recorded in the stable isotope ratios of storm-associated precipitation and water vapor, and isotopic analysis provides constraints on difficult-to-observe cyclone dynamics. During Superstorm Sandy, a unique crowdsourced approach enabled 685 precipitation samples to be obtained for oxygen and hydrogen isotopic analysis, constituting the largest isotopic sampling of a synoptic-scale system to date. Isotopically, these waters span an enormous range of values (> 21‰ for δ(18O, > 160‰ for δ(2H and exhibit strong spatiotemporal structure. Low isotope ratios occurred predominantly in the west and south quadrants of the storm, indicating robust isotopic distillation that tracked the intensity of the storm's warm core. Elevated values of deuterium-excess (> 25‰ were found primarily in the New England region after Sandy made landfall. Isotope mass balance calculations and Lagrangian back-trajectory analysis suggest that these samples reflect the moistening of dry continental air entrained from a mid-latitude trough. These results demonstrate the power of rapid-response isotope monitoring to elucidate the structure and dynamics of water cycling within synoptic-scale systems and improve our understanding of storm evolution, hydroclimatological impacts, and paleo-storm proxies.

  19. Stable isotope analysis of precipitation samples obtained via crowdsourcing reveals the spatiotemporal evolution of Superstorm Sandy.

    Science.gov (United States)

    Good, Stephen P; Mallia, Derek V; Lin, John C; Bowen, Gabriel J

    2014-01-01

    Extra-tropical cyclones, such as 2012 Superstorm Sandy, pose a significant climatic threat to the northeastern United Sates, yet prediction of hydrologic and thermodynamic processes within such systems is complicated by their interaction with mid-latitude water patterns as they move poleward. Fortunately, the evolution of these systems is also recorded in the stable isotope ratios of storm-associated precipitation and water vapor, and isotopic analysis provides constraints on difficult-to-observe cyclone dynamics. During Superstorm Sandy, a unique crowdsourced approach enabled 685 precipitation samples to be obtained for oxygen and hydrogen isotopic analysis, constituting the largest isotopic sampling of a synoptic-scale system to date. Isotopically, these waters span an enormous range of values (> 21‰ for δ(18)O, > 160‰ for δ(2)H) and exhibit strong spatiotemporal structure. Low isotope ratios occurred predominantly in the west and south quadrants of the storm, indicating robust isotopic distillation that tracked the intensity of the storm's warm core. Elevated values of deuterium-excess (> 25‰) were found primarily in the New England region after Sandy made landfall. Isotope mass balance calculations and Lagrangian back-trajectory analysis suggest that these samples reflect the moistening of dry continental air entrained from a mid-latitude trough. These results demonstrate the power of rapid-response isotope monitoring to elucidate the structure and dynamics of water cycling within synoptic-scale systems and improve our understanding of storm evolution, hydroclimatological impacts, and paleo-storm proxies.

  20. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Jyri Pakarinen

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  1. BASTILLE - Better Analysis Software to Treat ILL Experiments - a unified, unifying approach to data reduction and analysis

    International Nuclear Information System (INIS)

    Johnson, M.

    2011-01-01

    Data reduction and analysis is a key component in the production of scientific results. If this component, like any other in the chain, is weak, the final output is compromised. The current situation for data reduction and analysis may be regarded as adequate, but it is variable, depending on the instrument, and should be improved. In particular the delivery of new and upgraded instruments in Millennium Phase I and those proposed for Phase II will bring new demands and challenges for software development. Failure to meet these challenges will hamper the exploitation of higher data rates and the delivery of new science. The proposed project is to provide a single, underpinning software infrastructure for data analysis, which would ensure: 1) a clear vision of software provision at ILL; 2) a clear role for the 'Computing for Science' Group (CS) in maintaining and developing the infrastructure and the codes; 3) a well-defined framework for recruiting and training CS staff; 4) ease and efficiency of development within a common, well-defined software environment; 5) safeguarding of key, existing software; and 6) ease of communication with other software like instrument control software to allow real-time data analysis and experiment control, or software from other institutes or sources

  2. OST: analysis tool for real time software by simulation of material and software environments

    International Nuclear Information System (INIS)

    Boulc'h; Le Meur; Lapassat; Salichon; Segalard

    1988-07-01

    The utilization of microprocessors systems in a nuclear installation control oblige a great operation safety in the installation operation and in the environment protection. For the safety analysis of these installations the Institute of Protection and Nuclear Safety (IPSN) will dispose tools which permit to make controls during all the life of the software. The simulation and test tool (OST) which have been created is completely made by softwares. It is used on VAX calculators and can be easily transportable on other calculators [fr

  3. Development of a software for the curimeter model cdn102

    International Nuclear Information System (INIS)

    Dotres Llera, Armando

    2001-01-01

    The characteristics of the software for the Curimeter Model CD-N102 developed at CEADEN are presented. The software consists of two main parts: a basic software for the electrometer block and an application software for a P C. The basic software is totally independent of the Pc and performs all the basic functions of the process of measurement. The application software is optional and offers a friendlier interface and additional options to the user. Among these is the possibility to keep a statistical record of the measurements in a database, to create labels and to introduce new isotopes and calibrate them. A more detailed explanation of both software is given

  4. The software analysis project for the Office of Human Resources

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  5. Growth history of cultured pearl oysters based on stable oxygen isotope analysis

    Science.gov (United States)

    Nakashima, R.; Furuta, N.; Suzuki, A.; Kawahata, H.; Shikazono, N.

    2007-12-01

    We investigated the oxygen isotopic ratio in shells of the pearl oyster Pinctada martensii cultivated in embayments in Mie Prefecture, central Japan, to evaluate the biomineralization of shell structures of the species and its pearls in response to environmental change. Microsamples for oxygen isotope analysis were collected from the surfaces of shells (outer, middle, and inner shell layers) and pearls. Water temperature variations were estimated from the oxygen isotope values of the carbonate. Oxygen isotope profiles of the prismatic calcite of the outer shell layer reflected seasonal variations of water temperature, whereas those of nacreous aragonites of the middle and inner shell layers and pearls recorded temperatures from April to November, June to September, and July to September, respectively. Lower temperatures in autumn and winter might slow the growth of nacreous aragonites. The oxygen isotope values are controlled by both variations of water temperature and shell structures; the prismatic calcite of this species is useful for reconstructing seasonal changes of calcification temperature.

  6. Metal/glass composites for analysis of hydrogen isotopes by gas-chromatography

    International Nuclear Information System (INIS)

    Nicolae, Constantin Adrian; Sisu, Claudia; Stefanescu, Doina; Stanciu, Vasile

    1999-01-01

    The separation process of hydrogen isotopes by cryogenic distillation or thermal diffusion is a key technology for tritium separation from heavy water in CANDU reactor and for tritium fuel cycle in thermonuclear fusion reactor. In each process, analytical techniques for analyzing the hydrogen isotope mixture are required. An extensive experimental research has been carried out in order to produce the most suitable adsorbents and to establish the best operating conditions for selective separation and analysis of hydrogen isotopes by gas-chromatography. This paper describes the preparation of adsorbent materials used as stationary phases in the gas-chromatographic column for hydrogen isotope separation and the treatment (activation) of stationary phases. Modified thermoresisting glass with Fe(NH 4 ) 2 (SO 4 ) 2 ·6H 2 O and Cr 2 O 3 respectively have been experimentally investigated at 77 K for H 2 , HD and D 2 separation and the results of chromatographic runs are reported and discussed. The gas-chromatographic apparatus used in this study is composed of a Hewlett-Packard 7620A gas-chromatograph equipped with a gas carrier flow rate controller and a thermal conductivity detector. The apparatus comprises also a Dewar vessel containing the separation column. The hydrogen isotopes, H 2 , HD, D 2 , and their mixture have been obtained in our laboratories. The best operating conditions and parameters of the Fe 3+ /glass adsorbent column , i.e. granulometry, column length, pressure-drop along the column, carrier gas flow rate and sample volume have been studied by means of the analysis of the retention times, separation factors and HETP. (authors)

  7. Assessing connectivity of estuarine fishes based on stable isotope ratio analysis

    Science.gov (United States)

    Herzka, Sharon Z.

    2005-07-01

    Assessing connectivity is fundamental to understanding the population dynamics of fishes. I propose that isotopic analyses can greatly contribute to studies of connectivity in estuarine fishes due to the high diversity of isotopic signatures found among estuarine habitats and the fact that variations in isotopic composition at the base of a food web are reflected in the tissues of consumers. Isotopic analysis can be used for identifying nursery habitats and estimating their contribution to adult populations. If movement to a new habitat is accompanied by a shift to foods of distinct isotopic composition, recent immigrants and residents can be distinguished based on their isotopic ratios. Movement patterns thus can be reconstructed based on information obtained from individuals. A key consideration is the rate of isotopic turnover, which determines the length of time that an immigrant to a given habitat will be distinguishable from a longtime resident. A literature survey indicated that few studies have measured turnover rates in fishes and that these have focused on larvae and juveniles. These studies reveal that biomass gain is the primary process driving turnover rates, while metabolic turnover is either minimal or undetectable. Using a simple dilution model and biomass-specific growth rates, I estimated that young fishes with fast growth rates will reflect the isotopic composition of a new diet within days or weeks. Older or slower-growing individuals may take years or never fully equilibrate. Future studies should evaluate the factors that influence turnover rates in fishes during various stages of the life cycle and in different tissues, as well as explore the potential for combining stable isotope and otolith microstructure analyses to examine the relationship between demographic parameters, movement and connectivity.

  8. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    Science.gov (United States)

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  9. A relational approach to support software architecture analysis

    NARCIS (Netherlands)

    Feijs, L.M.G.; Krikhaar, R.L.; van Ommering, R.C.

    1998-01-01

    This paper reports on our experience with a relational approach to support the analysis of existing software architectures. The analysis options provide for visualization and view calculation. The approach has been applied for reverse engineering. It is also possible to check concrete designs

  10. Isotope correlation techniques for verifying input accountability measurements at a reprocessing plant

    International Nuclear Information System (INIS)

    Umezawa, H.; Nakahara, Y.

    1983-01-01

    Isotope correlation techniques were studied to verify input accountability measurements at a reprocessing plant. On the basis of a historical data bank, correlation between plutonium-to-uranium ratio and isotopic variables was derived as a function of burnup. The burnup was determined from the isotopic ratios of uranium and plutonium, too. Data treatment was therefore made in an iterative manner. The isotopic variables were defined to cover a wide spectrum of isotopes of uranium and plutonium. The isotope correlation techniques evaluated important parameters such as the fuel burnup, the most probable ratio of plutonium to uranium, and the amounts of uranium and plutonium in reprocessing batches in connection with fresh fuel fabrication data. In addition, the most probable values of isotope abundance of plutonium and uranium could be estimated from the plutonium-to-uranium ratio determined, being compared with the reported data for verification. A pocket-computer-based system was developed to enable inspectors to collect and evaluate data in a timely fashion at the input accountability measurement point by the isotope correlation techniques. The device is supported by battery power and completely independent of the operator's system. The software of the system was written in BASIC. The data input can be stored in a cassette tape and transferred into a higher level computer. The correlations used for the analysis were given as a form of analytical function. Coefficients for the function were provided relevant to the type of reactor and the initial enrichment of fuel. (author)

  11. REVEAL - A tool for rule driven analysis of safety critical software

    International Nuclear Information System (INIS)

    Miedl, H.; Kersken, M.

    1998-01-01

    As the determination of ultrahigh reliability figures for safety critical software is hardly possible, national and international guidelines and standards give mainly requirements for the qualitative evaluation of software. An analysis whether all these requirements are fulfilled is time and effort consuming and prone to errors, if performed manually by analysts, and should instead be dedicated to tools as far as possible. There are many ''general-purpose'' software analysis tools, both static and dynamic, which help analyzing the source code. However, they are not designed to assess the adherence to specific requirements of guidelines and standards in the nuclear field. Against the background of the development of I and C systems in the nuclear field which are based on digital techniques and implemented in high level language, it is essential that the assessor or licenser has a tool with which he can automatically and uniformly qualify as many aspects as possible of the high level language software. For this purpose the software analysis tool REVEAL has been developed at ISTec and the Halden Reactor Project. (author)

  12. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    International Nuclear Information System (INIS)

    VINCENT, ANDREW

    2005-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture

  13. Applications of the BEam Cross section Analysis Software (BECAS)

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert; Fedorov, Vladimir

    2013-01-01

    A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used for the gener......A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used...... for the generation of beam finite element models which correctly account for effects stemming from material anisotropy and inhomogeneity in cross sections of arbitrary geometry. These type of modelling approach allows for an accurate yet computationally inexpensive representation of a general class of three...

  14. How to do Meta-Analysis using HLM software

    OpenAIRE

    Petscher, Yaacov

    2013-01-01

    This is a step-by-step presentation of how to run a meta-analysis using HLM software. Because it's a variance known model, it is not run through the GUI, but batch mode. These slides show how to prepare the data and run the analysis.

  15. Development of neutron activation analysis software

    International Nuclear Information System (INIS)

    Wang Liyu

    1987-10-01

    The software for quantitative neutron activation analysis was developed to run under the MS/DOS operating system. The programmes of the IBM/SPAN include: spectra file transfer from and to a Canberra Series 35 multichannel analyzer, spectrum evaluation routines, calibration subprogrammes, and quantitative analysis. The programmes for spectrum analysis include fitting routine for separation of multiple lines by reproducing the peak shape with a combination of Gaussian and exponential terms. The programmes were tested on an IBM/AT-compatible computer. The programmes and the sources are available costfree for the IAEA projects of Technical Cooperation. 7 refs, 3 figs

  16. ChelomEx: Isotope-assisted discovery of metal chelates in complex media using high-resolution LC-MS.

    Science.gov (United States)

    Baars, Oliver; Morel, François M M; Perlman, David H

    2014-11-18

    Chelating agents can control the speciation and reactivity of trace metals in biological, environmental, and laboratory-derived media. A large number of trace metals (including Fe, Cu, Zn, Hg, and others) show characteristic isotopic fingerprints that can be exploited for the discovery of known and unknown organic metal complexes and related chelating ligands in very complex sample matrices using high-resolution liquid chromatography mass spectrometry (LC-MS). However, there is currently no free open-source software available for this purpose. We present a novel software tool, ChelomEx, which identifies isotope pattern-matched chromatographic features associated with metal complexes along with free ligands and other related adducts in high-resolution LC-MS data. High sensitivity and exclusion of false positives are achieved by evaluation of the chromatographic coherence of the isotope pattern within chromatographic features, which we demonstrate through the analysis of bacterial culture media. A built-in graphical user interface and compound library aid in identification and efficient evaluation of results. ChelomEx is implemented in MatLab. The source code, binaries for MS Windows and MAC OS X as well as test LC-MS data are available for download at SourceForge ( http://sourceforge.net/projects/chelomex ).

  17. Analysis for Parallel Execution without Performing Hardware/Software Co-simulation

    OpenAIRE

    Muhammad Rashid

    2014-01-01

    Hardware/software co-simulation improves the performance of embedded applications by executing the applications on a virtual platform before the actual hardware is available in silicon. However, the virtual platform of the target architecture is often not available during early stages of the embedded design flow. Consequently, analysis for parallel execution without performing hardware/software co-simulation is required. This article presents an analysis methodology for parallel execution of ...

  18. Calcium Isotope Analysis with "Peak Cut" Method on Column Chemistry

    Science.gov (United States)

    Zhu, H.; Zhang, Z.; Liu, F.; Li, X.

    2017-12-01

    To eliminate isobaric interferences from elemental and molecular isobars (e.g., 40K+, 48Ti+, 88Sr2+, 24Mg16O+, 27Al16O+) on Ca isotopes during mass determination, samples should be purified through ion-exchange column chemistry before analysis. However, large Ca isotopic fractionation has been observed during column chemistry (Russell and Papanastassiou, 1978; Zhu et al., 2016). Therefore, full recovery during column chemistry is greatly needed, otherwise uncertainties would be caused by poor recovery (Zhu et al., 2016). Generally, matrix effects could be enhanced by full recovery, as other elements might overlap with Ca cut during column chemistry. Matrix effects and full recovery are difficult to balance and both need to be considered for high-precision analysis of stable Ca isotopes. Here, we investigate the influence of poor recovery on δ44/40Ca using TIMS with the double spike technique. The δ44/40Ca values of IAPSO seawater, ML3B-G and BHVO-2 in different Ca subcats (e.g., 0-20, 20-40, 40-60, 60-80, 80-100%) with 20% Ca recovery on column chemistry display limited variation after correction by the 42Ca-43Ca double spike technique with the exponential law. Notably, δ44/40Ca of each Ca subcut is quite consistent with δ44/40Ca of Ca cut with full recovery within error. Our results indicate that the 42Ca-43Ca double spike technique can simultaneously correct both of the Ca isotopic fractionation that occurred during column chemistry and thermal ionization mass spectrometry (TIMS) determination properly, because both of the isotopic fractionation occurred during analysis follow the exponential law well. Therefore, we propose the "peak cut" method on Ca column chemistry for samples with complex matrix effects. Briefly, for samples with low Ca contents, we can add the double spike before column chemistry, and only collect the middle of the Ca eluate and abandon the both sides of Ca eluate that might overlap with other elements (e.g., K, Sr). This method would

  19. Analysis of stable isotope assisted metabolomics data acquired by GC-MS

    International Nuclear Information System (INIS)

    Wei, Xiaoli; Shi, Biyun; Koo, Imhoi; Yin, Xinmin; Lorkiewicz, Pawel; Suhail, Hamid; Rattan, Ramandeep; Giri, Shailendra; McClain, Craig J.

    2017-01-01

    Stable isotope assisted metabolomics (SIAM) measures the abundance levels of metabolites in a particular pathway using stable isotope tracers (e.g., 13 C, 18 O and/or 15 N). We report a method termed signature ion approach for analysis of SIAM data acquired on a GC-MS system equipped with an electron ionization (EI) ion source. The signature ion is a fragment ion in EI mass spectrum of a derivatized metabolite that contains all atoms of the underivatized metabolite, except the hydrogen atoms lost during derivatization. In this approach, GC-MS data of metabolite standards were used to recognize the signature ion from the EI mass spectra acquired from stable isotope labeled samples, and a linear regression model was used to deconvolute the intensity of overlapping isotopologues. A mixture score function was also employed for cross-sample chromatographic peak list alignment to recognize the chromatographic peaks generated by the same metabolite in different samples, by simultaneously evaluating the similarity of retention time and EI mass spectrum of two chromatographic peaks. Analysis of a mixture of 16 13 C-labeled and 16 unlabeled amino acids showed that the signature ion approach accurately identified and quantified all isotopologues. Analysis of polar metabolite extracts from cells respectively fed with uniform 13 C-glucose and 13 C-glutamine further demonstrated that this method can also be used to analyze the complex data acquired from biological samples. - Highlights: • A signature ion approach is developed for analysis of stable isotope GC-MS data. • GC-MS data of compound standards are used for selection of the signature ion. • Linear regression model is used to deconvolute the overlapping isotopologue peaks. • The developed method was tested by known compounds and biological samples.

  20. Rapid-swept CW cavity ring-down laser spectroscopy for carbon isotope analysis

    International Nuclear Information System (INIS)

    Tomita, Hideki; Watanabe, Kenichi; Takiguchi, Yu; Kawarabayashi, Jun; Iguchi, Tetsuo

    2006-01-01

    With the aim of developing a portable system for an in field isotope analysis, we investigate an isotope analysis based on rapid-swept CW cavity ring-down laser spectroscopy, in which the concentration of a chemical species is derived from its photo absorbance. Such a system can identify the isotopomer and still be constructed as a quite compact system. We have made some basic experimental measurements of the overtone absorption lines of carbon dioxide ( 12 C 16 O 2 , 13 C 16 O 2 ) by rapid-swept cavity ring-down spectroscopy with a CW infrared diode laser at 6,200 cm -1 (1.6 μm). The isotopic ratio has been obtained as (1.07±0.13)x10 -2 , in good agreement with the natural abundance within experimental uncertainty. The detection sensitivity in absorbance has been estimated to be 3x10 -8 cm -1 . (author)

  1. Confirmatory Factor Analysis Alternative: Free, Accessible CBID Software.

    Science.gov (United States)

    Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron

    2018-02-01

    New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.

  2. Research and Development on Food Nutrition Statistical Analysis Software System

    OpenAIRE

    Du Li; Ke Yun

    2013-01-01

    Designing and developing a set of food nutrition component statistical analysis software can realize the automation of nutrition calculation, improve the nutrition processional professional’s working efficiency and achieve the informatization of the nutrition propaganda and education. In the software development process, the software engineering method and database technology are used to calculate the human daily nutritional intake and the intelligent system is used to evaluate the user’s hea...

  3. DIII-D Thomson Scattering Diagnostic Data Acquisition, Processing and Analysis Software

    International Nuclear Information System (INIS)

    Middaugh, K.R.; Bray, B.D.; Hsieh, C.L.; McHarg, B.B.Jr.; Penaflor, B.G.

    1999-01-01

    One of the diagnostic systems critical to the success of the DIII-D tokamak experiment is the Thomson scattering diagnostic. This diagnostic is unique in that it measures local electron temperature and density: (1) at multiple locations within the tokamak plasma; and (2) at different times throughout the plasma duration. Thomson ''raw'' data are digitized signals of scattered light, measured at different times and locations, from the laser beam paths fired into the plasma. Real-time acquisition of this data is performed by specialized hardware. Once obtained, the raw data are processed into meaningful temperature and density values which can be analyzed for measurement quality. This paper will provide an overview of the entire Thomson scattering diagnostic software and will focus on the data acquisition, processing, and analysis software implementation. The software falls into three general categories: (1) Set-up and Control: Initializes and controls all Thomson hardware and software, synchronizes with other DIII-D computers, and invokes other Thomson software as appropriate. (2) Data Acquisition and Processing: Obtains raw measured data from memory and processes it into temperature and density values. (3) Analysis: Provides a graphical user interface in which to perform analysis and sophisticated plotting of analysis parameters

  4. Isotope analysis (δ13C of pulpy whole apple juice

    Directory of Open Access Journals (Sweden)

    Ricardo Figueira

    2011-09-01

    Full Text Available The objectives of this study were to develop the method of isotope analysis to quantify the carbon of C3 photosynthetic cycle in pulpy whole apple juice and to measure the legal limits based on Brazilian legislation in order to identify the beverages that do not conform to the Ministry of Agriculture, Livestock and Food Supply (MAPA. This beverage was produced in a laboratory according to the Brazilian law. Pulpy juices adulterated by the addition of sugarcane were also produced. The isotope analyses measured the relative isotope enrichment of the juices, their pulpy fractions (internal standard and purified sugar. From those results, the quantity of C3 source was estimated by means of the isotope dilution equation. To determine the existence of adulteration in commercial juices, it was necessary to create a legal limit according to the Brazilian law. Three brands of commercial juices were analyzed. One was classified as adulterated. The legal limit enabled to clearly identify the juice that was not in conformity with the Brazilian law. The methodology developed proved efficient for quantifying the carbon of C3 origin in commercial pulpy apple juices.

  5. Automated Freedom from Interference Analysis for Automotive Software

    OpenAIRE

    Leitner-Fischer , Florian; Leue , Stefan; Liu , Sirui

    2016-01-01

    International audience; Freedom from Interference for automotive software systems developed according to the ISO 26262 standard means that a fault in a less safety critical software component will not lead to a fault in a more safety critical component. It is an important concern in the realm of functional safety for automotive systems. We present an automated method for the analysis of concurrency-related interferences based on the QuantUM approach and tool that we have previously developed....

  6. Isotope dilution analysis for urinary fentanyl and its main metabolite, norfentanyl, in patients by isotopic fractionation using capillary gas chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Sera, Shoji; Goromaru, Tsuyoshi [Fukuyama Univ., Hiroshima (Japan). Faculty of Pharmacy and Pharmaceutical Sciences; Sameshima, Teruko; Kawasaki, Koichi; Oda, Toshiyuki

    1998-07-01

    Isotope dilution analysis was applied to determine urinary excretion of fentanyl (FT) and its main metabolite, norfentanyl (Nor-FT), by isotopic fractionation using a capillary gas chromatograph equipped with a surface ionization detector (SID). Urinary FT was determined quantitatively in the range of 0.4-40 ng/ml using deuterium labeled FT (FT-{sup 2}H{sub 19}), as an internal standard. We also performed isotope dilution analysis of Nor-FT in urine. N-Alkylation was necessary to sensitively detect Nor-FT with SID. Methyl derivative was selected from 3 kinds of N-alkyl derivatives to increase sensitivity and peak resolution, and to prevent interference with urinary compound. Nor-FT concentration was quantitatively determined in the range of 10-400 ng/ml using deuterium labeled Nor-FT (Nor-FT-{sup 2}H{sub 10}). No endogenous compounds or concomitant drugs interfered with the detection of FT and Nor-FT in the urine of patients. The present method will be useful for pharmacokinetic studies and the evaluation of drug interactions in FT metabolism. (author)

  7. Isotope dilution analysis for urinary fentanyl and its main metabolite, norfentanyl, in patients by isotopic fractionation using capillary gas chromatography

    International Nuclear Information System (INIS)

    Sera, Shoji; Goromaru, Tsuyoshi; Sameshima, Teruko; Kawasaki, Koichi; Oda, Toshiyuki

    1998-01-01

    Isotope dilution analysis was applied to determine urinary excretion of fentanyl (FT) and its main metabolite, norfentanyl (Nor-FT), by isotopic fractionation using a capillary gas chromatograph equipped with a surface ionization detector (SID). Urinary FT was determined quantitatively in the range of 0.4-40 ng/ml using deuterium labeled FT (FT- 2 H 19 ), as an internal standard. We also performed isotope dilution analysis of Nor-FT in urine. N-Alkylation was necessary to sensitively detect Nor-FT with SID. Methyl derivative was selected from 3 kinds of N-alkyl derivatives to increase sensitivity and peak resolution, and to prevent interference with urinary compound. Nor-FT concentration was quantitatively determined in the range of 10-400 ng/ml using deuterium labeled Nor-FT (Nor-FT- 2 H 10 ). No endogenous compounds or concomitant drugs interfered with the detection of FT and Nor-FT in the urine of patients. The present method will be useful for pharmacokinetic studies and the evaluation of drug interactions in FT metabolism. (author)

  8. Isotope ratio analysis by a combination of element analyzer and mass spectrometer

    International Nuclear Information System (INIS)

    Pichlmayer, F.

    1987-06-01

    The use of stable isotope ratios of carbon, nitrogen and sulfur as analytical tool in many fields of research is of growing interest. A method has therefore been developed, consisting in essential of coupling an Elemental Analyzer with an Isotope Mass Spectrometer, which enables the gas preparation of carbon dioxide, nitrogen and sulfur dioxide from any solid or liquid sample in a fast and easy way. Results of carbon isotope measurements in food analysis are presented, whereat it is possible to check origin and treatment of sugar, oils, fats, mineral waters, spirituous liquors etc. and to detect adulterations as well. Also applications in the field of environmental research are given. (Author)

  9. Chemically modified glasses for analysis of hydrogen isotopes by gas-chromatography

    International Nuclear Information System (INIS)

    Stanciu, Vasile; Stefanescu, Doina

    1999-01-01

    Hydrogen isotope separation process by such methods as cryogenic distillation or thermal diffusion method is one of the key technologies of the tritium separation from heavy water of CANDU reactors and in the tritium fuel cycle for a thermonuclear fusion reactor. In each process, the analytical techniques for measuring contents of hydrogen isotope mixture are necessary. An extensive experimental research has been carried out in order to produce the most suitable absorbent and define the best operating conditions for selective separation and analysis of hydrogen isotope by gas-chromatography. This paper describes the preparation of adsorbent materials utilised as stationary phase in the gas-chromatographic column for hydrogen isotope separation and treatment (activation) of stationary phase. Modified thermo-resisting glass with Fe(NH 4 ) 2 (SO 4 ) 2 6H 2 O and Cr 2 O 3 , respectively, have been experimentally investigated at 77 K for H 2 , HD and D 2 separation and the results of chromatographic runs are also reported and discussed. The gas-chromatographic apparatus used is composed of a Hewlett-Packard 7620A gas-chromatograph equipped with a gas carrier flow rate controller and a thermal conductivity detector (TCD). The apparatus comprises also a Dewar vessel containing the separation column. The hydrogen isotopes H 2 , HD, D 2 and their mixture have been obtained in our laboratories. The best operating conditions of the adsorbent column Fe (III)/glass and Cr 2 O 3 /glass, i.e. granulometry, column length, pressure-drop along the column, carrier gas flow rate, sample volume have been studied by means of the analysis of the retention times, separation factors and HETP. (authors)

  10. ISODEP, A Fuel Depletion Analysis Code for Predicting Isotopic ...

    African Journals Online (AJOL)

    The trend of results was found to be consistent with those obtained by analytical and other numerical methods. Discovery and Innovation Vol. 13 no. 3/4 December (2001) pp. 184-195. KEY WORDS: depletion analysis, code, research reactor, simultaneous equations, decay of nuclides, radionuclitides, isotope. Résumé

  11. Carbon isotopic analysis of atmospheric methane by isotope-ratio-monitoring gas chromatography-mass spectrometry

    Science.gov (United States)

    Merritt, Dawn A.; Hayes, J. M.; Des Marais, David J.

    1995-01-01

    Less than 15 min are required for the determination of delta C(sub PDB)-13 with a precision of 0.2 ppt(1 sigma, single measurement) in 5-mL samples of air containing CH4 at natural levels (1.7 ppm). An analytical system including a sample-introduction unit incorporating a preparative gas chromatograph (GC) column for separation of CH4 from N2, O2, and Ar is described. The 15-min procedure includes time for operation of that system, high-resolution chromatographic separation of the CH4, on-line combustion and purification of the products, and isotopic calibration. Analyses of standards demonstrate that systematic errors are absent and that there is no dependence of observed values of delta on sample size. For samples containing 100 ppm or more CH4, preconcentration is not required and the analysis time is less than 5 min. The system utilizes a commercially available, high-sensitivity isotope-ratio mass spectrometer. For optimal conditions of smaple handling and combustion, performance of the system is within a factor of 2 of the shot-noise limit. The potential exists therefore for analysis of samples as small as 15 pmol CH4 with a standard deviation of less than 1 ppt.

  12. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  13. First stable isotope analysis of Asiatic wild ass tail hair from the Mongolian Gobi.

    Science.gov (United States)

    Horacek, Micha; Sturm, Martina Burnik; Kaczensky, Petra

    Stable isotope analysis has become a powerful tool to study feeding ecology, water use or movement pattern in contemporary, historic and ancient species. Certain hair and teeth grow continuously, and when sampled longitudinally can provide temporally explicit information on dietary regime and movement pattern. In an initial trial, we analysed a tail sample of an Asiatic wild ass ( Equus hemionus ) from the Mongolian Gobi. We found seasonal variations in H, C and N isotope patterns, likely being the result of temporal variations in available feeds, water supply and possibly physiological status. Thus stable isotope analysis shows promise to study the comparative ecology of the three autochthonous equid species in the Mongolian Gobi.

  14. In-situ Isotopic Analysis at Nanoscale using Parallel Ion Electron Spectrometry: A Powerful New Paradigm for Correlative Microscopy

    Science.gov (United States)

    Yedra, Lluís; Eswara, Santhana; Dowsett, David; Wirtz, Tom

    2016-01-01

    Isotopic analysis is of paramount importance across the entire gamut of scientific research. To advance the frontiers of knowledge, a technique for nanoscale isotopic analysis is indispensable. Secondary Ion Mass Spectrometry (SIMS) is a well-established technique for analyzing isotopes, but its spatial-resolution is fundamentally limited. Transmission Electron Microscopy (TEM) is a well-known method for high-resolution imaging down to the atomic scale. However, isotopic analysis in TEM is not possible. Here, we introduce a powerful new paradigm for in-situ correlative microscopy called the Parallel Ion Electron Spectrometry by synergizing SIMS with TEM. We demonstrate this technique by distinguishing lithium carbonate nanoparticles according to the isotopic label of lithium, viz. 6Li and 7Li and imaging them at high-resolution by TEM, adding a new dimension to correlative microscopy. PMID:27350565

  15. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  16. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  17. Evaluation of peak-fitting software for gamma spectrum analysis

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Moralles, Mauricio

    2009-01-01

    In all applications of gamma-ray spectroscopy, one of the most important and delicate parts of the data analysis is the fitting of the gamma-ray spectra, where information as the number of counts, the position of the centroid and the width, for instance, are associated with each peak of each spectrum. There's a huge choice of computer programs that perform this type of analysis, and the most commonly used in routine work are the ones that automatically locate and fit the peaks; this fit can be made in several different ways - the most common ways are to fit a Gaussian function to each peak or simply to integrate the area under the peak, but some software go far beyond and include several small corrections to the simple Gaussian peak function, in order to compensate for secondary effects. In this work several gamma-ray spectroscopy software are compared in the task of finding and fitting the gamma-ray peaks in spectra taken with standard sources of 137 Cs, 60 Co, 133 Ba and 152 Eu. The results show that all of the automatic software can be properly used in the task of finding and fitting peaks, with the exception of GammaVision; also, it was possible to verify that the automatic peak-fitting software did perform as well as - and sometimes even better than - a manual peak-fitting software. (author)

  18. Characterization of phenols biodegradation by compound specific stable isotope analysis

    Science.gov (United States)

    Wei, Xi; Gilevska, Tetyana; Wenzig, Felix; Hans, Richnow; Vogt, Carsten

    2015-04-01

    -cresol degradation and 2.2±0.3‰ for m-cresol degradation, respectively. The carbon isotope fractionation patterns of phenol degradation differed more profoundly. Oxygen-dependent monooxygenation of phenol by A.calcoaceticus as the initial reaction yielded ƐC values of -1.5±0.02‰. In contrast, the anaerobic degradation initiated by ATP-dependent carboxylation performed by Thauera aromatia DSM 6984, produced no detectable fractionation (ƐC 0±0.1‰). D. cetonica showed a slight inverse carbon isotope fractionation (ƐC 0.4±0.1‰). In conclusion, a validated method for compound specific stable isotope analysis was developed for phenolic compounds, and the first data set of carbon enrichment factors upon the biodegradation of phenol and cresols with different activation mechanisms has been obtained in the present study. Carbon isotope fractionation analysis is a potentially powerful tool to monitor phenolic compounds degradation in the environment.

  19. Stable-isotope analysis: a neglected tool for placing parasites in food webs.

    Science.gov (United States)

    Sabadel, A J M; Stumbo, A D; MacLeod, C D

    2018-02-28

    Parasites are often overlooked in the construction of food webs, despite their ubiquitous presence in almost every type of ecosystem. Researchers who do recognize their importance often struggle to include parasites using classical food-web theory, mainly due to the parasites' multiple hosts and life stages. A novel approach using compound-specific stable-isotope analysis promises to provide considerable insight into the energetic exchanges of parasite and host, which may solve some of the issues inherent in incorporating parasites using a classical approach. Understanding the role of parasites within food webs, and tracing the associated biomass transfers, are crucial to constructing new models that will expand our knowledge of food webs. This mini-review focuses on stable-isotope studies published in the past decade, and introduces compound-specific stable-isotope analysis as a powerful, but underutilized, newly developed tool that may answer many unresolved questions regarding the role of parasites in food webs.

  20. Development of a new method for hydrogen isotope analysis of trace hydrocarbons in natural gas samples

    Directory of Open Access Journals (Sweden)

    Xibin Wang

    2016-12-01

    Full Text Available A new method had been developed for the analysis of hydrogen isotopic composition of trace hydrocarbons in natural gas samples by using solid phase microextraction (SPME combined with gas chromatography-isotope ratio mass spectrometry (GC/IRMS. In this study, the SPME technique had been initially introduced to achieve the enrichment of trace content of hydrocarbons with low abundance and coupled to GC/IRMS for hydrogen isotopic analysis. The main parameters, including the equilibration time, extraction temperature, and the fiber type, were systematically optimized. The results not only demonstrated that high extraction yield was true but also shows that the hydrogen isotopic fractionation was not observed during the extraction process, when the SPME device fitted with polydimethylsiloxane/divinylbenzene/carbon molecular sieve (PDMS/DVB/CAR fiber. The applications of SPME-GC/IRMS method were evaluated by using natural gas samples collected from different sedimentary basins; the standard deviation (SD was better than 4‰ for reproducible measurements; and also, the hydrogen isotope values from C1 to C9 can be obtained with satisfying repeatability. The SPME-GC/IRMS method fitted with PDMS/DVB/CAR fiber is well suited for the preconcentration of trace hydrocarbons, and provides a reliable hydrogen isotopic analysis for trace hydrocarbons in natural gas samples.

  1. Error analysis of isotope dilution mass spectrometry method with internal standard

    International Nuclear Information System (INIS)

    Rizhinskii, M.W.; Vitinskii, M.Y.

    1989-02-01

    The computation algorithms of the normalized isotopic ratios and element concentration by isotope dilution mass spectrometry with internal standard are presented. A procedure based on the Monte-Carlo calculation is proposed for predicting the magnitude of the errors to be expected. The estimation of systematic and random errors is carried out in the case of the certification of uranium and plutonium reference materials as well as for the use of those reference materials in the analysis of irradiated nuclear fuels. 4 refs, 11 figs, 2 tabs

  2. Direct uranium isotope ratio analysis of single micrometer-sized glass particles

    OpenAIRE

    Kappel, Stefanie; Boulyga, Sergei F.; Prohaska, Thomas

    2012-01-01

    We present the application of nanosecond laser ablation (LA) coupled to a ‘Nu Plasma HR’ multi collector inductively coupled plasma mass spectrometer (MC-ICP-MS) for the direct analysis of U isotope ratios in single, 10–20 μm-sized, U-doped glass particles. Method development included studies with respect to (1) external correction of the measured U isotope ratios in glass particles, (2) the applied laser ablation carrier gas (i.e. Ar versus He) and (3) the accurate determination of lower abu...

  3. Specdata: Automated Analysis Software for Broadband Spectra

    Science.gov (United States)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  4. The application of formal software engineering methods to the unattended and remote monitoring software suite at Los Alamos National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Determan, John Clifford [Los Alamos National Laboratory; Longo, Joseph F [Los Alamos National Laboratory; Michel, Kelly D [Los Alamos National Laboratory

    2009-01-01

    The Unattended and Remote Monitoring (UNARM) system is a collection of specialized hardware and software used by the International Atomic Energy Agency (IAEA) to institute nuclear safeguards at many nuclear facilities around the world. The hardware consists of detectors, instruments, and networked computers for acquiring various forms of data, including but not limited to radiation data, global position coordinates, camera images, isotopic data, and operator declarations. The software provides two primary functions: the secure and reliable collection of this data from the instruments and the ability to perform an integrated review and analysis of the disparate data sources. Several years ago the team responsible for maintaining the software portion of the UNARM system began the process of formalizing its operations. These formal operations include a configuration management system, a change control board, an issue tracking system, and extensive formal testing, for both functionality and reliability. Functionality is tested with formal test cases chosen to fully represent the data types and methods of analysis that will be commonly encountered. Reliability is tested with iterative, concurrent testing where up to five analyses are executed simultaneously for thousands of cycles. Iterative concurrent testing helps ensure that there are no resource conflicts or leaks when multiple system components are in use simultaneously. The goal of this work is to provide a high quality, reliable product, commensurate with the criticality of the application. Testing results will be presented that demonstrate that this goal has been achieved and the impact of the introduction of a formal software engineering framework to the UNARM product will be presented.

  5. Optimization and Validation of the Developed Uranium Isotopic Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. H.; Kang, M. Y.; Kim, Jinhyeong; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    γ-ray spectroscopy is a representative non-destructive assay for nuclear material, and less time-consuming and less expensive than the destructive analysis method. The destructive technique is more precise than NDA technique, however, there is some correction algorithm which can improve the performance of γ-spectroscopy. For this reason, an analysis code for uranium isotopic analysis is developed by Applied Nuclear Physics Group in Seoul National University. Overlapped γ- and x-ray peaks in the 89-101 keV X{sub α}-region are fitted with Gaussian and Lorentzian distribution peak functions, tail and background functions. In this study, optimizations for the full-energy peak efficiency calibration and fitting parameters of peak tail and background are performed, and validated with 24 hour acquisition of CRM uranium samples. The optimization of peak tail and background parameters are performed with the validation by using CRM uranium samples. The analysis performance is improved in HEU samples, but more optimization of fitting parameters is required in LEU sample analysis. In the future, the optimization research about the fitting parameters with various type of uranium samples will be performed. {sup 234}U isotopic analysis algorithms and correction algorithms (coincidence effect, self-attenuation effect) will be developed.

  6. Development of Software for Measurement and Analysis of Solar Radiation

    International Nuclear Information System (INIS)

    Mohamad Idris Taib; Abul Adli Anuar; Noor Ezati Shuib

    2015-01-01

    This software was under development using LabVIEW to be using with StellarNet spectrometers system with USB communication to computer. LabVIEW have capabilities in hardware interfacing, graphical user interfacing and mathematical calculation including array manipulation and processing. This software read data from StellarNet spectrometer in real-time and then processed for analysis. Several measurement of solar radiation and analysis have been done. Solar radiation involved mainly infra-red, visible light and ultra-violet. With solar radiation spectrum data, information of weather and suitability of plant can be gathered and analyzed. Furthermore, optimization of utilization and safety precaution of solar radiation can be planned. Using this software, more research and development in utilization and safety of solar radiation can be explored. (author)

  7. STAMPS: development and verification of swallowing kinematic analysis software.

    Science.gov (United States)

    Lee, Woo Hyung; Chun, Changmook; Seo, Han Gil; Lee, Seung Hak; Oh, Byung-Mo

    2017-10-17

    Swallowing impairment is a common complication in various geriatric and neurodegenerative diseases. Swallowing kinematic analysis is essential to quantitatively evaluate the swallowing motion of the oropharyngeal structures. This study aims to develop a novel swallowing kinematic analysis software, called spatio-temporal analyzer for motion and physiologic study (STAMPS), and verify its validity and reliability. STAMPS was developed in MATLAB, which is one of the most popular platforms for biomedical analysis. This software was constructed to acquire, process, and analyze the data of swallowing motion. The target of swallowing structures includes bony structures (hyoid bone, mandible, maxilla, and cervical vertebral bodies), cartilages (epiglottis and arytenoid), soft tissues (larynx and upper esophageal sphincter), and food bolus. Numerous functions are available for the spatiotemporal parameters of the swallowing structures. Testing for validity and reliability was performed in 10 dysphagia patients with diverse etiologies and using the instrumental swallowing model which was designed to mimic the motion of the hyoid bone and the epiglottis. The intra- and inter-rater reliability tests showed excellent agreement for displacement and moderate to excellent agreement for velocity. The Pearson correlation coefficients between the measured and instrumental reference values were nearly 1.00 (P software is expected to be useful for researchers who are interested in the swallowing motion analysis.

  8. Analysis of transuranic isotopes in irradiated U3Si2-Al fuel by alpha spectrometry

    International Nuclear Information System (INIS)

    Dian Anggraini; Aslina B Ginting; Arif Nugroho

    2011-01-01

    Separation and analysis of transuranic isotopes (uranium and plutonium) in irradiated U 3 Si 2 -Al plate has been done. The analysis experiment includes sample preparation (i.e. cutting, dissolving, filtering, dilution), fission products separation from heavy elements, and analysis of transuranic isotopes content with alpha spectrometer. The separation of transuranic isotopes (U, Pu) was done by two methods, i.e. direct method and ion exchanger method with zeolite. Measurement of standard transuranic isotope (AMR 43) and standard U 3 O 8 was done in advance in order to determine percentage of 235 U recovery and detector efficiency. Recovery of 235 U isotope was obtained as much as 92,58%, which fulfills validation requirement, and the detector efficiency was 0.314. Based on the measured recovery and detector efficiency, the separation was done by direct electrodeposition method of 250 µL irradiated U 3 Si 2 -Al solution. The deposited sample was subsequently analyzed with alpha spectrometer. The separation with ion exchanger was done by mixing and shaking of 300 µL irradiated U 3 Si 2 -Al solution and 0.5 gram zeolite to separate the liquid phase from the solid phase. The liquid phase was electrodeposited and analyzed with alpha spectrometer. The analysis of transuranic isotopes (U, Pu) by both methods shows different results. Heavy element ( 238 U, 236 U, 234 U, 239 Pu) content obtained by direct method was 0.0525 g/g and 235 U= 0.0076 g/g, while the separation using zeolite ion exchanger resulted in Heavy element = 0.0253 g/g and 235 U = 0.0092 g/g. (author)

  9. Enhanced understanding of ectoparasite: host trophic linkages on coral reefs through stable isotope analysis

    Science.gov (United States)

    Demopoulos, Amanda W. J.; Sikkel, Paul C.

    2015-01-01

    Parasitism, although the most common type of ecological interaction, is usually ignored in food web models and studies of trophic connectivity. Stable isotope analysis is widely used in assessing the flow of energy in ecological communities and thus is a potentially valuable tool in understanding the cryptic trophic relationships mediated by parasites. In an effort to assess the utility of stable isotope analysis in understanding the role of parasites in complex coral-reef trophic systems, we performed stable isotope analysis on three common Caribbean reef fish hosts and two kinds of ectoparasitic isopods: temporarily parasitic gnathiids (Gnathia marleyi) and permanently parasitic cymothoids (Anilocra). To further track the transfer of fish-derived carbon (energy) from parasites to parasite consumers, gnathiids from host fish were also fed to captive Pederson shrimp (Ancylomenes pedersoni) for at least 1 month. Parasitic isopods had δ13C and δ15N values similar to their host, comparable with results from the small number of other host–parasite studies that have employed stable isotopes. Adult gnathiids were enriched in 15N and depleted in13C relative to juvenile gnathiids, providing insights into the potential isotopic fractionation associated with blood-meal assimilation and subsequent metamorphosis. Gnathiid-fed Pedersen shrimp also had δ13C values consistent with their food source and enriched in 15N as predicted due to trophic fractionation. These results further indicate that stable isotopes can be an effective tool in deciphering cryptic feeding relationships involving parasites and their consumers, and the role of parasites and cleaners in carbon transfer in coral-reef ecosystems specifically.

  10. Enhanced understanding of ectoparasite–host trophic linkages on coral reefs through stable isotope analysis

    Directory of Open Access Journals (Sweden)

    Amanda W.J. Demopoulos

    2015-04-01

    Full Text Available Parasitism, although the most common type of ecological interaction, is usually ignored in food web models and studies of trophic connectivity. Stable isotope analysis is widely used in assessing the flow of energy in ecological communities and thus is a potentially valuable tool in understanding the cryptic trophic relationships mediated by parasites. In an effort to assess the utility of stable isotope analysis in understanding the role of parasites in complex coral-reef trophic systems, we performed stable isotope analysis on three common Caribbean reef fish hosts and two kinds of ectoparasitic isopods: temporarily parasitic gnathiids (Gnathia marleyi and permanently parasitic cymothoids (Anilocra. To further track the transfer of fish-derived carbon (energy from parasites to parasite consumers, gnathiids from host fish were also fed to captive Pederson shrimp (Ancylomenes pedersoni for at least 1 month. Parasitic isopods had δ13C and δ15N values similar to their host, comparable with results from the small number of other host–parasite studies that have employed stable isotopes. Adult gnathiids were enriched in 15N and depleted in 13C relative to juvenile gnathiids, providing insights into the potential isotopic fractionation associated with blood-meal assimilation and subsequent metamorphosis. Gnathiid-fed Pedersen shrimp also had δ13C values consistent with their food source and enriched in 15N as predicted due to trophic fractionation. These results further indicate that stable isotopes can be an effective tool in deciphering cryptic feeding relationships involving parasites and their consumers, and the role of parasites and cleaners in carbon transfer in coral-reef ecosystems specifically.

  11. Carbon isotope analysis in apple nectar beverages

    Directory of Open Access Journals (Sweden)

    Ricardo Figueira

    2013-03-01

    Full Text Available The aims of this study were to use the isotope analysis method to quantify the carbon of C3 photosynthetic cycle in commercial apple nectars and to determine the legal limit to identify the beverages that do not conform to the safety standards established by the Brazilian Ministry of Agriculture, Livestock and Food Supply. These beverages (apple nectars were produced in the laboratory according to the Brazilian legislation. Adulterated nectars were also produced with an amount of pulp juice below the permitted threshold limit value. The δ13C values of the apple nectars and their fractions (pulp and purified sugar were measured to quantify the C3 source percentage. In order to demonstrate the existence of adulteration, the values found were compared to the limit values established by the Brazilian Law. All commercial apple nectars analyzed were within the legal limits, which enabled to identify the nectars that were in conformity with the Brazilian Law. The isotopic methodology developed proved efficient to quantify the carbon of C3 origin in commercial apple nectars.

  12. Monitoring of the aerobe biodegradation of chlorinated organic solvents by stable isotope analysis

    Science.gov (United States)

    Horváth, Anikó; Futó, István; Palcsu, László

    2014-05-01

    Our chemical-biological basic research aims to eliminate chlorinated environmental contaminants from aquifers around industrial areas in the frame of research program supported by the European Social Fund (TÁMOP-4.2.2.A-11/1/KONV-2012-0043). The most careful and simplest way includes the in situ biodegradation with the help of cultured and compound specific strains. Numerous members of Pseudomonas bacteria are famous about function of bioremediation. They can metabolism the environmental hazardous chemicals like gas oils, dyes, and organic solvents. Our research based on the Pseudomonas putida F1 strain, because its ability to degrade halogenated hydrocarbons such as trichloroethylene. Several methods were investigated to estimate the rate of biodegradation, such as the measurement of the concentration of the pollutant along the contamination pathway, the microcosm's studies or the compound specific stable isotope analysis. In this area in the Transcarpathian basin we are pioneers in the stable isotope monitoring of biodegradation. The main goal is to find stable isotope fractionation factors by stable isotope analysis, which can help us to estimate the rate and effectiveness of the biodegradation. The subsequent research period includes the investigation of the method, testing its feasibility and adaptation in the environment. Last but not least, the research gives an opportunity to identify the producer of the contaminant based on the stable isotope composition of the contaminant.

  13. PyPWA: A partial-wave/amplitude analysis software framework

    Science.gov (United States)

    Salgado, Carlos

    2016-05-01

    The PyPWA project aims to develop a software framework for Partial Wave and Amplitude Analysis of data; providing the user with software tools to identify resonances from multi-particle final states in photoproduction. Most of the code is written in Python. The software is divided into two main branches: one general-shell where amplitude's parameters (or any parametric model) are to be estimated from the data. This branch also includes software to produce simulated data-sets using the fitted amplitudes. A second branch contains a specific realization of the isobar model (with room to include Deck-type and other isobar model extensions) to perform PWA with an interface into the computer resources at Jefferson Lab. We are currently implementing parallelism and vectorization using the Intel's Xeon Phi family of coprocessors.

  14. GRACAT, Software for grounding and collision analysis

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Simonsen, Bo Cerup

    2002-01-01

    and grounding accidents. The software consists of three basic analysis modules and one risk mitigation module: 1) frequency, 2) damage, and 3) consequence. These modules can be used individually or in series and the analyses can be performed in deterministic or probabilistic mode. Finally, in the mitigation...

  15. AVNG System Software-Attribute Verification System with Information Barriers for Mass Isotopic Measurements

    International Nuclear Information System (INIS)

    Elmont, T.H.; Langner, Diana C.; MacArthur, D.W.; Mayo, D.R.; Smith, M.K.; Modenov, A.

    2005-01-01

    This report describes the software development for the plutonium attribute verification system - AVNG. A brief synopsis of the technical solution for the measurement system is presented. The main tasks for the software development that is underway are formulated. The development tasks are shown in software structural flowcharts, measurement system state diagram and a description of the software. The current status of the AVNG software development is elucidated.

  16. Direct U isotope analysis in μm-sized particles by LA-MC-ICPMS

    International Nuclear Information System (INIS)

    Kappel, S.; Boulyga, S.F.; Prohaska, T.

    2009-01-01

    Full text: The knowledge of the isotopic composition of individual μm-sized hot particles is of great interest especially for strengthened nuclear safeguards in order to identify undeclared nuclear activities. We present the potential of a 'Nu Plasma HR' MC-ICPMS coupled to a New Wave 'UP 193' laser ablation (LA) system for the direct analysis of U isotope abundance ratios in individual μm-sized particles. The ability to determine 234 U/ 238 U and 235 U/ 238 U isotope ratios was successfully demonstrated in the NUSIMEP-6 interlaboratory comparison, which was organized by the IRMM (Geel, Belgium). (author)

  17. Fully automatic and precise data analysis developed for time-of-flight mass spectrometry.

    Science.gov (United States)

    Meyer, Stefan; Riedo, Andreas; Neuland, Maike B; Tulej, Marek; Wurz, Peter

    2017-09-01

    Scientific objectives of current and future space missions are focused on the investigation of the origin and evolution of the solar system with the particular emphasis on habitability and signatures of past and present life. For in situ measurements of the chemical composition of solid samples on planetary surfaces, the neutral atmospheric gas and the thermal plasma of planetary atmospheres, the application of mass spectrometers making use of time-of-flight mass analysers is a technique widely used. However, such investigations imply measurements with good statistics and, thus, a large amount of data to be analysed. Therefore, faster and especially robust automated data analysis with enhanced accuracy is required. In this contribution, an automatic data analysis software, which allows fast and precise quantitative data analysis of time-of-flight mass spectrometric data, is presented and discussed in detail. A crucial part of this software is a robust and fast peak finding algorithm with a consecutive numerical integration method allowing precise data analysis. We tested our analysis software with data from different time-of-flight mass spectrometers and different measurement campaigns thereof. The quantitative analysis of isotopes, using automatic data analysis, yields results with an accuracy of isotope ratios up to 100 ppm for a signal-to-noise ratio (SNR) of 10 4 . We show that the accuracy of isotope ratios is in fact proportional to SNR -1 . Furthermore, we observe that the accuracy of isotope ratios is inversely proportional to the mass resolution. Additionally, we show that the accuracy of isotope ratios is depending on the sample width T s by T s 0.5 . Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers

    Science.gov (United States)

    Bjorner, Nikolaj

    2010-01-01

    The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings

  19. Isotopic analysis of bullet lead samples

    International Nuclear Information System (INIS)

    Sankar Das, M.; Venkatasubramanian, V.S.; Sreenivas, K.

    1976-01-01

    The possibility of using the isotopic composition of lead for the identification of bullet lead is investigated. Lead from several spent bullets were converted to lead sulphide and analysed for the isotopic abundances using an MS-7 mass spectrometer. The abundances are measured relative to that for Pb 204 was too small to permit differentiation, while the range of variation of Pb 206 and Pb 207 and the better precision in their analyses permitted differentiating samples from one another. The correlation among the samples examined has been pointed out. The method is complementary to characterisation of bullet leads by the trace element composition. The possibility of using isotopically enriched lead for tagging bullet lead is pointed out. (author)

  20. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  1. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Science.gov (United States)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  2. Open source software and crowdsourcing for energy analysis

    International Nuclear Information System (INIS)

    Bazilian, Morgan; Rice, Andrew; Rotich, Juliana; Howells, Mark; DeCarolis, Joseph; Macmillan, Stuart; Brooks, Cameron; Bauer, Florian; Liebreich, Michael

    2012-01-01

    Informed energy decision making requires effective software, high-quality input data, and a suitably trained user community. Developing these resources can be expensive and time consuming. Even when data and tools are intended for public re-use they often come with technical, legal, economic and social barriers that make them difficult to adopt, adapt and combine for use in new contexts. We focus on the promise of open, publically accessible software and data as well as crowdsourcing techniques to develop robust energy analysis tools that can deliver crucial, policy-relevant insight, particularly in developing countries, where planning resources are highly constrained—and the need to adapt these resources and methods to the local context is high. We survey existing research, which argues that these techniques can produce high-quality results, and also explore the potential role that linked, open data can play in both supporting the modelling process and in enhancing public engagement with energy issues. - Highlights: ► We focus on the promise of open, publicly accessible software and data. ► These emerging techniques can produce high-quality results for energy analysis. ► Developing economies require new techniques for energy planning.

  3. Suitability of selected free-gas and dissolved-gas sampling containers for carbon isotopic analysis.

    Science.gov (United States)

    Eby, P; Gibson, J J; Yi, Y

    2015-07-15

    Storage trials were conducted for 2 to 3 months using a hydrocarbon and carbon dioxide gas mixture with known carbon isotopic composition to simulate typical hold times for gas samples prior to isotopic analysis. A range of containers (both pierced and unpierced) was periodically sampled to test for δ(13)C isotopic fractionation. Seventeen containers were tested for free-gas storage (20°C, 1 atm pressure) and 7 containers were tested for dissolved-gas storage, the latter prepared by bubbling free gas through tap water until saturated (20°C, 1 atm) and then preserved to avoid biological activity by acidifying to pH 2 with phosphoric acid and stored in the dark at 5°C. Samples were extracted using valves or by piercing septa, and then introduced into an isotope ratio mass spectrometer for compound-specific δ(13)C measurements. For free gas, stainless steel canisters and crimp-top glass serum bottles with butyl septa were most effective at preventing isotopic fractionation (pierced and unpierced), whereas silicone and PTFE-butyl septa allowed significant isotopic fractionation. FlexFoil and Tedlar bags were found to be effective only for storage of up to 1 month. For dissolved gas, crimp-top glass serum bottles with butyl septa were again effective, whereas silicone and PTFE-butyl were not. FlexFoil bags were reliable for up to 2 months. Our results suggest a range of preferred containers as well as several that did not perform very well for isotopic analysis. Overall, the results help establish better QA/QC procedures to avoid isotopic fractionation when storing environmental gas samples. Recommended containers for air transportation include steel canisters and glass serum bottles with butyl septa (pierced and unpierced). Copyright © 2015 John Wiley & Sons, Ltd.

  4. Comparison of two software versions for assessment of body-composition analysis by DXA

    DEFF Research Database (Denmark)

    Vozarova, B; Wang, J; Weyer, C

    2001-01-01

    To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA.......To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA....

  5. Application of stable isotope analysis to study temporal changes in foraging ecology in a highly endangered amphibian.

    Directory of Open Access Journals (Sweden)

    J Hayley Gillespie

    Full Text Available Understanding dietary trends for endangered species may be essential to assessing the effects of ecological disturbances such as habitat modification, species introductions or global climate change. Documenting temporal variation in prey selection may also be crucial for understanding population dynamics. However, the rarity, secretive behaviours and obscure microhabitats of some endangered species can make direct foraging observations difficult or impossible. Furthermore, the lethality or invasiveness of some traditional methods of dietary analysis (e.g. gut contents analysis, gastric lavage makes them inappropriate for such species. Stable isotope analysis facilitates non-lethal, indirect analysis of animal diet that has unrealized potential in the conservation of endangered organisms, particularly amphibians.I determined proportional contributions of aquatic macroinvertebrate prey to the diet of an endangered aquatic salamander Eurycea sosorum over a two-year period using stable isotope analysis of (13/12C and (15/14N and the Bayesian stable isotope mixing model SIAR. I calculated Strauss' dietary electivity indices by comparing these proportions with changing relative abundance of potential prey species through time. Stable isotope analyses revealed that a previously unknown prey item (soft-bodied planarian flatworms in the genus Dugesia made up the majority of E. sosorum diet. Results also demonstrate that E. sosorum is an opportunistic forager capable of diet switching to include a greater proportion of alternative prey when Dugesia populations decline. There is also evidence of intra-population dietary variation.Effective application of stable isotope analysis can help circumvent two key limitations commonly experienced by researchers of endangered species: the inability to directly observe these species in nature and the invasiveness or lethality of traditional methods of dietary analysis. This study illustrates the feasibility of stable

  6. Copper and tin isotopic analysis of ancient bronzes for archaeological investigation: development and validation of a suitable analytical methodology.

    Science.gov (United States)

    Balliana, Eleonora; Aramendía, Maite; Resano, Martin; Barbante, Carlo; Vanhaecke, Frank

    2013-03-01

    Although in many cases Pb isotopic analysis can be relied on for provenance determination of ancient bronzes, sometimes the use of "non-traditional" isotopic systems, such as those of Cu and Sn, is required. The work reported on in this paper aimed at revising the methodology for Cu and Sn isotope ratio measurements in archaeological bronzes via optimization of the analytical procedures in terms of sample pre-treatment, measurement protocol, precision, and analytical uncertainty. For Cu isotopic analysis, both Zn and Ni were investigated for their merit as internal standard (IS) relied on for mass bias correction. The use of Ni as IS seems to be the most robust approach as Ni is less prone to contamination, has a lower abundance in bronzes and an ionization potential similar to that of Cu, and provides slightly better reproducibility values when applied to NIST SRM 976 Cu isotopic reference material. The possibility of carrying out direct isotopic analysis without prior Cu isolation (with AG-MP-1 anion exchange resin) was investigated by analysis of CRM IARM 91D bronze reference material, synthetic solutions, and archaeological bronzes. Both procedures (Cu isolation/no Cu isolation) provide similar δ (65)Cu results with similar uncertainty budgets in all cases (±0.02-0.04 per mil in delta units, k = 2, n = 4). Direct isotopic analysis of Cu therefore seems feasible, without evidence of spectral interference or matrix-induced effect on the extent of mass bias. For Sn, a separation protocol relying on TRU-Spec anion exchange resin was optimized, providing a recovery close to 100 % without on-column fractionation. Cu was recovered quantitatively together with the bronze matrix with this isolation protocol. Isotopic analysis of this Cu fraction provides δ (65)Cu results similar to those obtained upon isolation using AG-MP-1 resin. This means that Cu and Sn isotopic analysis of bronze alloys can therefore be carried out after a single chromatographic

  7. The use of prime implicants in dependability analysis of software controlled systems

    International Nuclear Information System (INIS)

    Yau, Michael; Apostolakis, George; Guarro, Sergio

    1998-01-01

    The behavior of software controlled systems is usually non-binary and dynamic. It is, thus, convenient to employ multi-valued logic to model these systems. Multi-valued logic functions can be used to represent the functional and temporal relationships between the software and hardware components. The resulting multi-valued logic model can be analyzed deductively, i.e. by tracking causality in reverse from undesirable 'top' events to identify faults that may be present in the system. The result of this deductive analysis is a set of prime implicants for a user-defined system top event. The prime implicants represent all the combinations of basic component conditions and software input conditions that may result in the top event; they are the extension to multi-valued logic of the concept of minimal cut sets that is used routinely in the analysis of binary fault trees. This paper discusses why prime implicants are needed in the dependability analysis of software controlled systems, how they are generated, and how they are used to identify faults in a software controlled system

  8. The use of prime implicants in dependability analysis of software controlled systems

    Energy Technology Data Exchange (ETDEWEB)

    Yau, Michael; Apostolakis, George; Guarro, Sergio

    1998-11-01

    The behavior of software controlled systems is usually non-binary and dynamic. It is, thus, convenient to employ multi-valued logic to model these systems. Multi-valued logic functions can be used to represent the functional and temporal relationships between the software and hardware components. The resulting multi-valued logic model can be analyzed deductively, i.e. by tracking causality in reverse from undesirable 'top' events to identify faults that may be present in the system. The result of this deductive analysis is a set of prime implicants for a user-defined system top event. The prime implicants represent all the combinations of basic component conditions and software input conditions that may result in the top event; they are the extension to multi-valued logic of the concept of minimal cut sets that is used routinely in the analysis of binary fault trees. This paper discusses why prime implicants are needed in the dependability analysis of software controlled systems, how they are generated, and how they are used to identify faults in a software controlled system.

  9. Featureous: infrastructure for feature-centric analysis of object-oriented software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand how user-observable program features are implemented and how their implementations relate to each other. It is worthwhile to improve this situation, since feature-centric program...... understanding and modification are essential during software evolution and maintenance. In this paper, we present an infrastructure built on top of the NetBeans IDE called Featureous that allows for rapid construction of tools for feature-centric analysis of object-oriented software. Our infrastructure...... encompasses a lightweight feature location mechanism, a number of analytical views and an API allowing for addition of third-party extensions. To form a common conceptual framework for future feature-centric extensions, we propose to structure feature centric analysis along three dimensions: perspective...

  10. Capi text V.1--data analysis software for nailfold skin capillaroscopy.

    Science.gov (United States)

    Dobrev, Hristo P

    2007-01-01

    Nailfold skin capillaroscopy is a simple non-invasive method used to assess conditions of disturbed microcirculation such as Raynaud's phenomenon, acrocyanosis, perniones, connective tissue diseases, psoriasis, diabetes mellitus, neuropathy and vibration disease. To develop data analysis software aimed at assisting the documentation and analysis of a capillaroscopic investigation. SOFTWARE DESCRIPTION: The programme is based on a modular principle. The module "Nomenclatures" includes menus for the patients' data. The module "Examinations" includes menus for all general and specific aspects of the medical examination and capillaroscopic investigations. The modules "Settings" and "Information" include customization menus for the programme. The results of nailfold capillaroscopy can be printed in a short or expanded form. This software allows physicians to perform quick search by using various specified criteria and prepare analyses and reports. This software programme will facilitate any practitioner who performs nailfold skin capillaroscopy.

  11. Digital PIV (DPIV) Software Analysis System

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.

  12. Isotopic analysis using optical spectroscopy (1963)

    International Nuclear Information System (INIS)

    Gerstenkorn, S.

    1963-01-01

    The isotopic displacement in the atomic lines of certain elements (H, He, Li, Ne, Sr, Hg, Pb, U, Pu) is used for dosing these elements isotopically. The use of the Fabry-Perot photo-electric interference spectrometer is shown to be particularly adapted for this sort of problem: in each case we give on the one hand the essential results obtained with this apparatus, and on the other hand the results previously obtained with a conventional apparatus (grating, photographic plate). These results together give an idea of the possibilities of optical spectroscopy: in the best case, the precision which may be expected is of the order of 1 to 2 per cent for isotopes whose concentration is about 1 per cent. (author) [fr

  13. An assessment of software for flow cytometry analysis in banana plants

    Directory of Open Access Journals (Sweden)

    Renata Alves Lara Silva

    2014-02-01

    Full Text Available Flow cytometry is a technique that yields rapid results in analyses of cell properties such as volume, morphological complexity and quantitative DNA content, and it is considered more convenient than other techniques. However, the analysis usually generates histograms marked by variations that can be produced by many factors, including differences between the software packages that capture the data generated by the flow cytometer. The objective of the present work was to evaluate the performance of four software products commonly used in flow cytometry based on quantifications of DNA content and analyses of the coefficients of variation associated with the software outputs. Readings were obtained from 25 ‘NBA’ (AA banana leaf samples using the FACSCalibur (BD flow cytometer, and 25 histograms from each software product (CellQuest™, WinMDI™, FlowJo™ and FCS Express™ were analyzed to obtain the estimated DNA content and the coefficient of variation (CV of the estimates. The values of DNA content obtained from the software did not differ significantly. However, the CV analysis showed that the precision of the WinMDI™ software was low and that the CV values were underestimated, whereas the remaining software showed CV values that were in relatively close agreement with those found in the literature. The CellQuest™ software is recommended because it was developed by the same company that produces the flow cytometer used in the present study.

  14. Model extension and improvement for simulator-based software safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H.-W. [Department of Engineering and System Science, National Tsing Hua University (NTHU), 101 Section 2 Kuang Fu Road, Hsinchu, Taiwan (China) and Institute of Nuclear Energy Research (INER), No. 1000 Wenhua Road, Chiaan Village, Longtan Township, Taoyuan County 32546, Taiwan (China)]. E-mail: hwhwang@iner.gov.tw; Shih Chunkuan [Department of Engineering and System Science, National Tsing Hua University (NTHU), 101 Section 2 Kuang Fu Road, Hsinchu, Taiwan (China); Yih Swu [Department of Computer Science and Information Engineering, Ching Yun University, 229 Chien-Hsin Road, Jung-Li, Taoyuan County 320, Taiwan (China); Chen, M.-H. [Institute of Nuclear Energy Research (INER), No. 1000Wenhua Road, Chiaan Village, Longtan Township, Taoyuan County 32546, Taiwan (China); Lin, J.-M. [Taiwan Power Company (TPC), 242 Roosevelt Road, Section 3, Taipei 100, Taiwan (China)

    2007-05-15

    One of the major concerns when employing digital I and C system in nuclear power plant is digital system may introduce new failure mode, which differs with previous analog I and C system. Various techniques are under developing to analyze the hazard originated from software faults in digital systems. Preliminary hazard analysis, failure modes and effects analysis, and fault tree analysis are the most extensive used techniques. However, these techniques are static analysis methods, cannot perform dynamic analysis and the interactions among systems. This research utilizes 'simulator/plant model testing' technique classified in (IEEE Std 7-4.3.2-2003, 2003. IEEE Standard for Digital Computers in Safety Systems of Nuclear Power Generating Stations) to identify hazards which might be induced by nuclear I and C software defects. The recirculation flow system, control rod system, feedwater system, steam line model, dynamic power-core flow map, and related control systems of PCTran-ABWR model were successfully extended and improved. The benchmark against ABWR SAR proves this modified model is capable to accomplish dynamic system level software safety analysis and better than the static methods. This improved plant simulation can then further be applied to hazard analysis for operator/digital I and C interface interaction failure study, and the hardware-in-the-loop fault injection study.

  15. Uses of stable isotopes

    International Nuclear Information System (INIS)

    Axente, Damian

    1998-01-01

    The most important fields of stable isotope use with examples are presented. These are: 1. Isotope dilution analysis: trace analysis, measurements of volumes and masses; 2. Stable isotopes as tracers: transport phenomena, environmental studies, agricultural research, authentication of products and objects, archaeometry, studies of reaction mechanisms, structure and function determination of complex biological entities, studies of metabolism, breath test for diagnostic; 3. Isotope equilibrium effects: measurement of equilibrium effects, investigation of equilibrium conditions, mechanism of drug action, study of natural processes, water cycle, temperature measurements; 4. Stable isotope for advanced nuclear reactors: uranium nitride with 15 N as nuclear fuel, 157 Gd for reactor control. In spite of some difficulties of stable isotope use, particularly related to the analytical techniques, which are slow and expensive, the number of papers reporting on this subject is steadily growing as well as the number of scientific meetings organized by International Isotope Section and IAEA, Gordon Conferences, and regional meeting in Germany, France, etc. Stable isotope application development on large scale is determined by improving their production technologies as well as those of labeled compound and the analytical techniques. (author)

  16. Intraprocedural dataflow analysis for software product lines

    DEFF Research Database (Denmark)

    Brabrand, Claus; Ribeiro, Márcio; Tolêdo, Társis

    2013-01-01

    Software product lines (SPLs) developed using annotative approaches such as conditional compilation come with an inherent risk of constructing erroneous products. For this reason, it is essential to be able to analyze such SPLs. However, as dataflow analysis techniques are not able to deal with SP...... and memory characteristics on five qualitatively different SPLs. On our benchmarks, the combined analysis strategy is up to almost eight times faster than the brute-force approach....

  17. Spectrum analysis in lead spectrometer for isotopic fissile assay in used fuel

    International Nuclear Information System (INIS)

    Lee, Y.D.; Park, C.J.; Kim, H.D.; Song, K.C.

    2014-01-01

    The LSDS system is under development for analyzing isotopic fissile content applicable in a hot cell for the pyro process. The fuel assay area and nuclear material composition were selected for simulation. The source mechanism for efficient neutron generation was also determined. A neutron is produced at the Ta target by hitting it from accelerated electron. The parameters for an electron accelerator are being researched for cost effectiveness, easy maintenance, and compact size. The basic principle of LSDS is that isotopic fissile has its own fission structure below the unresolved resonance region. The source neutron interacts with a lead medium and produces continuous neutron energy, which generates dominant fission at each fissile. Therefore, a spectrum analysis is very important at a lead medium and fuel area for system working. The energy spectrum with respect to slowing down energy and the energy resolution were investigated in lead. A spectrum analysis was done by the existence of surrounding detectors. In particular, high resonance energy was considered. The spectrum was well organized at each slowing down energy and the energy resolution was acceptable to distinguish isotopic fissile fissions. Additionally, LSDS is applicable for the optimum design of spent fuel storage and management.The isotopic fissile content assay will increase the transparency and credibility for spent fuel storage and its re-utilization, as demanded internationally. (author)

  18. A Practical Cryogen-Free CO2 Purification and Freezing Technique for Stable Isotope Analysis.

    Science.gov (United States)

    Sakai, Saburo; Matsuda, Shinichi

    2017-04-18

    Since isotopic analysis by mass spectrometry began in the early 1900s, sample gas for light-element isotopic measurements has been purified by the use of cryogens and vacuum-line systems. However, this conventional purification technique can achieve only certain temperatures that depend on the cryogens and can be sustained only as long as there is a continuous cryogen supply. Here, we demonstrate a practical cryogen-free CO 2 purification technique using an electrical operated cryocooler for stable isotope analysis. This approach is based on portable free-piston Stirling cooling technology and controls the temperature to an accuracy of 0.1 °C in a range from room temperature to -196 °C (liquid-nitrogen temperature). The lowest temperature can be achieved in as little as 10 min. We successfully purified CO 2 gas generated by carbonates and phosphoric acid reaction and found its sublimation point to be -155.6 °C at 0.1 Torr in the vacuum line. This means that the temperature required for CO 2 trapping is much higher than the liquid-nitrogen temperature. Our portable cooling system offers the ability to be free from the inconvenience of cryogen use for stable isotope analysis. It also offers a new cooling method applicable to a number of fields that use gas measurements.

  19. Field ionization mass spectrometry (FIMS) applied to tracer studies and isotope dilution analysis

    International Nuclear Information System (INIS)

    Anbar, M.; Heck, H.d'A.; McReynolds, J.H.; St John, G.A.

    1975-01-01

    The nonfragmenting nature of field ionization mass spectrometry makes it a preferred technique for the isotopic analysis of multilabeled organic compounds. The possibility of field ionization of nonvolatile thermolabile materials significantly extends the potential uses of this technique beyond those of conventional ionization methods. Multilabeled tracers may be studied in biological systems with a sensitivity comparable to that of radioactive tracers. Isotope dilution analysis may be performed reliably by this technique down to picogram levels. These techniques will be illustrated by a number of current studies using multilabeled metabolites and drugs. The scope and limitations of the methodology are discussed

  20. Spacelab user implementation assessment study. (Software requirements analysis). Volume 2: Technical report

    Science.gov (United States)

    1976-01-01

    The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.

  1. Software ion scan functions in analysis of glycomic and lipidomic MS/MS datasets.

    Science.gov (United States)

    Haramija, Marko

    2018-03-01

    Hardware ion scan functions unique to tandem mass spectrometry (MS/MS) mode of data acquisition, such as precursor ion scan (PIS) and neutral loss scan (NLS), are important for selective extraction of key structural data from complex MS/MS spectra. However, their software counterparts, software ion scan (SIS) functions, are still not regularly available. Software ion scan functions can be easily coded for additional functionalities, such as software multiple precursor ion scan, software no ion scan, and software variable ion scan functions. These are often necessary, since they allow more efficient analysis of complex MS/MS datasets, often encountered in glycomics and lipidomics. Software ion scan functions can be easily coded by using modern script languages and can be independent of instrument manufacturer. Here we demonstrate the utility of SIS functions on a medium-size glycomic MS/MS dataset. Knowledge of sample properties, as well as of diagnostic and conditional diagnostic ions crucial for data analysis, was needed. Based on the tables constructed with the output data from the SIS functions performed, a detailed analysis of a complex MS/MS glycomic dataset could be carried out in a quick, accurate, and efficient manner. Glycomic research is progressing slowly, and with respect to the MS experiments, one of the key obstacles for moving forward is the lack of appropriate bioinformatic tools necessary for fast analysis of glycomic MS/MS datasets. Adding novel SIS functionalities to the glycomic MS/MS toolbox has a potential to significantly speed up the glycomic data analysis process. Similar tools are useful for analysis of lipidomic MS/MS datasets as well, as will be discussed briefly. Copyright © 2017 John Wiley & Sons, Ltd.

  2. A parallel and sensitive software tool for methylation analysis on multicore platforms.

    Science.gov (United States)

    Tárraga, Joaquín; Pérez, Mariano; Orduña, Juan M; Duato, José; Medina, Ignacio; Dopazo, Joaquín

    2015-10-01

    DNA methylation analysis suffers from very long processing time, as the advent of Next-Generation Sequencers has shifted the bottleneck of genomic studies from the sequencers that obtain the DNA samples to the software that performs the analysis of these samples. The existing software for methylation analysis does not seem to scale efficiently neither with the size of the dataset nor with the length of the reads to be analyzed. As it is expected that the sequencers will provide longer and longer reads in the near future, efficient and scalable methylation software should be developed. We present a new software tool, called HPG-Methyl, which efficiently maps bisulphite sequencing reads on DNA, analyzing DNA methylation. The strategy used by this software consists of leveraging the speed of the Burrows-Wheeler Transform to map a large number of DNA fragments (reads) rapidly, as well as the accuracy of the Smith-Waterman algorithm, which is exclusively employed to deal with the most ambiguous and shortest reads. Experimental results on platforms with Intel multicore processors show that HPG-Methyl significantly outperforms in both execution time and sensitivity state-of-the-art software such as Bismark, BS-Seeker or BSMAP, particularly for long bisulphite reads. Software in the form of C libraries and functions, together with instructions to compile and execute this software. Available by sftp to anonymous@clariano.uv.es (password 'anonymous'). juan.orduna@uv.es or jdopazo@cipf.es. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections.

    Science.gov (United States)

    Zamawe, F C

    2015-03-01

    For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.

  4. Comparison of two three-dimensional cephalometric analysis computer software.

    Science.gov (United States)

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-10-01

    Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Twenty cone beam computed tomography images were obtained using i-CAT(®) imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (University of Illinois at Chicago, Chicago, IL, USA) software. Before and after orthodontic treatments data were analyzed using t-test. Reliability test using interclass correlation coefficient was stronger for InVivoDental5.0 (0.83-0.98) compared with 3DCeph™ (0.51-0.90). Paired t-test comparison of the two softwares shows no statistical significant difference in the measurements made in the two softwares. InVivoDental5.0 measurements are more reproducible and user friendly when compared to 3DCeph™. No statistical difference between the two softwares in linear or angular measurements. 3DCeph™ is more time-consuming in performing three-dimensional analysis compared with InVivoDental5.0.

  5. Validation of multi-element isotope dilution ICPMS for the analysis of basalts

    Energy Technology Data Exchange (ETDEWEB)

    Willbold, M.; Jochum, K.P.; Raczek, I.; Amini, M.A.; Stoll, B.; Hofmann, A.W. [Max-Planck-Institut fuer Chemie, Mainz (Germany)

    2003-09-01

    In this study we have validated a newly developed multi-element isotope dilution (ID) ICPMS method for the simultaneous analysis of up to 12 trace elements in geological samples. By evaluating the analytical uncertainty of individual components using certified reference materials we have quantified the overall analytical uncertainty of the multi-element ID ICPMS method at 1-2%. Individual components include sampling/weighing, purity of reagents, purity of spike solutions, calibration of spikes, determination of isotopic ratios, instrumental sources of error, correction of mass discrimination effect, values of constants, and operator bias. We have used the ID-determined trace elements for internal standardization to improve indirectly the analysis of 14 other (mainly mono-isotopic trace elements) by external calibration. The overall analytical uncertainty for those data is about 2-3%. In addition, we have analyzed USGS and MPI-DING geological reference materials (BHVO-1, BHVO-2, KL2-G, ML3B-G) to quantify the overall bias of the measurement procedure. Trace element analysis of geological reference materials yielded results that agree mostly within about 2-3% relative to the reference values. Since these results match the conclusions obtained by the investigation of the overall analytical uncertainty, we take this as a measure for the validity of multi-element ID ICPMS. (orig.)

  6. Composite waste analysis system

    International Nuclear Information System (INIS)

    Wachter, J.R.; Hagan, R.C.; Bonner, C.A.; Malcom, J.E.; Camp, K.L.

    1993-01-01

    Nondestructive analysis (NDA) of radioactive waste forms an integral component of nuclear materials accountability programs and waste characterization acceptance criterion. However, waste measurements are often complicated by unknown isotopic compositions and the potential for concealment of special nuclear materials in a manner that is transparent to gamma-ray measurement instruments. To overcome these complications, a new NDA measurement system has been developed to assay special nuclear material in both transuranic and low level waste from the same measurement platform. The system incorporates a NaI detector and customized commercial software routines to measure small quantities of radioactive material in low level waste. Transuranic waste analysis is performed with a coaxial HPGE detector and uses upgraded PC-based segmented gamma scanner software to assay containers up to 55 gal. in volume. Gamma-Ray isotopics analysis of both waste forms is also performed with this detector. Finally, a small neutron counter using specialized software is attached to the measurement platform to satisfy safeguards concerns related to nuclear materials that are not sensed by the gamma-ray instruments. This report describes important features and capabilities of the system and presents a series of test measurements that are to be performed to define system parameters

  7. Choosing your weapons : on sentiment analysis tools for software engineering research

    NARCIS (Netherlands)

    Jongeling, R.M.; Datta, S.; Serebrenik, A.; Koschke, R.; Krinke, J.; Robillard, M.

    2015-01-01

    Recent years have seen an increasing attention to social aspects of software engineering, including studies of emotions and sentiments experienced and expressed by the software developers. Most of these studies reuse existing sentiment analysis tools such as SentiStrength and NLTK. However, these

  8. Comparison of gas chromatography/isotope ratio mass spectrometry and liquid chromatography/isotope ratio mass spectrometry for carbon stable-isotope analysis of carbohydrates.

    Science.gov (United States)

    Moerdijk-Poortvliet, Tanja C W; Schierbeek, Henk; Houtekamer, Marco; van Engeland, Tom; Derrien, Delphine; Stal, Lucas J; Boschker, Henricus T S

    2015-07-15

    We compared gas chromatography/isotope ratio mass spectrometry (GC/IRMS) and liquid chromatography/isotope ratio mass spectrometry (LC/IRMS) for the measurement of δ(13)C values in carbohydrates. Contrary to GC/IRMS, no derivatisation is needed for LC/IRMS analysis of carbohydrates. Hence, although LC/IRMS is expected to be more accurate and precise, no direct comparison has been reported. GC/IRMS with the aldonitrile penta-acetate (ANPA) derivatisation method was compared with LC/IRMS without derivatisation. A large number of glucose standards and a variety of natural samples were analysed for five neutral carbohydrates at natural abundance as well as at (13)C-enriched levels. Gas chromatography/chemical ionisation mass spectrometry (GC/CIMS) was applied to check for incomplete derivatisation of the carbohydrate, which would impair the accuracy of the GC/IRMS method. The LC/IRMS technique provided excellent precision (±0.08‰ and ±3.1‰ at natural abundance and enrichment levels, respectively) for the glucose standards and this technique proved to be superior to GC/IRMS (±0.62‰ and ±19.8‰ at natural abundance and enrichment levels, respectively). For GC/IRMS measurements the derivatisation correction and the conversion of carbohydrates into CO2 had a considerable effect on the measured δ(13)C values. However, we did not find any significant differences in the accuracy of the two techniques over the full range of natural δ(13)C abundances and (13)C-labelled glucose. The difference in the performance of GC/IRMS and LC/IRMS diminished when the δ(13)C values were measured in natural samples, because the chromatographic performance and background correction became critical factors, particularly for LC/IRMS. The derivatisation of carbohydrates for the GC/IRMS method was complete. Although both LC/IRMS and GC/IRMS are reliable techniques for compound-specific stable carbon isotope analysis of carbohydrates (provided that derivatisation is complete and the

  9. Calibration Analysis Software for the ATLAS Pixel Detector

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00372086; The ATLAS collaboration

    2016-01-01

    The calibration of the ATLAS Pixel detector at LHC fulfils two main purposes: to tune the front-end configuration parameters for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel detector scans and analyses is called Calibration Console. The introduction of a new layer, equipped with new Front End-I4 Chips, required an update the Console architecture. It now handles scans and scans analyses applied together to chips with different characteristics. An overview of the newly developed Calibration Analysis Software will be presented, together with some preliminary result.

  10. Planning and Analysis of the Company’s Financial Performances by Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Meri BOSHKOSKA

    2017-06-01

    Full Text Available Information Technology includes a wide range of software solution that helps managers in decision making processes in order to increase the company's business performance. Using software solution in financial analysis is a valuable tool for managers in the financial decision making process. The objective of the study was accomplished by developing Software that easily determines the financial performances of the company through integration of the analysis of financial indicators and DuPont profitability analysis model. Through this software, managers will be able to calculate the current financial state and visually analyze how their actions will affect the financial performance of the company. This will enable them to identify the best ways to improve the financial performance of the company. The software can perform a financial analysis and give a clear, useful overview of the current business performance and can also help in planning the growth of the company. The Software can also be implemented in educational purposes for students and managers in the field of financial management.

  11. Development of the software dead time methodology for the 4πβ-γ software coincidence system analysis program

    International Nuclear Information System (INIS)

    Toledo, Fabio de; Brancaccio, Franco; Dias, Mauro da Silva

    2009-01-01

    The Laboratorio de Metrologia Nuclear - LMN, Nuclear Metrology Laboratory -, at IPEN-CNEN/SP, Sao Paulo, Brazil, developed a new Software Coincidence System (SCS) for 4πβ-γ radioisotope standardization. SCS is composed by the data acquisition hardware, for the coincidence data recording, and the coincidence data analysis program that performs the radioactive activity calculation for the target sample. Due to hardware intrinsic signal sampling characteristics, multiple undesired data recording occurs from a single saturated pulse. Also pulse pileup leads to bad data recording. As the beta counting rates are much greater than the gamma ones, due to the high 4π geometry beta detecting efficiencies, the beta counting significantly increases because of multiple pulse recordings, resulting in a respective increasing in the calculated activity value. In order to minimize such bad recordings effect, a software dead time value was introduced in the coincidence analysis program, under development at LMN, discarding multiple recordings, due to pulse pileup or saturation. This work presents the methodology developed to determine the optimal software dead time data value, for better accuracy results attaining, and discusses the results, pointing to software improvement possibilities. (author)

  12. SIMA: Python software for analysis of dynamic fluorescence imaging data

    Directory of Open Access Journals (Sweden)

    Patrick eKaifosh

    2014-09-01

    Full Text Available Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs, and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  13. The ESA's Space Trajectory Analysis software suite

    Science.gov (United States)

    Ortega, Guillermo

    The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and

  14. Bulk - Samples gamma-rays activation analysis (PGNAA) with Isotopic Neutron Sources

    International Nuclear Information System (INIS)

    HASSAN, A.M.

    2009-01-01

    An overview is given on research towards the Prompt Gamma-ray Neutron Activation Analysis (PGNAA) of bulk-samples. Some aspects in bulk-sample PGNAA are discussed, where irradiation by isotopic neutron sources is used mostly for in-situ or on-line analysis. The research was carried out in a comparative and/or qualitative way or by using a prior knowledge about the sample material. Sometimes we need to use the assumption that the mass fractions of all determined elements add up to 1. The sensitivity curves are also used for some elements in such complex samples, just to estimate the exact percentage concentration values. The uses of 252 Cf, 241 Arn/Be and 239 Pu/Be isotopic neutron sources for elemental investigation of: hematite, ilmenite, coal, petroleum, edible oils, phosphates and pollutant lake water samples have been mentioned.

  15. The isotopic contamination in electromagnetic isotope separators; La contagion isotopique dans les separateurs electromagnetiques d'isotopes

    Energy Technology Data Exchange (ETDEWEB)

    Cassignol, Ch [Commissariat a l' Energie Atomique, Saclay (France).Centre d' Etudes Nucleaires

    1959-07-01

    In the early years of isotope separation, and in particular electromagnetic isotope separation, needs for rapid results have conducted to empiric research. This paper describes fundamental research on the electromagnetic isotope separation to a better understanding of isotope separators as well as improving the performances. Focus has been made on the study of the principle of isotope contamination and the remedial action on the separator to improve the isotope separation ratio. In a first part, the author come back to the functioning of an electromagnetic separator and generalities on isotope contamination. Secondly, it describes the two stages separation method with two dispersive apparatus, an electromagnetic separation stage followed by an electrostatic separation stage, both separated by a diaphragm. The specifications of the electrostatic stage are given and its different settings and their consequences on isotope separation are investigated. In a third part, mechanisms and contamination factors in the isotope separation are discussed: natural isotope contamination, contamination by rebounding on the collector, contamination because of a low resolution, contamination by chromatism and diffusion effect, breakdown of condenser voltage. Analysis of experimental results shows the diffusion as the most important contamination factor in electromagnetic isotope separation. As contamination factors are dependent on geometric parameters, sector angle, radius of curvature in the magnetic field and clearance height are discussed in a fourth part. The better understanding of the mechanism of the different contamination factors and the study of influential parameters as pressure and geometric parameters lead to define a global scheme of isotope contamination and determinate optima separator design and experimental parameters. Finally, the global scheme of isotope contamination and hypothesis on optima specifications and experimental parameters has been checked during a

  16. Feasibility study of plutonium isotopic analysis of resin beads by nondestructive gamma-ray spectroscopy

    International Nuclear Information System (INIS)

    Li, T.K.

    1985-01-01

    We have initiated a feasibility study on the use of nondestructive low-energy gamma-ray spectroscopy for plutonium isotopic analysis on resin beads. Seven resin bead samples were measured, with each sample containing an average of 9 μg of plutonium; the isotopic compositions of the samples varied over a wide range. The gamma-ray spectroscopy results, obtained from 4-h counting-time measurements, were compared with mass spectrometry results. The average ratios of gamma-ray spectroscopy to mass spectrometry were 1.014 +- 0.025 for 238 Pu/ 239 Pu, 0.996 +- 0.018 for 240 Pu/ 239 Pu, and 0.980 +- 0.038 for 241 Pu/ 239 Pu. The rapid, automated, and accurate nondestructive isotopic analysis of resin beads may be very useful to process technicians and International Atomic Energy Agency inspectors. 3 refs., 1 fig., 3 tabs

  17. Modularity analysis of automotive control software

    OpenAIRE

    Dajsuren, Y.; Brand, van den, M.G.J.; Serebrenik, A.

    2013-01-01

    A design language and tool like MATLAB/Simulink is used for the graphical modelling and simulation of automotive control software. As the functionality based on electronics and software systems increases in motor vehicles, it is becoming increasingly important for system/software architects and control engineers in the automotive industry to ensure the quality of the highly complex MATLAB/Simulink control software. For automotive software, modularity is recognized as being a crucial quality a...

  18. On the interference of Kr during carbon isotope analysis of methane using continuous-flow combustion–isotope ratio mass spectrometry

    Directory of Open Access Journals (Sweden)

    J. Schmitt

    2013-05-01

    Full Text Available Stable carbon isotope analysis of methane (δ13C of CH4 on atmospheric samples is one key method to constrain the current and past atmospheric CH4 budget. A frequently applied measurement technique is gas chromatography (GC isotope ratio mass spectrometry (IRMS coupled to a combustion-preconcentration unit. This report shows that the atmospheric trace gas krypton (Kr can severely interfere during the mass spectrometric measurement, leading to significant biases in δ13C of CH4, if krypton is not sufficiently separated during the analysis. According to our experiments, the krypton interference is likely composed of two individual effects, with the lateral tailing of the doubly charged 86Kr peak affecting the neighbouring m/z 44 and partially the m/z 45 Faraday cups. Additionally, a broad signal affecting m/z 45 and especially m/z 46 is assumed to result from scattered ions of singly charged krypton. The introduced bias in the measured isotope ratios is dependent on the chromatographic separation, the krypton-to-CH4 mixing ratio in the sample, the focusing of the mass spectrometer as well as the detector configuration and can amount to up to several per mil in δ13C. Apart from technical solutions to avoid this interference, we present correction routines to a posteriori remove the bias.

  19. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  20. Development of a software for INAA analysis automation

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Figueiredo, Ana Maria G.; Ticianelli, Regina B.

    2013-01-01

    In this work, a software to automate the post-counting tasks in comparative INAA has been developed that aims to become more flexible than the available options, integrating itself with some of the routines currently in use in the IPEN Activation Analysis Laboratory and allowing the user to choose between a fully-automatic analysis or an Excel-oriented one. The software makes use of the Genie 2000 data importing and analysis routines and stores each 'energy-counts-uncertainty' table as a separate ASCII file that can be used later on if required by the analyst. Moreover, it generates an Excel-compatible CSV (comma separated values) file with only the relevant results from the analyses for each sample or comparator, as well as the results of the concentration calculations and the results obtained with four different statistical tools (unweighted average, weighted average, normalized residuals and Rajeval technique), allowing the analyst to double-check the results. Finally, a 'summary' CSV file is also produced, with the final concentration results obtained for each element in each sample. (author)

  1. Analytical developments in thermal ionization mass spectrometry for the isotopic analysis of very small amounts

    International Nuclear Information System (INIS)

    Mialle, S.

    2011-01-01

    In the framework of the French transmutation project of nuclear wastes, experiments consisted in the irradiation in a fast neutron reactor of few milligrams of isotopically enriched powders. Hence, the isotopic analysis of very small amount of irradiation products is one of the main issues. The aim of this study was to achieve analytical developments in thermal ionization mass spectrometry in order to accurately analyze these samples. Several axes were studied including the new total evaporation method, deposition techniques, electron multiplier potentialities and comparison between different isotope measurement techniques. Results showed that it was possible to drastically decrease the amounts needed for analysis, especially with Eu and Nd, while maintaining an uncertainty level in agreement with the project requirements. (author) [fr

  2. Investigation of plutonium abundance and age analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huailong, Wu; Jian, Gong; Fanhua, Hao [China Academy of Engineering Physics, Mianyang (China). Inst. of Nuclear Physics and Chemistry

    2007-06-15

    Based on spectra analysis software, all of the plutonium material peak counts are analyzed. Relatively efficiency calibration is done by the non-coupling peaks of {sup 239}Pu. By using the known isotopes half life and yield, the coupling peaks counts are allocated by non-coupling peaks, consequently the atom ratios of each isotope are gotten. The formula between atom ratio and abundance or age is deduced by plutonium material isotopes decay characteristic. And so the abundance and age of plutonium material is gotten. After some re- peat measurements for a plutonium equipment are completed, a comparison between our analysis results and PC-FRAM and the owner's reference results are done. (authors)

  3. Software Performs Complex Design Analysis

    Science.gov (United States)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  4. Direct uranium isotope ratio analysis of single micrometer-sized glass particles.

    Science.gov (United States)

    Kappel, Stefanie; Boulyga, Sergei F; Prohaska, Thomas

    2012-11-01

    We present the application of nanosecond laser ablation (LA) coupled to a 'Nu Plasma HR' multi collector inductively coupled plasma mass spectrometer (MC-ICP-MS) for the direct analysis of U isotope ratios in single, 10-20 μm-sized, U-doped glass particles. Method development included studies with respect to (1) external correction of the measured U isotope ratios in glass particles, (2) the applied laser ablation carrier gas (i.e. Ar versus He) and (3) the accurate determination of lower abundant (236)U/(238)U isotope ratios (i.e. 10(-5)). In addition, a data processing procedure was developed for evaluation of transient signals, which is of potential use for routine application of the developed method. We demonstrate that the developed method is reliable and well suited for determining U isotope ratios of individual particles. Analyses of twenty-eight S1 glass particles, measured under optimized conditions, yielded average biases of less than 0.6% from the certified values for (234)U/(238)U and (235)U/(238)U ratios. Experimental results obtained for (236)U/(238)U isotope ratios deviated by less than -2.5% from the certified values. Expanded relative total combined standard uncertainties U(c) (k = 2) of 2.6%, 1.4% and 5.8% were calculated for (234)U/(238)U, (235)U/(238)U and (236)U/(238)U, respectively. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Dual element ((15)N/(14)N, (13)C/(12)C) isotope analysis of glyphosate and AMPA by derivatization-gas chromatography isotope ratio mass spectrometry (GC/IRMS) combined with LC/IRMS.

    Science.gov (United States)

    Mogusu, Emmanuel O; Wolbert, J Benjamin; Kujawinski, Dorothea M; Jochmann, Maik A; Elsner, Martin

    2015-07-01

    To assess sources and degradation of the herbicide glyphosate [N-(phosphonomethyl) glycine] and its metabolite AMPA (aminomethylphosphonic acid), concentration measurements are often inconclusive and even (13)C/(12)C analysis alone may give limited information. To advance isotope ratio analysis of an additional element, we present compound-specific (15)N/(14)N analysis of glyphosate and AMPA by a two step derivatization in combination with gas chromatography/isotope ratio mass spectrometry (GC/IRMS). The N-H group was derivatized with isopropyl chloroformate (iso-PCF), and remaining acidic groups were subsequently methylated with trimethylsilyldiazomethane (TMSD). Iso-PCF treatment at pH 10 indicated decomposition of the derivative. At pH 10, and with an excess of iso-PCF by 10-24, greatest yields and accurate (15)N/(14)N ratios were obtained (deviation from elemental analyzer-IRMS: -0.2 ± 0.9% for glyphosate; -0.4 ± 0.7% for AMPA). Limits for accurate δ(15)N analysis of glyphosate and AMPA were 150 and 250 ng injected, respectively. A combination of δ(15)N and δ(13)C analysis by liquid chromatography/isotope ratio mass spectrometry (LC/IRMS) (1) enabled an improved distinction of commercial glyphosate products and (2) showed that glyphosate isotope values during degradation by MnO2 clearly fell outside the commercial product range. This highlights the potential of combined carbon and nitrogen isotopes analysis to trace sources and degradation of glyphosate.

  6. Spatially resolved δ13C analysis using laser ablation isotope ratio mass spectrometry

    Science.gov (United States)

    Moran, J.; Riha, K. M.; Nims, M. K.; Linley, T. J.; Hess, N. J.; Nico, P. S.

    2014-12-01

    Inherent geochemical, organic matter, and microbial heterogeneity over small spatial scales can complicate studies of carbon dynamics through soils. Stable isotope analysis has a strong history of helping track substrate turnover, delineate rhizosphere activity zones, and identifying transitions in vegetation cover, but most traditional isotope approaches are limited in spatial resolution by a combination of physical separation techniques (manual dissection) and IRMS instrument sensitivity. We coupled laser ablation sampling with isotope measurement via IRMS to enable spatially resolved analysis over solid surfaces. Once a targeted sample region is ablated the resulting particulates are entrained in a helium carrier gas and passed through a combustion reactor where carbon is converted to CO2. Cyrotrapping of the resulting CO2 enables a reduction in carrier gas flow which improves overall measurement sensitivity versus traditional, high flow sample introduction. Currently we are performing sample analysis at 50 μm resolution, require 65 ng C per analysis, and achieve measurement precision consistent with other continuous flow techniques. We will discuss applications of the laser ablation IRMS (LA-IRMS) system to microbial communities and fish ecology studies to demonstrate the merits of this technique and how similar analytical approaches can be transitioned to soil systems. Preliminary efforts at analyzing soil samples will be used to highlight strengths and limitations of the LA-IRMS approach, paying particular attention to sample preparation requirements, spatial resolution, sample analysis time, and the types of questions most conducive to analysis via LA-IRMS.

  7. Isotopes in heterogeneous catalysis

    CERN Document Server

    Hargreaves, Justin SJ

    2006-01-01

    The purpose of this book is to review the current, state-of-the-art application of isotopic methods to the field of heterogeneous catalysis. Isotopic studies are arguably the ultimate technique in in situ methods for heterogeneous catalysis. In this review volume, chapters have been contributed by experts in the field and the coverage includes both the application of specific isotopes - Deuterium, Tritium, Carbon-14, Sulfur-35 and Oxygen-18 - as well as isotopic techniques - determination of surface mobility, steady state transient isotope kinetic analysis, and positron emission profiling.

  8. Isotope analysis reveals foraging area dichotomy for atlantic leatherback turtles.

    Directory of Open Access Journals (Sweden)

    Stéphane Caut

    Full Text Available BACKGROUND: The leatherback turtle (Dermochelys coriacea has undergone a dramatic decline over the last 25 years, and this is believed to be primarily the result of mortality associated with fisheries bycatch followed by egg and nesting female harvest. Atlantic leatherback turtles undertake long migrations across ocean basins from subtropical and tropical nesting beaches to productive frontal areas. Migration between two nesting seasons can last 2 or 3 years, a time period termed the remigration interval (RI. Recent satellite transmitter data revealed that Atlantic leatherbacks follow two major dispersion patterns after nesting season, through the North Gulf Stream area or more eastward across the North Equatorial Current. However, information on the whole RI is lacking, precluding the accurate identification of feeding areas where conservation measures may need to be applied. METHODOLOGY/PRINCIPAL FINDINGS: Using stable isotopes as dietary tracers we determined the characteristics of feeding grounds of leatherback females nesting in French Guiana. During migration, 3-year RI females differed from 2-year RI females in their isotope values, implying differences in their choice of feeding habitats (offshore vs. more coastal and foraging latitude (North Atlantic vs. West African coasts, respectively. Egg-yolk and blood isotope values are correlated in nesting females, indicating that egg analysis is a useful tool for assessing isotope values in these turtles, including adults when not available. CONCLUSIONS/SIGNIFICANCE: Our results complement previous data on turtle movements during the first year following the nesting season, integrating the diet consumed during the year before nesting. We suggest that the French Guiana leatherback population segregates into two distinct isotopic groupings, and highlight the urgent need to determine the feeding habitats of the turtle in the Atlantic in order to protect this species from incidental take by

  9. Manual on mathematical models in isotope hydrogeology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-10-01

    Methodologies based on the use of naturally occurring isotopes are, at present, an integral part of studies being undertaken for water resources assessment and management. Quantitative evaluations based on the temporal and/or spatial distribution of different isotopic species in hydrological systems require conceptual mathematical formulations. Different types of model can be employed depending on the nature of the hydrological system under investigation, the amount and type of data available, and the required accuracy of the parameter to be estimated. This manual provides an overview of the basic concepts of existing modelling approaches, procedures for their application to different hydrological systems, their limitations and data requirements. Guidance in their practical applications, illustrative case studies and information on existing PC software are also included. While the subject matter of isotope transport modelling and improved quantitative evaluations through natural isotopes in water sciences is still at the development stage, this manual summarizes the methodologies available at present, to assist the practitioner in the proper use within the framework of ongoing isotope hydrological field studies. In view of the widespread use of isotope methods in groundwater hydrology, the methodologies covered in the manual are directed towards hydrogeological applications, although most of the conceptual formulations presented would generally be valid. Refs, figs, tabs.

  10. Manual on mathematical models in isotope hydrogeology

    International Nuclear Information System (INIS)

    1996-10-01

    Methodologies based on the use of naturally occurring isotopes are, at present, an integral part of studies being undertaken for water resources assessment and management. Quantitative evaluations based on the temporal and/or spatial distribution of different isotopic species in hydrological systems require conceptual mathematical formulations. Different types of model can be employed depending on the nature of the hydrological system under investigation, the amount and type of data available, and the required accuracy of the parameter to be estimated. This manual provides an overview of the basic concepts of existing modelling approaches, procedures for their application to different hydrological systems, their limitations and data requirements. Guidance in their practical applications, illustrative case studies and information on existing PC software are also included. While the subject matter of isotope transport modelling and improved quantitative evaluations through natural isotopes in water sciences is still at the development stage, this manual summarizes the methodologies available at present, to assist the practitioner in the proper use within the framework of ongoing isotope hydrological field studies. In view of the widespread use of isotope methods in groundwater hydrology, the methodologies covered in the manual are directed towards hydrogeological applications, although most of the conceptual formulations presented would generally be valid. Refs, figs, tabs

  11. Pressurizer pump reliability analysis high flux isotope reactor

    International Nuclear Information System (INIS)

    Merryman, L.; Christie, B.

    1993-01-01

    During a prolonged outage from November 1986 to May 1990, numerous changes were made at the High Flux Isotope Reactor (HFIR). Some of these changes involved the pressurizer pumps. An analysis was performed to calculate the impact of these changes on the pressurizer system availability. The analysis showed that the availability of the pressurizer system dropped from essentially 100% to approximately 96%. The primary reason for the decrease in availability comes because off-site power grid disturbances sometimes result in a reactor trip with the present pressurizer pump configuration. Changes are being made to the present pressurizer pump configuration to regain some of the lost availability

  12. Experimental analysis of specification language diversity impact on NPP software diversity

    International Nuclear Information System (INIS)

    Yoo, Chang Sik

    1999-02-01

    In order to increase computer system reliability, software fault tolerance methods have been adopted to some safety critical systems including NPP. Prevention of software common mode failure is very crucial problem in software fault tolerance, but the effective method for this problem is not found yet. In our research, to find out an effective method for prevention of software common mode failure, the impact of specification language diversity on NPP software diversity was examined experimentally. Three specification languages were used to compose three requirements specifications, and programmers made twelve product codes from the specifications. From the product codes analysis, using fault diversity criteria, we concluded that diverse specification language method would enhance program diversity through diversification of requirements specification imperfections

  13. Synchronized analysis of testbeam data with the Judith software

    CERN Document Server

    McGoldrick, Garrin; Gorišek, Andrej

    2014-01-01

    The Judith software performs pixel detector analysis tasks utilizing two different data streams such as those produced by the reference and tested devices typically found in a testbeam. This software addresses and fixes problems arising from the desynchronization of the two simultaneously triggered data streams by detecting missed triggers in either of the streams. The software can perform all tasks required to generate particle tracks using multiple detector planes: it can align the planes, cluster hits and generate tracks from these clusters. This information can then be used to measure the properties of a particle detector with very fine spatial resolution. It was tested at DESY in the Kartel telescope, a silicon tracking detector, with ATLAS Diamond Beam Monitor modules as a device under test.

  14. Linking cases of illegal shootings of the endangered California condor using stable lead isotope analysis

    Energy Technology Data Exchange (ETDEWEB)

    Finkelstein, Myra E., E-mail: myraf@ucsc.edu [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States); Kuspa, Zeka E. [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States); Welch, Alacia [National Park Service, Pinnacles National Park, 5000 Highway 146, Paicines, CA 95043 (United States); Eng, Curtis; Clark, Michael [Los Angeles Zoo and Botanical Gardens, 5333 Zoo Drive, Los Angeles, CA 90027 (United States); Burnett, Joseph [Ventana Wildlife Society, 19045 Portola Dr. Ste. F-1, Salinas, CA 93908 (United States); Smith, Donald R. [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States)

    2014-10-15

    Lead poisoning is preventing the recovery of the critically endangered California condor (Gymnogyps californianus) and lead isotope analyses have demonstrated that ingestion of spent lead ammunition is the principal source of lead poisoning in condors. Over an 8 month period in 2009, three lead-poisoned condors were independently presented with birdshot embedded in their tissues, evidencing they had been shot. No information connecting these illegal shooting events existed and the timing of the shooting(s) was unknown. Using lead concentration and stable lead isotope analyses of feathers, blood, and recovered birdshot, we observed that: i) lead isotope ratios of embedded shot from all three birds were measurably indistinguishable from each other, suggesting a common source; ii) lead exposure histories re-constructed from feather analysis suggested that the shooting(s) occurred within the same timeframe; and iii) two of the three condors were lead poisoned from a lead source isotopically indistinguishable from the embedded birdshot, implicating ingestion of this type of birdshot as the source of poisoning. One of the condors was subsequently lead poisoned the following year from ingestion of a lead buckshot (blood lead 556 µg/dL), illustrating that ingested shot possess a substantially greater lead poisoning risk compared to embedded shot retained in tissue (blood lead ∼20 µg/dL). To our knowledge, this is the first study to use lead isotopes as a tool to retrospectively link wildlife shooting events. - Highlights: • We conducted a case-based analysis of illegal shootings of California condors. • Blood and feather Pb isotopes were used to reconstruct the illegal shooting events. • Embedded birdshot from the three condors had the same Pb isotope ratios. • Feather and blood Pb isotopes indicated that the condors were shot in a common event. • Ingested shot causes substantially greater lead exposure compared to embedded shot.

  15. Tracking transformation processes of organic micropollutants in aquatic environments using multi-element isotope fractionation analysis

    International Nuclear Information System (INIS)

    Hofstetter, Thomas B.; Bolotin, Jakov; Skarpeli-Liati, Marita; Wijker, Reto; Kurt, Zohre; Nishino, Shirley F.; Spain, Jim C.

    2011-01-01

    The quantitative description of enzymatic or abiotic transformations of man-made organic micropollutants in rivers, lakes, and groundwaters is one of the major challenges associated with the risk assessment of water resource contamination. Compound-specific isotope analysis enables one to identify (bio)degradation pathways based on changes in the contaminants' stable isotope ratios even if multiple reactive and non-reactive processes cause concentrations to decrease. Here, we investigated how the magnitude and variability of isotope fractionation in some priority pollutants is determined by the kinetics and mechanisms of important enzymatic and abiotic redox reactions. For nitroaromatic compounds and substituted anilines, we illustrate that competing transformation pathways can be assessed via trends of N and C isotope signatures.

  16. Requirement analysis of the safety-critical software implementation for the nuclear power plant

    International Nuclear Information System (INIS)

    Chang, Hoon Seon; Jung, Jae Cheon; Kim, Jae Hack; Nam, Sang Ku; Kim, Hang Bae

    2005-01-01

    The safety critical software shall be implemented under the strict regulation and standards along with hardware qualification. In general, the safety critical software has been implemented using functional block language (FBL) and structured language like C in the real project. Software design shall comply with such characteristics as; modularity, simplicity, minimizing the use of sub-routine, and excluding the interrupt logic. To meet these prerequisites, we used the computer-aided software engineering (CASE) tool to substantiate the requirements traceability matrix that were manually developed using Word processors or Spreadsheets. And the coding standard and manual have been developed to confirm the quality of software development process, such as; readability, consistency, and maintainability in compliance with NUREG/CR-6463. System level preliminary hazard analysis (PHA) is performed by analyzing preliminary safety analysis report (PSAR) and FMEA document. The modularity concept is effectively implemented for the overall module configurations and functions using RTP software development tool. The response time imposed on the basis of the deterministic structure of the safety-critical software was measured

  17. 2H Stable Isotope Analysis of Tooth Enamel: A Pilot Study

    Science.gov (United States)

    Holobinko, Anastasia; Kemp, Helen; Meier-Augenstein, Wolfram; Prowse, Tracy; Ford, Susan

    2010-05-01

    Stable isotope analysis of biogenic tissues such as tooth enamel and bone mineral has become a well recognized and increasingly important method for determining provenance of human remains, and has been used successfully in bioarchaeological studies as well as forensic investigations (Lee-Thorp, 2008; Meier-Augenstein and Fraser, 2008). Particularly, 18O and 2H stable isotopes are well established proxies as environmental indicators of climate (temperature) and source water and are therefore considered as indicators of geographic life trajectories of animals and humans (Hobson et al., 2004; Schwarcz and Walker, 2006). While methodology for 2H analysis of human hair, fingernails, and bone collagen is currently used to determine geographic origin and identify possible migration patterns, studies involving the analysis of 2H in tooth enamel appear to be nonexistent in the scientific literature. The apparent lack of research in this area is believed to have two main reasons. (1) Compared to the mineral calcium hydroxylapatite Ca10(PO4)6(OH)2, in tooth enamel forming bio-apatite carbonate ions replace some of the hydroxyl ions at a rate of one CO32 replacing two OH, yet published figures for the degree of substitution vary (Wopenka and Pasteris, 2005). (2) Most probably due to the aforementioned no published protocols exist for sample preparation and analytical method to obtain δ2H-values from the hydroxyl fraction of tooth enamel. This dilemma has been addressed through a pilot study to establish feasibility of 2H stable isotope analysis of ground tooth enamel by continuous-flow isotope ratio mass spectrometry (IRMS) coupled on-line to a high-temperature conversion elemental analyzer (TC/EA). An array of archaeological and modern teeth has been analyzed under different experimental conditions, and results from this pilot study are being presented. References: Lee-Thorp, J.A. (2008) Archaeometry, 50, 925-950 Meier-Augenstein, W. and Fraser, I. (2008) Science & Justice

  18. Growth versus metabolic tissue replacement in mouse tissues determined by stable carbon and nitrogen isotope analysis

    Science.gov (United States)

    Macavoy, S. E.; Jamil, T.; Macko, S. A.; Arneson, L. S.

    2003-12-01

    Stable isotope analysis is becoming an extensively used tool in animal ecology. The isotopes most commonly used for analysis in terrestrial systems are those of carbon and nitrogen, due to differential carbon fractionation in C3 and C4 plants, and the approximately 3‰ enrichment in 15N per trophic level. Although isotope signatures in animal tissues presumably reflect the local food web, analysis is often complicated by differential nutrient routing and fractionation by tissues, and by the possibility that large organisms are not in isotopic equilibrium with the foods available in their immediate environment. Additionally, the rate at which organisms incorporate the isotope signature of a food through both growth and metabolic tissue replacement is largely unknown. In this study we have assessed the rate of carbon and nitrogen isotopic turnover in liver, muscle and blood in mice following a diet change. By determining growth rates, we were able to determine the proportion of tissue turnover caused by growth versus that caused by metabolic tissue replacement. Growth was found to account for approximately 10% of observed tissue turnover in sexually mature mice (Mus musculus). Blood carbon was found to have the shortest half-life (16.9 days), followed by muscle (24.7 days). Liver carbon turnover was not as well described by the exponential decay equations as other tissues. However, substantial liver carbon turnover was observed by the 28th day after diet switch. Surprisingly, these tissues primarily reflect the carbon signature of the protein, rather than carbohydrate, source in their diet. The nitrogen signature in all tissues was enriched by 3 - 5‰ over their dietary protein source, depending on tissue type, and the isotopic turnover rates were comparable to those observed in carbon.

  19. Contribution of bulk mass spectrometry isotopic analysis to characterization of materials in the framework of CMX-4

    International Nuclear Information System (INIS)

    Kuchkin, A.; Stebelkov, V.; Zhizhin, K.; Lierse von Gostomski, Ch.; Kardinal, Ch.; Loi, E.; Keegan, E.; Kristo, M.J.

    2018-01-01

    Seven laboratories used the results of bulk uranium isotopic analysis by either inductively coupled plasma mass spectrometry (ICP-MS) or thermal ionization mass spectrometry (TIMS) for characterization of the samples in the Nuclear Forensic International Technical Working Group fourth international collaborative material exercise, CMX-4. Comparison of the measured isotopic compositions of uranium in three exercise samples is implemented for identifying any differences or similarities between the samples. The role of isotopic analyses in the context of a real nuclear forensic investigation is discussed. Several limitations in carrying out ICP-MS or TIMS analysis in CMX-4 are noted. (author)

  20. STAR: Software Toolkit for Analysis Research

    International Nuclear Information System (INIS)

    Doak, J.E.; Prommel, J.M.; Whiteson, R.; Hoffbauer, B.L.; Thomas, T.R.; Helman, P.

    1993-01-01

    Analyzing vast quantities of data from diverse information sources is an increasingly important element for nonproliferation and arms control analysis. Much of the work in this area has used human analysts to assimilate, integrate, and interpret complex information gathered from various sources. With the advent of fast computers, we now have the capability to automate this process thereby shifting this burden away from humans. In addition, there now exist huge data storage capabilities which have made it possible to formulate large integrated databases comprising many thereabouts of information spanning a variety of subjects. We are currently designing a Software Toolkit for Analysis Research (STAR) to address these issues. The goal of STAR is to Produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest to nonproliferation and arms control experts. One major component deals with the preparation of information. The ability to manage and effectively transform raw data into a meaningful form is a prerequisite for analysis by any methodology. The relevant information to be analyzed can be either unstructured text structured data, signals, or images. Text can be numerical and/or character, stored in raw data files, databases, streams of bytes, or compressed into bits in formats ranging from fixed, to character-delimited, to a count followed by content The data can be analyzed in real-time or batch mode. Once the data are preprocessed, different analysis techniques can be applied. Some are built using expert knowledge. Others are trained using data collected over a period of time. Currently, we are considering three classes of analyzers for use in our software toolkit: (1) traditional machine learning techniques, (2) the purely statistical system, and (3) expert systems

  1. An isotopic analysis process with optical emission spectrometry on a laser-produced plasma

    International Nuclear Information System (INIS)

    Mauchien, P.; Pietsch, W.; Petit, A.; Briand, A.

    1994-01-01

    The sample that is to be analyzed is irradiated with a laser beam to produce a plasma at the sample surface; the spectrum of the light emitted by the plasma is analyzed and the isotope composition of the sample is derived from the spectrometry. The process is preferentially applied to uranium and plutonium; it is rapid, simpler and cheaper than previous methods, and may be applied to 'in-situ' isotopic analysis in nuclear industry. 2 figs

  2. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    Science.gov (United States)

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  3. Trace, isotopic analysis of micron-sized grains -- Mo, Zr analysis of stardust (SiC and graphite grains).

    Energy Technology Data Exchange (ETDEWEB)

    Pellin, M. J.; Nicolussi, G. K.

    1998-02-19

    Secondary Neutral Mass Spectrometry using resonant laser ionization can provide for both high useful yields and high discrimination while maintaining high lateral and depth resolutions. An example of the power of the method is measurement of the isotopic composition of Mo and Zr in 1-5 {micro}m presolar SiC and graphite grains isolated from the Murchison CM2 meteorite for the first time. These grains have survived the formation of the Solar System and isotopic analysis reveals a record of the stellar nucleosynthesis present during their formation. Mo and Zr, though present at less than 10 ppm in some grains, are particularly useful in that among their isotopes are members that can only be formed by distinct nucleosynthetic processes known as s-, p-, and r-process. Successful isotopic analysis of these elements requires both high selectivity (since these are trace elements) and high efficiency (since the total number of atoms available are limited). Resonant Ionization Spectroscopy is particularly useful and flexible in this application. While the sensitivity of this t.edmique has often been reported in the past, we focus hereon the very low noise properties of the technique. We further demonstrate the efficacy of noise removal by two complimentary methods. First we use the resonant nature of the signal to subtract background signal. Second we demonstrate that by choosing the appropriate resonance scheme background can often be dramatically reduced.

  4. Quality control in dual head γ-cameras: comparison between methods and software s used for image analysis

    International Nuclear Information System (INIS)

    Nayl E, A.; Fornasier, M. R.; De Denaro, M.; Sulieman, A.; Alkhorayef, M.; Bradley, D.

    2017-10-01

    Patient radiation dose and image quality are the main issues in nuclear medicine (Nm) procedures. Currently, many protocols are used for image acquisition and analysis of quality control (Qc) tests. National Electrical Manufacturers Association (Nema) methods and protocols are widely accepted method used for providing accurate description, measurement and reporting of γ-camera performance parameters. However, no standard software is available for image analysis. The aim os this study was to compare between the vendor Qc software analysis and three software from different developers downloaded free from internet; NMQC, Nm Tool kit and ImageJ-Nm Tool kit software. The three software are used for image analysis of some Qc tests for γ-cameras based on Nema protocols including non-uniformity evaluation. Ten non-uniformity Qc images were taken from dual head γ-camera (Siemens Symbia) installed in Trieste general hospital (Italy), and analyzed. Excel analysis was used as baseline calculation of the non-uniformity test according Nema procedures. The results of the non-uniformity analysis showed good agreement between the three independent software and excel calculation (the average differences were 0.3%, 2.9%, 1.3% and 1.6% for UFOV integral, UFOV differential, CFOV integral and CFOV differential respectively), while significant difference was detected on the analysis of the company Qc software with compare to the excel analysis (the average differences were 14.6%, 20.7%, 25.7% and 31.9% for UFOV integral, UFOV differential, CFOV integral and CFOV differential respectively). NMQC software was the best in comparison with the excel calculations. The variation in the results is due to different pixel sizes used for analysis in the three software and the γ-camera Qc software. Therefore, is important to perform the tests by the vendor Qc software as well as by independent analysis to understand the differences between the values. Moreover, the medical physicist should know

  5. Quality control in dual head γ-cameras: comparison between methods and software s used for image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nayl E, A. [Sudan Atomic Energy Commission, Radiation Safety Institute, Khartoum (Sudan); Fornasier, M. R.; De Denaro, M. [Azienda Sanitaria Universitaria Integrata di Trieste, Medical Physics Department, Via Giovanni Sai 7, 34128 Trieste (Italy); Sulieman, A. [Prince Sattam bin Abdulaziz University, College of Applied Medical Sciences, Radiology and Medical Imaging Department, P. O. Box 422, 11942 Al-Kharj (Saudi Arabia); Alkhorayef, M.; Bradley, D., E-mail: abdwsh10@hotmail.com [University of Surrey, Department of Physics, GU2-7XH Guildford, Surrey (United Kingdom)

    2017-10-15

    Patient radiation dose and image quality are the main issues in nuclear medicine (Nm) procedures. Currently, many protocols are used for image acquisition and analysis of quality control (Qc) tests. National Electrical Manufacturers Association (Nema) methods and protocols are widely accepted method used for providing accurate description, measurement and reporting of γ-camera performance parameters. However, no standard software is available for image analysis. The aim os this study was to compare between the vendor Qc software analysis and three software from different developers downloaded free from internet; NMQC, Nm Tool kit and ImageJ-Nm Tool kit software. The three software are used for image analysis of some Qc tests for γ-cameras based on Nema protocols including non-uniformity evaluation. Ten non-uniformity Qc images were taken from dual head γ-camera (Siemens Symbia) installed in Trieste general hospital (Italy), and analyzed. Excel analysis was used as baseline calculation of the non-uniformity test according Nema procedures. The results of the non-uniformity analysis showed good agreement between the three independent software and excel calculation (the average differences were 0.3%, 2.9%, 1.3% and 1.6% for UFOV integral, UFOV differential, CFOV integral and CFOV differential respectively), while significant difference was detected on the analysis of the company Qc software with compare to the excel analysis (the average differences were 14.6%, 20.7%, 25.7% and 31.9% for UFOV integral, UFOV differential, CFOV integral and CFOV differential respectively). NMQC software was the best in comparison with the excel calculations. The variation in the results is due to different pixel sizes used for analysis in the three software and the γ-camera Qc software. Therefore, is important to perform the tests by the vendor Qc software as well as by independent analysis to understand the differences between the values. Moreover, the medical physicist should know

  6. Growth patterns of an intertidal gastropod as revealed by oxygen isotope analysis

    Science.gov (United States)

    Bean, J. R.; Hill, T. M.; Guerra, C.

    2007-12-01

    The size and morphology of mollusk shells are affected by environmental conditions. As a result, it is difficult to assess growth rate, population age structure, shell morphologies associated with ontogenetic stages, and to compare life history patterns across various environments. Oxygen isotope analysis is a useful tool for estimating minimum ages and growth rates of calcium carbonate secreting organisms. Calcite shell material from members of two northern California populations of the intertidal muricid gastropod Acanthinucella spirata was sampled for isotopic analysis. Individual shells were sampled from apex to margin, thus providing a sequential record of juvenile and adult growth. A. spirata were collected from a sheltered habitat in Tomales Bay and from an exposed reef in Bolinas. Abiotic factors, such as temperature, wave exposure, and substrate consistency, and biotic composition differ significantly between these sites, possibly resulting in local adaptations and variation in life history and growth patterns. Shell morphology of A. spirata changes with age as internal shell margin thickenings of denticle rows associated with external growth bands are irregularly accreted. It is not known when, either seasonally and/or ontogentically, these thickenings and bands form or whether inter or intra-populational variation exists. Preliminary results demonstrate the seasonal oxygen isotopic variability present at the two coastal sites, indicating 5-6 degC changes from winter to summertime temperatures; these data are consistent with local intertidal temperature records. Analysis of the seasonal patterns indicate that: 1) differences in growth rate and seasonal growth patterns at different ontogenetic stages within populations, and 2) differences in growth patterns and possibly age structure between the two A. spirata populations. These findings indicate that isotopic analyses, in addition to field observations and morphological measurements, are necessary to

  7. The isotopic contamination in electromagnetic isotope separators; La contagion isotopique dans les separateurs electromagnetiques d'isotopes

    Energy Technology Data Exchange (ETDEWEB)

    Cassignol, Ch. [Commissariat a l' Energie Atomique, Saclay (France).Centre d' Etudes Nucleaires

    1959-07-01

    In the early years of isotope separation, and in particular electromagnetic isotope separation, needs for rapid results have conducted to empiric research. This paper describes fundamental research on the electromagnetic isotope separation to a better understanding of isotope separators as well as improving the performances. Focus has been made on the study of the principle of isotope contamination and the remedial action on the separator to improve the isotope separation ratio. In a first part, the author come back to the functioning of an electromagnetic separator and generalities on isotope contamination. Secondly, it describes the two stages separation method with two dispersive apparatus, an electromagnetic separation stage followed by an electrostatic separation stage, both separated by a diaphragm. The specifications of the electrostatic stage are given and its different settings and their consequences on isotope separation are investigated. In a third part, mechanisms and contamination factors in the isotope separation are discussed: natural isotope contamination, contamination by rebounding on the collector, contamination because of a low resolution, contamination by chromatism and diffusion effect, breakdown of condenser voltage. Analysis of experimental results shows the diffusion as the most important contamination factor in electromagnetic isotope separation. As contamination factors are dependent on geometric parameters, sector angle, radius of curvature in the magnetic field and clearance height are discussed in a fourth part. The better understanding of the mechanism of the different contamination factors and the study of influential parameters as pressure and geometric parameters lead to define a global scheme of isotope contamination and determinate optima separator design and experimental parameters. Finally, the global scheme of isotope contamination and hypothesis on optima specifications and experimental parameters has been checked during a

  8. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  9. Isotopic abundance analysis of carbon, nitrogen and sulfur with a combined elemental analyzer-mass spectrometer system

    International Nuclear Information System (INIS)

    Pichlmayer, F.; Blochberger, K.

    1988-01-01

    Stable isotope ratio measurements of carbon, nitrogen and sulfur are of growing interest as analytical tool in many fields of research, but applications were somewhat hindered in the past by the fact that cumbersome sample preparation was necessary. A method has therefore been developed, consisting in essential of coupling an elemental analyzer with an isotope mass spectrometer, enabling fast and reliable conversion of C-, N- and S-compounds in any solid or liquid sample into the measuring gases carbon dioxide, nitrogen and sulfur dioxide for on-line isotopic analysis. The experimental set-up and the main characteristics are described in short and examples of application in environmental research, food analysis and clinical diagnosis are given. (orig.)

  10. Tracing fetal and childhood exposure to lead using isotope analysis of deciduous teeth

    International Nuclear Information System (INIS)

    Shepherd, Thomas J.; Dirks, Wendy; Roberts, Nick M.W.; Patel, Jaiminkumar G.; Hodgson, Susan; Pless-Mulloli, Tanja; Walton, Pamela; Parrish, Randall R.

    2016-01-01

    We report progress in using the isotopic composition and concentration of Pb in the dentine and enamel of deciduous teeth to provide a high resolution time frame of exposure to Pb during fetal development and early childhood. Isotope measurements (total Pb and 208 Pb/ 206 Pb, 207 Pb/ 206 Pb ratios) were acquired by laser ablation inductively coupled mass spectrometry at contiguous 100 micron intervals across thin sections of the teeth; from the outer enamel surface to the pulp cavity. Teeth samples (n=10) were selected from two cohorts of children, aged 5–8 years, living in NE England. By integrating the isotope data with histological analysis of the teeth, using the daily incremental lines in dentine, we were able to assign true estimated ages to each ablation point (first 2–3 years for molars, first 1–2 years for incisors+pre-natal growth). Significant differences were observed in the isotope composition and concentration of Pb between children, reflecting differences in the timing and sources of exposure during early childhood. Those born in 2000, after the withdrawal of leaded petrol in 1999, have the lowest dentine Pb levels (<0.2 µg Pb/g) with 208 Pb/ 206 Pb (mean ±2σ: 2.126–2.079) 208 Pb/ 206 Pb (mean ±2σ: 0.879–0.856) ratios that correlate very closely with modern day Western European industrial aerosols (PM 10 , PM 2.5 ) suggesting that diffuse airborne pollution was probably the primary source and exposure pathway. Legacy lead, if present, is insignificant. For those born in 1997, dentine lead levels are typically higher (>0.4 µgPb/g) with 208 Pb/ 206 Pb (mean ±2σ: 2.145–2.117) 208 Pb/ 206 Pb (mean ±2σ: 0.898–0.882) ratios that can be modelled as a binary mix between industrial aerosols and leaded petrol emissions. Short duration, high intensity exposure events (1–2 months) were readily identified, together with evidence that dentine provides a good proxy for childhood changes in the isotope composition of blood Pb. Our pilot

  11. A field study on root cause analysis of defects in space software

    International Nuclear Information System (INIS)

    Silva, Nuno; Cunha, João Carlos; Vieira, Marco

    2017-01-01

    Critical systems, such as space systems, are developed under strict requirements envisaging high integrity in accordance to specific standards. For such software systems, an independent assessment is put into effect (Independent Software Verification and Validation – ISVV) after the regular development lifecycle and V&V activities, aiming at finding residual faults and raising confidence in the software. However, it has been observed that there is still a significant number of defects remaining at this stage, questioning the effectiveness of the previous engineering processes. This paper presents a root cause analysis of 1070 defects found in four space software projects during ISVV, by applying an improved Orthogonal Defect Classification (ODC) taxonomy and examining the defect types, triggers and impacts, in order to identify why they reached such a later stage in the development. The paper also puts forward proposals for modifications to both the software development (to prevent defects) and the V&V activities (to better detect defects) and an assessment methodology for future works on root cause analysis. - Highlights: • Root cause analysis of space software defects by using an enhanced ODC taxonomy. • Prioritization of the root causes according to the more important defect impacts. • Identification of improvements to systems engineering and development processes. • Improvements to V&V activities as means to reduce the occurrence of defects. • Generic process to achieve the defects root causes and the corrections suggestions.

  12. Software selection based on analysis and forecasting methods, practised in 1C

    Science.gov (United States)

    Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.

    2015-09-01

    The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.

  13. Designing flexible, ''chemist-friendly'' software to control a radiochemistry autosynthesizer

    International Nuclear Information System (INIS)

    Feliu, A.L.

    1989-01-01

    To enhance the utility of process control software to control radiochemistry autosynthesizers used with short-lived positron-emitting isotopes, a scheme is proposed by which routine executive-level tasks, hardware control operations, and chemical procedures have been segregated. This strategy can lead to chemist-friendly control programs for any desired hardware configuration, as illustrated in new software designed to exploit the features and flexibility of the CTI/Siemens Chemical Process Control Unit. (author)

  14. An ion beam analysis software based on ImageJ

    International Nuclear Information System (INIS)

    Udalagama, C.; Chen, X.; Bettiol, A.A.; Watt, F.

    2013-01-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation

  15. An ion beam analysis software based on ImageJ

    Energy Technology Data Exchange (ETDEWEB)

    Udalagama, C., E-mail: chammika@nus.edu.sg [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore); Chen, X.; Bettiol, A.A.; Watt, F. [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore)

    2013-07-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation.

  16. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  17. Learning from examples - Generation and evaluation of decision trees for software resource analysis

    Science.gov (United States)

    Selby, Richard W.; Porter, Adam A.

    1988-01-01

    A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

  18. Essentials of iron, chromium, and calcium isotope analysis of natural materials by thermal ionization mass spectrometry

    Science.gov (United States)

    Fantle, M.S.; Bullen, T.D.

    2009-01-01

    The use of isotopes to understand the behavior of metals in geological, hydrological, and biological systems has rapidly expanded in recent years. One of the mass spectrometric techniques used to analyze metal isotopes is thermal ionization mass spectrometry, or TIMS. While TIMS has been a useful analytical technique for the measurement of isotopic composition for decades and TIMS instruments are widely distributed, there are significant difficulties associated with using TIMS to analyze isotopes of the lighter alkaline earth elements and transition metals. Overcoming these difficulties to produce relatively long-lived and stable ion beams from microgram-sized samples is a non-trivial task. We focus here on TIMS analysis of three geologically and environmentally important elements (Fe, Cr, and Ca) and present an in-depth look at several key aspects that we feel have the greatest potential to trouble new users. Our discussion includes accessible descriptions of different analytical approaches and issues, including filament loading procedures, collector cup configurations, peak shapes and interferences, and the use of isotopic double spikes and related error estimation. Building on previous work, we present quantitative simulations, applied specifically in this study to Fe and Ca, that explore the effects of (1) time-variable evaporation of isotopically homogeneous spots from a filament and (2) interferences on the isotope ratios derived from a double spike subtraction routine. We discuss how and to what extent interferences at spike masses, as well as at other measured masses, affect the double spike-subtracted isotope ratio of interest (44Ca/40Ca in the case presented, though a similar analysis can be used to evaluate 56Fe/54Fe and 53Cr/52Cr). The conclusions of these simulations are neither intuitive nor immediately obvious, making this examination useful for those who are developing new methodologies. While all simulations are carried out in the context of a

  19. STEM - software test and evaluation methods: fault detection using static analysis techniques

    International Nuclear Information System (INIS)

    Bishop, P.G.; Esp, D.G.

    1988-08-01

    STEM is a software reliability project with the objective of evaluating a number of fault detection and fault estimation methods which can be applied to high integrity software. This Report gives some interim results of applying both manual and computer-based static analysis techniques, in particular SPADE, to an early CERL version of the PODS software containing known faults. The main results of this study are that: The scope for thorough verification is determined by the quality of the design documentation; documentation defects become especially apparent when verification is attempted. For well-defined software, the thoroughness of SPADE-assisted verification for detecting a large class of faults was successfully demonstrated. For imprecisely-defined software (not recommended for high-integrity systems) the use of tools such as SPADE is difficult and inappropriate. Analysis and verification tools are helpful, through their reliability and thoroughness. However, they are designed to assist, not replace, a human in validating software. Manual inspection can still reveal errors (such as errors in specification and errors of transcription of systems constants) which current tools cannot detect. There is a need for tools to automatically detect typographical errors in system constants, for example by reporting outliers to patterns. To obtain the maximum benefit from advanced tools, they should be applied during software development (when verification problems can be detected and corrected) rather than retrospectively. (author)

  20. Reconstructing diet by stable isotope analysis: Two case studies from Bronze Age and Early Medieval Lower Austria

    International Nuclear Information System (INIS)

    Rumpelmayr, K.

    2012-01-01

    Carbon and nitrogen stable isotope analysis is nowadays a method frequently applied for the reconstruction of past human diets. The principles of this technique were developed in the late 1970s and 1980s, when it was shown that the isotopic composition of an animal's body reflected that of its diet. Given that the investigated material (often bone collagen) is well enough preserved, several aspects of diet can be investigated by carbon and nitrogen isotopic signatures - expressed as δ13C- und δ15N-values - as e.g. whether nutrition was based on C3 or C4 plants. Furthermore, these signatures can be used for the detection of a marine component in the diet and they contain information about the trophic level of an individual. The goal of the work presented in this talk was to investigate certain aspects of diet using carbon and nitrogen stable isotope analysis of human and animal skeletal remains from Austrian archaeological sites. Two sites (both in Lower Austria) were selected for this study, the Bronze Age Cemetery of Gemeinlebarn and the Early Medieval settlement of Thunau/Gars am Kamp. Previous archaeological and anthropological examinations suggested that both sites were inhabited by socially differentiated populations. Hence, during the stable isotope analysis special attention was paid to the detection of variation in nutritional habits due to sociogenic or gender-related differences. δ13C- und δ15N-values were measured in collagen, extracted from bone samples, by means of elemental analyzer-isotope ratio mass spectrometry (EA-IRMS). The obtained stable isotope data were examined for significant differences between social groups and the sexes using statistical hypothesis testing (MANOVA and ANOVA). (author)

  1. Evaluation of Kilauea Eruptions By Using Stable Isotope Analysis

    Science.gov (United States)

    Rahimi, K. E.; Bursik, M. I.

    2016-12-01

    Kilauea, on the island of Hawaii, is a large volcanic edifice with numerous named vents scattered across its surface. Halema`uma`u crater sits with Kilauea caldera, above the magma reservoir, which is the main source of lava feeding most vents on Kilauea volcano. Halema`uma`u crater produces basaltic explosive activity ranging from weak emission to sub-Plinian. Changes in the eruption style are thought to be due to the interplay between external water and magma (phreatomagmatic/ phreatic), or to segregation of gas from magma (magmatic) at shallow depths. Since there are three different eruption mechanisms (phreatomagmatic, phreatic, and magmatic), each eruption has its own isotope ratios. The aim of this study is to evaluate the eruption mechanism by using stable isotope analysis. Studying isotope ratios of D/H and δ18O within fluid inclusion and volcanic glass will provide an evidence of what driven the eruption. The results would be determined the source of water that drove an eruption by correlating the values with water sources (groundwater, rainwater, and magmatic water) since each water source has a diagnostic value of D/H and δ18O. These results will provide the roles of volatiles in eruptions. The broader application of this research is that these methods could help volcanologists forecasting and predicting the current volcanic activity by mentoring change in volatiles concentration within deposits.

  2. High Resolution Gamma Ray Analysis of Medical Isotopes

    Science.gov (United States)

    Chillery, Thomas

    2015-10-01

    Compton-suppressed high-purity Germanium detectors at the University of Massachusetts Lowell have been used to study medical radioisotopes produced at Brookhaven Linac Isotope Producer (BLIP), in particular isotopes such as Pt-191 used for cancer therapy in patients. The ability to precisely analyze the concentrations of such radio-isotopes is essential for both production facilities such as Brookhaven and consumer hospitals across the U.S. Without accurate knowledge of the quantities and strengths of these isotopes, it is possible for doctors to administer incorrect dosages to patients, thus leading to undesired results. Samples have been produced at Brookhaven and shipped to UML, and the advanced electronics and data acquisition capabilities at UML have been used to extract peak areas in the gamma decay spectra. Levels of Pt isotopes in diluted samples have been quantified, and reaction cross-sections deduced from the irradiation parameters. These provide both cross checks with published work, as well as a rigorous quantitative framework with high quality state-of-the-art detection apparatus in use in the experimental nuclear physics community.

  3. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    Science.gov (United States)

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-04

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .

  4. Application of software quality assurance methods in validation and maintenance of reactor analysis computer codes

    International Nuclear Information System (INIS)

    Reznik, L.

    1994-01-01

    Various computer codes employed at Israel Electricity Company for preliminary reactor design analysis and fuel cycle scoping calculations have been often subject to program source modifications. Although most changes were due to computer or operating system compatibility problems, a number of significant modifications were due to model improvement and enhancements of algorithm efficiency and accuracy. With growing acceptance of software quality assurance requirements and methods, a program of implementing extensive testing of modified software has been adopted within the regular maintenance activities. In this work survey has been performed of various software quality assurance methods of software testing which belong mainly to the two major categories of implementation ('white box') and specification-based ('black box') testing. The results of this survey exhibits a clear preference of specification-based testing. In particular the equivalence class partitioning method and the boundary value method have been selected as especially suitable functional methods for testing reactor analysis codes.A separate study of software quality assurance methods and techniques has been performed in this work objective to establish appropriate pre-test software specification methods. Two methods of software analysis and specification have been selected as the most suitable for this purpose: The method of data flow diagrams has been shown to be particularly valuable for performing the functional/procedural software specification while the entities - relationship diagrams has been approved to be efficient for specifying software data/information domain. Feasibility of these two methods has been analyzed in particular for software uncertainty analysis and overall code accuracy estimation. (author). 14 refs

  5. Analysis of signal acquisition in GPS receiver software

    Directory of Open Access Journals (Sweden)

    Vlada S. Sokolović

    2011-01-01

    Full Text Available This paper presents a critical analysis of the flow signal processing carried out in GPS receiver software, which served as a basis for a critical comparison of different signal processing architectures within the GPS receiver. It is possible to achieve Increased flexibility and reduction of GPS device commercial costs, including those of mobile devices, by using radio technology software (SDR, Software Defined Radio. The SDR application can be realized when certain hardware components in a GPS receiver are replaced. Signal processing in the SDR is implemented using a programmable DSP (Digital Signal Processing or FPGA (Field Programmable Gate Array circuit, which allows a simple change of digital signal processing algorithms and a simple change of the receiver parameters. The starting point of the research is the signal generated on the satellite the structure of which is shown in the paper. Based on the GPS signal structure, a receiver is realized with a task to extract an appropriate signal from the spectrum and detect it. Based on collected navigation data, the receiver calculates the position of the end user. The signal coming from the satellite may be at the carrier frequencies of L1 and L2. Since the SPS is used in the civil service, all the tests shown in the work were performed on the L1 signal. The signal coming to the receiver is generated in the spread spectrum technology and is situated below the level of noise. Such signals often interfere with signals from the environment which presents a difficulty for a receiver to perform proper detection and signal processing. Therefore, signal processing technology is continually being improved, aiming at more accurate and faster signal processing. All tests were carried out on a signal acquired from the satellite using the SE4110 input circuit used for filtering, amplification and signal selection. The samples of the received signal were forwarded to a computer for data post processing, i. e

  6. Mobility and diet in Neolithic, Bronze Age and Iron Age Germany : evidence from multiple isotope analysis

    NARCIS (Netherlands)

    Oelze, Viktoria Martha

    2012-01-01

    Prehistoric human diet can be reconstructed by the analysis of carbon (C), nitrogen (N) and sulphur (S) stable isotopes in bone, whereas ancient mobility and provenance can be studied using the isotopes of strontium (Sr) and oxygen (O) in tooth enamel, and of sulphur in bone. Although thirty years

  7. Software testability and its application to avionic software

    Science.gov (United States)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffery E.

    1993-01-01

    Randomly generated black-box testing is an established yet controversial method of estimating software reliability. Unfortunately, as software applications have required higher reliabilities, practical difficulties with black-box testing have become increasingly problematic. These practical problems are particularly acute in life-critical avionics software, where requirements of 10 exp -7 failures per hour of system reliability can translate into a probability of failure (POF) of perhaps 10 exp -9 or less for each individual execution of the software. This paper describes the application of one type of testability analysis called 'sensitivity analysis' to B-737 avionics software; one application of sensitivity analysis is to quantify whether software testing is capable of detecting faults in a particular program and thus whether we can be confident that a tested program is not hiding faults. We so 80 by finding the testabilities of the individual statements of the program, and then use those statement testabilities to find the testabilities of the functions and modules. For the B-737 system we analyzed, we were able to isolate those functions that are more prone to hide errors during system/reliability testing.

  8. Microbial degradation of alpha-cypermethrin in soil by compound-specific stable isotope analysis

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Zemin [MOE Key Laboratory of Environmental Remediation and Ecosystem Health, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou 310058 (China); Shen, Xiaoli [MOE Key Laboratory of Environmental Remediation and Ecosystem Health, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou 310058 (China); Department of Environmental Engineering, Quzhou University, Quzhou 324000 (China); Zhang, Xi-Chang [Laboratory for Teaching in Environmental and Resource Sciences, Zhejiang University, Hangzhou 310058 (China); Liu, Weiping [MOE Key Laboratory of Environmental Remediation and Ecosystem Health, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou 310058 (China); Yang, Fangxing, E-mail: fxyang@zju.edu.cn [MOE Key Laboratory of Environmental Remediation and Ecosystem Health, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou 310058 (China); Department of Effect-Directed Analysis, Helmholtz Center for Environmental Research – UFZ, Leipzig 04318 (Germany)

    2015-09-15

    Highlights: • Alpha-cypermethrin (α-CP) can be degraded by microorganisms in soil. • Biodegradation of α-CP resulted in carbon isotope fractionation. • A relationship was found between carbon isotope ratios and concentrations of α-CP. • An enrichment factor ϵ of α-CP was determined as −1.87‰. • CSIA is applicable to assess biodegradation of α-CP. - Abstract: To assess microbial degradation of alpha-cypermethrin in soil, attenuation of alpha-cypermethrin was investigated by compound-specific stable isotope analysis. The variations of the residual concentrations and stable carbon isotope ratios of alpha-cypermethrin were detected in unsterilized and sterilized soils spiked with alpha-cypermethrin. After an 80 days’ incubation, the concentrations of alpha-cypermethrin decreased to 0.47 and 3.41 mg/kg in the unsterilized soils spiked with 2 and 10 mg/kg, while those decreased to 1.43 and 6.61 mg/kg in the sterilized soils. Meanwhile, the carbon isotope ratios shifted to −29.14 ± 0.22‰ and −29.86 ± 0.33‰ in the unsterilized soils spiked with 2 and 10 mg/kg, respectively. The results revealed that microbial degradation contributed to the attenuation of alpha-cypermethrin and induced the carbon isotope fractionation. In order to quantitatively assess microbial degradation, a relationship between carbon isotope ratios and residual concentrations of alpha-cypermethrin was established according to Rayleigh equation. An enrichment factor, ϵ = −1.87‰ was obtained, which can be employed to assess microbial degradation of alpha-cypermethrin. The significant carbon isotope fractionation during microbial degradation suggests that CSIA is a proper approach to qualitatively detect and quantitatively assess the biodegradation during attenuation process of alpha-cypermethrin in the field.

  9. Elemental and isotopic characterization of Japanese and Philippine polished rice samples using instrumental neutron activation analysis and isotope ratio mass spectrometry

    International Nuclear Information System (INIS)

    Pabroa, Preciosa Corazon B.; Sucgang, Raymond J.; Mendoza, Norman dS.; Ebihara, Mitsuru

    2011-01-01

    Rice is a staple food for most Asian countries such as the Philippines and Japan and as such its elemental and isotopic content are of interest to the consumers. Its elemental content may reflect the macro nutrient reduction during milling or probable toxic elements uptake. Three Japanese and four Philippine polished rice samples in his study mostly came from rice bought from supermarkets.These rice samples were washed, dried and ground to fine powder. Instrumental neutron activation analysis (INAA), a very sensitive non-destructive multi-element analytical technique, was used for the elemental analysis of the samples and isotope-ratio mass spectrometry (IRMS) was used to obtain the isotopic signatures of the samples. Results show that compared with the unpolished rice standard NIES CRM10b, the polished Japanese and Philippine rice sampled show reduced concentrations of elements by as much as 1/3 to 1/10 of Mg, Mn, K and Na. Levels of Ca and Zn are not greatly affected. Arsenic is found in all the Japanese rice tested at an average concentration of 0.103 μg/g and three out of four of the Philippine rice at an average concentration of 0.070 μg/g. Arsenic contamination may have been introduced from the fertilizer used in rice fields. Higher levels of Br are seen in two of the Philippine rice at 14 and 34 μg/g with the most probable source being the pesticide methyl bromide. Isotopic ratio of ae 13 C show signature of a C3 plant with possible narrow distinguishable signature of Japanese rice within -27.5 to -28.5 while Philippine rice within -29 to -30. More rice samples will be analyzed to gain better understanding of isotopic signatures to distinguish inter-varietal and/or geographical differences. Elemental composition of soil samples of rice samples sources will be determined for better understanding of uptake mechanisms. (author)

  10. Redox substoichiometry in isotope dilution analysis Pt. 4

    International Nuclear Information System (INIS)

    Kambara, T.; Yoshioka, H.; Ugai, Y.

    1980-01-01

    The oxidation reaction of antimony(III) with potassium dichromate has been investigated radiometrically. The quantitative oxidation of antimony(III) was found to be not disturbed even in large amounts of tin(IV). On the basis of these results the redox substoichiometric isotope dilution analysis using potassium dichromate as the oxidizing agent was proposed for the determination of antimony in metallic tin. An antimony content of 1.22+-0.05 μg in metallic tin (10 mg) was determined without separation of the matrix element. (author)

  11. Modularity analysis of automotive control software

    NARCIS (Netherlands)

    Dajsuren, Y.; Brand, van den M.G.J.; Serebrenik, A.

    2013-01-01

    A design language and tool like MATLAB/Simulink is used for the graphical modelling and simulation of automotive control software. As the functionality based on electronics and software systems increases in motor vehicles, it is becoming increasingly important for system/software architects and

  12. Testing sequential extraction methods for the analysis of multiple stable isotope systems from a bone sample

    Science.gov (United States)

    Sahlstedt, Elina; Arppe, Laura

    2017-04-01

    Stable isotope composition of bones, analysed either from the mineral phase (hydroxyapatite) or from the organic phase (mainly collagen) carry important climatological and ecological information and are therefore widely used in paleontological and archaeological research. For the analysis of the stable isotope compositions, both of the phases, hydroxyapatite and collagen, have their more or less well established separation and analytical techniques. Recent development in IRMS and wet chemical extraction methods have facilitated the analysis of very small bone fractions (500 μg or less starting material) for PO43-O isotope composition. However, the uniqueness and (pre-) historical value of each archaeological and paleontological finding lead to preciously little material available for stable isotope analyses, encouraging further development of microanalytical methods for the use of stable isotope analyses. Here we present the first results in developing extraction methods for combining collagen C- and N-isotope analyses to PO43-O-isotope analyses from a single bone sample fraction. We tested sequential extraction starting with dilute acid demineralization and collection of both collagen and PO43-fractions, followed by further purification step by H2O2 (PO43-fraction). First results show that bone sample separates as small as 2 mg may be analysed for their δ15N, δ13C and δ18OPO4 values. The method may be incorporated in detailed investigation of sequentially developing skeletal material such as teeth, potentially allowing for the investigation of interannual variability in climatological/environmental signals or investigation of the early life history of an individual.

  13. Ar39 Detection at the 10-16 Isotopic Abundance Level with Atom Trap Trace Analysis

    Science.gov (United States)

    Jiang, W.; Williams, W.; Bailey, K.; Davis, A. M.; Hu, S.-M.; Lu, Z.-T.; O'Connor, T. P.; Purtschert, R.; Sturchio, N. C.; Sun, Y. R.; Mueller, P.

    2011-03-01

    Atom trap trace analysis, a laser-based atom counting method, has been applied to analyze atmospheric Ar39 (half-life=269yr), a cosmogenic isotope with an isotopic abundance of 8×10-16. In addition to the superior selectivity demonstrated in this work, the counting rate and efficiency of atom trap trace analysis have been improved by 2 orders of magnitude over prior results. The significant applications of this new analytical capability lie in radioisotope dating of ice and water samples and in the development of dark matter detectors.

  14. Analysis and separation of boron isotopes; Analyse et separation des isotopes du bore

    Energy Technology Data Exchange (ETDEWEB)

    Perie, M [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1966-11-01

    The nuclear applications of boron-10 justify the study of a method of measurement of its isotopic abundance as well as of very small traces of boron in different materials. A systematic study of thermionic emission of BO{sub 2}Na{sub 2}{sup +} has been carried out. In the presence of a slight excess of alkalis, the thermionic emission is considerably reduced. On the other hand, the addition of a mixture of sodium hydroxide-glycerol (or mannitol) to borax permits to obtain an intense and stable beam. These results have permitted to establish an operative method for the analysis of traces of boron by isotopic dilution. In other respects, the needs of boron-10 in nuclear industry Justify the study of procedures of separation of isotopes of boron. A considerable isotopic effect has been exhibited in the chemical exchange reaction between methyl borate and borate salt in solution. In the case of exchange between methyl borate and sodium borate, the elementary separation factor {alpha} is: {alpha}=(({sup 11}B/{sup 10}B)vap.)/(({sup 11}B/{sup 10}B)liq.)=1.03{sub 3}. The high value of this elementary effect has been multiplied in a distillation column in which the problem of regeneration of the reactive has been resolved. An alternative procedure replacing the alkali borate by a borate of volatile base, for example diethylamine, has also been studied ({alpha}=1,02{sub 5} in medium hydro-methanolic with 2,2 per cent water). (author) [French] Les applications nucleaires du bore 10 justifient l'etude d'une methode de mesure de son abondance isotopique dans divers materiaux ainsi que le dosage de tres faibles traces de bore. Une etude systematique de l'emission thermoionique de BO{sub 2} Na{sub 2}{sup +} a ete effectuee. En presence d'un leger exces d'alcalins, l'emission thermoionique est considerablement reduite. Par contre l'addition au borax d'un melange soude-glycerol (ou mannitol) permet d'obtenir un faisceau stable et intense. Ces resultats ont permis d'etablir un mode

  15. Thermal-hydraulic software development for nuclear waste transportation cask design and analysis

    International Nuclear Information System (INIS)

    Brown, N.N.; Burns, S.P.; Gianoulakis, S.E.; Klein, D.E.

    1991-01-01

    This paper describes the development of a state-of-the-art thermal-hydraulic software package intended for spent fuel and high-level nuclear waste transportation cask design and analysis. The objectives of this software development effort are threefold: (1) to take advantage of advancements in computer hardware and software to provide a more efficient user interface, (2) to provide a tool for reducing inefficient conservatism in spent fuel and high-level waste shipping cask design by including convection as well as conduction and radiation heat transfer modeling capabilities, and (3) to provide a thermal-hydraulic analysis package which is developed under a rigorous quality assurance program established at Sandia National Laboratories. 20 refs., 5 figs., 2 tabs

  16. Carotid artery stenosis: Performance of advanced vessel analysis software in evaluating CTA

    International Nuclear Information System (INIS)

    Tsiflikas, Ilias; Biermann, Christina; Thomas, Christoph; Ketelsen, Dominik; Claussen, Claus D.; Heuschmid, Martin

    2012-01-01

    Objectives: The aim of this study was to evaluate time efficiency and diagnostic reproducibility of an advanced vessel analysis software for diagnosis of carotid artery stenosis. Material and methods: 40 patients with suspected carotid artery stenosis received head and neck DE-CTA as part of their pre-interventional workup. Acquired data were evaluated by 2 independent radiologists. Stenosis grading was performed by MPR eyeballing with freely adjustable MPRs and with a preliminary prototype of the meanwhile available client-server and advanced visualization software syngo.via CT Vascular (Siemens Healthcare, Erlangen, Germany). Stenoses were graded according to the following 5 categories: I: 0%, II: 1–50%, III: 51–69%, IV: 70–99% and V: total occlusion. Furthermore, time to diagnosis for each carotid artery was recorded. Results: Both readers achieved very good specificity values and good respectively very good sensitivity values without significant differences between both reading methods. Furthermore, there was a very good correlation between both readers for both reading methods without significant differences (kappa value: standard image interpretation k = 0.809; advanced vessel analysis software k = 0.863). Using advanced vessel analysis software resulted in a significant time saving (p < 0.0001) for both readers. Time to diagnosis could be decreased by approximately 55%. Conclusions: Advanced vessel analysis application CT Vascular of the new imaging software syngo.via (Siemens Healthcare, Forchheim, Germany) provides a high rate of reproducibility in assessment of carotid artery stenosis. Furthermore a significant time saving in comparison to standard image interpretation is achievable

  17. Carotid artery stenosis: Performance of advanced vessel analysis software in evaluating CTA

    Energy Technology Data Exchange (ETDEWEB)

    Tsiflikas, Ilias, E-mail: ilias.tsiflikas@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Biermann, Christina, E-mail: christina.biermann@siemens.com [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Siemens AG, Siemens Healthcare Consulting, Allee am Röthelheimpark 3A, 91052 Erlangen (Germany); Thomas, Christoph, E-mail: christoph.thomas@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Ketelsen, Dominik, E-mail: dominik.ketelsen@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Claussen, Claus D., E-mail: claus.claussen@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Heuschmid, Martin, E-mail: martin.heuschmid@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany)

    2012-09-15

    Objectives: The aim of this study was to evaluate time efficiency and diagnostic reproducibility of an advanced vessel analysis software for diagnosis of carotid artery stenosis. Material and methods: 40 patients with suspected carotid artery stenosis received head and neck DE-CTA as part of their pre-interventional workup. Acquired data were evaluated by 2 independent radiologists. Stenosis grading was performed by MPR eyeballing with freely adjustable MPRs and with a preliminary prototype of the meanwhile available client-server and advanced visualization software syngo.via CT Vascular (Siemens Healthcare, Erlangen, Germany). Stenoses were graded according to the following 5 categories: I: 0%, II: 1–50%, III: 51–69%, IV: 70–99% and V: total occlusion. Furthermore, time to diagnosis for each carotid artery was recorded. Results: Both readers achieved very good specificity values and good respectively very good sensitivity values without significant differences between both reading methods. Furthermore, there was a very good correlation between both readers for both reading methods without significant differences (kappa value: standard image interpretation k = 0.809; advanced vessel analysis software k = 0.863). Using advanced vessel analysis software resulted in a significant time saving (p < 0.0001) for both readers. Time to diagnosis could be decreased by approximately 55%. Conclusions: Advanced vessel analysis application CT Vascular of the new imaging software syngo.via (Siemens Healthcare, Forchheim, Germany) provides a high rate of reproducibility in assessment of carotid artery stenosis. Furthermore a significant time saving in comparison to standard image interpretation is achievable.

  18. Forensic analysis of explosives using isotope ratio mass spectrometry (IRMS)--discrimination of ammonium nitrate sources.

    Science.gov (United States)

    Benson, Sarah J; Lennard, Christopher J; Maynard, Philip; Hill, David M; Andrew, Anita S; Roux, Claude

    2009-06-01

    An evaluation was undertaken to determine if isotope ratio mass spectrometry (IRMS) could assist in the investigation of complex forensic cases by providing a level of discrimination not achievable utilising traditional forensic techniques. The focus of the research was on ammonium nitrate (AN), a common oxidiser used in improvised explosive mixtures. The potential value of IRMS to attribute Australian AN samples to the manufacturing source was demonstrated through the development of a preliminary AN classification scheme based on nitrogen isotopes. Although the discrimination utilising nitrogen isotopes alone was limited and only relevant to samples from the three Australian manufacturers during the evaluated time period, the classification scheme has potential as an investigative aid. Combining oxygen and hydrogen stable isotope values permitted the differentiation of AN prills from three different Australian manufacturers. Samples from five different overseas sources could be differentiated utilising a combination of the nitrogen, oxygen and hydrogen isotope values. Limited differentiation between Australian and overseas prills was achieved for the samples analysed. The comparison of nitrogen isotope values from intact AN prill samples with those from post-blast AN prill residues highlighted that the nitrogen isotopic composition of the prills was not maintained post-blast; hence, limiting the technique to analysis of un-reacted explosive material.

  19. Optimization and application of ICPMS with dynamic reaction cell for precise determination of 44Ca/40Ca isotope ratios.

    Science.gov (United States)

    Boulyga, Sergei F; Klötzli, Urs; Stingeder, Gerhard; Prohaska, Thomas

    2007-10-15

    An inductively coupled plasma mass spectrometer with dynamic reaction cell (ICP-DRC-MS) was optimized for determining (44)Ca/(40)Ca isotope ratios in aqueous solutions with respect to (i) repeatability, (ii) robustness, and (iii) stability. Ammonia as reaction gas allowed both the removal of (40)Ar+ interference on (40)Ca+ and collisional damping of ion density fluctuations of an ion beam extracted from an ICP. The effect of laboratory conditions as well as ICP-DRC-MS parameters such a nebulizer gas flow rate, rf power, lens potential, dwell time, or DRC parameters on precision and mass bias was studied. Precision (calculated using the "unbiased" or "n - 1" method) of a single isotope ratio measurement of a 60 ng g(-1) calcium solution (analysis time of 6 min) is routinely achievable in the range of 0.03-0.05%, which corresponded to the standard error of the mean value (n = 6) of 0.012-0.020%. These experimentally observed RSDs were close to theoretical precision values given by counting statistics. Accuracy of measured isotope ratios was assessed by comparative measurements of the same samples by ICP-DRC-MS and thermal ionization mass spectrometry (TIMS) by using isotope dilution with a (43)Ca-(48)Ca double spike. The analysis time in both cases was 1 h per analysis (10 blocks, each 6 min). The delta(44)Ca values measured by TIMS and ICP-DRC-MS with double-spike calibration in two samples (Ca ICP standard solution and digested NIST 1486 bone meal) coincided within the obtained precision. Although the applied isotope dilution with (43)Ca-(48)Ca double-spike compensates for time-dependent deviations of mass bias and allows achieving accurate results, this approach makes it necessary to measure an additional isotope pair, reducing the overall analysis time per isotope or increasing the total analysis time. Further development of external calibration by using a bracketing method would allow a wider use of ICP-DRC-MS for routine calcium isotopic measurements, but it

  20. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  1. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  2. Symplectic Tracking of Multi-Isotopic Heavy-Ion Beams in SixTrack

    CERN Document Server

    Hermes, Pascal; De Maria, Riccardo

    2016-01-01

    The software SixTrack provides symplectic proton tracking over a large number of turns. The code is used for the tracking of beam halo particles and the simulation of their interaction with the collimators to study the efficiency of the LHC collimation system. Tracking simulations for heavy-ion beams require taking into account the mass to charge ratio of each particle because heavy ions can be subject to fragmentation at their passage through the collimators. In this paper we present the derivation of a Hamiltonian for multi-isotopic heavy-ion beams and symplectic tracking maps derived from it. The resulting tracking maps were implemented in the tracking software SixTrack. With this modification, SixTrack can be used to natively track heavy-ion beams of multiple isotopes through a magnetic accelerator lattice.

  3. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  4. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, Eelke

    2005-01-01

    One of the qualities that has received increased attention in recent decades is usability. A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their

  5. Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability

    Energy Technology Data Exchange (ETDEWEB)

    Shear, Trevor Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-29

    Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystal sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.

  6. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  7. Mathematical modeling and multicriterion optimization for photonuclear production of the 67cu isotope

    International Nuclear Information System (INIS)

    Dikij, N.P.; Rudychev, Y.V.; Fedorchenko, D.V.; Khazhmuradov, M.A.

    2014-01-01

    This paper considers a method for 67 Cu isotope production using electron bremsstrahlung by the 68 Zn(gamma, p) 67 Cu reaction. The facility for 67 Cu isotope production contains an electron accelerator, electron-gamma converter and zinc target. To optimize this facility we developed three-dimensional model of the converter and the target. Using this model, we performed the mathematical modeling of zinc target irradiation and thermal-hydraulic processes inside the target for various parameters of the electron beam and converter configurations. For mathematical modeling of radiation processes we used the MCNPX software. Thermal-hydraulic simulation utilized the commercial SolidWorks software with Flow Simulation module. Mathematical modeling revealed that efficient 67 Cu isotope production needs smaller beam diameter and higher electron energy. Under these conditions target heat power also increases, thus additional cooling is necessary. If the beam diameter and the electron energy are fixed the most effective method to satisfy the operating parameters and retain an efficient isotope yield is to optimize photonuclear spectra of the target by variation of converter thickness. We developed an algorithm for multicriterion optimization and performed the optimization of the facility with account to coupled radiation and heat transfer processes.

  8. Simulation and optimization of stable isotope 18O separation by water vacuum distillation

    International Nuclear Information System (INIS)

    Chen Yuyan; Qin Chuanjiang; Xiao Bin; Xu Jing'an

    2012-01-01

    In the research, a stable isotope 18 O separation column was set up by water vacuum distillation with 20 m packing height and 0.1 m diameter of the column. The self-developed special packing named PAC- 18 O was packed inside the column. Firstly, a model was created by using the Aspen Plus software, and then the simulation results were validated by test results. Secondly, a group of simulation results were created by Aspen Plus, and the optimal operation conditions were gotten by using the artificial neural network (ANN) and Statistica software. Considering comprehensive factors drawn from column pressure and from withdrawing velocity, conclusions were reached on the study of the impact on the abundance of the isotope 18 O. The final results show that the abundance of the isotope 18 O increases as column pressure dropping and withdrawing velocity decreasing. Besides, the optimal column pressure and the incidence formula between the abundance of the isotope 18 O and withdrawing velocity were gotten. The conclusion is that the method of simulation and optimization can be applied to 18 O industrial design and will be popular in traditional distillation process to realize optimization design. (authors)

  9. Chromatographic speciation of Cr(III)-species, inter-species equilibrium isotope fractionation and improved chemical purification strategies for high-precision isotope analysis

    DEFF Research Database (Denmark)

    Larsen, Kirsten Kolbjørn; Wielandt, Daniel Kim Peel; Schiller, Martin

    2016-01-01

    Chromatographic purification of chromium (Cr), which is required for high-precision isotope analysis, is complicated by the presence of multiple Cr-species with different effective charges in the acid digested sample aliquots. The differing ion exchange selectivity and sluggish reaction rates of ...

  10. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    Seo, Yong Seok; Seong, Seung Hwan; Park, Keun Ok; Hur, Sub; Kim, Dong Hoon

    2001-01-01

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  11. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  12. Nuclear analysis software. Pt. 1: Spectrum transfer and reformatting (SPEDAC)

    International Nuclear Information System (INIS)

    1991-01-01

    GANAAS (Gamma, Activity, and Neutron Activation Analysis System) is one in the family of software packages developed under the auspices of the International Atomic Energy Agency. Primarily, the package was intended to support the IAEA Technical Assistance and Cooperation projects in developing countries. However, it is open domain software that can be copied and used by anybody, except for commercial purposes. All the nuclear analysis software provided by the IAEA has the same design philosophy and similar structure. The intention was to provide the user with maximum flexibility, at the same time with a simple and logical organization that requires minimum digging through the manuals. GANAAS is a modular system. It consists of several programmes that can be installed on the hard disk as the are needed. Obviously, some parts of they system are required in all cases. Those are installed at the beginning, without consulting the operator. GANAAS offers the opportunity to expand and improve the system. The gamma spectrum evaluation programmes using different fitting algorithms can be added to GANAAS, under the condition that the format of their input and output files corresponds to the rules of GANAAS. The same applies to the quantitative analysis parts of the programme

  13. Tracing fetal and childhood exposure to lead using isotope analysis of deciduous teeth

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, Thomas J. [Centre for Oral Health Research, School of Dental Sciences, Newcastle University, Newcastle upon Tyne (United Kingdom); British Geological Survey, Keyworth, Nottingham (United Kingdom); Dirks, Wendy [Department of Anthropology, Durham University, Durham (United Kingdom); Roberts, Nick M.W. [NERC Isotope Geosciences Laboratory, British Geological Survey, Nottingham (United Kingdom); Patel, Jaiminkumar G. [Leeds Dental Institute, University Leeds, Leeds (United Kingdom); Hodgson, Susan [MRC-PHE Centre for Environment and Health, Department of Epidemiology and Biostatistics, Imperial College London (United Kingdom); Pless-Mulloli, Tanja [Institute of Health and Society, Newcastle University, Newcastle upon Tyne (United Kingdom); Walton, Pamela [Centre for Oral Health Research, School of Dental Sciences, Newcastle University, Newcastle upon Tyne (United Kingdom); Parrish, Randall R. [British Geological Survey, Keyworth, Nottingham (United Kingdom)

    2016-04-15

    We report progress in using the isotopic composition and concentration of Pb in the dentine and enamel of deciduous teeth to provide a high resolution time frame of exposure to Pb during fetal development and early childhood. Isotope measurements (total Pb and {sup 208}Pb/{sup 206}Pb, {sup 207}Pb/{sup 206}Pb ratios) were acquired by laser ablation inductively coupled mass spectrometry at contiguous 100 micron intervals across thin sections of the teeth; from the outer enamel surface to the pulp cavity. Teeth samples (n=10) were selected from two cohorts of children, aged 5–8 years, living in NE England. By integrating the isotope data with histological analysis of the teeth, using the daily incremental lines in dentine, we were able to assign true estimated ages to each ablation point (first 2–3 years for molars, first 1–2 years for incisors+pre-natal growth). Significant differences were observed in the isotope composition and concentration of Pb between children, reflecting differences in the timing and sources of exposure during early childhood. Those born in 2000, after the withdrawal of leaded petrol in 1999, have the lowest dentine Pb levels (<0.2 µg Pb/g) with {sup 208}Pb/{sup 206}Pb (mean ±2σ: 2.126–2.079) {sup 208}Pb/{sup 206}Pb (mean ±2σ: 0.879–0.856) ratios that correlate very closely with modern day Western European industrial aerosols (PM{sub 10}, PM{sub 2.5}) suggesting that diffuse airborne pollution was probably the primary source and exposure pathway. Legacy lead, if present, is insignificant. For those born in 1997, dentine lead levels are typically higher (>0.4 µgPb/g) with {sup 208}Pb/{sup 206}Pb (mean ±2σ: 2.145–2.117) {sup 208}Pb/{sup 206}Pb (mean ±2σ: 0.898–0.882) ratios that can be modelled as a binary mix between industrial aerosols and leaded petrol emissions. Short duration, high intensity exposure events (1–2 months) were readily identified, together with evidence that dentine provides a good proxy for childhood

  14. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  15. Progress in the analysis and interpretation of N2O isotopes: Potential and future challenges

    Science.gov (United States)

    Mohn, Joachim; Tuzson, Béla; Zellweger, Christoph; Harris, Eliza; Ibraim, Erkan; Yu, Longfei; Emmenegger, Lukas

    2017-04-01

    In recent years, research on nitrous oxide (N2O) stable isotopes has significantly advanced, addressing an increasing number of research questions in biogeochemical and atmospheric sciences [1]. An important milestone was the development of quantum cascade laser based spectroscopic devices [2], which are inherently specific for structural isomers (15N14N16O vs. 14N15N16O) and capable to collect real-time data with high temporal resolution, complementary to the well-established isotope-ratio mass-spectrometry (IRMS) method. In combination with automated preconcentration, optical isotope ratio spectroscopy (OIRS) has been applied to disentangle source processes in suburban, rural and pristine environments [e.g. 3, 4]. Within the European Metrology Research Programme (EMRP) ENV52 project "Metrology for high-impact greenhouse gases (HIGHGAS)", the quality of N2O stable isotope analysis by OIRS, the comparability between laboratories, and the traceability to the international isotope ratio scales have been addressed. An inter-laboratory comparison between eleven IRMS and OIRS laboratories, organised within HIGHGAS, indicated limited comparability for 15N site preference, i.e. the difference between 15N abundance in central (N*NO) and end (*NNO) position [5]. In addition, the accuracy of the NH4NO3 decomposition reaction, which provides the link between 15N site preference and the international 15N/14N scale, was found to be limited by non-quantitative NH4NO3 decomposition in combination with substantially different isotope enrichment factors for both nitrogen atoms [6]. Results of the HIGHGAS project indicate that the following research tasks have to be completed to foster research on N2O isotopes: 1) develop improved techniques to link the 15N and 18O abundance and the 15N site preference in N2O to the international stable isotope ratio scales; 2) provide N2O reference materials, pure and diluted in an air matrix, to improve inter-laboratory compatibility. These tasks

  16. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  17. Key attributes of the SAPHIRE risk and reliability analysis software for risk-informed probabilistic applications

    International Nuclear Information System (INIS)

    Smith, Curtis; Knudsen, James; Kvarfordt, Kellie; Wood, Ted

    2008-01-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has led to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30-40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities, which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena

  18. Test results of a new detector system for gamma ray isotopic measurements

    International Nuclear Information System (INIS)

    Malcom, J.E.; Bonner, C.A.; Hurd, J.R.; Fleissner,

    1993-01-01

    A new type of gamma-ray detector system for isotopic measurements has been developed. This new system, a ''Duo detector'' array, consists of two intrinsic germanium detectors, a planar followed by a coaxial mounted on the same axis within a single cryostat assembly. This configuration allows the isotopic analysis system to take advantage of spectral data results that are collected simultaneously from different gamma-ray energy regimes. Princeton Gamma Tech (PGT) produced several prototypes of this Duo detector array which were then tested by Rocky Flats personnel until the design was optimized. An application for this detector design is in automated, roboticized NDA systems such as those being developed at the Los Alamos TA-55 Plutonium Facility. The Duo detector design reduces the space necessary for the isotopic instrument by a factor of two (only one liquid nitrogen dewar is needed), and also reduces the complexity of the mechanical systems and controlling software. Data will be presented on measurements of nuclear material with a Duo detector for a wide variety of matrices. Results indicate that the maximum count rate can be increased up to 100,000 counts per second yet maintaining excellent resolution and energy rate product

  19. Isotopic analysis of uranium hexafluoride highly enriched in U-235

    International Nuclear Information System (INIS)

    Chaussy, L.; Boyer, R.

    1968-01-01

    Isotopic analysis of uranium in the form of the hexafluoride by mass-spectrometry gives gross results which are not very accurate. Using a linear interpolation method applied to two standards it is possible to correct for this inaccuracy as long as the isotopic concentrations are less than about 10 per cent in U-235. Above this level, the interpolations formula overestimates the results, especially if the enrichment of the analyzed samples is higher than 1.3 with respect to the standards. A formula is proposed for correcting the interpolation equation and for the extending its field of application to high values of the enrichment (≅2) and of the concentration. It is shown that by using this correction the results obtained have an accuracy which depends practically only on that of the standards, taking into account the dispersion in the measurements. (authors) [fr

  20. Evaluation of the performance of high temperature conversion reactors for compound-specific oxygen stable isotope analysis.

    Science.gov (United States)

    Hitzfeld, Kristina L; Gehre, Matthias; Richnow, Hans-Hermann

    2017-05-01

    In this study conversion conditions for oxygen gas chromatography high temperature conversion (HTC) isotope ratio mass spectrometry (IRMS) are characterised using qualitative mass spectrometry (IonTrap). It is shown that physical and chemical properties of a given reactor design impact HTC and thus the ability to accurately measure oxygen isotope ratios. Commercially available and custom-built tube-in-tube reactors were used to elucidate (i) by-product formation (carbon dioxide, water, small organic molecules), (ii) 2nd sources of oxygen (leakage, metal oxides, ceramic material), and (iii) required reactor conditions (conditioning, reduction, stability). The suitability of the available HTC approach for compound-specific isotope analysis of oxygen in volatile organic molecules like methyl tert-butyl ether is assessed. Main problems impeding accurate analysis are non-quantitative HTC and significant carbon dioxide by-product formation. An evaluation strategy combining mass spectrometric analysis of HTC products and IRMS 18 O/ 16 O monitoring for future method development is proposed.

  1. Critical Factors Analysis for Offshore Software Development Success by Structural Equation Modeling

    Science.gov (United States)

    Wada, Yoshihisa; Tsuji, Hiroshi

    In order to analyze the success/failure factors in offshore software development service by the structural equation modeling, this paper proposes to follow two approaches together; domain knowledge based heuristic analysis and factor analysis based rational analysis. The former works for generating and verifying of hypothesis to find factors and causalities. The latter works for verifying factors introduced by theory to build the model without heuristics. Following the proposed combined approaches for the responses from skilled project managers of the questionnaire, this paper found that the vendor property has high causality for the success compared to software property and project property.

  2. Surveillance Analysis Computer System (SACS): Software requirements specification (SRS). Revision 2

    International Nuclear Information System (INIS)

    Glasscock, J.A.

    1995-01-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) database, an Impact Level 3Q system. SACS stores information on tank temperatures, surface levels, and interstitial liquid levels. This information is retrieved by the customer through a PC-based interface and is then available to a number of other software tools. The software requirements specification (SRS) describes the system requirements for the SACS Project, and follows the Standard Engineering Practices (WHC-CM-6-1), Software Practices (WHC-CM-3-10) and Quality Assurance (WHC-CM-4-2, QR 19.0) policies

  3. Cl and C isotope analysis to assess the effectiveness of chlorinated ethene degradation by zero-valent iron: Evidence from dual element and product isotope values

    International Nuclear Information System (INIS)

    Audí-Miró, Carme; Cretnik, Stefan; Otero, Neus; Palau, Jordi; Shouakar-Stash, Orfan; Soler, Albert

    2013-01-01

    Highlights: ► TCE and cis-DCE Cl isotope fractionation was investigated for the first time with ZVI. ► A C–Cl bond is broken in the rate-limiting step during ethylene ZVI dechlorination. ► Dual C/Cl isotope plot is a promising tool to discriminate abiotic degradation. ► Product-related carbon isotopic fractionation gives evidence of abiotic degradation. ► Hydrogenolysis and β-dichloroelimination pathways occur simultaneously. - Abstract: This study investigated C and, for the first time, Cl isotope fractionation of trichloroethene (TCE) and cis-dichloroethene (cis-DCE) during reductive dechlorination by cast zero-valent iron (ZVI). Hydrogenolysis and β-dichloroelimination pathways occurred as parallel reactions, with ethene and ethane deriving from the β-dichloroelimination pathway. Carbon isotope fractionation of TCE and cis-DCE was consistent for different batches of Fe studied. Transformation of TCE and cis-DCE showed Cl isotopic enrichment factors (ε Cl ) of −2.6‰ ± 0.1‰ (TCE) and −6.2‰ ± 0.8‰ (cis-DCE), with Apparent Kinetic Isotope Effects (AKIE Cl ) for Cl of 1.008 ± 0.001 (TCE) and 1.013 ± 0.002 (cis-DCE). This indicates that a C–Cl bond breakage is rate-determining in TCE and cis-DCE transformation by ZVI. Two approaches were investigated to evaluate if isotope fractionation analysis can distinguish the effectiveness of transformation by ZVI as opposed to natural biodegradation. (i) Dual isotope plots. This study reports the first dual (C, Cl) element isotope plots for TCE and cis-DCE degradation by ZVI. The pattern for cis-DCE differs markedly from that reported for biodegradation of the same compound by KB-1, a commercially available Dehalococcoides-containing culture. The different trends suggest an expedient approach to distinguish abiotic and biotic transformation, but this needs to be confirmed in future studies. (ii) Product-related isotope fractionation. Carbon isotope ratios of the hydrogenolysis product cis

  4. Use of alpha spectrometry for analysis of U-isotopes in some granite samples

    International Nuclear Information System (INIS)

    El-Galy, M.M.; Desouky, O.A.; Khattab, M.R.; Issa, F.A.

    2011-01-01

    The present study aims to use the α-spectrometry, at NMA. A radiochemical technique for analysis of U-isotopes was carried out for some granite samples from Gabal Gattar and El Missikat localities and also for some reference soil samples of IAEA. Several steps of sample preparation, radiochemical separation, and source preparation were performed before analysis. The concerned sample was leached by HNO 3 , HF and H 2 O 2 acids after ashing. The ashed sample was spiked with uranium tracer ( 232 U) for chemical yield and activity calculation. Then uranium was extracted from the matrix elements with trioctylphosphine oxide (TOPO) and stripped with 1 M NH 4 F/0.1 M HCl solution. The uranium fraction was purified by co-precipitation with LaF 3 to ensure complete removal of thorium and traces of resolution degrading elements. This was followed by a final clean-up step using an anion exchange. The pure uranium fraction was electrodeposited on a stainless steel disc from HCl/oxalate solution. The obtained results from the soil reference samples indicate general similarities between the techniques of α-spectrometers of NMA, EAEA and IAEA for analysis of U-isotopes. The U-isotopes in the granite samples of high radioactivity levels need more attempts after dilution process to be in the limit detection of α-spectrometry. (author)

  5. Development of Software for Automatic Analysis of Intervention in the Field of Homeopathy.

    Science.gov (United States)

    Jain, Rajesh Kumar; Goyal, Shagun; Bhat, Sushma N; Rao, Srinath; Sakthidharan, Vivek; Kumar, Prasanna; Sajan, Kannanaikal Rappayi; Jindal, Sameer Kumar; Jindal, Ghanshyam D

    2018-05-01

    To study the effect of homeopathic medicines (in higher potencies) in normal subjects, Peripheral Pulse Analyzer (PPA) has been used to record physiologic variability parameters before and after administration of the medicine/placebo in 210 normal subjects. Data have been acquired in seven rounds; placebo was administered in rounds 1 and 2 and medicine in potencies 6, 30, 200, 1 M, and 10 M was administered in rounds 3 to 7, respectively. Five different medicines in the said potencies were given to a group of around 40 subjects each. Although processing of data required human intervention, a software application has been developed to analyze the processed data and detect the response to eliminate the undue delay as well as human bias in subjective analysis. This utility named Automatic Analysis of Intervention in the Field of Homeopathy is run on the processed PPA data and the outcome has been compared with the manual analysis. The application software uses adaptive threshold based on statistics for detecting responses in contrast to fixed threshold used in manual analysis. The automatic analysis has detected 12.96% higher responses than subjective analysis. Higher response rates have been manually verified to be true positive. This indicates robustness of the application software. The automatic analysis software was run on another set of pulse harmonic parameters derived from the same data set to study cardiovascular susceptibility and 385 responses were detected in contrast to 272 of variability parameters. It was observed that 65% of the subjects, eliciting response, were common. This not only validates the software utility for giving consistent yield but also reveals the certainty of the response. This development may lead to electronic proving of homeopathic medicines (e-proving).

  6. A method of uranium isotopes concentration analysis

    International Nuclear Information System (INIS)

    Lin Yuangen; Jiang Meng; Wu Changli; Duan Zhanyuan; Guo Chunying

    2010-01-01

    A basic method of uranium isotopes concentration is described in this paper. The iteration method is used to calculate the relative efficiency curve, by analyzing the characteristic γ energy spectrum of 235 U, 232 U and the daughter nuclide of 238 U, then the relative activity can be calculated, at last the uranium isotopes concentration can be worked out, and the result is validated by the experimentation. (authors)

  7. Meta-Analyst: software for meta-analysis of binary, continuous and diagnostic data

    Directory of Open Access Journals (Sweden)

    Schmid Christopher H

    2009-12-01

    Full Text Available Abstract Background Meta-analysis is increasingly used as a key source of evidence synthesis to inform clinical practice. The theory and statistical foundations of meta-analysis continually evolve, providing solutions to many new and challenging problems. In practice, most meta-analyses are performed in general statistical packages or dedicated meta-analysis programs. Results Herein, we introduce Meta-Analyst, a novel, powerful, intuitive, and free meta-analysis program for the meta-analysis of a variety of problems. Meta-Analyst is implemented in C# atop of the Microsoft .NET framework, and features a graphical user interface. The software performs several meta-analysis and meta-regression models for binary and continuous outcomes, as well as analyses for diagnostic and prognostic test studies in the frequentist and Bayesian frameworks. Moreover, Meta-Analyst includes a flexible tool to edit and customize generated meta-analysis graphs (e.g., forest plots and provides output in many formats (images, Adobe PDF, Microsoft Word-ready RTF. The software architecture employed allows for rapid changes to be made to either the Graphical User Interface (GUI or to the analytic modules. We verified the numerical precision of Meta-Analyst by comparing its output with that from standard meta-analysis routines in Stata over a large database of 11,803 meta-analyses of binary outcome data, and 6,881 meta-analyses of continuous outcome data from the Cochrane Library of Systematic Reviews. Results from analyses of diagnostic and prognostic test studies have been verified in a limited number of meta-analyses versus MetaDisc and MetaTest. Bayesian statistical analyses use the OpenBUGS calculation engine (and are thus as accurate as the standalone OpenBUGS software. Conclusion We have developed and validated a new program for conducting meta-analyses that combines the advantages of existing software for this task.

  8. Calcium isotope effects in ion exchange electromigration and calcium isotope analysis by thermo-ionization mass spectrometry

    International Nuclear Information System (INIS)

    Fujii, Y.; Hoshi, J.; Iwamoto, H.; Okamoto, M.; Kakihana, H.

    1985-01-01

    Calcium ions were made to electromigrate along a cation exchange membrane. The abundance ratios of the calcium isotopes (Ca-40, 42, 43, 44, 48) in the migrated bands were measured by thermo-ionization mass spectrometry. The lighter isotopes were enriched in the front part of the migrated band. The increments in the isotope abundance ratios were found to be proportional to the mass difference of the isotopes. The observed epsilon-values per unit mass difference (epsilon/ΔM) were 1.26 x 10 -4 (at 20 0 C), 1.85 x 10 -4 (at 25 0 C) and 2.4 x 10 -4 (at 40 0 C). The mass spectrometry was improved by using a low temperature for the evaporation of CaI 2 . (orig.)

  9. Free-drop analysis of the transport container for hydrogen isotopes

    International Nuclear Information System (INIS)

    Lee, M. S.; Hong, C. S.; Baek, S. W.; Ahn, D. H.; Kim, K. R.; Lee, S. H.; Lim, S. P.; Jung, H. S.

    2002-01-01

    The vessel used for the transport of radioactive materials, containing hydrogen isotopes is evaluated for hypothetical accident conditions according to national regulations. The computational analysis is a cost effective tool to minimize testing and streamline the regulatory procedures, and supports experimental programs to qualify the container for the safe transport of radioactive materials. The numerical analysis of 9m free-drop onto a flat unyielding, horizontal surface has been performed using the explicit finite element computer program ABAQUS. Especially free-drop simulations for 30.deg. C tilted condition is precisely estimated

  10. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  11. BAM/DASS: Data Analysis Software for Sub-Microarcsecond Astrometry Device

    Science.gov (United States)

    Gardiol, D.; Bonino, D.; Lattanzi, M. G.; Riva, A.; Russo, F.

    2010-12-01

    The INAF - Osservatorio Astronomico di Torino is part of the Data Processing and Analysis Consortium (DPAC) for Gaia, a cornerstone mission of the European Space Agency. Gaia will perform global astrometry by means of two telescopes looking at the sky along two different lines of sight oriented at a fixed angle, also called basic angle. Knowledge of the basic angle fluctuations at the sub-microarcsecond level over periods of the order of the minute is crucial to reach the mission goals. A specific device, the Basic Angle Monitoring, will be dedicated to this purpose. We present here the software system we are developing to analyze the BAM data and recover the basic angle variations. This tool is integrated into the whole DPAC data analysis software.

  12. Keratin decomposition by trogid beetles: evidence from a feeding experiment and stable isotope analysis

    Science.gov (United States)

    Sugiura, Shinji; Ikeda, Hiroshi

    2014-03-01

    The decomposition of vertebrate carcasses is an important ecosystem function. Soft tissues of dead vertebrates are rapidly decomposed by diverse animals. However, decomposition of hard tissues such as hairs and feathers is much slower because only a few animals can digest keratin, a protein that is concentrated in hairs and feathers. Although beetles of the family Trogidae are considered keratin feeders, their ecological function has rarely been explored. Here, we investigated the keratin-decomposition function of trogid beetles in heron-breeding colonies where keratin was frequently supplied as feathers. Three trogid species were collected from the colonies and observed feeding on heron feathers under laboratory conditions. We also measured the nitrogen (δ15N) and carbon (δ13C) stable isotope ratios of two trogid species that were maintained on a constant diet (feathers from one heron individual) during 70 days under laboratory conditions. We compared the isotopic signatures of the trogids with the feathers to investigate isotopic shifts from the feathers to the consumers for δ15N and δ13C. We used mixing models (MixSIR and SIAR) to estimate the main diets of individual field-collected trogid beetles. The analysis indicated that heron feathers were more important as food for trogid beetles than were soft tissues under field conditions. Together, the feeding experiment and stable isotope analysis provided strong evidence of keratin decomposition by trogid beetles.

  13. Intramolecular carbon and nitrogen isotope analysis by quantitative dry fragmentation of the phenylurea herbicide isoproturon in a combined injector/capillary reactor prior to GC separation.

    Science.gov (United States)

    Penning, Holger; Elsner, Martin

    2007-11-01

    Potentially, compound-specific isotope analysis may provide unique information on source and fate of pesticides in natural systems. Yet for isotope analysis, LC-based methods that are based on the use of organic solvents often cannot be used and GC-based analysis is frequently not possible due to thermolability of the analyte. A typical example of a compound with such properties is isoproturon (3-(4-isopropylphenyl)-1,1-dimethylurea), belonging to the worldwide extensively used phenylurea herbicides. To make isoproturon accessible to carbon and nitrogen isotope analysis, we developed a GC-based method during which isoproturon was quantitatively fragmented to dimethylamine and 4-isopropylphenylisocyanate. Fragmentation occurred only partially in the injector but was mainly achieved on a heated capillary column. The fragments were then chromatographically separated and individually measured by isotope ratio mass spectrometry. The reliability of the method was tested in hydrolysis experiments with three isotopically different batches of isoproturon. For all three products, the same isotope fractionation factors were observed during conversion and the difference in isotope composition between the batches was preserved. This study demonstrates that fragmentation of phenylurea herbicides does not only make them accessible to isotope analysis but even enables determination of intramolecular isotope fractionation.

  14. Analyzing the State of Static Analysis : A Large-Scale Evaluation in Open Source Software

    NARCIS (Netherlands)

    Beller, M.; Bholanath, R.; McIntosh, S.; Zaidman, A.E.

    2016-01-01

    The use of automatic static analysis has been a software engineering best practice for decades. However, we still do not know a lot about its use in real-world software projects: How prevalent is the use of Automated Static Analysis Tools (ASATs) such as FindBugs and JSHint? How do developers use

  15. An application of nitrogen microwave-induced plasma mass spectrometry to isotope dilution analysis of selenium in marine organisms

    Energy Technology Data Exchange (ETDEWEB)

    Shirasaki, Toshihiro [Hitachi Instruments Engineering Co. Ltd., Hitachinaka, Ibaraki (Japan); Yoshinaga, Jun; Morita, Masatoshi; Okumoto, Toyoharu; Oishi, Konosuke

    1996-01-01

    Nitrogen microwave-induced plasma mass spectrometry was studied for its applicability to the isotope dilution analysis of selenium in biological samples. Spectroscopic interference by calcium, which is present in high concentrations in biological samples, was investigated. No detectable background spectrum was observed for the major selenium isotopes of {sup 78}Se and {sup 80}Se. No detectable interferences by sodium, potassium, calcium and phosphorus on the isotope ratio {sup 80}Se/{sup 78}Se were observed up to concentration of 200 mg/ml. The method was applied to the analysis of selenium in biological reference materials of marine organisms. The results showed good agreement between the certified and found values. (author).

  16. Determination of geographic provenance of cotton fibres using multi-isotope profiles and multivariate statistical analysis

    Science.gov (United States)

    Daeid, N. Nic; Meier-Augenstein, W.; Kemp, H. F.

    2012-04-01

    The analysis of cotton fibres can be particularly challenging within a forensic science context where discrimination of one fibre from another is of importance. Normally cotton fibre analysis examines the morphological structure of the recovered material and compares this with that of a known fibre from a particular source of interest. However, the conventional microscopic and chemical analysis of fibres and any associated dyes is generally unsuccessful because of the similar morphology of the fibres. Analysis of the dyes which may have been applied to the cotton fibre can also be undertaken though this can be difficult and unproductive in terms of discriminating one fibre from another. In the study presented here we have explored the potential for Isotope Ratio Mass Spectrometry (IRMS) to be utilised as an additional tool for cotton fibre analysis in an attempt to reveal further discriminatory information. This work has concentrated on un-dyed cotton fibres of known origin in order to expose the potential of the analytical technique. We report the results of a pilot study aimed at testing the hypothesis that multi-element stable isotope analysis of cotton fibres in conjunction with multivariate statistical analysis of the resulting isotopic abundance data using well established chemometric techniques permits sample provenancing based on the determination of where the cotton was grown and as such will facilitate sample discrimination. To date there is no recorded literature of this type of application of IRMS to cotton samples, which may be of forensic science relevance.

  17. A software architectural framework specification for neutron activation analysis

    International Nuclear Information System (INIS)

    Preston, J.A.; Grant, C.N.

    2013-01-01

    Neutron Activation Analysis (NAA) is a sensitive multi-element nuclear analytical technique that has been routinely applied by research reactor (RR) facilities to environmental, nutritional, health related, geological and geochemical studies. As RR facilities face calls to increase their research output and impact, with existing or reducing budgets, automation of NAA offers a possible solution. However, automation has many challenges, not the least of which is a lack of system architecture standards to establish acceptable mechanisms for the various hardware/software and software/software interactions among data acquisition systems, specialised hardware such as sample changers, sample loaders, and data processing modules. This lack of standardization often results in automation hardware and software being incompatible with existing system components, in a facility looking to automate its NAA operations. This limits the availability of automation to a few RR facilities with adequate budgets or in-house engineering resources. What is needed is a modern open system architecture for NAA, that provides the required set of functionalities. This paper describes such an 'architectural framework' (OpenNAA), and portions of a reference implementation. As an example of the benefits, calculations indicate that applying this architecture to the compilation and QA steps associated with the analysis of 35 elements in 140 samples, with 14 SRM's, can reduce the time required by over 80 %. The adoption of open standards in the nuclear industry has been very successful over the years in promoting interchangeability and maximising the lifetime and output of nuclear measurement systems. OpenNAA will provide similar benefits within the NAA application space, safeguarding user investments in their current system, while providing a solid path for development into the future. (author)

  18. Separation of polybrominated diphenyl ethers in fish for compound-specific stable carbon isotope analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zeng, Yan-Hong [State Key Laboratory of Organic Geochemistry, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China); Graduate University of Chinese Academy of Sciences, Beijing, 100049 (China); Luo, Xiao-Jun, E-mail: luoxiaoj@gig.ac.cn [State Key Laboratory of Organic Geochemistry, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China); Chen, Hua-Shan; Wu, Jiang-Ping; Chen, She-Jun; Mai, Bi-Xian [State Key Laboratory of Organic Geochemistry, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China)

    2012-05-15

    A separation and isotopic analysis method was developed to accurately measure the stable carbon isotope ratios of polybrominated diphenyl ethers (PBDEs) with three to six substituted bromine atoms in fish samples. Sample extracts were treated with concentrated sulfuric acid to remove lipids, purified using complex silica gel column chromatography, and finally processed using alumina/silica (Al/Si) gel column chromatography. The purities of extracts were verified by gas chromatography and mass spectrometry (GC-MS) in the full-scan mode. The average recoveries of all compounds across the purification method were between 60% and 110%, with the exception of BDE-154. The stable carbon isotopic compositions of PBDEs can be measured with a standard deviation of less than 0.5 Per-Mille-Sign . No significant isotopic fraction was found during the purification of the main PBDE congeners. A significant change in the stable carbon isotope ratio of BDE-47 was observed in fish carcasses compared to the original isotopic signatures, implying that PBDE stable carbon isotopic compositions can be used to trace the biotransformation of PBDEs in biota. - Highlights: Black-Right-Pointing-Pointer A method for the purification of PBDEs for CSIA was developed. Black-Right-Pointing-Pointer The {delta}{sup 13}C of PBDE congeners can be measured with a standard deviation of less than 0.5 Per-Mille-Sign . Black-Right-Pointing-Pointer Common carp were exposed to a PBDE mixture to investigate debromination. Black-Right-Pointing-Pointer Ratios of the {delta}{sup 13}C values can be used to trace the debromination of PBDE in fish.

  19. Software applications for flux balance analysis.

    Science.gov (United States)

    Lakshmanan, Meiyappan; Koh, Geoffrey; Chung, Bevan K S; Lee, Dong-Yup

    2014-01-01

    Flux balance analysis (FBA) is a widely used computational method for characterizing and engineering intrinsic cellular metabolism. The increasing number of its successful applications and growing popularity are possibly attributable to the availability of specific software tools for FBA. Each tool has its unique features and limitations with respect to operational environment, user-interface and supported analysis algorithms. Presented herein is an in-depth evaluation of currently available FBA applications, focusing mainly on usability, functionality, graphical representation and inter-operability. Overall, most of the applications are able to perform basic features of model creation and FBA simulation. COBRA toolbox, OptFlux and FASIMU are versatile to support advanced in silico algorithms to identify environmental and genetic targets for strain design. SurreyFBA, WEbcoli, Acorn, FAME, GEMSiRV and MetaFluxNet are the distinct tools which provide the user friendly interfaces in model handling. In terms of software architecture, FBA-SimVis and OptFlux have the flexible environments as they enable the plug-in/add-on feature to aid prospective functional extensions. Notably, an increasing trend towards the implementation of more tailored e-services such as central model repository and assistance to collaborative efforts was observed among the web-based applications with the help of advanced web-technologies. Furthermore, most recent applications such as the Model SEED, FAME, MetaFlux and MicrobesFlux have even included several routines to facilitate the reconstruction of genome-scale metabolic models. Finally, a brief discussion on the future directions of FBA applications was made for the benefit of potential tool developers.

  20. The 8th edition of the Table of Isotopes

    International Nuclear Information System (INIS)

    Firestone, R.B.; Shirley, V.S.; Baglin, C.M.; Chu, S.Y.F.; Zipkin, J.

    1997-01-01

    A new edition of the Table of Isotopes has been published by John Wiley and Sons, Inc. The two-volume, 3168 page, cloth-bound edition contains nuclear structure and decay data for over 3100 isotopes and isomers. Approximately 24,000 references are cited and the appendices have been updated and expanded. The book is packaged with an interactive CD-ROM that contains the Table of Isotopes in Adobe Acrobat Portable Document Format for convenient viewing on PC and Macintosh personal computers and Unix workstations. The CD-ROM also contains the Table of Superdeformed Nuclear Bands and Fission Isomers; Tables of Atoms, Atomic Nuclei, and Subatomic Particles; the Evaluated Nuclear Structure Data File (ENSDF) and the ENSDF Manual; the Nuclear Science Reference file (NSR); and Adobe Acrobat Reader software. (author)

  1. Individual economical value of plutonium isotopes and analysis of the reprocessing of irradiated fuel

    International Nuclear Information System (INIS)

    Gomes, I.C.; Rubini, L.A.; Barroso, D.E.G.

    1983-01-01

    An economical analysis of plutonium recycle in a PWR reactor, without any modification, is done, supposing an open market for the plutonium. The individual value of the plutonium isotopes is determined solving a system with four equations, which the unknow factors are the Pu-239, Pu-240, pu-241 and Pu-242 values. The equations are obtained equalizing the cost of plutonium fuel cycle of four different isotope mixture to the cost of the uranium fuel cycle. (E.G.) [pt

  2. Isotope Dilution - Thermal Ionisation Mass Spectrometric Analysis for Tin in a Fly Ash Material

    International Nuclear Information System (INIS)

    Hernandez, C.; Fernandez, M.; Quejido, A. L.

    2006-01-01

    Isotope dilution-thermal ionisation mass spectrometry (ID-TIMS) analysis has been applied to the determination of tin in a fly ash sample supplied by the EC Joint Research Centre (Ispra, Italy). The proposed procedure includes the silica gel/phosphoric acid technique for tin thermal ionisation activation and a strict heating protocol for isotope ratio measurements. Instrumental mass discrimination factor has been previously determined measuring a natural tin standard solution. Spike solutions has been prepared from 112Sn-enriched metal and quantified by reverse isotope dilution analysis. Two sample aliquots were spiked and tin was extracted with 4,5 M HCI during 25 min ultrasound esposure time. Due to the complex matrix of this fly ash material, a two-steps purification stage using ion-exchange chromatography was required prior TIMS analysis. Obtained results for the two sample-spike blends (10,10 + - 0,55 y 10,50 + - 0,64 imolg-1) are comprarable, both value and uncertainty. Also a good reproducibility is observed between measurements. The proposed ID-TIMS procedure, as a primary method and due to the lack of fly ash reference material certified for tin content, can be used to validate more routine methodologies applied to tin determination in this kind of materials. (Author) 75 refs

  3. Water-hydrogen isotope exchange process analysis

    International Nuclear Information System (INIS)

    Fedorchenko, O.; Alekseev, I.; Uborsky, V.

    2008-01-01

    The use of a numerical method is needed to find a solution to the equation system describing a general case of heterogeneous isotope exchange between gaseous hydrogen and liquid water in a column. A computer model of the column merely outputting the isotope compositions in the flows leaving the column, like the experimental column itself, is a 'black box' to a certain extent: the solution is not transparent and occasionally not fully comprehended. The approximate analytical solution was derived from the ZXY-diagram (McCabe-Thiele diagram), which illustrates the solution of the renewed computer model called 'EVIO-4.2' Several 'unusual' results and dependences have been analyzed and explained. (authors)

  4. UPVapor: Cofrentes nuclear power plant production results analysis software

    International Nuclear Information System (INIS)

    Curiel, M.; Palomo, M. J.; Baraza, A.; Vaquer, J.

    2010-10-01

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  5. UPVapor: Cofrentes nuclear power plant production results analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Curiel, M. [Logistica y Acondicionamientos Industriales SAU, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain); Palomo, M. J. [ISIRYM, Universidad Politecnica de Valencia, Camino de Vera s/n, Valencia (Spain); Baraza, A. [Iberdrola Generacion S. A., Central Nuclear Cofrentes, Carretera Almansa Requena s/n, 04662 Cofrentes, Valencia (Spain); Vaquer, J., E-mail: m.curiel@lainsa.co [TITANIA Servicios Tecnologicos SL, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain)

    2010-10-15

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  6. Isotopic analysis of a single Pb particle by using laser ablation TOF-MS

    Energy Technology Data Exchange (ETDEWEB)

    Choi, I. H.; Yoo, H. S. [Chungbuk National Univ., Cheongju (Korea, Republic of); Song, K. S. [KAERI, Daejeon (Korea, Republic of)

    2008-11-15

    A laser ablation can be applied to a direct isotopic analysis of solid samples due to the following advantages. Because a laser ablation is a very powerful ionization source, an additional ionization source is not required and an one step vaporization and ionization of samples is possible. Due to the small size of a laser beam, an analysis of a local trace can be applied. Also, the contamination or loss of samples is reduced because there is no need for a sample preparation process. In this study, Pb particles with a size of∼150μm were analyzed by LA TOF MS and a second harmonic of the Nd:YAG laser, 532nm, was used for the laser ablation. First, the ion signal of Pb was measured depending on the matrices. For loading a Pb particle, a silicon wafer, cotton textile, and Ta metal plate were used as a basic plate. As a result, the silicon wafer plate was identified to be the best matrix for the analysis of Pb with a good resolution and its measured isotopic ratios reasonably agree with the natural abundance within 5%. The figure shows a mass spectrum of Pb onto a silicon wafer. In applying the resonance laser ablation, the detection sensitivity was increased by more than 10 times. In the experiment regarding the cotton textile, the mass resolution of Pb was more than 500 which was enough to measure the isotopes, and it is applicable to real swipe samples in various fields such as environmental analysis, industry, and nuclear forensic.

  7. A simple cleanup method for the isolation of nitrate from natural water samples for O isotopes analysis

    International Nuclear Information System (INIS)

    Haberhauer, G.; Blochberger, K.

    1999-09-01

    The analysis of O-isotopic composition of nitrate has many potential applications in studies of environmental processes. O-isotope nitrate analysis require sample free of other oxygen-containing compounds. More than 100 % of non-NO 3 - oxygen relative to NO 3 - oxygen can still be found in forest soil water samples after cleanup if improper cleanup strategies, e.g., adsorption onto activated carbon, are used. Such non-NO 3 - oxygen compounds will bias O-isotropic data. Therefore, an efficient cleanup method was developed to isolate nitrate from natural water samples. In a multistep cleanup procedure using adsorption onto water-insoluble poly(vinylpyrrolidone), removal of almost all other oxygen-containing compounds, such as fulvic acids, and isolation of nitrate was achieved. The method supplied samples free of non-NO 3 - oxygen which can be directly combusted to CO 2 for subsequent O-isotope analysis. (author)

  8. Uranium isotope separation using styrene cation exchangers

    International Nuclear Information System (INIS)

    Kahovec, J.

    1980-01-01

    The separation of 235 U and 238 U isotopes is carried out either by simple isotope exchange in the system uranium-cation exchanger (sulphonated styrene divinylbenzene resin), or by combination of isotope exchange in a uranium-cation exchanger (Dowex 50, Amberlite IR-120) system and a chemical reaction. A review is presented of elution agents used, the degree of cation exchanger cross-linking, columns length, and 235 U enrichment. The results are described of the isotope effect study in a U(IV)-U(VI)-cation exchanger system conducted by Japanese and Romanian authors (isotope exchange kinetics, frontal analysis, reverse (indirect) frontal analysis). (H.S.)

  9. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  10. Software for analysis of waveforms acquired by digital Doppler broadening spectrometer

    International Nuclear Information System (INIS)

    Vlcek, M; Čížek, J; Procházka, I

    2013-01-01

    High-resolution digital spectrometer for coincidence measurement of Doppler broadening of positron annihilation radiation was recently developed and tested. In this spectrometer pulses from high purity Ge (HPGe) detectors are sampled in the real time by fast digitizers and subsequently analyzed off-line by software. We present description of the software routines used for pulse shape analysis in two spectrometer configurations: (i) semi-digital setup in which detector pulses shaped in spectroscopic amplifiers (SA's) are digitized; (ii) pure digital setup in which pulses from detector pre-amplifiers are digitized directly. Software developed in this work will be freely available in the form of source code and pre-compiled binaries.

  11. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    Science.gov (United States)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  12. Models for composing software : an analysis of software composition and objects

    NARCIS (Netherlands)

    Bergmans, Lodewijk

    1999-01-01

    In this report, we investigate component-based software construction with a focus on composition. In particular we try to analyze the requirements and issues for components and software composition. As a means to understand this research area, we introduce a canonical model for representing

  13. Development of tools for safety analysis of control software in advanced reactors

    International Nuclear Information System (INIS)

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described

  14. Development of tools for safety analysis of control software in advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  15. Conditional CO2 flux analysis of a managed grassland with the aid of stable isotopes

    Science.gov (United States)

    Zeeman, M. J.; Tuzson, B.; Emmenegger, L.; Knohl, A.; Buchmann, N.; Eugster, W.

    2009-04-01

    Short statured managed ecosystems, such as agricultural grasslands, exhibit high temporal changes in carbon dioxide assimilation and respiration fluxes for which measurements of the net CO2 flux, e.g. by using the eddy covariance (EC) method, give only limited insight. We have therefore adopted a recently proposed concept for conditional EC flux analysis of forest to grasslands, in order to identify and quantify daytime sub-canopy respiration fluxes. To validate the concept, high frequency (≈5 Hz) stable carbon isotope analyis of CO2 was used. We made eddy covariance measurements of CO2 and its isotopologues during four days in August 2007, using a novel quantum cascade laser absorption spectrometer, capable of high time resolution stable isotope analysis. The effects of a grass cut during the measurement period could be detected and resulted in a sub-canopy source conditional flux classification, for which the isotope composition of the CO2 could be confirmed to be of a respiration source. However, the conditional flux method did not work for an undisturbed grassland canopy. We attribute this to the flux measurement height that was chosen well above the roughness sublayer, where the natural isotopic tracer (δ13C) of respiration was too well mixed with background air.

  16. Development of Stable Isotope Analysis Technology for Epidemiological Study of Migratory Birds in Connection with Avian Influenza

    International Nuclear Information System (INIS)

    Kim, Jongyun; Park, Jongho; Han, Sunho; Song, Kyuseok; Ko, Yongkwon; Bae, Inae; Cho, Mihyun; Jung, Gahee; Yeom, Ina

    2012-03-01

    In order to clarify correlations between the spread of avian influenza and migratory routes of birds, various conventional methods including a ring method, gene analysis, geolocator and a satellite tracking method are being used together. We first report on the estimation of origin of migratory bird in the Korea based on the statistical method of stable isotope ratio analysis of feathers. It is expected that migratory birds in Junam reservoir were from the two different regions according to the stable isotope ration analysis. However, it is not easy to conclude the breeding ground of northern pintails based on the current data because the degree of precision or accuracy can be influenced by many factors. For this reason, this statistical analysis accuracy can be influenced by many factors. For this reason, this statistical analysis can have a scientific significance if the reliability of the whole measurement system is improved. Furthermore, databases are not enough to prepare base map of regional isotope ratios because database of stable isotope ratio in oxygen and hydrogen of rainwater in Korea should be constructed. Though the research has focused on the hydrogen and oxygen until now, investigation of other elements, such as carbon, sulfur, nitrogen and others that can describe metabolic process or regional characteristics, is also worthwhile subject. And it is believed that this research will improve a resolution of detection for the migratory pathway and habitat of birds

  17. Development of Stable Isotope Analysis Technology for Epidemiological Study of Migratory Birds in Connection with Avian Influenza

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jongyun; Park, Jongho; Han, Sunho; Song, Kyuseok; Ko, Yongkwon; Bae, Inae; Cho, Mihyun; Jung, Gahee; Yeom, Ina

    2012-03-15

    In order to clarify correlations between the spread of avian influenza and migratory routes of birds, various conventional methods including a ring method, gene analysis, geolocator and a satellite tracking method are being used together. We first report on the estimation of origin of migratory bird in the Korea based on the statistical method of stable isotope ratio analysis of feathers. It is expected that migratory birds in Junam reservoir were from the two different regions according to the stable isotope ration analysis. However, it is not easy to conclude the breeding ground of northern pintails based on the current data because the degree of precision or accuracy can be influenced by many factors. For this reason, this statistical analysis accuracy can be influenced by many factors. For this reason, this statistical analysis can have a scientific significance if the reliability of the whole measurement system is improved. Furthermore, databases are not enough to prepare base map of regional isotope ratios because database of stable isotope ratio in oxygen and hydrogen of rainwater in Korea should be constructed. Though the research has focused on the hydrogen and oxygen until now, investigation of other elements, such as carbon, sulfur, nitrogen and others that can describe metabolic process or regional characteristics, is also worthwhile subject. And it is believed that this research will improve a resolution of detection for the migratory pathway and habitat of birds.

  18. WinDAM C earthen embankment internal erosion analysis software

    Science.gov (United States)

    Two primary causes of dam failure are overtopping and internal erosion. For the purpose of evaluating dam safety for existing earthen embankment dams and proposed earthen embankment dams, Windows Dam Analysis Modules C (WinDAM C) software will simulate either internal erosion or erosion resulting f...

  19. Redox substoichiometry in isotope dilution analysis Pt. 2

    International Nuclear Information System (INIS)

    Kambara, T.; Suzuki, J.; Yoshioka, H.; Nakajima, N.

    1978-01-01

    Isotope dilution analysis using the redox substoichiometric principle has been applied to the determination of antimony content in metallic zinc. As the substoichiometric reaction, the oxidation of trivalent to pentavalent antimony with potassium permanganate was used, followed by separation of these species by the BHPA extraction of trivalent antimony. Determination of antimony contents less than 0.5 μg was found to be possible with good accuracy, without separation of zinc ions. The antimony content in a metallic zinc sample was determined to be 19.7+-0.8 ppm, in good agreement with the results obtained by the other analytical methods. (author)

  20. Software analysis by simulation for nuclear plant availability and safety goals

    International Nuclear Information System (INIS)

    Lapassat, A.M.; Segalard, J.; Salichon, M.; Le Meur, M.; Boulc'h, J.

    1988-01-01

    The microprocessors utilisation for monitoring protection and safety of nuclear reactor has become reality in the eighties. The authorities responsible for reactor safety systems have considered the necessity of the correct functioning of reactor control systems. The problems take off, when analysis of software, has led us in a first time to develop a completely software tool of verification and validation of programs and specifications. The CEA (French Atomic Energie Commission) responsible of reliable distributed techniques of nuclear plant discusses in this paper the software test and simulation tools used to analyse real-time software. The tool O.S.T. make part of a big program of help for the conception and the evaluation for the systems' fault tolerance which the European ESPRIT SMART no. 1609 (System Measurement and Architecture Technique) will be the kernel [fr

  1. Software Safety Analysis of Digital Protection System Requirements Using a Qualitative Formal Method

    International Nuclear Information System (INIS)

    Lee, Jang-Soo; Kwon, Kee-Choon; Cha, Sung-Deok

    2004-01-01

    The safety analysis of requirements is a key problem area in the development of software for the digital protection systems of a nuclear power plant. When specifying requirements for software of the digital protection systems and conducting safety analysis, engineers find that requirements are often known only in qualitative terms and that existing fault-tree analysis techniques provide little guidance on formulating and evaluating potential failure modes. A framework for the requirements engineering process is proposed that consists of a qualitative method for requirements specification, called the qualitative formal method (QFM), and a safety analysis method for the requirements based on causality information, called the causal requirements safety analysis (CRSA). CRSA is a technique that qualitatively evaluates causal relationships between software faults and physical hazards. This technique, extending the qualitative formal method process and utilizing information captured in the state trajectory, provides specific guidelines on how to identify failure modes and the relationship among them. The QFM and CRSA processes are described using shutdown system 2 of the Wolsong nuclear power plants as the digital protection system example

  2. A Computational Drug Metabolite Detection Using the Stable Isotopic Mass-Shift Filtering with High Resolution Mass Spectrometry in Pioglitazone and Flurbiprofen

    Directory of Open Access Journals (Sweden)

    Yohei Miyamoto

    2013-09-01

    Full Text Available The identification of metabolites in drug discovery is important. At present, radioisotopes and mass spectrometry are both widely used. However, rapid and comprehensive identification is still laborious and difficult. In this study, we developed new analytical software and employed a stable isotope as a tool to identify drug metabolites using mass spectrometry. A deuterium-labeled compound and non-labeled compound were both metabolized in human liver microsomes and analyzed by liquid chromatography/time-of-flight mass spectrometry (LC-TOF-MS. We computationally aligned two different MS data sets and filtered ions having a specific mass-shift equal to masses of labeled isotopes between those data using our own software. For pioglitazone and flurbiprofen, eight and four metabolites, respectively, were identified with calculations of mass and formulas and chemical structural fragmentation analysis. With high resolution MS, the approach became more accurate. The approach detected two unexpected metabolites in pioglitazone, i.e., the hydroxypropanamide form and the aldehyde hydrolysis form, which other approaches such as metabolite-biotransformation list matching and mass defect filtering could not detect. We demonstrated that the approach using computational alignment and stable isotopic mass-shift filtering has the ability to identify drug metabolites and is useful in drug discovery.

  3. Preparation of Authigenic Pyrite from Methane-bearing Sediments for In Situ Sulfur Isotope Analysis Using SIMS.

    Science.gov (United States)

    Lin, Zhiyong; Sun, Xiaoming; Peckmann, Jörn; Lu, Yang; Strauss, Harald; Xu, Li; Lu, Hongfeng; Teichert, Barbara M A

    2017-08-31

    Different sulfur isotope compositions of authigenic pyrite typically result from the sulfate-driven anaerobic oxidation of methane (SO4-AOM) and organiclastic sulfate reduction (OSR) in marine sediments. However, unravelling the complex pyritization sequence is a challenge because of the coexistence of different sequentially formed pyrite phases. This manuscript describes a sample preparation procedure that enables the use of secondary ion mass spectroscopy (SIMS) to obtain in situ δ 34 S values of various pyrite generations. This allows researchers to constrain how SO4-AOM affects pyritization in methane-bearing sediments. SIMS analysis revealed an extreme range in δ 34 S values, spanning from -41.6 to +114.8‰, which is much wider than the range of δ 34 S values obtained by the traditional bulk sulfur isotope analysis of the same samples. Pyrite in the shallow sediment mainly consists of 34 S-depleted framboids, suggesting early diagenetic formation by OSR. Deeper in the sediment, more pyrite occurs as overgrowths and euhedral crystals, which display much higher SIMS δ 34 S values than the framboids. Such 34 S-enriched pyrite is related to enhanced SO4-AOM at the sulfate-methane transition zone, postdating OSR. High-resolution in situ SIMS sulfur isotope analyses allow for the reconstruction of the pyritization processes, which cannot be resolved by bulk sulfur isotope analysis.

  4. Software and package applicating for network meta-analysis: A usage-based comparative study.

    Science.gov (United States)

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  5. Investigating the provenance of un-dyed spun cotton fibre using multi-isotope profiles and chemometric analysis.

    Science.gov (United States)

    Daéid, Niamh Nic; Meier-Augenstein, Wolfram; Kemp, Helen F

    2011-07-15

    The analysis of un-dyed spun cotton fibres can be challenging within a forensic science context where discrimination of one fibre from another is of importance. Conventional microscopic and chemical analysis of these fibres is generally unsuccessful because of their similar morphology. In this work we have explored the potential of isotope ratio mass spectrometry (IRMS) as a tool for spun cotton fibre analysis in an attempt to reveal any discriminatory information available. Seven different batches of un-dyed spun cotton fibre from four different countries were analysed. A combination of the hydrogen and oxygen isotopic data facilitated the correct association of the samples, demonstrating, for the first time, the applicability of IRMS to fibre analysis in this way. Copyright © 2011 John Wiley & Sons, Ltd.

  6. Development of data acquisition and analysis software for multichannel detectors

    International Nuclear Information System (INIS)

    Chung, Y.

    1988-06-01

    This report describes the development of data acquisition and analysis software for Apple Macintosh computers, capable of controlling two multichannel detectors. With the help of outstanding graphics capabilities, easy-to-use user interface, and several other built-in convenience features, this application has enhanced the productivity and the efficiency of data analysis. 2 refs., 6 figs

  7. Application of TIMS in isotope correlations for determining the isotope ratios of plutonium

    International Nuclear Information System (INIS)

    Alamelu, D.; Aggarwal, S.K.

    2003-01-01

    Thermal ionisation mass spectrometry (TIMS) is a well-recognized technique for determining the isotopic composition of Pu in irradiated nuclear fuel samples. Other mass spectrometric methods such as ICPMS, SIMS can also be employed for the isotopic analysis of Pu. In the event of non-availability of a mass spectrometer, other techniques such as gamma spectrometry and alpha spectrometry can also be used. They have a limited applicability since data on all the Pu isotopes cannot be obtained

  8. Stable oxygen and hydrogen isotopes of brines - comparing isotope ratio mass spectrometry and isotope ratio infrared spectroscopy

    Science.gov (United States)

    Ahrens, Christian; Koeniger, Paul; van Geldern, Robert; Stadler, Susanne

    2013-04-01

    Today's standard analytical methods for high precision stable isotope analysis of fluids are gas-water equilibration and high temperature pyrolysis coupled to isotope ratio mass spectrometers (IRMS). In recent years, relatively new laser-based analytical instruments entered the market that are said to allow high isotope precision data on nearly every media. This optical technique is referred to as isotope ratio infrared spectroscopy (IRIS). The objective of this study is to evaluate the capability of this new instrument type for highly saline solutions and a comparison of the analytical results with traditional IRMS analysis. It has been shown for the equilibration method that the presence of salts influences the measured isotope values depending on the salt concentration (see Lécuyer et al, 2009; Martineau, 2012). This so-called 'isotope salt effect' depends on the salt type and salt concentration. These factors change the activity in the fluid and therefore shift the isotope ratios measured by the equilibration method. Consequently, correction factors have to be applied to these analytical data. Direct conversion techniques like pyrolysis or the new laser instruments allow the measurement of the water molecule from the sample directly and should therefore not suffer from the salt effect, i.e. no corrections of raw values are necessary. However, due to high salt concentrations this might cause technical problems with the analytical hardware and may require labor-intensive sample preparation (e.g. vacuum distillation). This study evaluates the salt isotope effect for the IRMS equilibration technique (Thermo Gasbench II coupled to Delta Plus XP) and the laser-based IRIS instruments with liquid injection (Picarro L2120-i). Synthetic salt solutions (NaCl, KCl, CaCl2, MgCl2, MgSO4, CaSO4) and natural brines collected from the Stassfurt Salt Anticline (Germany; Stadler et al., 2012) were analysed with both techniques. Salt concentrations ranged from seawater salinity

  9. Using stable isotope analysis to discriminate gasoline on the basis of its origin.

    Science.gov (United States)

    Heo, Su-Young; Shin, Woo-Jin; Lee, Sin-Woo; Bong, Yeon-Sik; Lee, Kwang-Sik

    2012-03-15

    Leakage of gasoline and diesel from underground tanks has led to a severe environmental problem in many countries. Tracing the production origin of gasoline and diesel is required to enable the development of dispute resolution and appropriate remediation strategies for the oil-contaminated sites. We investigated the bulk and compound-specific isotopic compositions of gasoline produced by four oil companies in South Korea: S-Oil, SK, GS and Hyundai. The relative abundance of several compounds in gasoline was determined by the peak height of the major ion (m/z 44). The δ(13)C(Bulk) and δD(Bulk) values of gasoline produced by S-Oil were significantly different from those of SK, GS and Hyundai. In particular, the compound-specific isotopic value (δ(13)C(CSIA)) of methyl tert-butyl ether (MTBE) in S-Oil gasoline was significantly lower than that of gasoline produced by other oil companies. The abundance of several compounds in gasoline, such as n-pentane, MTBE, n-hexane, toluene, ethylbenzene and o-xylene, differed widely among gasoline from different oil companies. This study shows that gasoline can be forensically discriminated according to the oil company responsible for its manufacture using stable isotope analysis combined with multivariate statistical analysis. Copyright © 2012 John Wiley & Sons, Ltd.

  10. High precision isotopic ratio analysis of volatile metal chelates

    International Nuclear Information System (INIS)

    Hachey, D.L.; Blais, J.C.; Klein, P.D.

    1980-01-01

    High precision isotope ratio measurements have been made for a series of volatile alkaline earth and transition metal chelates using conventional GC/MS instrumentation. Electron ionization was used for alkaline earth chelates, whereas isobutane chemical ionization was used for transition metal studies. Natural isotopic abundances were determined for a series of Mg, Ca, Cr, Fe, Ni, Cu, Cd, and Zn chelates. Absolute accuracy ranged between 0.01 and 1.19 at. %. Absolute precision ranged between +-0.01-0.27 at. % (RSD +- 0.07-10.26%) for elements that contained as many as eight natural isotopes. Calibration curves were prepared using natural abundance metals and their enriched 50 Cr, 60 Ni, and 65 Cu isotopes covering the range 0.1-1010.7 at. % excess. A separate multiple isotope calibration curve was similarly prepared using enriched 60 Ni (0.02-2.15 at. % excess) and 62 Ni (0.23-18.5 at. % excess). The samples were analyzed by GC/CI/MS. Human plasma, containing enriched 26 Mg and 44 Ca, was analyzed by EI/MS. 1 figure, 5 tables

  11. The conflict between cheetahs and humans on Namibian farmland elucidated by stable isotope diet analysis.

    Directory of Open Access Journals (Sweden)

    Christian C Voigt

    Full Text Available Large areas of Namibia are covered by farmland, which is also used by game and predator species. Because it can cause conflicts with farmers when predators, such as cheetahs (Acinonyx jubatus, hunt livestock, we assessed whether livestock constitutes a significant part of the cheetah diet by analysing the stable isotope composition of blood and tissue samples of cheetahs and their potential prey species. According to isotopic similarities, we defined three isotopic categories of potential prey: members of a C4 food web with high δ15N values (gemsbok, cattle, springhare and guinea fowl and those with low δ15N values (hartebeest, warthog, and members of a C3 food web, namely browsers (eland, kudu, springbok, steenbok and scrub hare. We quantified the trophic discrimination of heavy isotopes in cheetah muscle in 9 captive individuals and measured an enrichment for 15N (3.2‰ but not for 13C in relation to food. We captured 53 free-ranging cheetahs of which 23 were members of groups. Cheetahs of the same group were isotopically distinct from members of other groups, indicating that group members shared their prey. Solitary males (n = 21 and males in a bachelor groups (n = 11 fed mostly on hartebeest and warthogs, followed by browsers in case of solitary males, and by grazers with high δ15N values in case of bachelor groups. Female cheetahs (n = 9 predominantly fed on browsers and used also hartebeest and warthogs. Mixing models suggested that the isotopic prey category that included cattle was only important, if at all, for males living in bachelor groups. Stable isotope analysis of fur, muscle, red blood cells and blood plasma in 9 free-ranging cheetahs identified most individuals as isotopic specialists, focussing on isotopically distinct prey categories as their food.

  12. The Conflict between Cheetahs and Humans on Namibian Farmland Elucidated by Stable Isotope Diet Analysis

    Science.gov (United States)

    Voigt, Christian C.; Thalwitzer, Susanne; Melzheimer, Jörg; Blanc, Anne-Sophie; Jago, Mark; Wachter, Bettina

    2014-01-01

    Large areas of Namibia are covered by farmland, which is also used by game and predator species. Because it can cause conflicts with farmers when predators, such as cheetahs (Acinonyx jubatus), hunt livestock, we assessed whether livestock constitutes a significant part of the cheetah diet by analysing the stable isotope composition of blood and tissue samples of cheetahs and their potential prey species. According to isotopic similarities, we defined three isotopic categories of potential prey: members of a C4 food web with high δ15N values (gemsbok, cattle, springhare and guinea fowl) and those with low δ15N values (hartebeest, warthog), and members of a C3 food web, namely browsers (eland, kudu, springbok, steenbok and scrub hare). We quantified the trophic discrimination of heavy isotopes in cheetah muscle in 9 captive individuals and measured an enrichment for 15N (3.2‰) but not for 13C in relation to food. We captured 53 free-ranging cheetahs of which 23 were members of groups. Cheetahs of the same group were isotopically distinct from members of other groups, indicating that group members shared their prey. Solitary males (n = 21) and males in a bachelor groups (n = 11) fed mostly on hartebeest and warthogs, followed by browsers in case of solitary males, and by grazers with high δ15N values in case of bachelor groups. Female cheetahs (n = 9) predominantly fed on browsers and used also hartebeest and warthogs. Mixing models suggested that the isotopic prey category that included cattle was only important, if at all, for males living in bachelor groups. Stable isotope analysis of fur, muscle, red blood cells and blood plasma in 9 free-ranging cheetahs identified most individuals as isotopic specialists, focussing on isotopically distinct prey categories as their food. PMID:25162403

  13. Oxygen isotope analysis of phosphate: improved precision using TC/EA CF-IRMS.

    Science.gov (United States)

    LaPorte, D F; Holmden, C; Patterson, W P; Prokopiuk, T; Eglington, B M

    2009-06-01

    Oxygen isotope values of biogenic apatite have long demonstrated considerable promise for paleothermometry potential because of the abundance of material in the fossil record and greater resistance of apatite to diagenesis compared to carbonate. Unfortunately, this promise has not been fully realized because of relatively poor precision of isotopic measurements, and exceedingly small size of some substrates for analysis. Building on previous work, we demonstrate that it is possible to improve precision of delta18O(PO4) measurements using a 'reverse-plumbed' thermal conversion elemental analyzer (TC/EA) coupled to a continuous flow isotope ratio mass spectrometer (CF-IRMS) via a helium stream [Correction made here after initial online publication]. This modification to the flow of helium through the TC/EA, and careful location of the packing of glassy carbon fragments relative to the hot spot in the reactor, leads to narrower, more symmetrically distributed CO elution peaks with diminished tailing. In addition, we describe our apatite purification chemistry that uses nitric acid and cation exchange resin. Purification chemistry is optimized for processing small samples, minimizing isotopic fractionation of PO4(-3) and permitting Ca, Sr and Nd to be eluted and purified further for the measurement of delta44Ca and 87Sr/86Sr in modern biogenic apatite and 143Nd/144Nd in fossil apatite. Our methodology yields an external precision of +/- 0.15 per thousand (1sigma) for delta18O(PO4). The uncertainty is related to the preparation of the Ag3PO4 salt, conversion to CO gas in a reversed-plumbed TC/EA, analysis of oxygen isotopes using a CF-IRMS, and uncertainty in constructing calibration lines that convert raw delta18O data to the VSMOW scale. Matrix matching of samples and standards for the purpose of calibration to the VSMOW scale was determined to be unnecessary. Our method requires only slightly modified equipment that is widely available. This fact, and the

  14. Development of the Free-space Optical Communications Analysis Software (FOCAS)

    Science.gov (United States)

    Jeganathan, M.; Mecherle, G.; Lesh, J.

    1998-01-01

    The Free-space Optical Communications Analysis Software (FOCAS) was developed at the Jet Propulsion Laboratory (JPL) to provide mission planners, systems engineers and communications engineers with an easy to use tool to analyze optical communications link.

  15. Isotope analysis in the transmission electron microscope.

    Science.gov (United States)

    Susi, Toma; Hofer, Christoph; Argentero, Giacomo; Leuthner, Gregor T; Pennycook, Timothy J; Mangler, Clemens; Meyer, Jannik C; Kotakoski, Jani

    2016-10-10

    The Ångström-sized probe of the scanning transmission electron microscope can visualize and collect spectra from single atoms. This can unambiguously resolve the chemical structure of materials, but not their isotopic composition. Here we differentiate between two isotopes of the same element by quantifying how likely the energetic imaging electrons are to eject atoms. First, we measure the displacement probability in graphene grown from either 12 C or 13 C and describe the process using a quantum mechanical model of lattice vibrations coupled with density functional theory simulations. We then test our spatial resolution in a mixed sample by ejecting individual atoms from nanoscale areas spanning an interface region that is far from atomically sharp, mapping the isotope concentration with a precision better than 20%. Although we use a scanning instrument, our method may be applicable to any atomic resolution transmission electron microscope and to other low-dimensional materials.

  16. Comparison of analysis methods for burnup credit applications

    International Nuclear Information System (INIS)

    Sanders, T.L.; Brady, M.C.; Renier, J.P.; Parks, C.V.

    1989-01-01

    The current approach used for the development and certification of spent fuel storage and transport casks requires an assumption of fresh fuel isotopics in the criticality safety analysis. However, it has been shown that there is a considerable reactivity reduction when the isotopics representative of the depleted (or burned) fuel are used in a criticality analysis. Thus, by taking credit for the burned state of the fuel (i.e., burnup credit), a cask designer could achieve a significant increase in payload. Accurate prediction of k eff for spent fuel arrays depends both on the criticality safety analysis and the prediction of the spent fuel isotopics via a depletion analysis. Spent fuel isotopics can be obtained from detailed multidimensional reactor analyses, e.g. the code PDQ, or from point reactor burnup models. These reactor calculations will help verify the adequacy of the isotopics and determine Δk eff biases for various analysis assumptions (with and without fission products, actinide absorbers, burnable poison rods, etc.). New software developed to interface PDQ multidimensional isotopics with KENO V.a reactor and cask models is described. Analyses similar to those performed for the reactor cases are carried out with a representative burnup credit cask model using the North Anna fuel. This paper presents the analysis methodology that has been developed for evaluating the physics issues associated with burnup credit. It is applicable in the validation and characterization of fuel isotopics as well as in determining the influence of various analysis assumptions in terms of δk eff . The methodology is used in the calculation of reactor restart criticals and analysis of a typical burnup credit cask

  17. Quantification of the carbonaceous matter origin in submicron marine aerosol particles by dual carbon isotope analysis

    Science.gov (United States)

    Ceburnis, D.; Garbaras, A.; Szidat, S.; Rinaldi, M.; Fahrni, S.; Perron, N.; Wacker, L.; Leinert, S.; Remeikis, V.; Facchini, M. C.; Prevot, A. S. H.; Jennings, S. G.; O'Dowd, C. D.

    2011-01-01

    Dual carbon isotope analysis has been performed for the first time demonstrating a potential in organic matter apportionment between three principal sources: marine, terrestrial (non-fossil) and fossil fuel due to unique isotopic signatures. The results presented here, utilising combinations of dual carbon isotope analysis, provides a conclusive evidence of a dominant biogenic organic fraction to organic aerosol over biologically active oceans. In particular, the NE Atlantic, which is also subjected to notable anthropogenic influences via pollution transport processes, was found to contain 80% organic aerosol matter of biogenic origin directly linked to plankton emissions. The remaining carbonaceous aerosol was of fossil-fuel origin. By contrast, for polluted air advecting out from Europe into the NE Atlantic, the source apportionment is 30% marine biogenic, 40% fossil fuel, and 30% continental non-fossil fuel. The dominant marine organic aerosol source in the atmosphere has significant implications for climate change feedback processes.

  18. Applied software risk management a guide for software project managers

    CERN Document Server

    Pandian, C Ravindranath

    2006-01-01

    Few software projects are completed on time, on budget, and to their original specifications. Focusing on what practitioners need to know about risk in the pursuit of delivering software projects, Applied Software Risk Management: A Guide for Software Project Managers covers key components of the risk management process and the software development process, as well as best practices for software risk identification, risk planning, and risk analysis. Written in a clear and concise manner, this resource presents concepts and practical insight into managing risk. It first covers risk-driven project management, risk management processes, risk attributes, risk identification, and risk analysis. The book continues by examining responses to risk, the tracking and modeling of risks, intelligence gathering, and integrated risk management. It concludes with details on drafting and implementing procedures. A diary of a risk manager provides insight in implementing risk management processes.Bringing together concepts ...

  19. Meta-DiSc: a software for meta-analysis of test accuracy data.

    Science.gov (United States)

    Zamora, Javier; Abraira, Victor; Muriel, Alfonso; Khan, Khalid; Coomarasamy, Arri

    2006-07-12

    Systematic reviews and meta-analyses of test accuracy studies are increasingly being recognised as central in guiding clinical practice. However, there is currently no dedicated and comprehensive software for meta-analysis of diagnostic data. In this article, we present Meta-DiSc, a Windows-based, user-friendly, freely available (for academic use) software that we have developed, piloted, and validated to perform diagnostic meta-analysis. Meta-DiSc a) allows exploration of heterogeneity, with a variety of statistics including chi-square, I-squared and Spearman correlation tests, b) implements meta-regression techniques to explore the relationships between study characteristics and accuracy estimates, c) performs statistical pooling of sensitivities, specificities, likelihood ratios and diagnostic odds ratios using fixed and random effects models, both overall and in subgroups and d) produces high quality figures, including forest plots and summary receiver operating characteristic curves that can be exported for use in manuscripts for publication. All computational algorithms have been validated through comparison with different statistical tools and published meta-analyses. Meta-DiSc has a Graphical User Interface with roll-down menus, dialog boxes, and online help facilities. Meta-DiSc is a comprehensive and dedicated test accuracy meta-analysis software. It has already been used and cited in several meta-analyses published in high-ranking journals. The software is publicly available at http://www.hrc.es/investigacion/metadisc_en.htm.

  20. Software package for the design and analysis of DNA origami structures

    DEFF Research Database (Denmark)

    Andersen, Ebbe Sloth; Nielsen, Morten Muhlig; Dong, Mingdong

    was observed on the mica surface with a fraction of the dolphin nanostructures showing extensive tail flexibility of approximately 90 degrees. The Java editor and tools are free software distributed under the GNU license. The open architecture of the editor makes it easy for the scientific community......A software package was developed for the semi-automated design of DNA origamis and further data analysis of Atomic Force Microscopy (AFM) images. As an example, we design the shape of a bottlenose dolphin and analyze it by means of high resolution AFM imaging. A high yield of DNA dolphins...... to contribute new tools and functionalities. Documentation, tutorials and software will be made available online....

  1. The software safety analysis based on SFTA for reactor power regulating system in nuclear power plant

    International Nuclear Information System (INIS)

    Liu Zhaohui; Yang Xiaohua; Liao Longtao; Wu Zhiqiang

    2015-01-01

    The digitalized Instrumentation and Control (I and C) system of Nuclear power plants can provide many advantages. However, digital control systems induce new failure modes that differ from those of analog control systems. While the cost effectiveness and flexibility of software is widely recognized, it is very difficult to achieve and prove high levels of dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. Software safety analysis (SSA) was one way to improve the software safety by identify the system hazards caused by software failure. This paper describes the application of a software fault tree analysis (SFTA) at the software design phase. At first, we evaluate all the software modules of the reactor power regulating system in nuclear power plant and identify various hazards. The SFTA was applied to some critical modules selected from the previous step. At last, we get some new hazards that had not been identified in the prior processes of the document evaluation which were helpful for our design. (author)

  2. Standard gamma-ray spectra for the comparison of spectral analysis software

    International Nuclear Information System (INIS)

    Woods, S.; Hemingway, J.; Bowles, N.

    1997-01-01

    Three sets of standard γ-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  3. Standard gamma-ray spectra for the comparison of spectral analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Woods, S.; Hemingway, J.; Bowles, N. [and others

    1997-08-01

    Three sets of standard {gamma}-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  4. Isotopic abundance in atom trap trace analysis

    Science.gov (United States)

    Lu, Zheng-Tian; Hu, Shiu-Ming; Jiang, Wei; Mueller, Peter

    2014-03-18

    A method and system for detecting ratios and amounts of isotopes of noble gases. The method and system is constructed to be able to measure noble gas isotopes in water and ice, which helps reveal the geological age of the samples and understand their movements. The method and system uses a combination of a cooled discharge source, a beam collimator, a beam slower and magneto-optic trap with a laser to apply resonance frequency energy to the noble gas to be quenched and detected.

  5. Isotopic biases for actinide-only burnup credit

    International Nuclear Information System (INIS)

    Rahimi, M.; Lancaster, D.; Hoeffer, B.; Nichols, M.

    1997-01-01

    The primary purpose of this paper is to present the new methodology for establishing bias and uncertainty associated with isotopic prediction in spent fuel assemblies for burnup credit analysis. The analysis applies to the design of criticality control systems for spent fuel casks. A total of 54 spent fuel samples were modeled and analyzed using the Shielding Analyses Sequence (SAS2H). Multiple regression analysis and a trending test were performed to develop isotopic correction factors for 10 actinide burnup credit isotopes. 5 refs., 1 tab

  6. Using Business Analysis Software in a Business Intelligence Course

    Science.gov (United States)

    Elizondo, Juan; Parzinger, Monica J.; Welch, Orion J.

    2011-01-01

    This paper presents an example of a project used in an undergraduate business intelligence class which integrates concepts from statistics, marketing, and information systems disciplines. SAS Enterprise Miner software is used as the foundation for predictive analysis and data mining. The course culminates with a competition and the project is used…

  7. Diode laser based resonance ionization mass spectrometry for spectroscopy and trace analysis of uranium isotopes

    International Nuclear Information System (INIS)

    Hakimi, Amin

    2013-01-01

    In this doctoral thesis, the upgrade and optimization of a diode laser system for high-resolution resonance ionization mass spectrometry is described. A frequency-control system, based on a double-interferometric approach, allowing for absolute stabilization down to 1 MHz as well as frequency detunings of several GHz within a second for up to three lasers in parallel was optimized. This laser system was used for spectroscopic studies on uranium isotopes, yielding precise and unambiguous level energies, total angular momenta, hyperfine constants and isotope shifts. Furthermore, an efficient excitation scheme which can be operated with commercial diode lasers was developed. The performance of the complete laser mass spectrometer was optimized and characterized for the ultra-trace analysis of the uranium isotope 236 U, which serves as a neutron flux dosimeter and tracer for radioactive anthropogenic contaminations in the environment. Using synthetic samples, an isotope selectivity of ( 236 U)/( 238 U) = 4.5(1.5) . 10 -9 was demonstrated.

  8. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  9. A novel high-temperature combustion based system for stable isotope analysis of dissolved organic carbon in aqueous samples. : I development and validation

    NARCIS (Netherlands)

    Federherr, E.; Cerli, C.; Kirkels, F. M. S. A.; Kalbitz, K.; Kupka, H. J.; Dunsbach, R.; Lange, L.; Schmidt, T. C.

    2014-01-01

    RATIONALE: Traditionally, dissolved organic carbon (DOC) stable isotope analysis (SIA) is performed using either offline sample preparation followed by elemental analyzer/isotope ratiomass spectrometry (EA/IRMS) or a wet chemical oxidation (WCO)-based device coupled to an isotope ratio mass

  10. Analysis and design of software ecosystem architectures – Towards the 4S telemedicine ecosystem

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Kyng, Morten

    2014-01-01

    performed a descriptive, revelatory case study of the Danish telemedicine ecosystem and for ii), we experimentally designed, implemented, and evaluated the architecture of 4S. Results We contribute in three areas. First, we define the software ecosystem architecture concept that captures organization......, and application stove-pipes that inhibit the adoption of telemedical solutions. To which extent can a software ecosystem approach to telemedicine alleviate this? Objective In this article, we define the concept of software ecosystem architecture as the structure(s) of a software ecosystem comprising elements...... experience in creating and evolving the 4S telemedicine ecosystem. Conclusion The concept of software ecosystem architecture can be used analytically and constructively in respectively the analysis and design of software ecosystems....

  11. On structural identifiability analysis of the cascaded linear dynamic systems in isotopically non-stationary 13C labelling experiments.

    Science.gov (United States)

    Lin, Weilu; Wang, Zejian; Huang, Mingzhi; Zhuang, Yingping; Zhang, Siliang

    2018-06-01

    The isotopically non-stationary 13C labelling experiments, as an emerging experimental technique, can estimate the intracellular fluxes of the cell culture under an isotopic transient period. However, to the best of our knowledge, the issue of the structural identifiability analysis of non-stationary isotope experiments is not well addressed in the literature. In this work, the local structural identifiability analysis for non-stationary cumomer balance equations is conducted based on the Taylor series approach. The numerical rank of the Jacobian matrices of the finite extended time derivatives of the measured fractions with respect to the free parameters is taken as the criterion. It turns out that only one single time point is necessary to achieve the structural identifiability analysis of the cascaded linear dynamic system of non-stationary isotope experiments. The equivalence between the local structural identifiability of the cascaded linear dynamic systems and the local optimum condition of the nonlinear least squares problem is elucidated in the work. Optimal measurements sets can then be determined for the metabolic network. Two simulated metabolic networks are adopted to demonstrate the utility of the proposed method. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. A coverage and slicing dependencies analysis for seeking software security defects.

    Science.gov (United States)

    He, Hui; Zhang, Dongyan; Liu, Min; Zhang, Weizhe; Gao, Dongmin

    2014-01-01

    Software security defects have a serious impact on the software quality and reliability. It is a major hidden danger for the operation of a system that a software system has some security flaws. When the scale of the software increases, its vulnerability has becoming much more difficult to find out. Once these vulnerabilities are exploited, it may lead to great loss. In this situation, the concept of Software Assurance is carried out by some experts. And the automated fault localization technique is a part of the research of Software Assurance. Currently, automated fault localization method includes coverage based fault localization (CBFL) and program slicing. Both of the methods have their own location advantages and defects. In this paper, we have put forward a new method, named Reverse Data Dependence Analysis Model, which integrates the two methods by analyzing the program structure. On this basis, we finally proposed a new automated fault localization method. This method not only is automation lossless but also changes the basic location unit into single sentence, which makes the location effect more accurate. Through several experiments, we proved that our method is more effective. Furthermore, we analyzed the effectiveness among these existing methods and different faults.

  13. Stable isotope analysis of the human body. What isotopes in our tissue can reveal and what not; Stabilisotopenanalysen am Menschen. Was die Isotopie unseres Koerpergewebes ueber uns verraet- und was nicht

    Energy Technology Data Exchange (ETDEWEB)

    Goerger, Marlene [Technische Univ. Darmstadt (Germany)

    2016-07-01

    Most isotopes in the natural environment are stable but there are radioactive isotopes. Premordial radionuclides are nuclides that exist since the development of the earth crust. Cosmogenic radionuclides are generated due to cosmic radiation (protons, electrons, ionized atoms) - for instance C-14. Radiogenic nuclides are daughter products of radioactive nuclei. Anthropogenic radionuclides are generated due to human activities. Deviations from a ''normal'' isotope distribution are used for environmental impact analysis and forensic purposes. The human provenance project was stopped.

  14. Analysis of carbon stable isotope to determine the origin and migration of gaseous hydrocarbon in the Brazilian sedimentary basins

    International Nuclear Information System (INIS)

    Takaki, T.; Rodrigues, R.

    1986-01-01

    The carbon isotopic composition of natural gases to determine the origin and gaseous hydrocarbon migration of Brazilian sedimentar basins is analysed. The carbon isotopic ratio of methane from natural gases depends on the process of gas formation and stage of organic matter maturation. In the geochemical surface exploration the biogenic gases are differentiated from thermogenic gases, because the last one is isotopically heavier. As the isotopic composition of methane has not changed during migration, the migrated gases from deeper and more mature source rocks are identified by its relative 13 C enrichment. The methane was separated from chromatography and and the isotopic analysis was done with mass spectrometer. (M.C.K.) [pt

  15. Use of azeotropic distillation for isotopic analysis of deuterium in soil water and saturate saline solution

    International Nuclear Information System (INIS)

    Santos, Antonio Vieira dos.

    1995-05-01

    The azeotropic distillation technique was adapted to extract soil water and saturate saline solution, which is similar to the sea water for the Isotopic Determination of Deuterium (D). A soil test was used to determine the precision and the nature of the methodology to extract soil water for stable isotopic analysis, using the azeotropic distillation and comparing with traditional methodology of heating under vacuum. This methodology has been very useful for several kinds of soil or saturate saline solution. The apparatus does not have a memory effect, and the chemical reagents do not affect the isotopic composition of soil water. (author). 43 refs., 10 figs., 12 tabs

  16. Automatic measurement system for light element isotope analysis

    International Nuclear Information System (INIS)

    Satake, Hiroshi; Ikegami, Kouichi.

    1990-01-01

    The automatic measurement system for the light element isotope analysis was developed by installing the specially designed inlet system which was controlled by a computer. The microcomputer system contains specific interface boards for the inlet system and the mass spectrometer, Micromass 602 E. All the components of the inlet and the computer system installed are easily available in Japan. Ten samples can be automatically measured as a maximum of. About 160 minutes are required for 10 measurements of δ 18 O values of CO 2 . Thus four samples can be measured per an hour using this system, while usually three samples for an hour using the manual operation. The automatized analysis system clearly has an advantage over the conventional method. This paper describes the details of this automated system, such as apparatuses used, the control procedure and the correction for reliable measurement. (author)

  17. A Simple Plasma Retinol Isotope Ratio Method for Estimating β-Carotene Relative Bioefficacy in Humans: Validation with the Use of Model-Based Compartmental Analysis.

    Science.gov (United States)

    Ford, Jennifer Lynn; Green, Joanne Balmer; Lietz, Georg; Oxley, Anthony; Green, Michael H

    2017-09-01

    Background: Provitamin A carotenoids are an important source of dietary vitamin A for many populations. Thus, accurate and simple methods for estimating carotenoid bioefficacy are needed to evaluate the vitamin A value of test solutions and plant sources. β-Carotene bioefficacy is often estimated from the ratio of the areas under plasma isotope response curves after subjects ingest labeled β-carotene and a labeled retinyl acetate reference dose [isotope reference method (IRM)], but to our knowledge, the method has not yet been evaluated for accuracy. Objectives: Our objectives were to develop and test a physiologically based compartmental model that includes both absorptive and postabsorptive β-carotene bioconversion and to use the model to evaluate the accuracy of the IRM and a simple plasma retinol isotope ratio [(RIR), labeled β-carotene-derived retinol/labeled reference-dose-derived retinol in one plasma sample] for estimating relative bioefficacy. Methods: We used model-based compartmental analysis (Simulation, Analysis and Modeling software) to develop and apply a model that provided known values for β-carotene bioefficacy. Theoretical data for 10 subjects were generated by the model and used to determine bioefficacy by RIR and IRM; predictions were compared with known values. We also applied RIR and IRM to previously published data. Results: Plasma RIR accurately predicted β-carotene relative bioefficacy at 14 d or later. IRM also accurately predicted bioefficacy by 14 d, except that, when there was substantial postabsorptive bioconversion, IRM underestimated bioefficacy. Based on our model, 1-d predictions of relative bioefficacy include absorptive plus a portion of early postabsorptive conversion. Conclusion: The plasma RIR is a simple tracer method that accurately predicts β-carotene relative bioefficacy based on analysis of one blood sample obtained at ≥14 d after co-ingestion of labeled β-carotene and retinyl acetate. The method also provides

  18. Free/open source software: a study of some applications for scientific data analysis of nuclear experiments

    Energy Technology Data Exchange (ETDEWEB)

    Menezes, Mario Olimpio de [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)]. E-mail: mario@ipen.br; mo.menezes@gmail.com

    2005-07-01

    Free/Open Source Software (FOSS) has been used in science long before the formal social movement known as 'Free Software/Open Source Software' came in to existence. After the Personal Computer (PC) boom in the 80s, commercial closed source software became widely available to scientists for data analysis in this platform. In this paper, we study some high quality FOSS, available also for free, that can be used for complex data analysis tasks. We show the results and data analysis process, aiming to expose the high quality and highly productive ways of both results and processes, while highlighting the different approach used in some of the FOSS. We show that scientists have today in FOSS a viable, high quality alternative to commercial closed source software which, besides being ready to use, also offer the possibility of great customization or extension to fit very particular needs of many fields of scientific data analysis. Among the FOSS, we study in this paper GNU Octave and SCILAB - free alternatives to MATLAB; Gnuplot - free alternative to ORIGIN-like software. We also show that scientists have invaluable resources in modern FOSS programming languages such as Python, and Perl, that can be used both to do data analysis and manipulation, allowing very complex tasks to be done automatically after some few lines of easy programming. (author)

  19. Free/open source software: a study of some applications for scientific data analysis of nuclear experiments

    International Nuclear Information System (INIS)

    Menezes, Mario Olimpio de

    2005-01-01

    Free/Open Source Software (FOSS) has been used in science long before the formal social movement known as 'Free Software/Open Source Software' came in to existence. After the Personal Computer (PC) boom in the 80s, commercial closed source software became widely available to scientists for data analysis in this platform. In this paper, we study some high quality FOSS, available also for free, that can be used for complex data analysis tasks. We show the results and data analysis process, aiming to expose the high quality and highly productive ways of both results and processes, while highlighting the different approach used in some of the FOSS. We show that scientists have today in FOSS a viable, high quality alternative to commercial closed source software which, besides being ready to use, also offer the possibility of great customization or extension to fit very particular needs of many fields of scientific data analysis. Among the FOSS, we study in this paper GNU Octave and SCILAB - free alternatives to MATLAB; Gnuplot - free alternative to ORIGIN-like software. We also show that scientists have invaluable resources in modern FOSS programming languages such as Python, and Perl, that can be used both to do data analysis and manipulation, allowing very complex tasks to be done automatically after some few lines of easy programming. (author)

  20. Sample preparation techniques of biological material for isotope analysis

    International Nuclear Information System (INIS)

    Axmann, H.; Sebastianelli, A.; Arrillaga, J.L.

    1990-01-01

    Sample preparation is an essential step in all isotope-aided experiments but often it is not given enough attention. The methods of sample preparation are very important to obtain reliable and precise analytical data and for further interpretation of results. The size of a sample required for chemical analysis is usually very small (10mg-1500mg). On the other hand the amount of harvested plant material from plots in a field experiment is often bulky (several kilograms) and the entire sample is too large for processing. In addition, while approaching maturity many crops show not only differences in physical consistency but also a non-uniformity in 15 N content among plant parts, requiring a plant fractionation or separation into parts (vegetative and reproductive) e.g. shoots and spikes, in case of small grain cereals, shoots and pods in case of grain legumes and tops and roots or beets (including crown) in case of sugar beet, etc. In any case the ultimate goal of these procedures is to obtain representative subsample harvested from greenhouse or field experiments for chemical analysis. Before harvesting an isotopic-aided experiment the method of sampling has to be selected. It should be based on the type of information required in relation to the objectives of the research and the availability of resources (staff, sample preparation equipment, analytical facilities, chemicals and supplies, etc.). 10 refs, 3 figs, 3 tabs